Forums

The forums ran from 2008-2020 and are now closed and viewable here as an archive.

Home Forums Back End traq – Duplicate Key Entry Error [1062]

  • This topic is empty.
Viewing 3 posts - 1 through 3 (of 3 total)
  • Author
    Posts
  • #46421
    chrisburton
    Participant

    @traq You’re going to kill me, I’m sure. This is what happened:

    I want to add an article to my readability feed and my page didn’t update (I’m using Ajax), I even tried refreshing. So I checked the input.php file that consists of the INSERT statement. That’s when I found the following error:

    failed to INSERT: [1062] Duplicate entry ‘Sneak preview: Syncing fonts to your desktop-http://blog.typekit’ for key ‘PRIMARY’

    But this isn’t the article I tried adding. Remember when we talked about how readability only saves the last 15 articles? Well the article in the error is the beginning of the next 15. So basically the “first”.

    To test this, I dropped the table and re-imported it. I refreshed the input.php file that has the INSERT code and no error showed up. So I checked my database and now I’m getting a bunch of duplicates. It grabs my whole feed and re-inserts it. I’m not sure if it’s supposed to do that when refreshing a page with INSERT or why it didn’t throw an error like the above. I’m really lost.

    #142995
    chrisburton
    Participant

    @traq

    >are you still using INSERT IGNORE?

    Nope. I’m just using INSERT INTO.

    $SQL = “INSERT INTO `read`(`title`,`url`) VALUESn “.implode( “n,”,array_reverse( $sql_values ) );

    >as far as duplicates go, are you sure you recreated the table from the correct definition (with the primary key constraint)?

    When I dropped the table, I used the SQL statement from the backup.

    CREATE TABLE IF NOT EXISTS `read` (
    `order` int(11) NOT NULL AUTO_INCREMENT,
    `title` varchar(255) NOT NULL,
    `url` varchar(255) NOT NULL,
    PRIMARY KEY (`title`,`url`),
    UNIQUE KEY `order` (`order`)
    ) DEFAULT CHARSET=utf8;

    >Are the records exactly identical (no extra whitespace on one, for example)?

    Clicking through the records and also checking the sql file, I’m not seeing any whitespace issues.

    Take a look at this screenshot. Whenever I add a new article, whatever previous article that is at the bottom of that var_dump, throws a duplicated error. So in this case, the ‘Futuristic Apartment’ article throws an error. If I add a new article to the feed, ‘Best Dorm Room Ever’ will throw an error, etc.

    #143021
    chrisburton
    Participant

    @traq

    > put the IGNORE back in there. That way, MySQL will simply skip duplicate entries, rather than complain about them and abort.

    >>(I think this will solve most of your current problems.)

    Done and solved it. I also tried `REPLACE INTO` before you commented and that worked also. I read on Stack Overflow that it’s not a good idea to use `INSERT IGNORE`. Instead I should use `KEY UPDATE` or something.

    >I did notice that the encoding screwiness issue seems to be back. Make sure you’re doing mysqli->set_charset( ‘UTF8’ );.

    I have `$DB->set_charset( ‘utf8’ );` set. What you’re seeing is what the XML feed shows. When it’s inserted into the database, everything converts normally.

    Thank you!

Viewing 3 posts - 1 through 3 (of 3 total)
  • The forum ‘Back End’ is closed to new topics and replies.