@traq You’re going to kill me, I’m sure. This is what happened:
I want to add an article to my readability feed and my page didn’t update (I’m using Ajax), I even tried refreshing. So I checked the input.php file that consists of the INSERT statement. That’s when I found the following error:
failed to INSERT:  Duplicate entry ‘Sneak preview: Syncing fonts to your desktop-http://blog.typekit’ for key ‘PRIMARY’
But this isn’t the article I tried adding. Remember when we talked about how readability only saves the last 15 articles? Well the article in the error is the beginning of the next 15. So basically the “first”.
To test this, I dropped the table and re-imported it. I refreshed the input.php file that has the INSERT code and no error showed up. So I checked my database and now I’m getting a bunch of duplicates. It grabs my whole feed and re-inserts it. I’m not sure if it’s supposed to do that when refreshing a page with INSERT or why it didn’t throw an error like the above. I’m really lost.
as far as duplicates go, are you sure you recreated the table from the correct definition (with the primary key constraint)? Are the records _exactly_ identical (no extra whitespace on one, for example)?
>as far as duplicates go, are you sure you recreated the table from the correct definition (with the primary key constraint)?
When I dropped the table, I used the SQL statement from the backup.
CREATE TABLE IF NOT EXISTS `read` (
`order` int(11) NOT NULL AUTO_INCREMENT,
`title` varchar(255) NOT NULL,
`url` varchar(255) NOT NULL,
PRIMARY KEY (`title`,`url`),
UNIQUE KEY `order` (`order`)
) DEFAULT CHARSET=utf8;
>Are the records exactly identical (no extra whitespace on one, for example)?
Clicking through the records and also checking the sql file, I’m not seeing any whitespace issues.
Take a look at this screenshot. Whenever I add a new article, whatever previous article that is at the bottom of that var_dump, throws a duplicated error. So in this case, the ‘Futuristic Apartment’ article throws an error. If I add a new article to the feed, ‘Best Dorm Room Ever’ will throw an error, etc.
> put the IGNORE back in there. That way, MySQL will simply skip duplicate entries, rather than complain about them and abort.
>>(I think this will solve most of your current problems.)
Done and solved it. I also tried `REPLACE INTO` before you commented and that worked also. I read on Stack Overflow that it’s not a good idea to use `INSERT IGNORE`. Instead I should use `KEY UPDATE` or something.
>I did notice that the encoding screwiness issue seems to be back. Make sure you’re doing mysqli->set_charset( ‘UTF8’ );.
I have `$DB->set_charset( ‘utf8’ );` set. What you’re seeing is what the XML feed shows. When it’s inserted into the database, everything converts normally.
>Done and solved it. I also tried REPLACE INTO before you commented and that worked also. I read on Stack Overflow that it’s not a good idea to use INSERT IGNORE. Instead I should use KEY UPDATE or something.
It depends on what you’re trying to accomplish. In this case, I think `IGNORE` is the most appropriate option – there’s no point in spending the overhead on replacing/updating records when you _know_ what’s causing the error (and you _know_ that the new data is no more or less “correct” than the old).
>What you’re seeing is what the XML feed shows. When it’s inserted into the database, everything converts normally.