0

Oke, i've been busting my head on this one.

I'm gonna try and keep things short, however, if you need more info, don't hesitate to ask.

I've written an import repo for an external firm, so we can import their data into our service.

quick overview of implemented logic?

ftp, grab xml file, parse it with simple_xml and do db stuff using laravel eloquent component.

on my dev machine, every run gets parsed fully and all data is inserted correctly into the database.

problem

when i try the same thing on my production server. I'm receiving a duplicate entry error, always on the same exact record. (unless i'm using another file)

pre script setup to help detect the error

on each run i do the following:

  • make sure i'm using the exact same files on both dev and prod environment... (i've disabled the ftpgrab and uploaded manually to the correct location)
  • truncate all the related tables so i'm always starting with empty! tables.
  • i've manually triple-zillion checked for duplicates in the xml, but they're not in there.... and the fact that my dev machine parses the file correctly confirms this.

what i tried

at this point, i've got no more clues as to how i'm supposed to debug this properly.

by now, i've checked so many things (most of them i can't even remember), all of which seemed pretty unrelated to me, but i had to try them. those things include:

  • automatic disconnects due to browser
  • mysql wait timeouts
  • php script timeouts
  • memory settings

none of them seem to help (which was exactly what i was expecting)

another fact

my php version on my dev is 5.4.4 and the version on the production server is 5.3.2 (i know this is bad practise, but i'm not using any new features, it's really dead easy code, though it has quite a few lines :-) )

i've been suspecting this to be the cause, but

i've now switched to 5.3.14 on my dev... still the import runs without an issue

the changes from 5.3.2 to 5.3.14 are probably pretty minor

i've tried to manually compile the exact same php version, but i'm to inexperienced to properly do this. moreover, it probably wouldn't have the exact same specs anyway (i think it's pretty impossibly to ./configure exactly the same, considering the use of MacOs vs Ubuntu? especially for a noob like me) So i've abandoned this path.

I've tried to find the differences in the php versions, but i can't seem to stumble upon anything that might be the cause to all this. there was a change related to non-numeric keys in arrays (or strings for that matter) in version 5.4.4 (i think) but since i've now come to the conclusion that 5.3.14 also works, this definitely is not the issue. --- looking around insecurely hoping not having said anything downright stupid ---

quick thought while writing this:

the thing is, even though i'm getting the duplicate error statement. The record did get inserted into the database. moreover, the error gets triggered when having processed about 2700 (of total 6000) records. the bound data to the query is actually the data of the second record in the xml file.)

I'm sincerely hoping anyone could put me on the right track for this issue :( If you made it this far, but don't have a clue about what's going on, thx for reading and sticking to it.

If you might have clue, please enlighten me!

Bodybag
  • 318
  • 2
  • 11
  • I had same problem, and just added to my insert script 'ON DUPLICATE KEY UPDATE'. And problem gone. – YamahaSY Feb 10 '14 at 15:16
  • Or You can make just INSERT IGNORE, almost same stuff but without overwriting values – YamahaSY Feb 10 '14 at 15:18
  • did you perhaps miss the part that stated i have a completely empty table before running the script? i wouldn't see how the advise would help me? i'm trying to solve this, rather then to suppress an error.. The thing is: the error shouldn't be there.. as there are no duplicates.. or am i missing something? – Bodybag Feb 10 '14 at 15:25
  • I mean that i had same problem with empty table, just import from xml. Some times (I don't know why) script tried to insert already inserted value (row with same id). When You add 'insert ignore' or 'on duplicate key update' it will skip or overwrite this row and script will insert all values – YamahaSY Feb 10 '14 at 15:40
  • did you perhaps have any valid documentation on this that this might be simple_xml related? at least, that's what i understand from what you're saying :-) – Bodybag Feb 10 '14 at 15:41
  • I didn't found any explanation about that problem, about mysql changes in queries you can take a look on this http://www.electrictoolbox.com/mysql-insert-ignore/ http://stackoverflow.com/questions/14383503/on-duplicate-key-update-same-as-insert – YamahaSY Feb 10 '14 at 15:50

0 Answers0