• [MYSQL] SQL Error: 1062 duplicate entry

    From J.O. Aho@21:1/5 to All on Thu Dec 5 22:32:46 2019
    On 05/12/2019 22.02, ^Bart wrote:

    Like what I said above I started with a empty table and I started to
    import data from a *.csv in phpmyadmin and I had immediately the error
    so the import didn't work...

    If I don't recall it wrong, phpmyadmin uses transactions when using
    innodb tables, so if you have an error while importing, it will rollback
    the transaction and those the table will be empty when you look at it.

    I do suggest you take a look at your csv file and I'm quite sure you
    will find two entries with the recipe_id=5 and subfoodgroup_id=413.

    Suggest you load it in LibreOffice, sort it first on the recipe_id
    column and then the subfoodgroup_id column and you will easilly find
    your duplicates.

    --

    //Aho

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From ^Bart@21:1/5 to All on Thu Dec 5 23:05:33 2019
    If I don't recall it wrong, phpmyadmin uses transactions when using
    innodb tables, so if you have an error while importing, it will rollback
    the transaction and those the table will be empty when you look at it.

    If I force the error I can upload a single record but it doesn't work on
    *.csv import... :\

    I do suggest you take a look at your csv file and I'm quite sure you
    will find two entries with the recipe_id=5 and subfoodgroup_id=413.

    Suggest you load it in LibreOffice, sort it first on the recipe_id
    column and then the subfoodgroup_id column and you will easilly find
    your duplicates.

    My *.csv file:

    ,,,,,,,,,,,,,,,,,,1,52,Show
    ,,,,,,,,,,,,,,,,,,1,54,Show
    ,,,,,,,,,,,,,,,,,,1,35,Show
    ,,,,,,,,,,,,,,,,,,1,37,Hide
    ,,,,,,,,,,,,,,,,,,1,39,Hide
    ,,,,,,,,,,,,,,,,,,1,34,Show

    I don't know why I have a lot of ,,,,, I happened just with this file, I
    just tried with:

    1,52,Show
    1,54,Show
    1,35,Show
    1,37,Hide
    1,39,Hide
    1,34,Show

    It works (I tried it on import area of phpmyadmin), maybe there's
    something wrong on my *.ods file or when I save it as *.csv.

    Regards.
    ^Bart

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From ^Bart@21:1/5 to All on Thu Dec 5 23:07:35 2019
    I don't know why I have a lot of ,,,,, I happened just with this file, I
    just tried with:

    "It happened" I'm sorry but it's quite late and... I'm very tired! :)

    ^Bart

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From J.O. Aho@21:1/5 to All on Fri Dec 6 06:53:37 2019
    On 05/12/2019 23.05, ^Bart wrote:
    If I don't recall it wrong, phpmyadmin uses transactions when using
    innodb tables, so if you have an error while importing, it will
    rollback the transaction and those the table will be empty when you
    look at it.

    If I force the error I can upload a single record but it doesn't work on *.csv import... :\

    I do suggest you take a look at your csv file and I'm quite sure you
    will find two entries with the recipe_id=5 and subfoodgroup_id=413.

    Suggest you load it in LibreOffice, sort it first on the recipe_id
    column and then the subfoodgroup_id column and you will easilly find
    your duplicates.

    My *.csv file:

    ,,,,,,,,,,,,,,,,,,1,52,Show
    ,,,,,,,,,,,,,,,,,,1,54,Show
    ,,,,,,,,,,,,,,,,,,1,35,Show
    ,,,,,,,,,,,,,,,,,,1,37,Hide
    ,,,,,,,,,,,,,,,,,,1,39,Hide
    ,,,,,,,,,,,,,,,,,,1,34,Show

    sed 's/,*\([0-9].*\)/\1/' -i faultyfile.csv

    this will remove the leading ,

    if you want to remove the duplicates you can run

    sort -u faultyfile.csv > noduplicates.csv

    of course this is how you do it in the worlds most used OS or an Unix,
    if you happen to have that Redmond based company's OS, then you need to
    install a lot of extra stuff.

    I don't know why I have a lot of ,,,,, I happened just with this file, I
    just tried with:

    1,52,Show
    1,54,Show
    1,35,Show
    1,37,Hide
    1,39,Hide
    1,34,Show

    It works (I tried it on import area of phpmyadmin), maybe there's
    something wrong on my *.ods file or when I save it as *.csv.

    Then you solved your issue.

    --

    //Aho

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From ^Bart@21:1/5 to thanks to on Sat Dec 7 10:24:39 2019
    sed 's/,*\([0-9].*\)/\1/' -i faultyfile.csv

    this will remove the leading ,

    if you want to remove the duplicates you can run

    sort -u faultyfile.csv > noduplicates.csv

    of course this is how you do it in the worlds most used OS or an Unix,
    if you happen to have that Redmond based company's OS, then you need to install a lot of extra stuff.

    I'm a Debian user, I didn't know sed command, thanks to write it! :)

    Then you solved your issue.

    I tried on my old *.ods file to select all fields and to cancel format
    to have a default layout but I solved just to make a new *.ods file and
    after it I did a right *.csv export.

    Thanks for all your replies! :)

    ^Bart

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)