CSV Import Listing Duplication

This topic contains 17 replies, has 2 voices, and was last updated by  Stiofan O’Connor 7 years, 1 month ago.

We have moved to a support ticketing system and our forums are now closed.

Open Support Ticket
  • Author
    Posts
  • #377192

    Alistair Williamson
    Full Member
    Post count: 415

    Hi Stiofan,

    Major problem with GD duplicating listings on import.

    Please advise.

    Thanks

    Al

    #377194

    Alistair Williamson
    Full Member
    Post count: 415
    This reply has been marked as private.
    #377225

    Stiofan O’Connor
    Site Admin
    Post count: 22956

    If you import a a listing with no post ID it will be created, if you import the same listing with no ID it will still import, my guess is this is what happened.

    There are a few ways to delete the duplicates but you would need some unique aspacts, like are all titles 100% unique apart from duplicates?

    Stiofan

    #377229

    Alistair Williamson
    Full Member
    Post count: 415
    This reply has been marked as private.
    #377230

    Alistair Williamson
    Full Member
    Post count: 415
    This reply has been marked as private.
    #377234

    Stiofan O’Connor
    Site Admin
    Post count: 22956

    you would have to run a query to join the two tables and get listings only where they have the same key, then do a foreach to delete them the proper way.

    Stiofan

    #377236

    Alistair Williamson
    Full Member
    Post count: 415

    How do I run a query in GD?

    Al

    #377239

    Stiofan O’Connor
    Site Admin
    Post count: 22956

    There is no easy way to do this as you need to know what ones to delete, i don’t wan to give you a query to delete things and then it turns out to be the wrong ones

    A query like this should identify the duplicatres

    SELECT GROUP_CONCAT(post_id), post_title, COUNT(*) geodir_businesskey FROM wp_geodir_gd_place_detail GROUP BY post_title HAVING geodir_businesskey > 1;

    From there you could do a few things but only a dev could write something to do it automatically.

    Stiofan

    #377249

    Alistair Williamson
    Full Member
    Post count: 415
    This reply has been marked as private.
    #377250

    Stiofan O’Connor
    Site Admin
    Post count: 22956

    I do not think this is a GD problem as no one else has reported it, but i have added a task to add a tool for duplicates, it is an enhancement meaning it will be worked on when all higher priority things are done first: https://github.com/GeoDirectory/geodirectory/issues/394

    That query i gave you will allow you to get the IDs of duplicates and maybe there is a plugin or something that can delete by id (not checked)

    Thanks,

    Stiofan

    #377251

    Alistair Williamson
    Full Member
    Post count: 415

    Thanks Stiofan.

    Appreciated and understood.

    Have a great eve.

    Bests

    Alistair 🙂

    #377253

    Stiofan O’Connor
    Site Admin
    Post count: 22956

    You too Al 🙂

    #377514

    Alistair Williamson
    Full Member
    Post count: 415

    Hi Stiofan,

    I came up with a way of deleting duplicates for non-Devs like me.

    I identified the duplicates outside of GD using MS Access, created a ‘Duplicate CSV’ and update the published date for each listing to a future month.

    I can now see and select the dupes as the posts appear as ‘Scheduled’ in the backend.

    I have been selecting and moving the duplicate listings to the bin. However, I keep getting the 414 error message – please see image attached. Using ‘screen options’ I have limited my selection to 100 listings but still keep getting the error message ‘Request URL too long.’

    Please can you assist?

    Thx

    Al

    #377515

    Alistair Williamson
    Full Member
    Post count: 415

    Please see Attachment

    #377517

    Alistair Williamson
    Full Member
    Post count: 415

    Limiting selections to 50 listings works but very time consuming to delete 12,000 listings.

Viewing 15 posts - 1 through 15 (of 18 total)

We have moved to a support ticketing system and our forums are now closed.

Open Support Ticket