Jreth Ws
Forum Replies Created
-
AuthorPosts
-
Thanks very much Guust.
I really dont mean to bug you. You have a really good script. This is my constructive feedback so that all your new users can benefit from it.
I know Location pages are good and they are the main entrance page for each location but for a new site, these pages are creating confusion to users because there are already locations for each CPT.
Normal users do not see differences in urls like us.
They cant differentiate these two:
http://www.example.com/dir/type/food/america/newyork
http://www.example.com/dir/america/newyork
(even myself as a technical person took me days to spot the differences between both urls)In short, I just hope you can assist me on how to hide the code that generates all the location pages. You can ignore me if you think im stubborn. Sorry for the troubles.
One more question, how do I hide these main locations from google? I mean how did google find these pages? is it from the source code(below)?
GD2 setting should let us hide these location pages from google.
My dir is still not ready to have so many location pages and these pages are very confusing to users and google as well. Dont you think so?I see this in source:
<script type=’application/ld+json’>{“@context”:”https://schema.org”,”@type”:”BreadcrumbList”,”itemListElement”:[{“@type”:”ListItem”,”position”:1,”item”:Thanks
Ok I got it. Thanks Paolo and Alex.
In google’s point of view for a GD2 site:
Location = ArchivePlease try to think, one website is having two different indexed page for two same location. I know location shows all types of listing and archive only shows it’s own CPT listing but for many ‘less listing categories’, this can be issue. Google doesnt like two results from the same site especially when there are so many of them, this will hurt the site’s ranking.
I really hope theres a solution on this as I cant start using GD2.
Thanks
Why are these pages created for? We already have complete locations with their own meta and contents.
This reply has been marked as private.Thanks for your reply.
Robots.txt is not doing the job well as google still pickup those URLS as I have tested. Robots.txt is not guaranteed.
Can implement this? Its really important. For now Ill pay freelance progrmmers to solve this out.
Got it. Solved 🙂
Can you try to import again, it failed to import. Im at the last stage. Ive attached the zip file here.
Thanks so much
This reply has been marked as private.This reply has been marked as private.Got it working. Changed in permalink. Thanks
Thanks Stiofan but the replacement doesnt solve the problem.
Im using GDV2 with supreme theme. Am I missing out anything?Thanks
Tried that but WP doesnt allow me to save.
Can I manually change in DB? Which table to do this?
Thanks
-
AuthorPosts