Andrew Fletcher published: 16 December 2021 1 minute read
Have you noticed through your Google account that there are items being indexed that shouldn't have been? One for me was taxonomy terms. Not important to be followed.
- Being indexed on my sitemap (do I just erase those entries on the XML file?)
- Being crawled (I am guessing this is with a robots.txt file though I have never created one before)
- Being viewed (stumbled upon) -> if this even possible to block?
Resolve these quickly by simply adding disallow in robots.txt file.
Disallow: /taxonomy/
And add no follow no index on taxonomy pages using the Metatags module (/admin/config/search/metatag)
<meta name="robots" content="noindex, nofollow" />
Clear your cache, rebuild your sitemap and cross check that they are gone.
Related articles
Andrew Fletcher
•
27 Apr 2024
Streamlined Drupal 10 Deployment: Best Practices for Stability and Security
Have you ever walked into a situation at work and just wondered how everything still runs smoothly? I mean, there are deployments with so many fail-safes stacked on a shaky foundation that it's almost a miracle they work at all. The real trick is gently nudging the team toward accepting change to...
Andrew Fletcher
•
27 Apr 2024
TypeError: Cannot assign null to property Drupal\views\Plugin\views\argument\ArgumentPluginBase::$operator of type string
I'm getting the following errorTypeError: Cannot assign null to property Drupal\views\Plugin\views\argument\ArgumentPluginBase::$operator of type string in Drupal\views\Plugin\views\argument\ArgumentPluginBase->unpackArgumentValue() (line 1302 of...
Andrew Fletcher
•
21 Apr 2024
This command will help you spot any unusually large directories
The red flag was when I saw the server disk space is showing a site is taking up 57992.5 MB, where locally the site size is showing 957MB. There is something serious happening here and I need to establish the problem quickly. When dealing with a significant discrepancy in site size...