Andrew Fletcher published: 16 December 2021 1 minute read
Have you noticed through your Google account that there are items being indexed that shouldn't have been? One for me was taxonomy terms. Not important to be followed.
- Being indexed on my sitemap (do I just erase those entries on the XML file?)
- Being crawled (I am guessing this is with a robots.txt file though I have never created one before)
- Being viewed (stumbled upon) -> if this even possible to block?
Resolve these quickly by simply adding disallow in robots.txt file.
Disallow: /taxonomy/
And add no follow no index on taxonomy pages using the Metatags module (/admin/config/search/metatag)
<meta name="robots" content="noindex, nofollow" />
Clear your cache, rebuild your sitemap and cross check that they are gone.
Related articles
Andrew Fletcher
•
09 Jan 2026
Upgrading Drupal from 10.6.x to 11.3.2: a practical, dependency-driven walkthrough
Upgrading from Drupal 10.6.x to 11.3.x is officially supported, but in real projects it’s rarely a single command. The friction usually comes from **Composer constraints**, not Drupal itself.This article documents a real-world upgrade path from Drupal 10.6.1 → 11.3.2, including the specific blockers...
Andrew Fletcher
•
04 Apr 2025
Managing .gitignore changes
When working with Git, the .gitignore file plays a critical role in controlling which files and folders are tracked by version control. Yet, many developers are unsure when changes to .gitignore take effect and how to manage files that are already being tracked. This uncertainty can lead to...
Andrew Fletcher
•
26 Mar 2025
How to fix the ‘Undefined function t’ error in Drupal 10 or 11 code
Upgrading to Drupal 10.4+ you might have noticed a warning in their code editor stating “Undefined function ‘t’”. While Drupal’s `t()` function remains valid in procedural code, some language analysis tools — such as Intelephense — do not automatically recognise Drupal’s global functions. This...