There are some ways to take away URLs from Google, however there’s nobody dimension suits all strategy. It all will depend on your circumstances.

That’s an essential level to know. Not solely will utilizing the improper technique generally result in pages not being faraway from the index as meant, however it might even have a destructive impact on website positioning.

To enable you shortly determine which technique of elimination is finest for you, we made a flowchart so you may skip to the related part of the article.

Flowchart that will help you determine the best way to take away your pages from Google.

In this submit, you’ll study:

How to test if a URL is listed

What I sometimes see SEOs do to test if content material is listed is use a web site: search in Google (e.g., web site:https://ahrefs.com). While web site: searches will be helpful for figuring out the pages or sections of a web site which may be problematic in the event that they present in search outcomes, it’s important to watch out as a result of they aren’t regular queries and received’t truly inform you if a page is listed. They might present pages which are recognized to Google, however that doesn’t imply they’re eligible to point out in regular search outcomes with out the location: operator.

For instance, web site: searches can nonetheless present pages that redirect or are canonicalized to a different page. When you ask for a particular web site, Google might present a page from that area with the content material, title, and outline from one other area. Take for instance moz.com which was once seomoz.org. Any common consumer queries that result in pages on moz.com will present moz.com within the SERPs, whereas web site:seomoz.org will present seomoz.org within the search outcomes as proven beneath.

1588161221 593 How to Remove URLs From Google Search 5 Methods - How to Remove URLs From Google Search (5 Methods)

The purpose this is a crucial distinction is that it might lead SEOs to make errors equivalent to actively blocking or eradicating URLs from the index for the outdated area, which prevents consolidation of indicators like PageRank. I’ve seen many circumstances with area migrations the place folks assume they made a mistake through the migration as a result of these pages nonetheless present for web site:old-domain.com searches and find yourself actively harming their web site whereas attempting to “fix” the issue.

The higher technique to test indexation is to make use of the Index Coverage report in Google Search Console, or the URL Inspection Tool for a person URL. These instruments inform you if a page is listed and supply further data on how Google is treating the page. If you don’t have entry to this, merely search Google for the complete URL of your page.

1588161221 489 How to Remove URLs From Google Search 5 Methods - How to Remove URLs From Google Search (5 Methods)

Screenshot of the URL inspection software in Google Search Console.

In Marketing Media Wizard, in case you discover the page in our “Top pages” report or rating for natural key phrases, it normally means we noticed it rating for regular search queries and is an efficient indication that the page was listed. Note that the pages had been listed after we noticed them, however which will have modified. Check the date we final noticed the page for a question. 

1588161222 462 How to Remove URLs From Google Search 5 Methods - How to Remove URLs From Google Search (5 Methods)

If there’s a downside with a specific URL and it wants eradicating from the index, observe the flowchart firstly of the article to seek out the right elimination possibility, then soar to the suitable part beneath.

Removal possibility 1: Delete the content material

If you take away the page and serve both a 404 (not discovered) or 410 (gone) standing code, then the page can be faraway from the index shortly after the page is re-crawled. Until it’s eliminated, the page should present in search outcomes. And even when the page itself is now not out there, a cached model of the page could also be quickly out there.

When you would possibly want a special possibility:

  • I would like extra speedy elimination. See the URL elimination software part.
  • I have to consolidate indicators like hyperlinks. See the canonicalization part.
  • I would like the page out there for customers. See if the noindex or proscribing entry sections suit your scenario.

Removal possibility 2: Noindex

A noindex meta robots tag or x‑robots header response will inform search engines to take away a page from the index. The meta robots tag works for pages the place the x‑robots response works for pages and extra file sorts like PDFs. For these tags to be seen, a search engine wants to have the ability to crawl the pages—so ensure that they aren’t blocked in robots.txt. Also, notice that eradicating pages from the index might stop the consolidation of hyperlink and different indicators.

Example of a meta robots noindex:

<meta title="robots" content material="noindex">

Example of x‑robots noindex tag within the header response:

HTTP/1.1 200 OK
X-Robots-Tag: noindex

When you would possibly want a special possibility:

  • I don’t need customers to entry these pages. See the proscribing entry part.
  • I have to consolidate indicators like hyperlinks. See the canonicalization part.

Removal possibility 3: Restricting entry

If you need the page to be accessible to some customers however not search engines, then what you in all probability need is one among these three choices:

  • some type of login system;
  • HTTP Authentication (the place a password is required for entry);
  • IP Whitelisting (which solely permits particular IP addresses to entry the pages)

This kind of setup is finest for issues like inside networks, member solely content material, or for staging, take a look at, or growth websites. It permits for a gaggle of customers to entry the page, however search engines will be unable to entry them and won’t index the pages.

When you would possibly want a special possibility:

  • I would like extra speedy elimination. See the URL elimination software part. In this specific case, you might have considered trying extra speedy elimination if the content material you are attempting to cover has been cached, and it’s worthwhile to stop customers from seeing that content material.

Removal possibility 4: URL Removal Tool

The title for this software from Google is barely deceptive as the way in which it really works is that it’s going to quickly conceal the content material. Google will nonetheless see and crawl this content material, however the pages received’t seem for customers. This non permanent impact lasts for six months in Google, whereas Bing has an analogous software that lasts for 3 months. These instruments ought to be utilized in probably the most excessive circumstances for issues like safety points, information leaks, personally identifiable data (PII), and so on. For Google, use the Removals Tool and for Bing, see how to block URLs.

You nonetheless want to use one other technique together with utilizing the elimination software as a way to even have the pages eliminated for an extended interval (noindex or delete) or stop customers from accessing the content material in the event that they nonetheless have the hyperlinks (delete or limit entry). This simply provides you a quicker method of hiding the pages whereas the elimination has time to course of. The request can take as much as a day to course of.

Removal possibility 5: Canonicalization

When you may have a number of variations of a page and wish to consolidate indicators like hyperlinks to a single model, what you wish to do is a few type of canonicalization. This is usually to forestall duplicate content material whereas consolidating a number of variations of a page to a single listed URL.

You have a number of canonicalization choices:

  • Canonical tag. This specifies one other URL because the canonical model or the model you wish to be proven. If pages are duplicate or very comparable, this ought to be tremendous. When pages are too totally different, the canonical could also be ignored as it’s a trace and never a directive.
  • Redirects. A redirect takes a consumer and a search bot from one page to a different. 301 is probably the most generally used redirect by SEOs, and it tells the search engines that you really want the ultimate URL to be the one proven in search outcomes and the place indicators are consolidated. A 302 or non permanent redirect tells search engines you need the unique URL to be the one to stay within the index and to consolidate indicators there.
  • URL parameter handling. A parameter is appended to the tip of the URL and sometimes features a query mark, like ahrefs.com?this=parameter. This software from Google allows you to inform them the best way to deal with URLs with particular parameters. For occasion, you may specify if the parameter adjustments the page content material or if it’s simply meant to trace utilization.

How to prioritize removals

If you may have a number of pages to take away from Google’s index, then they need to be prioritized accordingly.

Highest precedence: These pages are normally security-related or associated to confidential information. This contains content material that comprises private information (PII), buyer information, or proprietary data.

Medium precedence: This normally entails content material meant for a particular group of customers. Company intranets or worker portals, content material meant for members solely, and staging, take a look at, or growth environments.

Low precedence: These pages normally contain duplicate content material of some sort. Some examples of this would come with pages served from a number of URLs, URLs with parameters, and once more may embody staging, take a look at, or growth environments.

Common elimination errors to keep away from

I wish to cowl just a few of the methods I normally see removals completed incorrectly and what occurs in every situation to assist folks perceive why they don’t work.

Noindex in robots.txt

While Google used to unofficially assist noindex in robots.txt, it was by no means an official customary and so they’ve now formally removed support. Many of the websites that had been doing this had been doing so incorrectly and harming themselves.

Blocking from crawling in robots.txt

Crawling shouldn’t be the identical factor as indexing. Even if Google is blocked from crawling pages, if there are any inside or exterior hyperlinks to a page they will nonetheless index it. Google received’t know what’s on the page as a result of they won’t crawl it, however they know a page exists and can even write a title to point out in search outcomes primarily based on indicators just like the anchor textual content of hyperlinks to the page.

Nofollow

This generally will get confused for noindex, and a few folks will use it at a page degree anticipating the page to not be listed. Nofollow is a touch, and whereas it initially stopped hyperlinks on the page and particular person hyperlinks with the nofollow attribute from being crawled, that’s now not the case. Google can now crawl these hyperlinks in the event that they wish to. Nofollow was additionally used on particular person hyperlinks to attempt to cease Google from crawling by way of to particular pages and for PageRank sculpting. Again, this now not works since nofollow is a touch. In the previous, if the page had one other hyperlink to it, then Google may nonetheless uncover from this alternate crawl path.

Note that you will discover nofollowed pages in bulk utilizing this filter within the Page Explorer in Marketing Media Wizard’ Site Audit.

1588161222 863 How to Remove URLs From Google Search 5 Methods - How to Remove URLs From Google Search (5 Methods)

As it hardly ever is sensible to nofollow all hyperlinks on a page, the variety of outcomes ought to be zero or near zero. If there are matching outcomes, I urge you to test whether or not the nofollow directive was unintentionally added instead of noindex and to decide on a extra applicable technique of elimination if want be.

You may also discover particular person hyperlinks marked nofollow utilizing this filter in Link Explorer.

1588161222 892 How to Remove URLs From Google Search 5 Methods - How to Remove URLs From Google Search (5 Methods)

Noindex and canonical to a different URL

These indicators are conflicting. Noindex says to take away the page from the index, and canonical says that one other page is the model that ought to be listed. This may very well work for consolidation as Google will sometimes select to disregard the noindex and as a substitute use the canonical as the principle sign. However, this isn’t an absolute habits. There’s an algorithm concerned and there’s a danger that the noindex tag might be the sign counted. If that’s the case, then pages received’t consolidate correctly.

Note that you will discover noindexed pages with non-self-referential canonicals utilizing this set of filters within the Page Explorer in Site Audit:

1588161223 806 How to Remove URLs From Google Search 5 Methods - How to Remove URLs From Google Search (5 Methods)

Noindex, await Google to crawl, then block from crawling

There are a few methods this normally occurs:

  1. Pages are already blocked however are listed, folks add noindex and unblock in order that Google can crawl and see the noindex, then block the pages from crawling once more.
  2. People add noindex tags for the pages they need eliminated and after Google has crawled and processed the noindex tag, they block the pages from crawling.

Either method, the ultimate state is blocked from crawling. If you keep in mind, earlier, we talked about how crawling shouldn’t be the identical as indexing. Even although these pages are blocked, they will nonetheless find yourself within the index.

What if it’s your content material however not on a web site you personal?

If you personal the content material that’s getting used on one other web site, you might be able to file a declare primarily based on the Digital Millennium Copyright Act (DMCA). You can use Google’s Copyright Removal tool to do what’s referred to as a DMCA takedown, which requests the elimination of any copyrighted materials.

What if it’s content material about you however not on a web site you personal?

If you’re within the EU, you may have content material eliminated that comprises details about you due to a court docket order for the correct to be forgotten. You can request to have private data eliminated utilizing the EU Privacy Removal form.

To take away photographs from Google, the best method is with robots.txt. While the unofficial assist for eradicating pages was faraway from robots.txt as we talked about earlier, merely disallowing the crawl of photographs is the correct strategy to take away photographs.

For a single picture:

User-agent: Googlebot-Image
Disallow: /photographs/canine.jpg 

For all photographs:

User-agent: Googlebot-Image
Disallow: /

Final ideas

How you take away URLs is pretty situational. We’ve talked about a number of choices, however in case you’re nonetheless confused which is best for you, refer again to the flowchart in the beginning.

You may also undergo the legal troubleshooter offered by Google for content material elimination.

Have questions? Let me know on Twitter.