Sometimes web pages can become outdated, irrelevant or undesirable. When this happens, it isn’t enough for webmasters to delete them from the server. They must also ensure that the pages get taken down from the Google Index so that they will stop appearing in search results. This can be easier said than done due to the complexities of the Web, so how should one go about it properly?

According to Matt Cutts, single page removal requires having the server return a true Error 404. Try to double-check because there are times when the HTTP status is actually a 200 code, giving Googlebot the impression that the page is still up. URL removal requests will be denied if the page appears to be live. When removing an entire site from the Google Index, the best practice is to block it through robots.txt. This process allows a faster and more thorough deletion compared to the page-by-page method.

The techniques above should work 90% of the time. For the rest, a deeper understanding of the underlying issues may be in order. Matt suggests going to the Webmaster Forum to seek expert advice.

Video Link:
How can I remove old content from Google’s index? –