Tuesday, September 16, 2008

How To Find Secret Sites And Articles

On the internet there are a lot of site owners that hide some of the site's pages or even the entire site from the search engines. You can now find those sites with robots.txt.

Robots.txt is a text file present in the root direct ory of a site which is used to control which pages are indexed by a robot. If you use the 'disallow' word you can block parts of your sites to be found by search engines.

1. Open http://www.google.com and search after the keyword :

"robots.txt" "disallow:" filetype:txt


2. You will find the robots.txt file from sites that uses disallow command in it.

Find Secret Sites And Articles

3. Let's open for example the first site: WhiteHouse. We can see that a lot of pages were made invisible.

Find Secret Sites And Articles

4. To open 'forbidden' pages just copy the text from what disallow command you want, without the "text" at the end.

Find Secret Sites And Articles

5. Now replace in the browser /robots.txt with your copied text and press Enter. The page will open.

Find Secret Sites And Articles

This is the hidden page from WhiteHouse.

Find Secret Sites And Articles

Of course you can find more interesting pages, this was just an example.

Now you can be like a modern online Sherlock Holmes. Find what others hid.

check the link to the main site for more detailed instructions