I remember a person mentioning a command after a sites url that views the subdirectorys of a website. It would go like this... google.com/BLANK.txt
Thaks
erm... it would be a hole/bug/exploit if it was true.
directory listings are activated only when an index or default page doesn't exist.
otherwise, it depends on the configuration of the webserver or any executed scripts.
Well, you have the robots.txt file. It can used to exclude certain
subdirectories from search engine spiders. You have www.whitehouse.gov... for example.