English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I am getting ready to up up my website, and would like it listed on Google, can I do this for free, and do I need a robots.txt file and if I do what need to go in to it? so far My robots.txt file looks like this:

User-agent:*
Disallow:/cgi-bin/
Disallow:/privatefolder/
Disallow:/downloads/
Disallow:/important/members.html

is that all I need?

I have never used a robots.txt file.

2006-11-22 22:16:34 · 3 answers · asked by damien_black4 3 in Computers & Internet Internet

3 answers

Check this article out

http://www.google.com/support/webmasters/bin/answer.py?answer=34397

Consider creating and submitting a detailed site map of your pages using Google Sitemaps. Google Sitemaps is an easy way for you to submit all your URLs to the Google index and get detailed reports about the visibility of your pages on Google. With Google Sitemaps, you can automatically keep us informed of all of your current pages and of any updates you make to those pages. Please note that submitting a Sitemap doesn't guarantee that all pages of your site will be crawled or included in our search results.

What is your website anyway?

2006-11-22 22:31:51 · answer #1 · answered by Hugo V 3 · 0 0

robots.txt file is used to tell google search bot to not crawl and index certain files. You can read more about it on w3c.org

Well in order to make google bot to crawl your webpage you can add you site into google index list.

http://www.google.com/addurl/

good luck

2006-11-23 06:19:45 · answer #2 · answered by Ravemz 2 · 0 0

See if your website hoster helps promote your website in search engines.

2006-11-23 06:30:23 · answer #3 · answered by Kite 3 · 0 0

fedest.com, questions and answers