English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I run a forum, and I see that googlebot is always visiting my page. I know that he usually collect informations to index pages, but the question is:

How do googlebot really works? If a person search for example "nature", do googlebot automatically go to your site and take the information that you have about nature at the same time and displays at the search results? I mean, all in real time.

2007-02-11 09:33:56 · 3 answers · asked by Trilochan Kaur 2 in Computers & Internet Internet

3 answers

From
http://news.softpedia.com/news/Webmaster-Tips-How-Googlebot-Works-40346.shtml

Many web developers are using different tips to improve their site rankings but, when it comes to Googlebot, they’re all afraid that a wrong crawling process can affect the number of visitors.

Vanessa Fox, a Google employee, described the way that Googlebot works, to help webmasters develop their websites.

So, if you have your site down for maintenance, you’re probably afraid that Googlebot will index your page as a down for maintenance page. “You should configure your server to return a status of 503 (network unavailable) rather than 200 (successful). That lets Googlebot know to try the pages again later,” Vanessa said.

If you’re asking yourself what’s more useful between using meta robots tag and a robots.txt file, Vanessa Fox offers the answer: “Googlebot obeys either, but meta tags apply to single pages only. If you have a number of pages you want to exclude from crawling, you can structure your site in such a way that you can easily use a robots.txt file to block those pages (for instance, put the pages into a single directory).”

As a conclusion, if you have more questions about how Googlebot works, you should read all the documentation provided by Google and, if you didn’t find the answer, ask help from the company’s employees.

2007-02-11 09:40:41 · answer #1 · answered by The Nerd 4 · 1 0

No:
"Googlebots" are effectivly spyware for the Internet. At scheduled times, say once a day / week / month the bots are sent through every single url link on any page found. These bots scan the Internet for new pages, or updated pages and then report back to Google server which categorizes the web page by the text content.
Then it is assigned a search keyword(s) to which the site it produced to match the key word a user of Google may type in.

2007-02-11 09:40:51 · answer #2 · answered by Chεεrs [uk] 7 · 1 0

It's impossible for you to have the Googlebot spider only parts of a page. If you don't want some text shown to the Googlebot turn the text into a graphic or delete it from the page.

2016-05-23 22:30:50 · answer #3 · answered by ? 4 · 0 0

fedest.com, questions and answers