A search engine consists of three subsystems:
1. The crawler (aka the search bot) indexes the site (goes to the home page, records its contents into a storage facility, usually in both the original form and plain text, follows every link it can find and records all the pages it can reach). Usually, the crawler is written in a compiled language; often, it is implemented as a daemon (on Unix) or a service (on Windows), so it works continuously.
2. The storage facility can be a database or a flat-file archive with a full-text search utility.
3. The front end (what the uninitiated mistake for the search engine) takes search queries from users and searches the storage facility based on the search queries.
It is considered a good practice to deploy the search engine on a separate physical machine optimized for this line of work. Google actually makes those. Google Mini in its most basic version supports up to 50,000 documents and costs $1,995:
http://www.google.com/enterprise/mini/
__________
2006-11-09 04:44:26
·
answer #1
·
answered by NC 7
·
0⤊
0⤋
that is needed anywhere from 30-45 a short time for the search engines to post the site. you can try re-submitting it everyother working day and sometimes it could improve the process. after this check having your site boosted for search.
2015-01-08 06:41:00
·
answer #2
·
answered by Jenna 1
·
0⤊
0⤋