English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

http://www.smarterdeals.com/Software-price-comparision/Xbox-4701-415.htm

Thousands of HTML pages (like above) get refreshed twice a month but still Googlebots are unable to reach the internal pages. Please advice.

Please let me know if there are any flaws in the HTML pages of the website? Thanks.

2007-07-06 01:55:34 · 5 answers · asked by humsubka 1 in Business & Finance Advertising & Marketing Search Engine Optimization

5 answers

ya, its because its an html file. i do not know what to do. all i can suggest is a file coverter.

2007-07-06 02:03:34 · answer #1 · answered by Prophet. 4 · 0 2

Google may not go as deep as you want. If the pages are generated on the fly, or by database, Google will not find them relevent. Google likes pages that are static, infromative and different from what they already have. I suggest you be sure you have a Google xml site map and be sure to have a static sitemap as well. pages that are databse driven need to have a static html page that leads to them. as well

2007-07-06 04:38:42 · answer #2 · answered by Consultant 3 · 0 0

I opened 3 of the internal links without any problem.
I am using Mozilla Firefox, do not know what browser you are using, and do not even know if that would/could be the problem.

This is my help from Washington, D.C.

2007-07-06 02:06:37 · answer #3 · answered by Anonymous · 0 0

Because of the excessive number of "Popups" your website will encounter many browsers which simply lock out your opening webpage. Check back with your web designer to reduce their number. Search robots literally get "stuck" when encountering "Popup" scripts!

Good luck!

2007-07-06 06:11:20 · answer #4 · answered by Anonymous · 0 0

One major flaw is you have no meta tags on the page. Try adding meta tags for keywords and descriptions to the page.

2007-07-06 05:40:28 · answer #5 · answered by jt66250 7 · 0 0

fedest.com, questions and answers