5/08/2013

Submit site to Yahoo for free, MSN submit url here


Submit your site free to Google... Yahoo or MSN


Search engines like Google ¤ Altavista and AlltheWeb search the Internet and find your web site through links from other web sites. This is another reason to build a strong network of relevant... quality sites that link to yours.

free add url search engines list

If your web site is linked to from other sites there may be no need to submit. If your web site has no links... use the submit site free pages here only once... then check your listing in 45 to 60 days. CAUTION: submitting again before 45 days will not help and may hurt your ranking chances if considered abuse by the search engine.

You can also submit your site free... when you add your url to a search engine requesting to be indexed. Below is an easy one stop search engines list of some of the top search engines where you can submit your site free.

Software that automatically submits your web site should be avoided. Only submit manually to the top search engines... that is where 80 to 90% of your traffic normally comes from.

Have you seen Google™ Trends yet? Let's you compare traffic of different keyword phrases.

Submit your site to Google for inclusion in Google's index

Live Search and MSN Search are now BING on 05/31/09 free Bing / MSN submit url here - BING / MSN submit submit your Ohio site to OhioBiz (Ohio sites only)

Keep us in mind when your SEO time comes and you need affordable professional search engine optimization services.

What is a search engine robots.txt file?


How-To use a search engines robots.txt file

The robots.txt file is a must for your web site. The search engine robots / web crawlers / spiders come looking for this file, and when the robot doesn't find it, he's not a happy spider crawling your site. :-)

If you don't have it a 404 error will show up in your logs, as a page not found, a situation you can easily avoid.

Inviting the search engine robots into your site

To ALLOW ALL robots complete access simply copy & paste this code into notepad and save as: http://www.yourdomain.com/robots.txt

User-agent: *
Disallow:

To EXCLUDE ALL robots from the entire server simply paste this code into notepad and save as http://www.yourdomain.com/robots.txt

User-agent: *
Disallow: /

NOTE: the robots.txt file Must be on the root level, as shown in the sample urls above. Then you can go check it with a validator like one of these.


No comments:

Post a Comment