SEO concepts # 1 Crawlers, spiders and robots

This is a series of concepts used in SEO (Search Engine Optimization)

Crawler – Crawler is a program or an automated script (also called a bot, short for robot) that navigates the Web visiting URLs by following the links on the pages of the websites that they visit
The search engines send their crawlers over the Internet and copy the text and code on them. They keep these copies on their indexes using a process called spidering. This huge index is a database of all the pages on all websites a search engine crawler can successfully visit; It is what the search engines use to provide lightning fast results when you search
When you enter a search such as Google. what you’re really querying is the search engine entire index, not the Internet as it exists at that very instant in time

Different search engines have their own individual crawlers

One very basic way to help out the search engine crawlers is by creating a sitemap. A sitemap is a file (often in XML or Extensible Markup Language) that provides a crawler with a listing of all URLs on the site. This information helps S.E. crawl the site more intelligently Google Yahoo, MSN, and Ask all accept sitemap submission.

Synonyms of the word crawler: spider and robot

Excerpts of this text has been copied from the following source:
book “The truth about SEO” from Rebecca Lieb


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: