How Search Engine Robots Works:-

Think of search engine robots as very simple and automated data retrieval programs, traveling the web to find information and links.

Think of search engine robots as very simple and automated data retrieval programs, traveling the web to find information and links. They only absorb what they can see, and while a picture is worth a thousand words to a person, its worth zero to a search engine. They can only read and understand text, and then only if it’s laid out in a format that is tuned to their needs. Ensuring that they can access and read all the content from within a web site must be a core part of any search engine optimization strategy.

Working of Search Engine Robots:
When a web page is submitted to a search engine, the URL is added to the search engine bots queue of websites to visit. Even if you don't directly submit a website, or the web pages within a website, most robots will find the content within your website if other websites link to it. That’s part of a process referred to as building reciprocal links. This is one of the reasons why it is crucial to build the link popularity for a website, and to get links from other topical sites back to yours. It should be part of any website marketing strategy you opt in for.

When a search engine bot arrives at a website, the bots are supposed to check to see if you have a robots.txt file. This file is used to tell robots which areas of your site are off-limits to them. Typically these may be directories containing files the robot doesn't need to concern itself with. Some bots will ignore these files. However, all search engine bots do look for the file. Every website should have one, even if it is blank. Its just one of the things that the search engines look for.

Robots store a list of all the links they find on each page they visit, and follow those links through to other websites. The original concept behind the Internet was that everything would organically be linked together, like a giant relationship model. This principle is still a code part behind is how robots get around.

The smart part behind search engines actually comes in the next step. Compiling all the data that the bots have retrieved is part of building the search engine index, or database. This part of indexing websites and web pages comes from the search engine engineers, who devise the rules and algorithms which are used to evaluate and score the information the search engine bots retrieved. Once the website is added into the search engine database,the information is available for customers who are querying the search engine.