The "spider" is the computer software (algorithm) used by engines to analyse pages on the world wide web. The "spiders" roam the web, following links and indexing all pages found.
When a potential visitor enters keywords in Google, for example, he is not searching the web but a database compiled by Google.
This database consists of the text and links from the web pages that have been visited by Google's "spider".
When a website is submited to Google, Google's spider is suppled with a starting point for their automatic journey.
The spider follows links and thus discovers other pages in your website and visits other sites to which your site is linked.
Search Engine Submission
Search Engine submissions to the "spidering" engines are no longer neccessary!
Search engine submissions are only done for "new websites" that have not been previously submitted to the search engines.
Manual search engine submissions to the directories are still important and is the start to your link building efforts.
We have determined which of the "spidering" SEs are the most important as they deliver the vast majority of traffic. The smaller SEs use these databases as a default.