Search engines use software and algorithms to deliver rankings. Search engines also use spiders (sometimes called robots) to crawl and read your site. The robots (and this is plural because some search engines like Google have several) are basically software applications that travel the Internet by following links and finding websites to read and add to their databases. These robots will travel to your website if they find a link to follow to your site.
Once the robot finds your site and indexes it, it will move along to other sites from your outbound links. When you site is indexed, then another process takes place and this is the process of determining your ranking for specific keyword or key-phrase searches performed upon the search engine. All of the major search engines that use robots to index websites also use complicated algorithms for determining the page rank of your site.
The search engine algorithm software looks at the title and description of the website along with keyword density and prominence to name a few to see which keyword or phrase your site is optimized for (if any). Certain weight is given for each of these components along with inbound links and the page is ranked accordingly.
The advantages of the spider-driven search engines are that robots will continue to index your site on a regular basis. In addition, the volume (and popularity) of people using the search engines far outweighs the volume of other web locations such as directories.