In short: A bot is a software tool for digging through data. You give a bot directions and it bring back answers.

The word is short for robot of course, which is derived from the Czech word robota meaning work.

The idea of robots as humanoid machines was first introduced in Karel Capek's 1921 play "R.U.R.," where the playwright conceived Rossum's Universal Robots. Sci-fi writer Isaac Asimov made them famous, beginning with his story I, Robot (1950) and continuing through a string of books known as the Robot Series (see the Isaac Asimov FAQ - for more details including "The Three Laws of Robotics").

On the Web, robots have taken on a new form of life. Since all Web servers are connected, robot-like software is the perfect way to perform the methodical searches needed to find information.

For example, Web search engines send out robots that crawl from one server to another, compiling the enormous lists of URLs that are the heart of every search engine. Shopping bots compile enormous databases of products sold at online stores.

The term bot has become interchangeable with agent, to indicate that the software can be sent out on a mission, usually to find information and report back. Strictly speaking, an agent is a bot that goes out on a mission. Some bots operate in place; for example, a bot in Microsoft Front Page automates work on a Web page.

Bots have great potential in data mining, the process of finding patterns in enormous amounts of data. Because data mining often requires a series of searches, bots can save labor as they persist in a search, refining it as they go along. Intelligent bots can make decisions based on past experiences, which will become an important tool for data miners trying to perfect complex searches that delve into billions of data points.

Bots were not invented on the Internet, however. Robotic software is generally believed to have been created in the form of Eliza, one of the first public displays of artificial intelligence. Eliza is a computer programmer that can engage a human in conversation: Eliza asks the user a question, and uses the answer to formulate yet another question. Artificial intelligence is an advanced form of computer science that aims to develop software capable of processing information on its own, without the need for human direction.

At times, Webmasters look on some forms of robots as a nuisance. A spider robot may uncover information the Webmaster would prefer would remain secret; occasionally, a bot will mis-behave as it crawls through a Web site, looking for URLs over and over, and slowing down the server's performance. As a result, search engine developers have formed standards on how robots should behave and how they can be excluded from Web sites.

BotSpot classifies Bots and Intelligent Agents by subject. Most of the bots you'll find discussed at BotSpot can be downloaded and used on your computer; some require a fee for permanent registration. Others are completely free. Browse through Bots by Category to begin your journey in the brave new world of bots.