INTERNET 201

Editor's note: Part I of Search Strategies appeared in Washington Technology's June 13 issue. In devising an Internet search strategy, these current tools make the process of gathering data and information more efficient and the results more effective. Here are three more steps (Steps 1 and 2 appeared in the last issue) that Internet searchers should consider when starting to collect information. I assume the rese

In devising an Internet search strategy, these current tools make the process of gathering data and information more efficient and the results more effective.

Editor's note: Part I of Search Strategies appeared in Washington Technology's June 13 issue.


Here are three more steps (Steps 1 and 2 appeared in the last issue) that Internet searchers should consider when starting to collect information. I assume the researcher knows about URLs (uniform resource locators), feels comfortable using the Internet and possesses sufficient background, skill and knowledge to conduct the initial research and to evaluate the quality of the results.

Step 3. Ftp, telnet, Gopher and the World Wide Web are standard repositories. Ftp (file transfer protocol) is an early Internet tool, along with telnet (a.k.a. rlogin). With it, you can send and retrieve files on networked computers throughout the world. In most cases, logging in as anonymous and using your e-mail address as the password allows access to files on the host machine. For a compressed listing of all ftp sites, connect to ftp://oak.oakland.edu/pub/simtelnet/simdos-l.zip.

Telnet is a protocol offering access to an array of databases, including library catalogs. Hytelnet (http://library.usask.ca/hytelnet/), created by Peter Scott, offers researchers an outstanding tool to explore diverse telnet sites.

Gopher, a popular tool in the early 1990s, lets you search for a wide variety of data, mainly text. The only address you need, gopher://liberty.uc.wlu.edu/, points you to the excellent work of John Doyle. At this URL, you find references to 4,987 public gopher servers worldwide sorted by topic and geographic location.

The Web contains 20 million to 50 million unique URLs. For researchers interested in quality versus quantity, one of the key resources is the WWW Virtual Library (http://www.w3.org/pub/DataSources/bySubject/Overview.html).

Step 4. With the current popularity of the Web and increasing use of browsers such as Netscape Navigator (current version: 3.0b4) and Internet Explorer (current version: 3.0b1), researchers must understand how to use a number of so-called search engines. RTFM or Read The Fantastic Manual (or help pages) is the best advice. While the different engines search on different areas of Web pages, the more popular are HotBot (http://www.hotbot.com/; 50 million URLs), Alta Vista (http://www.altavista.digital.com/; 44 million URLs) and Lycos (http://www.lycos.com/; 39 million URLs). You can find more than 45 engines at A new book by one of the on-line industry's top researchers, Mary Ellen Bates, is a must-have for those serious about tracking down information. "The Online Deskbook: Essential Desk Reference for Online and Internet Searchers," published by Pemberton, costs $29.95.

Step 5. There are miscellaneous search commands, for example, finger, name server lookup, whois, ping and traceroute, that let you find information about people, servers and hosts. For those seeking a shortcut, I placed links to many of the sites mentioned in this column at X Marks the Spot (http://www.cais.com/makulow/x.html).

John Makulowich writes, talks and trains on the Internet. You can reach him at john@trainer.com. For a list of his upcoming free seminars, see http://www.cais.com/makulow/workshop.html