With today’s networked PCs and the use of e-mail attach-ments it is easy to send a copy of a file or files from one computer to another, because networks already include all the facilities for doing so. Earlier, many PCs were not net-worked but could be connected via a dial-up modem. To established the connection, a terminal program running on one PC had to negotiate with its counterpart on the other machine, agreeing on whether data would be sent in 7- or 8-bit chunks, and the number of parity bits that would be included for error-checking (see error correction). The sending program would inform the receiving program as to the name and basic type of the file. For binary files (files intended to be interpreted as literal binary codes, as with executable programs, images, and so on) the contents would be sent unchanged. For text files, there might be the issue of which character set (7- bit or 8-bit ASCII) was being used, and whether the ends of lines were to be marked with a CR (carriage return) character, an LF (linefeed), or both (see characters and strings).
Implementations
Once the programs agree on the basic parameters for a file transfer, the transfer has to be managed to ensure that it completes correctly. Typically, files are divided into blocks of data (such as 1K, or 1024 bytes each). During the 1970s, Ward Christensen developed Xmodem, the first widely used file transfer program for PCs running CP/M (and later, MS-DOS and other operating systems). Xmodem was quite reli-able because it incorporated a checksum (and later, a more advanced CRC) to check the integrity of each data block. If an error is detected, the receiving program requests a retransmission.
The Ymodem program adds the capability of specifying and sending a batch of files. Zmodem, the latest in this line of evolution, automatically adjusts for the amount of errors caused by line conditions by changing the size of the data blocks used and also includes the ability to resume after an interrupted file transfer. Another widely used file transfer protocol is Kermit, which has been implemented for virtu-ally every platform and operating system. Besides file trans-fer, Kermit software offers terminal emulation and scripting capabilities. However, despite their robustness and capabil-ity, Zmodem and Kermit have been largely supplanted by the ubiquitous Web download link.
In the UNIX world, the ftp (file transfer protocol) pro-gram has been a reliable workhorse for almost 30 years. With ftp, the user at the PC or terminal connects to an ftp server on the machine that has the desired files. A variety of commands are available for specifying the directory, listing the files in the directory, specifying binary or text mode, and so on. While the traditional implementation uses typed text commands, there are now many ftp clients available for PCs that use a graphical interface with menus and buttons and allow files to be selected and dragged between the local and remote machines. Even though many files can now be downloaded through HTML links on Web pages, ftp is still the most efficient way to transfer batches of files, such as for uploading content to a Web server.
No comments:
Post a Comment