This section compares the antique modems with the modern ones. You should read it if you are interested in modem history or are intending to actually use an antique modem.
Before v.32, modems typically had speeds of 300 to 2400 bps. Some super fast ones had much higher speeds (such as 19.2k bps) and used non-standard protocols. To utilize these "fast" ones, both modems for a connection usually needed to be of the same brand.
Prior to the v.42 standard for error correction and the v.42bis (1990) standard for data compression, the MNP standards were usually used for both error correction and data compression. An X.PC error correction standard was used on some commercial data networks. Compression and error correction were available on some 2400 bps modems.
Around 1980 many modems only had a speed of 300 bps (which was also 300 baud). This is only 0.3kbps. Modern modems are over 100 times faster. Some old-slow modems are still in use so they are not really "antique" quite yet.
This term has a few different meanings. In general it means either the automatic adjustment of modem-to-modem speed or modem-to-serial_port speed.
Modern modems negotiate the modem-to-modem speed and protocol when they first connect to each other. If one side can't negotiate, the other side should accept whatever speed and protocol that the antique modem has available. Sometimes this is called "autobauding". When both modems automatically lower their speed due to a noisy line it's called "fallback". Thus users of modern modems (or computer programs in your PC) don't need to deal with this (unless the S37 register has been set so as to disable autobauding).
But many old modems didn't have such autobauding (although many had fallback). If you have such a modem, it will likely work OK if the other modem you connect to is a modern one that can adjust it's speed and protocol to yours. But a problem arises if both modems which want to communicate with each other are both antique and don't support autobauding. How was this done?
In olden days, a computer dial-in site might have a number of phone lines, each of which would have a specific speed modem on it. For example, if you had a 1200 bps modem then you simply only dialed in to certain telephone numbers that supported that speed. Once a site obtained modems that could support various speeds on the same modem and automatically detect the callers speed (do autobauding) then people could call in using modems that didn't do this autobauding (providing that their speed and protocol was supported).
When a modem modem is sent an init string (or a dial command), the modem detects the speed of the serial port and sets it's modem-to-serial_port speed to this value. It does this by sensing the speed of the "AT" at the beginning of the string. This is sometimes also called autobauding. This same speed is retained once the modem connects to another modem.
Old modems couldn't retain the same speed after connection and would set the serial port speed (with stty or the like) to the same exact value as the modem-to-modem speed (such as 1200 bps). If the modem had a choice of speeds one could use the AT register S37 to select one. But for dial-in when there was a choice of speeds (via modem-to-modem autobauding), if a connection was made at say 2400 bps, then the modem-to-serial_port speed would change to 2400 bps. Then one would need to switch getty to 2400 bps.
How could getty determine this speed? One way would be to read the "CONNECT" message when the call arrived. This might show the modem-to-modem speed. But how could getty read this "CONNECT" message? What made this possible was that the modem was capable of using a certain serial port speed for sending the "CONNECT" message and then switch to this speed to equal the modem-to-modem speed after it connected. Thus the modem would send the CONNECT message to the serial port at whatever serial port speed had been set (as detected by the original "AT" sent to the modem). Then getty could determine what speed to switch to by the "CONNECT" message (provided it showed modem-to-modem speed).
Another (but cruder) way to set the serial port speed to match is as follows. The person trying to login over a modem connection doesn't see any login prompt (because getty is sending it out at the wrong speed). So s/he hits a "break" key to send a break signal over the phone line (via modem) to getty. Getty gets this signal and switches to the next baud-rate as specified in it's configuration file. This continues until getty finally gets the baud-rate correct and a login prompt displays. Note that PC keyboards have no "break" key but dumb terminal keyboards did. Also note that a serial ports can communicate break signals even if they are set to different baud rates. Mgetty, agetty, and uugetty can do this break "manual bauding"
In Linux, there's a problem if the speed is set to a speed not supported by Linux's serial port (for example 7200 bps). You may dial out and connect at 7200 bps (both modem-to-modem and modem-to-serial_port speed) but you only see garbage since Linux doesn't support 7200 on the serial port. Once you connect there is no simple way to hang up because even the +++ escape sequence can't be sent to the modem over a 7200 baud interface.
To dial out by the antique method using a modern modem set &Q0 N0 and S17=5 (if you want 1200 bps). Some of the S17 settings vary with the make of modem except that S17=0 is the default that connects the modern way at the highest speed supported.
Modern modems can use almost any serial port speed and it doesn't depend at all on modem-to-modem speed. To do this they employ speed buffering and flow control. Speed buffering means that modems have buffers so that there can be a difference between the modem-to-modem speed and the modem-to-serial_port speed. If the flow entering the modem is faster than the flow exiting it, the excess flow is simply stored in a buffer in the modem. Then to prevent the buffer from overflowing, the modem sends a flow control signal to stop the input flow to the modem. This is true for either direction of flow. See Flow Control for more details.
Hayes introduced the AT command set and other modem manufacturers adopted it as a standard. Before the AT commands, many modems used dip switches to configure the modem. Another command set is the CCITT V.25bis command set. Some modems supported both CCITT and AT commands. The CCITT V.25bis also specifies how Synchronous modem-to-serial_port communication is to take place using either the ASCII or 8-bit EBCDIC character sets.
MNP 2, 3, or 4 were used for error correction. MNP 5 was compression. Modern modems generally use V42 (error correction) and V42bis (compression). Many modems support both MNP and V42.
END OF Modem-HOWTO