difference between analog and digital computer




Analog computers (there are still some out there) use linear combinations of voltage amplitude (or currents or frequencies or phases) from the inputs to determine the required output sequence; digital computers use two-state variables that are either “on” or “off” (true or false, 1 or 0, yes or no) as logical building blocks to derive a digital output sequence from the digitized input. Digital signals may also be converted to analog, and vice versa, so you could have a series of a/d converters, a digital computer, and d/a converters to simulate an analog computer, or a communication channel, as in digital telephones.