Dennis Ritchie and John McCarthy
Dennis Ritchie and John McCarthy, machine whisperers, died on October 8th and 24th respectively, aged 70 and 84
Nov 5th 2011 | from the print edition
NOW that digital devices are fashion items, it is easy to forget what really accounts for their near-magical properties. Without the operating systems which tell their different physical bits what to do, and without the languages in which these commands are couched, the latest iSomething would be a pretty but empty receptacle. The gizmos of the digital age owe a part of their numeric souls to Dennis Ritchie and John McCarthy.
As was normal in the unformed days of computer science in the 1950s and 1960s, both men came to the discipline through maths. They were rather good with numbers. As a teenager Mr McCarthy taught himself calculus from textbooks found at the California Institute of Technology in balmy Pasadena, where his family had moved to from Boston because of his delicate health. Mr Ritchie was not quite as precocious. He breezed through school in New Jersey, of course, and went on to Harvard to study physics. After receiving a bachelor’s degree, however, he decided, with typical modesty, that he was “not smart enough to be a physicist”.
When Mr McCarthy and Mr Ritchie first developed an urge to talk to machines, people still regarded the word “digital” as part of the jargon of anatomy. If they no longer do, that is because of the new vernaculars invented to cajole automatons into doing man’s bidding. In 1958 Mr McCarthy came up with the list-processing language, or LISP. It is the second-oldest high-level programming language still in use today—one whose grammar and vocabulary were more perspicuous and versatile than the machine code early programmers had to use. A little over a decade later Mr Ritchie created C.
C fundamentally changed the way computer programs were written. For the first time it enabled the same programs to work, without too much tweaking, on different machines; before, they had to be tailored to particular models. Much of modern software is written using one of C’s more evolved dialects. These include objective C (which Apple favours), C# (espoused by rival Microsoft) and Java (the choice for a host of internet applications). Mr Ritchie and his life-long collaborator, Ken Thompson, then used C to write UNIX, an operating system whose powerful simplicity endeared it to the operators of the minicomputers which were starting to proliferate in universities and companies in the 1970s. Nowadays its iterations undergird the entire internet and breathe life into most mobile devices, whether based on Google’s Android or Apple’s iOS.
Mr McCarthy has had less direct impact. That is partly because he believed, wrongly, that minicomputers were a passing fad. In the early 1950s, while at the Massachusetts Institute of Technology (MIT), he pioneered “time-sharing”, by which multiple users could work on a single mainframe simultaneously. Mr Ritchie, who moonlighted as a mainframe operator at MIT while a graduate student at nearby Harvard, also dabbled in time-sharing. Yet unlike his younger colleague, whose UNIX spurred the development of mini- and later microcomputers, Mr McCarthy always argued that the future lay in simple terminals hooked up remotely to a powerful mainframe which would both store and process data: a notion vindicated only recently, as cloud computing has spread.
Needed: 1.8 Einsteins
As for LISP, Mr McCarthy created it with an altogether different goal in mind—one that was, in a way, even more ambitious than Mr Ritchie’s. Whereas Mr Ritchie was happy giving machines orders, Mr McCarthy wanted them—perhaps because he had never suffered human fools gladly—to talk back. Intelligently. LISP was designed to spark this conversation, and with it “artificial intelligence”, a term Mr McCarthy coined hoping it would attract money for the first conference on the subject at Dartmouth in 1956.
In 1962 Mr McCarthy left MIT for Stanford, where he created the new Artificial Intelligence Laboratory. He set himself the goal of building a thinking machine in ten years. He would later admit this was hubristic. Not that technology wasn’t up to it. The problem lay elsewhere: in the fact that “we understand human mental processes only slightly better than a fish understands swimming.” An intelligent computer, he quipped, would require “1.8 Einsteins and one-tenth of the resources of the Manhattan Project” to construct.
Neither was forthcoming, though the Department of Defence did take an interest in Mr McCarthy’s work at Stanford from the start. Mr Ritchie, too, was briefly on the Pentagon’s payroll, at Sandia National Laboratory. He did not stay long, though. “It was nearly 1968,” he later recalled, “and somehow making A-bombs for the government didn’t seem in tune with the times.” So in 1967 he moved to AT&T’s Bell Laboratories in Murray Hill, New Jersey, where his father had worked for many years, and where both C and UNIX were born. He never left.
For his part, Mr McCarthy continued to tinker away at a truly thinking machine at Stanford. He never quite saw his dream realised. Mr Ritchie had more luck. “It’s not the actual programming that’s interesting,” he once remarked. “It’s what you can accomplish with the end results.” Amen to that, Mr McCarthy would have said.
沒有留言:
張貼留言