One of those is Peter G. Neumann, now an 80-year-old computer scientist at SRI International, a pioneering engineering research laboratory here.
As an applied-mathematics student at
Harvard, Dr. Neumann had a two-hour breakfast with Einstein on Nov. 8, 1952. What the young math student took away was a deeply held philosophy of design that has remained with him for six decades and has been his governing principle of computing and computer security.
For many of those years, Dr. Neumann (pronounced NOY-man) has remained a voice in the wilderness, tirelessly pointing out that the computer industry has a penchant for repeating the mistakes of the past. He has long been one of the nation’s leading specialists in computer security, and early on he predicted that the security flaws that have accompanied the pell-mell explosion of the computer and Internet industries would have disastrous consequences.
“His biggest contribution is to stress the ‘systems’ nature of the security and reliability problems,” said Steven M. Bellovin, chief technology officer of the Federal Trade Commission. “That is, trouble occurs not because of one failure, but because of the way many different pieces interact.”
Dr. Bellovin said that it was Dr. Neumann who originally gave him the insight that “complex systems break in complex ways” — that the increasing complexity of modern hardware and software has made it virtually impossible to identify the flaws and vulnerabilities in computer systems and ensure that they are secure and trustworthy.
It is remarkable, then, that years after most of his contemporaries have retired, Dr. Neumann is still at it and has seized the opportunity to start over and redesign computers and software from a “clean slate.”
He is leading a team of researchers in an effort to completely rethink how to make computers and networks secure, in a five-year project financed by the Pentagon’s Defense Advanced Research Projects Agency, or Darpa, with Robert N. Watson, a computer security researcher at Cambridge University’s Computer Laboratory.
“I’ve been tilting at the same windmills for basically 40 years,” said Dr. Neumann recently during a lunchtime interview at a Chinese restaurant near his art-filled home in Palo Alto, Calif. “And I get the impression that most of the folks who are responsible don’t want to hear about complexity. They are interested in quick and dirty solutions.”
An Early Voice for Security
Dr. Neumann, who left Bell Labs and moved to California as a single father with three young children in 1970, has occupied the same office at SRI for four decades. Until the building was recently modified to make it earthquake-resistant, the office had attained notoriety for the towering stacks of computer science literature that filled every cranny. Legend has it that colleagues who visited the office after the 1989 earthquake were stunned to discover that while other offices were in disarray from the 7.1-magnitude quake, nothing in Dr. Neumann’s office appeared to have been disturbed.
A trim and agile man, with piercing eyes and a salt-and-pepper beard, Dr. Neumann has practiced tai chi for decades. But his passion, besides computer security, is music. He plays a variety of instruments, including bassoon, French horn, trombone and piano, and is active in a variety of musical groups. At computer security conferences it has become a tradition for Dr. Neumann to lead his colleagues in song, playing tunes from Gilbert and Sullivan and Tom Lehrer.
Until recently, security was a backwater in the world of computing. Today it is a multibillion-dollar industry, though one of dubious competence, and safeguarding the nation’s computerized critical infrastructure has taken on added urgency. President Obama cited it in the third debate of the presidential campaign, focusing on foreign policy, as something “we need to be thinking about” as part of the nation’s military strategy.
Dr. Neumann reasons that the only workable and complete solution to the computer security crisis is to study the past half century’s research, cherry-pick the best ideas and then build something new from the bottom up.
“Fundamentally all of the stuff we’re doing to secure networks today is putting bandages on and putting our fingers in the dike, and the dike springs a leak somewhere else,” Mr. Clarke said.
“We have not fundamentally redesigned our networks for 45 years,” he said. “Sure, it would cost an enormous amount to rearchitect, but let’s start it and see if it works better and let the marketplace decide.”
Dr. Neumann is one of the most qualified people to lead such an effort to rethink security. He has been there for the entire trajectory of modern computing — even before its earliest days. He took his first computing job in the summer of 1953, when he was hired to work as a programmer employing an I.B.M. card-punched calculator.
Today the SRI-Cambridge collaboration is one of several dozen research projects financed by Darpa’s Information Innovation Office as part of a “cyber resilience” effort started in 2010.
Run by Dr. Howard Shrobe, an
M.I.T. computer scientist who is now a Darpa program manager, the effort began with a premise: If the computer industry got a do-over, what should it do differently?
The program includes two separate but related efforts: Crash, for Clean-Slate Design of Resilient Adaptive Secure Hosts; and MRC, for Mission-Oriented Resilient Clouds. The idea is to reconsider computing entirely, from the silicon wafers on which circuits are etched to the application programs run by users, as well as services that are placing more private and personal data in remote data centers.
Clean Slate is financing research to explore how to design computer systems that are less vulnerable to computer intruders and recover more readily once security is breached.
Dr. Shrobe argues that because the industry is now in a fundamental transition from desktop to mobile systems, it is a good time to completely rethink computing. But among the biggest challenges is the monoculture of the computer “ecosystem” of desktop, servers and networks, he said.
“Nature abhors monocultures, and that’s exactly what we have in the computer world today,” said Dr. Shrobe. “Eighty percent are running the same operating system.”
Lessons From Biology
To combat uniformity in software, designers are now pursuing a variety of approaches that make computer system resources moving targets. Already some computer operating systems scramble internal addresses much the way a magician might perform the trick of hiding a pea in a shell. The Clean Slate project is taking that idea further, essentially creating software that constantly shape-shifts to elude would-be attackers.
That the Internet enables almost any computer in the world to connect directly to any other makes it possible for an attacker who identifies a single vulnerability to almost instantly compromise a vast number of systems.
But borrowing from another science, Dr. Neumann notes that biological systems have multiple immune systems — not only are there initial barriers, but a second system consisting of sentinels like T cells has the ability to detect and eliminate intruders and then remember them to provide protection in the future.
In contrast, today’s computer and network systems were largely designed with security as an afterthought, if at all.
One design approach that Dr. Neumann’s research team is pursuing is known as a tagged architecture. In effect, each piece of data in the experimental system must carry “credentials” — an encryption code that ensures that it is one that the system trusts. If the data or program’s papers are not in order, the computer won’t process them.
A related approach is called a capability architecture, which requires every software object in the system to carry special information that describes its access rights on the computer, which is checked by a special part of the processor.
For Dr. Neumann, one of the most frustrating parts of the process is seeing problems that were solved technically as long ago as four decades still plague the computer world.
A classic example is “buffer overflow” vulnerability, a design flaw that permits an attacker to send a file with a long string of characters that will overrun an area of a computer’s memory, causing the program to fail and make it possible for the intruder to execute a malicious program.
Almost 25 years ago, Robert Tappan Morris, then a graduate student at Cornell University, used the technique to make his worm program spread throughout an Internet that was then composed of about only 50,000 computers.
Dr. Neumann had attended Harvard with Robert Morris, Robert Tappan Morris’s father, and then worked with him at Bell Laboratories in the 1960s and 1970s, where the elder Mr. Morris was one of the inventors of the Unix operating system. Dr. Neumann, a close family friend, was prepared to testify at the trial of the young programmer, who carried out his hacking stunt with no real malicious intent. He was convicted and fined, and is now a professor at M.I.T.
At the time that the Morris Worm had run amok on the Internet, the buffer overflow flaw had already been known about and controlled in the Multics operating system research project, which Dr. Neumann helped lead from 1965 to 1969.
An early Pentagon-financed design effort, Multics was the first systematic attempt to grapple with how to secure computer resources that are shared by many users. Yet many of the Multics innovations were ignored at the time because I.B.M. mainframes were quickly coming to dominate the industry.
Hope and Worry
The experience left Dr. Neumann — who had coined the term “Unics” to describe a programming effort by Ken Thompson that would lead to the modern Unix operating system — simultaneously pessimistic and optimistic about the industry’s future.
“I’m fundamentally an optimist with regard to what we can do with research,” he said. “I’m fundamentally a pessimist with respect to what corporations who are fundamentally beholden to their stockholders do, because they’re always working on short-term appearance.”
That dichotomy can be seen in the
Association of Computing Machinery Risks Forumnewsgroup, a collection of e-mails reporting computer failures and foibles that Dr. Neumann has edited since 1985. With hundreds of thousands, and possibly millions, of followers, it is one of the most widely read mailing lists on the Internet — an evolving compendium of computer failures, flaws and privacy issues that he has maintained and annotated with wry comments and the occasional pun. In 1995 the list became the basis for his book “Computer-Related Risks” (Addison-Wesley/ACM Press).
While the Risks list is a reflection of Dr. Neumann’s personality, it also displays his longtime interest in electronic privacy. He is deeply involved in the technology issues surrounding electronic voting — he likes to quote Stalin on the risks:, “It’s not who votes that counts, it’s who counts the votes” — and has testified, served on panels and written widely on the subject.
Dr. Neumann grew up in New York City, in Greenwich Village, but his family moved to Rye, N.Y., where he attended high school. J. B. Neumann, Dr. Neumann’s father, was a noted art dealer, first in Germany and then in New York, where he opened the New Art Circle gallery after moving to the United States in 1923. Dr. Neumann recalls his father’s tale of eating in a restaurant in Munich, where he had a gallery, and finding that he was seated next to Hitler and some of his Nazi associates. He left the country for the United States soon afterward.
His mother, Elsa Schmid Neumann, was an artist. His two-hour breakfast with Einstein took place because she had been commissioned to create a colorful mosaic of Einstein and had become friendly with him. The mosaic is now displayed in a reference reading room in the main library at Boston University.
Dr. Neumann’s college conversation was the start of a lifelong romance with both the beauty and the perils of complexity, something that Einstein hinted at during their breakfast.
“What do you think of Johannes Brahms?” Dr. Neumann asked the physicist.
“I have never understood Brahms,” Einstein replied. “I believe Brahms was burning the midnight oil trying to be complicated.”