加州山景——一台1956年的IBM Ramac啟動盤和磁盤堆棧擺在面前，不是每個人都會立馬心跳加速的。同樣，對很多人來說，1959年的Telefunken RAT 700/2模擬計算機就像一台老掉牙的電話接線總機和德國潛艇儀錶盤的結合體。像我們這些上了點年紀、又在科技上有點悟性的人，回憶起精巧卓絕的計算尺(slide rule)，嘴角還會泛起微笑。可如果是一台公文包大小卻幾乎拎不動的1981年產奧斯本計算機，那發光的5吋屏幕還能激起哪怕一丁點懷舊嗎？
IBM PS/2 Model 30 adTwenty-five
years ago, IBM announced the Personal System/2 (PS/2), a new line of
IBM PC-compatible machines that capped an era of profound influence on
the personal computer market.
By the time of the PS/2's launch in 1987, IBM PC clones--unauthorized
work-alike machines that could utilize IBM PC hardware and
software--had eaten away a sizable portion of IBM's own PC platform.
Compare the numbers: In 1983, IBM controlled roughly 76 percent of the
PC-compatible market, but in 1986 its share slipped to 26 percent.
IBM devised a plan to regain control of the PC-compatible market by
introducing a new series of machines--the PS/2 line--with a proprietary
expansion bus, operating system, and BIOS that would require clone
makers to pay a hefty license if they wanted to play IBM's game.
Unfortunately for IBM, PC clone manufacturers had already been playing
their own game.
In the end, IBM failed to reclaim a market that was quickly slipping
out of its grasp. But the PS/2 series left a lasting impression of
technical influence on the PC industry that continues to this day.
Attack of the Clones
When IBM created the PC in 1981, it used a large number of easily
obtainable, off-the-shelf components to construct the machine. Just
about any company could have put them together into a computer system,
but IBM added a couple of features that would give the machine a flavor
unique to IBM. The first was its BIOS,
the basic underlying code that governed use of the machine. The second
was its disk operating system, which had been supplied by Microsoft.
When Microsoft signed the deal to supply PC-DOS to IBM, it included a
clause that allowed Microsoft to sell that same OS to other computer
vendors--which Microsoft did (labeling it "MS-DOS") almost as soon as
the PC launched. Ad from the April 1987 launch, featuring the former cast of the 'MASH' TV show.That
wasn't a serious problem at first, because those non-IBM machines,
although they ran MS-DOS, could not legally utilize the full suite of
available IBM PC software and hardware add-ons.
As the IBM PC grew in sales and influence, other computer
manufacturers started to look into making PC-compatible machines. Before
doing so, they had to reverse-engineer IBM's proprietary BIOS code
using a clean-room technique to spare themselves from infringing upon IBM's copyright and trademarks.
First PC Clone: MPC 1600
In June 1982, Columbia Data Products did just that, and it introduced
the first PC clone, the MPC 1600. Dynalogic and Compaq followed with PC
work-alikes of their own in 1983, and soon, companies such as Phoenix
Technologies developed IBM PC-compatible BIOS products that they freely
licensed to any company that came calling. The floodgates had opened,
and the PC-compatible market was no longer IBM's to own.
At least in the early years, that market did not exist without IBM's
influence. IBM's PC XT (1983) and PC AT (1984) both brought with them
considerable innovations in PC design that cloners quickly copied. Compaq DeskPro 386 ad. Image: Courtesy of ToplessRobot.comBut
that lead would not last forever. A profound shift in market leadership
occurred when Compaq released its DeskPro 386, a powerful 1986 PC
compatible that beat IBM to market in using Intel's 80386 CPU. It was an
embarrassing blow to IBM, and Big Blue knew that it had to do something
drastic to solidify its power.
[Related: The Computer Hardware Hall of Fame]
That something was the PS/2. The line launched in April 1987 with a
high-powered ad campaign featuring the former cast of the hit MASH TV show, and a new slogan: "PS/2 It!"
Critics, who had seen more-powerful computers at lower prices,
weren't particularly impressed, and everyone immediately knew that IBM
planned to use the PS/2 to pull the rug out from beneath the
PC-compatible industry. But the new PS/2 did have some tricks up its
sleeve that would keep cloners busy for another couple of years in an
attempt to catch up.
Four Initial Models
IBM announced four PS/2 models during its April 1987 launch: the
Model 30, 50, 60, and 80. They ranged dramatically in power and price;
on the low end, the Model 30 (roughly equivalent to a PC XT) contained
an 8MHz 8086 CPU, 640KB of RAM, and a 20MB hard drive, and retailed for
$2295 (about $4642 in 2012 dollars when adjusted for inflation).
The most powerful configuration of the Model 80 came equipped with a
20MHz 386 CPU, 2MB of RAM, and a 115MB hard drive for a total cost of
$10,995 (about $22,243 today). Neither configuration included an OS--you
had to buy PC-DOS 3.3 for an extra $120 ($242 today).
The following chart from IBM offers a more detailed view of the
systems available during the 1987 launch, and illustrates just how
complex the variety could be. IBM chart explaining the four PS/2 models announced in April 1987.
Every unit in the line included at least one feature new to IBM's PC
offerings--and the market in general. In the following sections, I'll
discuss those new features and how they affected the PC industry.
Integrated I/O Functionality, New Memory Standard
From the IBM PC in 1981 through the PC AT in 1984, IBM preferred to
keep a minimum of features in the base unit. Instead, it allowed users
to extend their systems with expansion cards that plugged into the
internal slots. This meant that a 1981 PC, which shipped with five
slots, left little room for expansion when it already contained a
graphics card, a disk controller, a serial card, and a printer card--a
common configuration at the time.
With the PS/2, IBM chose to integrate many of those commonly used I/O
boards into the motherboard itself. Each model in the PS/2 line
included a built-in serial port, parallel port, mouse port, video
adapter, and floppy controller, which freed up internal slots for other
Computers in the PS/2 series also had a few other built-in advancements, such as the 16550 UART, a chip that allowed faster serial communications (useful when using a modem), as well as 72-pin RAM SIMM (single in-line memory module) sockets. Both items became standard across the industry over time.
PS/2 Keyboard and Mouse Ports
An ad describing the IBM Personal System/2.The
built-in mouse port I mentioned earlier is worth noting in more detail.
Each machine in the PS/2 line included a redesigned keyboard port and a
new mouse port, both of which used 6-pin mini-DIN connectors.
IBM intended the mouse, as a peripheral, to play a major part in the
PS/2 system. The company promised a new graphical OS (which I'll talk
about later) that would compete with the Macintosh in windowing
Even today, many new PCs ship with "PS/2 connectors" for mice and
keyboards, although they have been steadily falling out of fashion in
favor of USB ports.
New Floppy Drives
Every model in the PS/2 line contained a 3.5-inch microfloppy drive, a
Sony-developed technology that, until then, had been featured most
prominently in Apple Macintosh computers.
The low-end PS/2 Model 30 shipped with a drive that could read and
write 720KB double-density disks. Other models introduced something
completely new: a 1440KB high-density floppy drive that would become the
PC floppy drive standard for the next 20 years.
IBM's use of the 3.5-inch floppy drive was new in the PC-compatible
world. Up to that point, IBM itself had favored traditional 5.25-inch
disk drives. This drastic format shift initially came as a great
annoyance to PC users with large libraries of software on 5.25-inch
Although IBM did offer an external 5.25-inch drive option for the
PS/2 line, cloners quickly followed suit with their own 3.5-inch drives,
and many commercial software applications began shipping with both
5.25-inch and 3.5-inch floppies in the box.
VGA and MCGA
In many ways, the PS/2 line is most notable, historically, for its introduction of the Video Graphics Array standard.
Among its many modes, VGA could display 640-by-480-pixel resolution
with 16 colors on screen, and a resolution of 320 by 200 pixels with 256
colors, which was a significant improvement for PC-compatible systems
at the time. It was also fully backward-compatible with the earlier
Enhanced Graphics Adapter and Color Graphics Adapter standards from IBM.
In addition, the PS/2 line introduced what we now colloquially call a "VGA connector"--a 20-pin D-type socket that also became an industry standard.
The low-end Model 30 shipped with an integrated MCGA
graphics adapter that could display a resolution of 320 by 200 pixels
with 256 colors as well, but could display only 640 by 480 pixels in
monochrome and was not backward-compatible with EGA. MCGA met its end
after IBM included it in only a few low-end versions of the PS/2;
cloners never favored it.
Micro Channel Architecture
The crowning glory of the PS/2 line's hardware improvements was supposed to be its new expansion bus, dubbed Micro Channel Architecture. Every initial PS/2 model except the low-end Model 30 shipped with internal MCA slots for use with expansion cards.
The Model 30 included three ISA expansion slots--the type used in the original IBM PC and extended for the PC AT line.
Not surprisingly, the rest of the PC-compatible industry utilized the
ISA expansion bus as well, so any PC-compatible machine could use almost
all the cards created for other PC compatibles. MCA NIC IBM 83X9648 16-bit-card. Image: Courtesy of Appaloosa, Wikimedia CommonsWith
the PS/2, IBM saw the opportunity to create an entirely new and
improved expansion bus whose design it would strictly control and
license, thus limiting the industry's ability to clone the PS/2 machines
without paying a toll to IBM.
ISA had become slow and limiting by mid-1980s standards. MCA improved
on it by increasing the data width from 16 bits to either 16 bits or 32
bits (which allowed more data to transmit over the bus at a time) and
by improving the bus speed from 8MHz to 10MHz.
MCA also introduced a limited form of plug-and-play functionality,
wherein each expansion card carried with it a unique 16-bit ID number
that a PS/2 machine could identify to help it automatically configure
In theory, that method sounded much easier than the jumper-setting
necessary on earlier ISA cards; but in practice, it turned a bit
unwieldy. Older IBM Reference Disks (the utilities that set the system's
basic CMOS settings) would not know the IDs for newer cards, which
required IBM to release frequent Reference Disk updates. So unless you
always had the latest version (which was impossible in the
pre-Internet-update era), you probably needed a specially designed disk
to use your new MCA expansion card.
The PC clone industry did not take kindly to the power play
represented by IBM's new MCA bus. Just one year after its introduction, a
consortium of nine PC clone manufacturers introduced its own rival
which extended the earlier ISA bus to 32 bits with minimal licensing
cost. Ultimately, few desktop PCs utilized EISA. The standard remained
16-bit ISA slots until Intel's introduction of PCI, yet another new bus standard, in the early 1990s.
MCA did not help the PS/2's fortunes, but another major factor worked to sink the PS/2 as a successful platform. The high-end Model 80 PS/2As
previously mentioned, IBM planned to release the PS/2 with a completely
new, proprietary operating system called OS/2, which would take
advantage of new features of the 386 CPU in the high-end Model 80,
utilize the built-in mouse port, and also provide a graphical windowing
environment comparable to that of the Apple Macintosh.
There was only one problem: IBM hired Microsoft, creator of PC-DOS (and MS-DOS and Windows), to make it.
At the time, Microsoft was enjoying a boom in business from all the
MS-DOS licenses it was selling to PC clone vendors, and a proprietary PC
OS was most definitely not in its best interest.
So, when IBM announced that the full version of OS/2 would be delayed
until late 1988 (with a simple DOS-like preview version coming in late
1987), more than a few conspiracy theories flew around the industry.
Meanwhile, Microsoft was prepping a launch of Windows 2.0, which
would have most of the features of OS/2, in late 1987--over a year
before IBM would launch OS/2. The situation was a painful lesson in
letting your competitor create products for you. Amazingly, IBM did not
recognize (and act against) that potential conflict of interest.
The End of IBM's PC Dominance
After launch, the IBM PS/2 line sold well for a short time (about 1.5
million units sold by January 1988), but its comparatively high cost
versus PC-compatible brands steered most consumer-level users away from
Even worse for IBM, just about every advance it made in the PS/2
ended up being matched (or cloned) and then surpassed by the clone
vendors. Sales of the PS/2 slipped dramatically through the rest of the
1980s, and the PS/2 line became an embarrassing public disaster for IBM.
By 1990, it was abundantly clear that IBM no longer guided the
PC-compatible market. And in 1994, Compaq replaced IBM as the number one
PC vendor in the United States.
IBM stuck with the PC market until 2004, when it sold its PC division to Lenovo.
By that time IBM had scored a few more consumer PC innovations with
graphics standards and portable computers (especially with the ThinkPad
line), but none of its machines after the PS/2 would have the same
impact as those it released in the early and mid-1980s.
In late spring, the backroom number crunchers who powered Barack Obama’s campaign to victory noticed that George Clooney had an almost gravitational tug on West Coast females ages 40 to 49. The women were far and away the single demographic group most likely to hand over cash, for a chance to dine in Hollywood with Clooney — and Obama.
So as they did with all the other data collected, stored and analyzed in the two-year drive for re-election, Obama’s top campaign aides decided to put this insight to use. They sought out an East Coast celebrity who had similar appeal among the same demographic, aiming to replicate the millions of dollars produced by the Clooney contest. “We were blessed with an overflowing menu of options, but we chose Sarah Jessica Parker,” explains a senior campaign adviser. And so the next Dinner with Barack contest was born: a chance to eat at Parker’s West Village brownstone.
For the general public, there was no way to know that the idea for the Parker contest had come from a data-mining discovery about some supporters: affection for contests, small dinners and celebrity. But from the beginning, campaign manager Jim Messina had promised a totally different, metric-driven kind of campaign in which politics was the goal but political instincts might not be the means. “We are going to measure every single thing in this campaign,” he said after taking the job. He hired an analytics department five times as large as that of the 2008 operation, with an official “chief scientist” for the Chicago headquarters named Rayid Ghani, who in a previous life crunched huge data sets to, among other things, maximize the efficiency of supermarket sales promotions.
Exactly what that team of dozens of data crunchers was doing, however, was a closely held secret. “They are our nuclear codes,” campaign spokesman Ben LaBolt would say when asked about the efforts. Around the office, data-mining experiments were given mysterious code names such as Narwhal and Dreamcatcher. The team even worked at a remove from the rest of the campaign staff, setting up shop in a windowless room at the north end of the vast headquarters office. The “scientists” created regular briefings on their work for the President and top aides in the White House’s Roosevelt Room, but public details were in short supply as the campaign guarded what it believed to be its biggest institutional advantage over Mitt Romney’s campaign: its data.
On Nov. 4, a group of senior campaign advisers agreed to describe their cutting-edge efforts with TIME on the condition that they not be named and that the information not be published until after the winner was declared. What they revealed as they pulled back the curtain was a massive data effort that helped Obama raise $1 billion, remade the process of targeting TV ads and created detailed models of swing-state voters that could be used to increase the effectiveness of everything from phone calls and door knocks to direct mailings and social media.
How to Raise $1 Billion For all the praise Obama’s team won in 2008 for its high-tech wizardry, its success masked a huge weakness: too many databases. Back then, volunteers making phone calls through the Obama website were working off lists that differed from the lists used by callers in the campaign office. Get-out-the-vote lists were never reconciled with fundraising lists. It was like the FBI and the CIA before 9/11: the two camps never shared data. “We analyzed very early that the problem in Democratic politics was you had databases all over the place,” said one of the officials. “None of them talked to each other.” So over the first 18 months, the campaign started over, creating a single massive system that could merge the information collected from pollsters, fundraisers, field workers and consumer databases as well as social-media and mobile contacts with the main Democratic voter files in the swing states.
The new megafile didn’t just tell the campaign how to find voters and get their attention; it also allowed the number crunchers to run tests predicting which types of people would be persuaded by certain kinds of appeals. Call lists in field offices, for instance, didn’t just list names and numbers; they also ranked names in order of their persuadability, with the campaign’s most important priorities first. About 75% of the determining factors were basics like age, sex, race, neighborhood and voting record. Consumer data about voters helped round out the picture. “We could [predict] people who were going to give online. We could model people who were going to give through mail. We could model volunteers,” said one of the senior advisers about the predictive profiles built by the data. “In the end, modeling became something way bigger for us in ’12 than in ’08 because it made our time more efficient.”
Early on, for example, the campaign discovered that people who had unsubscribed from the 2008 campaign e-mail lists were top targets, among the easiest to pull back into the fold with some personal attention. The strategists fashioned tests for specific demographic groups, trying out message scripts that they could then apply. They tested how much better a call from a local volunteer would do than a call from a volunteer from a non–swing state like California. As Messina had promised, assumptions were rarely left in place without numbers to back them up.
The new megafile also allowed the campaign to raise more money than it once thought possible. Until August, everyone in the Obama orbit had protested loudly that the campaign would not be able to reach the mythical $1 billion fundraising goal. “We had big fights because we wouldn’t even accept a goal in the 900s,” said one of the senior officials who was intimately involved in the process. “And then the Internet exploded over the summer,” said another.
A large portion of the cash raised online came through an intricate, metric-driven e-mail campaign in which dozens of fundraising appeals went out each day. Here again, data collection and analysis were paramount. Many of the e-mails sent to supporters were just tests, with different subject lines, senders and messages. Inside the campaign, there were office pools on which combination would raise the most money, and often the pools got it wrong. Michelle Obama’s e-mails performed best in the spring, and at times, campaign boss Messina performed better than Vice President Joe Biden. In many cases, the top performers raised 10 times as much money for the campaign as the underperformers.
Chicago discovered that people who signed up for the campaign’s Quick Donate program, which allowed repeat giving online or via text message without having to re-enter credit-card information, gave about four times as much as other donors. So the program was expanded and incentivized. By the end of October, Quick Donate had become a big part of the campaign’s messaging to supporters, and first-time donors were offered a free bumper sticker to sign up.
Predicting Turnout The magic tricks that opened wallets were then repurposed to turn out votes. The analytics team used four streams of polling data to build a detailed picture of voters in key states. In the past month, said one official, the analytics team had polling data from about 29,000 people in Ohio alone — a whopping sample that composed nearly half of 1% of all voters there — allowing for deep dives into exactly where each demographic and regional group was trending at any given moment. This was a huge advantage: when polls started to slip after the first debate, they could check to see which voters were changing sides and which were not.
It was this database that helped steady campaign aides in October’s choppy waters, assuring them that most of the Ohioans in motion were not Obama backers but likely Romney supporters whom Romney had lost because of his September blunders. “We were much calmer than others,” said one of the officials. The polling and voter-contact data were processed and reprocessed nightly to account for every imaginable scenario. “We ran the election 66,000 times every night,” said a senior official, describing the computer simulations the campaign ran to figure out Obama’s odds of winning each swing state. “And every morning we got the spit-out — here are your chances of winning these states. And that is how we allocated resources.”
Online, the get-out-the-vote effort continued with a first-ever attempt at using Facebook on a mass scale to replicate the door-knocking efforts of field organizers. In the final weeks of the campaign, people who had downloaded an app were sent messages with pictures of their friends in swing states. They were told to click a button to automatically urge those targeted voters to take certain actions, such as registering to vote, voting early or getting to the polls. The campaign found that roughly 1 in 5 people contacted by a Facebook pal acted on the request, in large part because the message came from someone they knew.
Data helped drive the campaign’s ad buying too. Rather than rely on outside media consultants to decide where ads should run, Messina based his purchases on the massive internal data sets. “We were able to put our target voters through some really complicated modeling, to say, O.K., if Miami-Dade women under 35 are the targets, [here is] how to reach them,” said one official. As a result, the campaign bought ads to air during unconventional programming, like Sons of Anarchy, The Walking Dead and Don’t Trust the B—- in Apt. 23, skirting the traditional route of buying ads next to local news programming. How much more efficient was the Obama campaign of 2012 than 2008 at ad buying? Chicago has a number for that: “On TV we were able to buy 14% more efficiently … to make sure we were talking to our persuadable voters,” the same official said.
The numbers also led the campaign to escort their man down roads not usually taken in the late stages of a presidential campaign. In August, Obama decided to answer questions on the social news website Reddit, which many of the President’s senior aides did not know about. “Why did we put Barack Obama on Reddit?” an official asked rhetorically. “Because a whole bunch of our turnout targets were on Reddit.”
That data-driven decisionmaking played a huge role in creating a second term for the 44th President and will be one of the more closely studied elements of the 2012 cycle. It’s another sign that the role of the campaign pros in Washington who make decisions on hunches and experience is rapidly dwindling, being replaced by the work of quants and computer coders who can crack massive data sets for insight. As one official put it, the time of “guys sitting in a back room smoking cigars, saying ‘We always buy 60 Minutes’” is over. In politics, the era of big data has arrived.