2012年11月28日 星期三

幽門桿菌「序列性療法」(臺大醫院)

臺大醫院研究 將改寫幽門桿菌治療準則

台大幽門螺旋桿菌研究團隊
由臺大醫院醫療團隊率領的「多中心幽門桿菌大規模試驗研究」證實,幽門桿菌「序列性療法」治療效果更勝傳統的「三合一療法」,結果顯示不論是14天或是 10天的序列性療法,其殺菌效果都優於目前14天的幽門桿菌「三合一療法」,這項研究結果於2012年11月16號登上國際頂尖醫學期刊 —「THE LANCET」《刺胳針》。
(2012-11-28 18:04:31 林宜箴)

2012年11月19日 星期一

“革命: 計算機的前2000年” The IBM PS/2: 25 Years of PC History

 

展覽評論

計算機老祖宗齊聚一堂

Heidi Schumann for The New York Times
加州山景市計算機歷史博物館推出「革命:計算機的前2000年」展。

加州山景——一台1956年的IBM Ramac啟動盤和磁盤堆棧擺在面前,不是每個人都會立馬心跳加速的。同樣,對很多人來說,1959年的Telefunken RAT 700/2模擬計算機就像一台老掉牙的電話接線總機和德國潛艇儀錶盤的結合體。像我們這些上了點年紀、又在科技上有點悟性的人,回憶起精巧卓絕的計算尺(slide rule),嘴角還會泛起微笑。可如果是一台公文包大小卻幾乎拎不動的1981年產奧斯本計算機,那發光的5吋屏幕還能激起哪怕一丁點懷舊嗎?

包括這些在內的1000多件古老物品目前都放在計算機歷史博物館展出,博物館打出的旗號是“世界頭號致力於計算機革命及其影響的留存與呈現的博物館”。由於它講述的是這樣一種歷史,又在這樣一個地點,無論展品是多麼專業或者高深莫測,總是能吸引本不相干的人到這裡參觀。

博物館位於一個12萬平方英尺的巨大空間內,這裡原本是硅谷圖形公司(SGI)總部,就在貫穿硅谷的高速公路旁,離Google園區不遠。博物館在一定程度上是對本地產業的一種紀念,大概就跟英格蘭紐卡斯爾的煤礦博物館差 不多。它的捐助者有企業也有個人,一定程度上他們本身也是博物館的展覽內容。它展出的物品不只反映科技的歷史,也涉及商業。顯然它是希望吸引到廣泛的人群 的,但時不時又會透露出一種行家的視角,對樹木的興趣要超過森林。然而樹的種類是很豐富的,博物館的展品如此琳琅滿目,觀眾到了一定時候也會產生自己是行 家的感覺。

這家機構是1984年向公眾開放的,不過當時是在波士頓,環境跟現在截然不同,還要和兒童博物館共用一個空間。計算機博物館的收藏規模日益擴大,於 1999年遷往硅谷,目前使用的建築是在2002年購置的。不過機構真正自立門戶是在2011年,經過1900萬美元的改建工程後,它用一場名為“革命: 計算機的前2000年”的永久性展覽宣告2.5萬平方英尺的新空間開幕。
在原有的7.5萬件藏品基礎上,博物館收到了更多的文稿和物品捐贈,心氣也高了起來;它增添了一些流動性的小展覽、向業內先驅的致敬以及教育項目,作為對核心展覽的補充。

至於行家的那一面,最後往往成為一種優勢。這個課題的史詩維度,它的發展源頭和成敗的錯綜交織,不是行家怎麼可能看得清呢?計算機歷史記載着一些偉大的創意被迅速轉化成初始人工製品的過程。在行家眼裡,這種轉化的速度迅猛之極。

我們可以看到自20世紀中葉起的幾座歷史裡程碑:通用自動電子計算機(Univac)和電子數值積分計算機(Eniac),編程的觀念,數字存儲的 發展。我們還可以看到歷史留下的碎片被精心地保護了起來:真空管電路板,擠作一團的扭曲線纜,已經消失的公司的標誌,以及成排成排的機櫃。
這裡並非完美無瑕,但它講述的歷史令人神往,尤其是我們要知道,當今的發展進步已經和日常生活緊密結合到了一起,基本上無法察覺到。老一代的技術絕對不會這樣。

一開始幾個展廳的布置旨在讓我們意識到,自有記載以來,用於計算的機器在歷史上扮演着多麼重要的角色。算盤這個“可能是除手指以外最古老的持續使用 計算工具”,在這個展覽中很受重視。展覽安排了一段使用算盤的教學,另外我們還了解到,在1946年的日本,一名通曉新型電子計算器的美國陸軍士兵,和一 個將算盤用得出神入化的日本郵局職員比賽。五局裡算盤贏了四局。

展覽中最古老的計算機是一張“踏式計算器”的圖片,這種由萊布尼茨在17世紀末發明的四則計算器影響了後世近300年的計算器設計。門廳空間陳設着一台終極機械計算器:由查爾斯·巴貝奇(Charles Babbage)在150年前發明的“差分機”,至今仍可以使用。這台重5噸、長11英尺(約合335厘米——譯註),由8000個零件組成的機器可以進行複雜的數值表達式運算,並打印出結果。巴貝奇本人沒有把機器造出來,只是留下了圖紙;由倫敦科學博物館執行的製造工程花了將近二十年,直到2000年才完成(此處展出的是2008年委託製造的複製品)。

接下來,這段歷史之旅帶領我們從年代久遠的機械計算器向前走了一步,介紹了一項技術革新。這項革新對計算技術此後幾十年的發展起到了決定性作用,但用的卻是最普通的材料:一些打了孔的紙卡。這是赫爾曼·何樂禮(Herman Hollerith)的發明,此人在一個競賽中勝出,得到了為美國人口普查局作1890年人口普查數據分析的機會。他在6000多萬張卡片上根據每個人的個人特徵來打孔。然後根據孔的位置來整理數據。

這種做法聽起來技術含量夠低的,至於最初啟發了何樂禮的東西,比這還要低——19世紀初的賈卡提花機在印製布面圖案時會使用到一種帶孔卡片。往往是這些陳舊、簡單卻又有着驚人功用的想法,最終能成就技術的進步。

隨着年代發展,展覽中這種實用、基本的感覺漸漸減少了。這跟素材的問題有關。但有的時候,一項革新的重要性是需要做出更多闡釋的。模擬和數碼計算機的區別被解釋地太馬虎了。而相比硬件上的突破,編程領域的進步在展覽中體現的不充分,當然這方面呈現起來也確實不容易。

從各方面看,幾個效果最好的展品都是和歷史事件直接相關的。第二次世界大戰對科技潛能的發掘也許強於史上任何一場戰爭,尤其是密碼破譯和武器彈道計算這兩大需求,極大促進了計算技術的發展。這些研究改變了戰後計算技術的面貌。

博物館的歷史探尋還讓我們了解到電子計算機的發明權引發的專利之爭、數據存儲的演進、微處理器的開發,以及越來越複雜的計算機圖形技術。

我們看到了由西摩·克雷(Seymour Cray)設計、利用手工製作的超級計算機的一些組成部分(其中Cray–1是1976年到1982年之間全世界最快的電腦)。我們看到了電腦化的民用產 品,包括一台1969年的霍尼韋爾(Honeywell)迷你計算機,當時在內曼·馬庫斯(Neiman Marcus)商場被打上“廚房電腦”的標籤,售價10600美元。按照設想,這種電腦可以幫助那些富有的顧客處理連他們的廚師都應付不來的菜譜。(商品 圖冊上說:“要是她的廚藝能跟霍尼韋爾的計算本領一樣強該多好。”)另外,電子遊戲、移動計算、網絡和互聯網的發展,也在展覽中做了呈現。
隨着素材越來越接近我們的時代,展覽的論調開始變得不確定起來。重點在哪裡,為什麼這樣?後面的一些展廳已經有了過時的感覺,對某些東西(比如 “.com”的井噴式發展)說的太細,有些東西(取代PC的平板電腦和智能手機)又顯得不夠。過去15年的變化實在太大,司法部訴微軟的反壟斷官司,感覺 跟當年起訴IBM的那場一樣離奇(1982年該案因“證據不足”而撤訴)。

博物館如果能將視角進一步擴大,也許會有一些啟發:比如進一步探討科技發展與科幻小說的互動;計算機的大眾形象;圍繞着互聯網產生的烏托邦空想;或者政治文化中的變異。

不過,在某種程度上,這種展覽勢必要出現觀感的分歧——有人不滿,有人狂熱,不同的人會有截然不同的看法,並且會順時而變。這裡有如此多的內容,你 會忍不住順着一條線索往下走,希望它能帶你走得更遠;或者順着另一條線索,心裡想着可能是個死胡同。博物館在框架設計上還給創新和改變留出了許多空間。歸 根結底,它所探究的領域就是這樣,唯有往事才是真正靜態的。等它升級到2.0的時候,我會再來的。

計算機歷史博物館位於加利福尼亞州山景市North Shoreline Boulevard 1401號;(650)810–1010;computerhistory.org。
本文最初發表於2012年9月29日。
翻譯:經雷

 

 

The IBM PS/2: 25 Years of PC History

Here's a fond look back at the Personal System/2 series of PCs, which embarrassed IBM in the late 1980s but shaped the modern PC you know today.

IBM PS/2 Model 30 adIBM PS/2 Model 30 adTwenty-five years ago, IBM announced the Personal System/2 (PS/2), a new line of IBM PC-compatible machines that capped an era of profound influence on the personal computer market.
By the time of the PS/2's launch in 1987, IBM PC clones--unauthorized work-alike machines that could utilize IBM PC hardware and software--had eaten away a sizable portion of IBM's own PC platform. Compare the numbers: In 1983, IBM controlled roughly 76 percent of the PC-compatible market, but in 1986 its share slipped to 26 percent.
IBM devised a plan to regain control of the PC-compatible market by introducing a new series of machines--the PS/2 line--with a proprietary expansion bus, operating system, and BIOS that would require clone makers to pay a hefty license if they wanted to play IBM's game. Unfortunately for IBM, PC clone manufacturers had already been playing their own game.
In the end, IBM failed to reclaim a market that was quickly slipping out of its grasp. But the PS/2 series left a lasting impression of technical influence on the PC industry that continues to this day.

Attack of the Clones

When IBM created the PC in 1981, it used a large number of easily obtainable, off-the-shelf components to construct the machine. Just about any company could have put them together into a computer system, but IBM added a couple of features that would give the machine a flavor unique to IBM. The first was its BIOS, the basic underlying code that governed use of the machine. The second was its disk operating system, which had been supplied by Microsoft.
When Microsoft signed the deal to supply PC-DOS to IBM, it included a clause that allowed Microsoft to sell that same OS to other computer vendors--which Microsoft did (labeling it "MS-DOS") almost as soon as the PC launched.
Ad from the April 1987 launch, featuring the former cast of the 'MASH' TV show.Ad from the April 1987 launch, featuring the former cast of the 'MASH' TV show.That wasn't a serious problem at first, because those non-IBM machines, although they ran MS-DOS, could not legally utilize the full suite of available IBM PC software and hardware add-ons.
As the IBM PC grew in sales and influence, other computer manufacturers started to look into making PC-compatible machines. Before doing so, they had to reverse-engineer IBM's proprietary BIOS code using a clean-room technique to spare themselves from infringing upon IBM's copyright and trademarks.

First PC Clone: MPC 1600

In June 1982, Columbia Data Products did just that, and it introduced the first PC clone, the MPC 1600. Dynalogic and Compaq followed with PC work-alikes of their own in 1983, and soon, companies such as Phoenix Technologies developed IBM PC-compatible BIOS products that they freely licensed to any company that came calling. The floodgates had opened, and the PC-compatible market was no longer IBM's to own.
At least in the early years, that market did not exist without IBM's influence. IBM's PC XT (1983) and PC AT (1984) both brought with them considerable innovations in PC design that cloners quickly copied.
Compaq DeskPro 386 adCompaq DeskPro 386 ad. Image: Courtesy of ToplessRobot.comBut that lead would not last forever. A profound shift in market leadership occurred when Compaq released its DeskPro 386, a powerful 1986 PC compatible that beat IBM to market in using Intel's 80386 CPU. It was an embarrassing blow to IBM, and Big Blue knew that it had to do something drastic to solidify its power.
[Related: The Computer Hardware Hall of Fame]
That something was the PS/2. The line launched in April 1987 with a high-powered ad campaign featuring the former cast of the hit MASH TV show, and a new slogan: "PS/2 It!"
Critics, who had seen more-powerful computers at lower prices, weren't particularly impressed, and everyone immediately knew that IBM planned to use the PS/2 to pull the rug out from beneath the PC-compatible industry. But the new PS/2 did have some tricks up its sleeve that would keep cloners busy for another couple of years in an attempt to catch up.

Four Initial Models

IBM announced four PS/2 models during its April 1987 launch: the Model 30, 50, 60, and 80. They ranged dramatically in power and price; on the low end, the Model 30 (roughly equivalent to a PC XT) contained an 8MHz 8086 CPU, 640KB of RAM, and a 20MB hard drive, and retailed for $2295 (about $4642 in 2012 dollars when adjusted for inflation).
The most powerful configuration of the Model 80 came equipped with a 20MHz 386 CPU, 2MB of RAM, and a 115MB hard drive for a total cost of $10,995 (about $22,243 today). Neither configuration included an OS--you had to buy PC-DOS 3.3 for an extra $120 ($242 today).
The following chart from IBM offers a more detailed view of the systems available during the 1987 launch, and illustrates just how complex the variety could be.
IBM chart explaining the four PS/2 models announced in April 1987.IBM chart explaining the four PS/2 models announced in April 1987.
Every unit in the line included at least one feature new to IBM's PC offerings--and the market in general. In the following sections, I'll discuss those new features and how they affected the PC industry.

Integrated I/O Functionality, New Memory Standard

From the IBM PC in 1981 through the PC AT in 1984, IBM preferred to keep a minimum of features in the base unit. Instead, it allowed users to extend their systems with expansion cards that plugged into the internal slots. This meant that a 1981 PC, which shipped with five slots, left little room for expansion when it already contained a graphics card, a disk controller, a serial card, and a printer card--a common configuration at the time.
With the PS/2, IBM chose to integrate many of those commonly used I/O boards into the motherboard itself. Each model in the PS/2 line included a built-in serial port, parallel port, mouse port, video adapter, and floppy controller, which freed up internal slots for other uses.
Computers in the PS/2 series also had a few other built-in advancements, such as the 16550 UART, a chip that allowed faster serial communications (useful when using a modem), as well as 72-pin RAM SIMM (single in-line memory module) sockets. Both items became standard across the industry over time.

PS/2 Keyboard and Mouse Ports

An ad describing the IBM Personal System/2.An ad describing the IBM Personal System/2.The built-in mouse port I mentioned earlier is worth noting in more detail. Each machine in the PS/2 line included a redesigned keyboard port and a new mouse port, both of which used 6-pin mini-DIN connectors.
IBM intended the mouse, as a peripheral, to play a major part in the PS/2 system. The company promised a new graphical OS (which I'll talk about later) that would compete with the Macintosh in windowing functionality.
Even today, many new PCs ship with "PS/2 connectors" for mice and keyboards, although they have been steadily falling out of fashion in favor of USB ports.

New Floppy Drives

Every model in the PS/2 line contained a 3.5-inch microfloppy drive, a Sony-developed technology that, until then, had been featured most prominently in Apple Macintosh computers.
The low-end PS/2 Model 30 shipped with a drive that could read and write 720KB double-density disks. Other models introduced something completely new: a 1440KB high-density floppy drive that would become the PC floppy drive standard for the next 20 years.
IBM's use of the 3.5-inch floppy drive was new in the PC-compatible world. Up to that point, IBM itself had favored traditional 5.25-inch disk drives. This drastic format shift initially came as a great annoyance to PC users with large libraries of software on 5.25-inch disks.
Although IBM did offer an external 5.25-inch drive option for the PS/2 line, cloners quickly followed suit with their own 3.5-inch drives, and many commercial software applications began shipping with both 5.25-inch and 3.5-inch floppies in the box.

VGA and MCGA

In many ways, the PS/2 line is most notable, historically, for its introduction of the Video Graphics Array standard.
Among its many modes, VGA could display 640-by-480-pixel resolution with 16 colors on screen, and a resolution of 320 by 200 pixels with 256 colors, which was a significant improvement for PC-compatible systems at the time. It was also fully backward-compatible with the earlier Enhanced Graphics Adapter and Color Graphics Adapter standards from IBM.
In addition, the PS/2 line introduced what we now colloquially call a "VGA connector"--a 20-pin D-type socket that also became an industry standard.
The low-end Model 30 shipped with an integrated MCGA graphics adapter that could display a resolution of 320 by 200 pixels with 256 colors as well, but could display only 640 by 480 pixels in monochrome and was not backward-compatible with EGA. MCGA met its end after IBM included it in only a few low-end versions of the PS/2; cloners never favored it.

Micro Channel Architecture

The crowning glory of the PS/2 line's hardware improvements was supposed to be its new expansion bus, dubbed Micro Channel Architecture. Every initial PS/2 model except the low-end Model 30 shipped with internal MCA slots for use with expansion cards.
The Model 30 included three ISA expansion slots--the type used in the original IBM PC and extended for the PC AT line. Not surprisingly, the rest of the PC-compatible industry utilized the ISA expansion bus as well, so any PC-compatible machine could use almost all the cards created for other PC compatibles.
MCA NIC IBM 83X9648 16-bit expansion cardMCA NIC IBM 83X9648 16-bit-card. Image: Courtesy of Appaloosa, Wikimedia CommonsWith the PS/2, IBM saw the opportunity to create an entirely new and improved expansion bus whose design it would strictly control and license, thus limiting the industry's ability to clone the PS/2 machines without paying a toll to IBM.
ISA had become slow and limiting by mid-1980s standards. MCA improved on it by increasing the data width from 16 bits to either 16 bits or 32 bits (which allowed more data to transmit over the bus at a time) and by improving the bus speed from 8MHz to 10MHz.
MCA also introduced a limited form of plug-and-play functionality, wherein each expansion card carried with it a unique 16-bit ID number that a PS/2 machine could identify to help it automatically configure the card.
In theory, that method sounded much easier than the jumper-setting necessary on earlier ISA cards; but in practice, it turned a bit unwieldy. Older IBM Reference Disks (the utilities that set the system's basic CMOS settings) would not know the IDs for newer cards, which required IBM to release frequent Reference Disk updates. So unless you always had the latest version (which was impossible in the pre-Internet-update era), you probably needed a specially designed disk to use your new MCA expansion card.
The PC clone industry did not take kindly to the power play represented by IBM's new MCA bus. Just one year after its introduction, a consortium of nine PC clone manufacturers introduced its own rival standard, EISA, which extended the earlier ISA bus to 32 bits with minimal licensing cost. Ultimately, few desktop PCs utilized EISA. The standard remained 16-bit ISA slots until Intel's introduction of PCI, yet another new bus standard, in the early 1990s.

OS/2

MCA did not help the PS/2's fortunes, but another major factor worked to sink the PS/2 as a successful platform.
The high-end Model 80 PS/2The high-end Model 80 PS/2As previously mentioned, IBM planned to release the PS/2 with a completely new, proprietary operating system called OS/2, which would take advantage of new features of the 386 CPU in the high-end Model 80, utilize the built-in mouse port, and also provide a graphical windowing environment comparable to that of the Apple Macintosh.
There was only one problem: IBM hired Microsoft, creator of PC-DOS (and MS-DOS and Windows), to make it.
At the time, Microsoft was enjoying a boom in business from all the MS-DOS licenses it was selling to PC clone vendors, and a proprietary PC OS was most definitely not in its best interest.
So, when IBM announced that the full version of OS/2 would be delayed until late 1988 (with a simple DOS-like preview version coming in late 1987), more than a few conspiracy theories flew around the industry.
Meanwhile, Microsoft was prepping a launch of Windows 2.0, which would have most of the features of OS/2, in late 1987--over a year before IBM would launch OS/2. The situation was a painful lesson in letting your competitor create products for you. Amazingly, IBM did not recognize (and act against) that potential conflict of interest.

The End of IBM's PC Dominance

After launch, the IBM PS/2 line sold well for a short time (about 1.5 million units sold by January 1988), but its comparatively high cost versus PC-compatible brands steered most consumer-level users away from the systems.
Even worse for IBM, just about every advance it made in the PS/2 ended up being matched (or cloned) and then surpassed by the clone vendors. Sales of the PS/2 slipped dramatically through the rest of the 1980s, and the PS/2 line became an embarrassing public disaster for IBM.
By 1990, it was abundantly clear that IBM no longer guided the PC-compatible market. And in 1994, Compaq replaced IBM as the number one PC vendor in the United States.
IBM stuck with the PC market until 2004, when it sold its PC division to Lenovo. By that time IBM had scored a few more consumer PC innovations with graphics standards and portable computers (especially with the ThinkPad line), but none of its machines after the PS/2 would have the same impact as those it released in the early and mid-1980s.

2012年11月14日 星期三

能消字的影印機

日本將推出能消字的影印機

2012/11/13
 
東芝泰格的新款影印機可瞬間擦除文字重復利用紙張
        日本東芝泰格(TOSHIBATEC)11月12日稱將把可瞬間擦除印刷文字的影印機全球首次投入實用,自2013年2月起開始在日本國內供貨。使用這款 影印機,同一張紙平均可以重復使用5次,包括購買、廢棄乃至生產複印紙張,可以大幅減少二氧化碳(CO2)排放量。東芝泰格計劃向環保意識高的企業和行政 機關銷售這款影印機,今後3年內在全球銷出3萬台。
 
      該公司新開發了可以擦除的乾性油墨。將兼具影印機和傳真機等功能的多功能一體機和加熱後可將紙張還原為白紙的消色機組合起來使用。消色機具備掃描器功能,在擦除文字前讀取其內容並以電子格式保存在伺服器上。

     此外,這款產品還能對邊角折損等無法再次利用的紙張進行自動分類。一體機和消色機合計141萬日元(不含稅,約合人民幣11.18萬元)。

2012年11月13日 星期二

人工智慧 挑戰考大學、寫小說

人工智慧 挑戰考大學、寫小說

〔編譯林翠儀/綜合報導〕日本經濟新聞報導,為開拓「人工智慧」的無限可能,日本研究人員計畫讓搭載人工智慧的電腦報考東京大學,還要讓電腦在5年後創作4000字的小說。
人工智慧又稱機器智能,通常是指人工製造的系統,經過運算後表現出來的智能。科學家從1950年代著手研究人工智慧,希望創造出具有智能的機器人成為勞動力,但成果仍無法跨越「玩具」領域。
1970年代,人工智慧的研究處於停滯狀態,直到1997年美國IBM電腦「深藍(Deep Blue)」,在一場6局對決的西洋棋賽中擊敗當時的世界棋王。2011年IBM的「華生(Watson)」參加美國益智節目贏得首獎百萬美元,人工智慧的開發再度受到矚目。
日本2010年也曾以一套將棋人工智慧系統打敗職業棋手,國立情報學研究所的研究員更異想天開,打算嘗試讓擁有人工智慧的電腦報考日本第一學府東京大學。
目 前該研究所和開發人工智慧軟體的富士通研究所合作,讓電腦試作大學入學考試的題目。研究人員表示,目前電腦大概能夠回答5到6成的題目,其中最難解的是數 學部分,因為電腦沒辦法像人類一樣,在閱讀問題的敘述文字後,馬上理解題意進行運算。不過,研究人員希望在2016年拿到聯考高分,2021年考上東大。
此外,人工智慧一向被認為「缺乏感性」,因此研究人員還嘗試挑戰用人工智慧寫小說,初步計畫讓電腦寫出長約4000字的科幻小說,並預定5年後參加徵文比賽。

2012年11月12日 星期一

In politics, the era of big data has arrived.



Inside the Secret World of the Data Crunchers Who Helped Obama Win


image "The cave" at President Obama's Election headquarters in Chicago
DANIEL SHEA FOR TIME
"The cave" at President Obama's campaign headquarters in Chicago
In late spring, the backroom number crunchers who powered Barack Obama’s campaign to victory noticed that George Clooney had an almost gravitational tug on West Coast females ages 40 to 49. The women were far and away the single demographic group most likely to hand over cash, for a chance to dine in Hollywood with Clooney — and Obama.
So as they did with all the other data collected, stored and analyzed in the two-year drive for re-election, Obama’s top campaign aides decided to put this insight to use. They sought out an East Coast celebrity who had similar appeal among the same demographic, aiming to replicate the millions of dollars produced by the Clooney contest. “We were blessed with an overflowing menu of options, but we chose Sarah Jessica Parker,” explains a senior campaign adviser. And so the next Dinner with Barack contest was born: a chance to eat at Parker’s West Village brownstone.
For the general public, there was no way to know that the idea for the Parker contest had come from a data-mining discovery about some supporters: affection for contests, small dinners and celebrity. But from the beginning, campaign manager Jim Messina had promised a totally different, metric-driven kind of campaign in which politics was the goal but political instincts might not be the means. “We are going to measure every single thing in this campaign,” he said after taking the job. He hired an analytics department five times as large as that of the 2008 operation, with an official “chief scientist” for the Chicago headquarters named Rayid Ghani, who in a previous life crunched huge data sets to, among other things, maximize the efficiency of supermarket sales promotions.
Exactly what that team of dozens of data crunchers was doing, however, was a closely held secret. “They are our nuclear codes,” campaign spokesman Ben LaBolt would say when asked about the efforts. Around the office, data-mining experiments were given mysterious code names such as Narwhal and Dreamcatcher. The team even worked at a remove from the rest of the campaign staff, setting up shop in a windowless room at the north end of the vast headquarters office. The “scientists” created regular briefings on their work for the President and top aides in the White House’s Roosevelt Room, but public details were in short supply as the campaign guarded what it believed to be its biggest institutional advantage over Mitt Romney’s campaign: its data.
On Nov. 4, a group of senior campaign advisers agreed to describe their cutting-edge efforts with TIME on the condition that they not be named and that the information not be published until after the winner was declared. What they revealed as they pulled back the curtain was a massive data effort that helped Obama raise $1 billion, remade the process of targeting TV ads and created detailed models of swing-state voters that could be used to increase the effectiveness of everything from phone calls and door knocks to direct mailings and social media.
How to Raise $1 Billion
For all the praise Obama’s team won in 2008 for its high-tech wizardry, its success masked a huge weakness: too many databases. Back then, volunteers making phone calls through the Obama website were working off lists that differed from the lists used by callers in the campaign office. Get-out-the-vote lists were never reconciled with fundraising lists. It was like the FBI and the CIA before 9/11: the two camps never shared data. “We analyzed very early that the problem in Democratic politics was you had databases all over the place,” said one of the officials. “None of them talked to each other.” So over the first 18 months, the campaign started over, creating a single massive system that could merge the information collected from pollsters, fundraisers, field workers and consumer databases as well as social-media and mobile contacts with the main Democratic voter files in the swing states.
The new megafile didn’t just tell the campaign how to find voters and get their attention; it also allowed the number crunchers to run tests predicting which types of people would be persuaded by certain kinds of appeals. Call lists in field offices, for instance, didn’t just list names and numbers; they also ranked names in order of their persuadability, with the campaign’s most important priorities first. About 75% of the determining factors were basics like age, sex, race, neighborhood and voting record. Consumer data about voters helped round out the picture. “We could [predict] people who were going to give online. We could model people who were going to give through mail. We could model volunteers,” said one of the senior advisers about the predictive profiles built by the data. “In the end, modeling became something way bigger for us in ’12 than in ’08 because it made our time more efficient.”
Early on, for example, the campaign discovered that people who had unsubscribed from the 2008 campaign e-mail lists were top targets, among the easiest to pull back into the fold with some personal attention. The strategists fashioned tests for specific demographic groups, trying out message scripts that they could then apply. They tested how much better a call from a local volunteer would do than a call from a volunteer from a non–swing state like California. As Messina had promised, assumptions were rarely left in place without numbers to back them up.

The new megafile also allowed the campaign to raise more money than it once thought possible. Until August, everyone in the Obama orbit had protested loudly that the campaign would not be able to reach the mythical $1 billion fundraising goal. “We had big fights because we wouldn’t even accept a goal in the 900s,” said one of the senior officials who was intimately involved in the process. “And then the Internet exploded over the summer,” said another.
A large portion of the cash raised online came through an intricate, metric-driven e-mail campaign in which dozens of fundraising appeals went out each day. Here again, data collection and analysis were paramount. Many of the e-mails sent to supporters were just tests, with different subject lines, senders and messages. Inside the campaign, there were office pools on which combination would raise the most money, and often the pools got it wrong. Michelle Obama’s e-mails performed best in the spring, and at times, campaign boss Messina performed better than Vice President Joe Biden. In many cases, the top performers raised 10 times as much money for the campaign as the underperformers.
Chicago discovered that people who signed up for the campaign’s Quick Donate program, which allowed repeat giving online or via text message without having to re-enter credit-card information, gave about four times as much as other donors. So the program was expanded and incentivized. By the end of October, Quick Donate had become a big part of the campaign’s messaging to supporters, and first-time donors were offered a free bumper sticker to sign up.
Predicting Turnout
The magic tricks that opened wallets were then repurposed to turn out votes. The analytics team used four streams of polling data to build a detailed picture of voters in key states. In the past month, said one official, the analytics team had polling data from about 29,000 people in Ohio alone — a whopping sample that composed nearly half of 1% of all voters there — allowing for deep dives into exactly where each demographic and regional group was trending at any given moment. This was a huge advantage: when polls started to slip after the first debate, they could check to see which voters were changing sides and which were not.
It was this database that helped steady campaign aides in October’s choppy waters, assuring them that most of the Ohioans in motion were not Obama backers but likely Romney supporters whom Romney had lost because of his September blunders. “We were much calmer than others,” said one of the officials. The polling and voter-contact data were processed and reprocessed nightly to account for every imaginable scenario. “We ran the election 66,000 times every night,” said a senior official, describing the computer simulations the campaign ran to figure out Obama’s odds of winning each swing state. “And every morning we got the spit-out — here are your chances of winning these states. And that is how we allocated resources.”
Online, the get-out-the-vote effort continued with a first-ever attempt at using Facebook on a mass scale to replicate the door-knocking efforts of field organizers. In the final weeks of the campaign, people who had downloaded an app were sent messages with pictures of their friends in swing states. They were told to click a button to automatically urge those targeted voters to take certain actions, such as registering to vote, voting early or getting to the polls. The campaign found that roughly 1 in 5 people contacted by a Facebook pal acted on the request, in large part because the message came from someone they knew.
Data helped drive the campaign’s ad buying too. Rather than rely on outside media consultants to decide where ads should run, Messina based his purchases on the massive internal data sets. “We were able to put our target voters through some really complicated modeling, to say, O.K., if Miami-Dade women under 35 are the targets, [here is] how to reach them,” said one official. As a result, the campaign bought ads to air during unconventional programming, like Sons of Anarchy, The Walking Dead and Don’t Trust the B—- in Apt. 23, skirting the traditional route of buying ads next to local news programming. How much more efficient was the Obama campaign of 2012 than 2008 at ad buying? Chicago has a number for that: “On TV we were able to buy 14% more efficiently … to make sure we were talking to our persuadable voters,” the same official said.
The numbers also led the campaign to escort their man down roads not usually taken in the late stages of a presidential campaign. In August, Obama decided to answer questions on the social news website Reddit, which many of the President’s senior aides did not know about. “Why did we put Barack Obama on Reddit?” an official asked rhetorically. “Because a whole bunch of our turnout targets were on Reddit.”
That data-driven decisionmaking played a huge role in creating a second term for the 44th President and will be one of the more closely studied elements of the 2012 cycle. It’s another sign that the role of the campaign pros in Washington who make decisions on hunches and experience is rapidly dwindling, being replaced by the work of quants and computer coders who can crack massive data sets for insight. As one official put it, the time of “guys sitting in a back room smoking cigars, saying ‘We always buy 60 Minutes’” is over. In politics, the era of big data has arrived.


2012年11月10日 星期六

MIT Technology Review

http://www.technologyreview.com/






2012年11月9日 星期五

「4G 執照釋出公聽會」



上網吃到飽用戶造成塞車? 消基會林宗男:錯誤的觀念


記者洪聖壹/台北報導
針對日前吵得火熱的「取消行動上網吃到飽」以及 「4G 釋照」相關配套等議題,立法委員林佳龍、魏明谷、劉櫂豪在立法院召開「4G 執照釋出公聽會」,會中消基會副秘書長林宗男舉證澄清使用上網吃到飽的重度使用者並不是造成網路塞車的元凶,同時也強調加強基礎建設的重要性。
由 立法委員林佳龍、劉櫂豪、魏明谷 8 日在立法院所共同召開的「4G 執照釋出公聽會」當中,找來電信業者、學界代表、政委張善政及 NCC 副主委虞孝成等人,共同探討有關行動上網通訊的配套措施、4G 執照釋出的政策與 WiMax 未來發展等相關議題,以期提供政府、業者、技術單位一個施行共識。
消基會副秘書長林宗男認為就技術的演進來看,4G LTE 確實 3G 的效率更好,也確實能有效紓解頻寬使用的問題,同時也符合民眾上網的需求,否則各國也不會投入資源在建設 4G LTE 上面。
不過林宗男提出數據並澄清 3G 上網吃到飽消費者當中,重度使用者是造成網路塞車的元凶,這是一個錯誤的說法,同時也可能會誤導執政者的施政方向。
林宗男以單一基地台附近的使用者持續使用手機上網兩個小時為例,指出那些使用無限上網吃到飽的重度使用者,實際上吃掉了 83% 的頻寬,其他的一般型用戶占 17 % 的頻寬,但基地台實際在尖峰的速度卻比離峰的速度差 7.3 倍。簡言之,影響速度最重要的因素,是因為使用的人數多寡。

▲林宗男提供實際數據澄清 3G 上網吃到飽消費者當中,重度使用者是造成網路塞車的元凶,是一個錯誤的說法,同時也提到台灣 4G 政策推動只有落後,沒有太快的問題,提醒執政者在這同時必須重視網路基礎建設的增加並防範趁機卡油的頻譜蟑螂。(圖/記者洪聖壹攝)
林宗男解釋,基地台上網速率,雖然有一個頻寬上限,但其實是根據使用人數多少來提供上網速率。以能支援到 14 MBps 的上網速率的 iPhone 5 手機為例,當使用的人多的時候,基地台所提供的速度就可以到達 14 MBps,藉此讓每個使用者都能使用到一樣的速率,當然當使用的人少的時候,上網速度也就會隨之改變。
除此之外,林宗男還提出自己在今年六月在消基會發佈的使用者報告當中,台東縣平均上網速度只有台北市的一半,實際數字約是 0.8 MBps 相對於 1.6 MBps,如果依照目前所說:「吃到飽用戶是造成網路塞車的元凶」的道理,是不是就是指台東縣縣民的吃到飽的用戶比台北市市民還多?
同樣的道理套用在今年 10 月初 NCC 發佈的測速報告當中指出,「我國平均上網速度從原本的 1.7 Mbps 提升到 2.1 Mbps」,也就是說進步了 23 %,如果說重度使用者造成網路塞車,那麼在 7月~10月這段期間,難道這些重度使用者又平白無故減少了 23 % 嗎?
對此,林宗男認為,速度的提升當然是網路基礎建設的增加,也是政府、業者必須要共同對症下藥的地方
同樣的議題,建議取消行動上網吃到飽的行政院政委張善政態度保留的說:「政府沒有權利要求業者取消上網吃到飽」;反之,政府的責任是在於督促業者把基礎建設做好,提供好的上網品質,另一方面應該把釋照作成最有效(指方案),大家也就可以沿用上網吃到飽的方案。

▲行政院政委張善政態度保留的說:「政府沒有權利要求業者取消上網吃到飽」,但還是強調上網吃到飽的方案是不容易的,更提到政府的責任是在於督促業者把基礎建設做好,提供好的上網品質。
不過張善政還是強調上網吃到飽是不容易的,呼籲所有民眾應該共體時艱,珍惜使用資源。至於先前所提到 14% 的使用者用掉 67% 的頻寬這件事情,不是指這 14% 的人占另外 86% 的便宜,刻意造成族群對立與網路評論,而是根據國外趨勢,同時藉此數據讓業者審慎評估取消上網吃到飽。
針對行動上網吃到飽的方案,NCC 副主委虞孝成說:「業者提供吃到飽的方案,我個人是沒有立場去阻止,而且業者不斷提供新的方案,假如業者跟消費者簽了約,在合約期間也不應改變方案影響消費者權益。」

▲NCC 副主委虞孝成提到NCC 的立場在於鼓勵業者多建立微型基地台、增加傳輸容量,提供民眾更好的上網品質 。
虞孝成表示,NCC 的立場在於鼓勵業者多建立微型基地台,可以紓解大基地台的壓力,也鼓勵業者多增加傳輸容量,讓實體網路傳輸快一點,這次開放 700/900/1800 三個區段的頻譜合計 270 MHz,將全數規畫為支援 GSM/LTE 技術,如此一來,空中頻譜一多,就像高速公路拓寬一樣,也就可以紓解大家的用量。
交通部郵電司副司長王廷俊認為頻寬再怎麼開放,技術再怎麼演進,都一定有個上限,在加上基地台增加的問題,尤其大都會區,更碰到很多抗爭的問題,因此在供給面也是有個極限。對此,王廷俊以「訊務分流,分級收費」這八個字為鑒,認為這應該是全球通訊業者共同的理念,也是解決網路塞車,唯一共同考量的方向。至於開放 WiMax 頻譜讓民眾可以有更多頻譜可利用方面,目前已經著手進行蒐集各方面意見,包括現有業者的看法,不過目前還未定案。
立法委員林佳龍認為身為一個政策規劃者,心底一定要有個藍圖,絕對不是「委外」可以解決的事情。針對基礎建設的投資,林佳龍認為電信業者當然有責任投入基礎建設,但是這並不表示政府沒有責任,像是把這些頻譜的釋放的收入,投入基礎建設,或是規劃基地台用地管理法案等,都是政府應該積極介入的範疇。

▲立法委員林佳龍認為身為一個政策規劃者,心底一定要有個藍圖,絕對不是「委外」可以解決的事情,針對基礎建設的投資,並不表示政府沒有責任,像是規劃基地台用地管理法案等,都是政府應該積極介入的範疇。(圖/記者洪聖壹攝)
至於吃到飽的方案,林佳龍也認為應該是採用分級收費的制度,怎麼確保消費者權益不受影響,甚至使用量更少的用戶,應該怎麼用鼓勵的方式來作優惠,而 那些超過平均用量的大戶,又應該用甚麼樣的費率來解決,就像大家可以有省道、縣道、高速公路、高速鐵路可以選擇但是收費各有不同一樣,不然這實在不符合公 平正義。
不論如何,台灣 4G LTE 執照的釋出進度已經刻不容緩,林宗男也提到,在亞洲像是香港、日本、韓國、中國、新加坡、菲律賓,都有 4G LTE 的服務的提出,而寮國也將在今年 11 月推出 4G LTE 的服務,甚至科技遠比台灣落後的坦尚尼亞,安哥拉,那比米亞都已相繼宣布在今年 4、5 月就會有有 4G LTE 商轉。

相較於此,台灣當初 2G 的執照,是在第一時間追隨歐洲各國的腳步釋出,不過北歐早已釋出 4G LTE 的執照,台灣居然還要拖到 2015 年才有 4G LTE 的服務,效率落後菲律賓 4 年,非洲國家 3 年。林宗男認為,台灣目前的 4G LTE 服務的提供,只有落後的問題,並沒有太快的問題,更沒有該不該建設的問題。

不過觀看現場電信業者的態度,除了中華電信行動通信分公司總經理林國豐、亞太電信法規暨品保部協理柯志宏之外,其他包括台灣大哥大、遠傳電信、威寶電信、威達有線等電信公司代表,在整個會議當中,並沒有表達任何意見與看法,對於立委與政院的意見也沒有任何回應看來,似乎仍然處於被動的聽政與回報態 度,頗讓人感到心寒。