2017年5月28日 星期日

何日君再來,·AlphaGo


Hanching Chung

真是的,您,alfa-go 圍棋一號,只出手一輪就讓棋王們淚滿襟、稱臣。
您將那些民族感情者或哲學家大發酸葡萄之言:"人工智慧雖然精明,畢竟缺少道德和感情。"
您將智慧留給人類:此恨綿綿.....;您不帶走未央歌的一片雲彩,不說聲Bye Bye,就要"離休"了。
請問,世間有誰寫篇"After the Alfa-Go"?
(仿腦計算器/程式Clouston, B.; Stansfield, K., eds. (1979). After the Elm. London: Heinemann.(A general introduction, with a history of Dutch elm disease and proposals for re-landscaping in the aftermath of the pandemic. Illustrated.))


蘋論:圍棋電腦痛宰人類天才

 
人工智慧(AI)圍棋手AlphaGo,前天第三戰再度擊敗中國九段棋士柯潔後宣告退役。
決戰的盤中柯潔陷入長考,抓腦搔腮,表情痛苦,經過3個半小時苦戰,柯棄子認輸。賽後,柯潔感嘆「AlphaGo太完美。在對弈中我只能猜到它一半的棋,而且以後的差距越來越大」,「不會讓你感到有獲勝的希望」。它的創始人哈薩比斯說:「我們見證了天才柯潔把AlphaGo推向極限,比賽展現了AI的最高水準,讓人類挖掘了AI作為工具的潛力。」

人腦成長有極限

人們對人工智慧的圍棋手多次戰勝人類的天才,都感訝異,心中充滿問號,希望了解以下問題:為什麼AI會贏?為什麼AI能自動學習到高深的程度?將來AI機器人會不會像電影演的一樣,人類被高智慧的機器人控制、管理而無法掙脫?機器人會模擬到人類的情感和喜怒哀樂嗎?
曾被稱為小神童、今年56歲的王銘琬,旅日多年,獲得多項圍棋頭銜,包括兩屆本因坊及王座頭銜,長期參與電腦圍棋活動,2014年擔任趨勢科技圍棋軟體的顧問。圍棋被認知為AI進軍日常生活的第一個戰場,王就是這個戰場的分析家,他在遠流剛出版的書《迎接AI新時代:用圍棋理解人工智慧》中,對以上問題提出部分解答。
王銘琬斷言,人腦將無法在對戰電腦中獲勝。人腦有成長進步曲線,越接近極限,成長越鈍化,上升曲線越趨近水平。電腦加上深層學習後,開始走不同的棋力提升曲線,從被人類讓3子起,很快超越人類並不斷上升。王銘琬因此總結說:「圍棋的電腦、人腦之爭,現在已逐漸脫離了誰比較厲害的時期,進入邊欣賞AI的表現,邊思考AI與人類關係的時候了。」
圍棋的每一步都有400多條選擇,電腦可以快速運算所有的可能,做出最有利的決定,但人腦無法一時窮盡所有的可能,因此很難獲勝。 

「共鳴」無法取代

王銘琬還引用「死活」、「大小」;「棋力」、「共鳴」的圍棋概念解釋AI 打敗人腦的原因,但卻在精神層面肯定了人類的優越性。他說:「共鳴唯有人類之間才有……當AI充斥在身邊的時代來臨,能意識到自己是人類,或許是一種幸福;人類只能依自己的特質去感受、思考,對此能理解、反應的也只有人類;我們一個人無法活下去,而跟人類共有的一切,是無法被任何東西取代的。」 

2017年5月19日 星期五

Google thinks it has cracked the VR adoption problem

當蘋果(Apple)在2001年首度發布其數位音樂播放器iPod時,創辦人史蒂夫‧賈伯斯(Steve Jobs)是這樣介紹的:這款iPod「可以把1000首歌放進口袋裡」。
假設當時賈伯斯說的是,「這款音樂播放器只有185公克,擁有5GB的容量。」你覺得iPod的魅力是增還是減?




For most consumers, virtual reality is still a technology of the future. Google hopes that by making the virtual world more convenient and accessible, more people will want to dive in.

It’s launching a high-end wireless headset and new software improvements…
TECHNOLOGYREVIEW.COM

2017年5月18日 星期四

Google和NVIDIA (Volta 第7代 GPU 架構)的人工智慧晶片之戰,Intel on the outside:The rise of artificial intelligence is creating new variety in the chip market, and trouble for Intel

前一陣子跟學生們探討Google那篇將在六月ISCA會議上發表的TPU論文,最後提醒大家論文講的東西是兩年前的技術,要大家想像一下Google現在機房裏的TPU有多厲害...
市場最可能受TPU技術影響的是NVIDIA,現在雖然很多人買GPU來加速深度學習,因為NVIDIA領先發展這個市場,設備取得和使用者經驗相對普遍,但是如果要大型部署,就得考量性價比和運算效率。
GPU的性價比是靠電玩遊戲撐起來的,但是當NVIDIA逐步在GPU架構上加入針對深度學習的優化,例如半精準度、大矩陣運算、高速NVLink網路這些一般電玩遊戲可能用不到的功能,如果Gamers不願買單,那麼就得從深度學習的客戶群收到足夠的利潤。
Google開發TPU來低價化深度學習的成本,也讓深度學習成為一種平民化的雲端服務。這樣的趨勢對NVIDIA是不利的,但鹿死誰手還不知道。有競爭才會進步得快,這點是可期待的。

----
Volta 到底有多強?作為NVIDIA第7代 GPU 架構,它集成了210億顆晶體管,具有 5120 個 CUDA 處理內核,可以和100台 CPU 在進行深度學習處理上的性能相抗衡;相比起前一代的Pascal ,它有了5倍的性能提升,比起兩年前的Maxwell 架構,性能提升15倍!



NVIDIA 再丟核彈震撼業界:效能同 100 顆 CPU,第七代 GPU「Volta」登場! - INSIDE 硬塞的網路趨勢




Google近日發表報告,比較旗下深度學習運算晶片TPU與NVIDIA圖像處理晶片GPU的速度和頻寬,引發NVIDIA執行長黃仁勳發文回應,也讓兩家公司的人工智慧晶片戰爭,更加白熱化。


Google近日發表報告,比較旗下深度學習運算晶片TPU與NVIDIA圖像處理晶片GPU的速度和頻寬,引發NVIDIA執行長黃仁勳發文回應,也讓兩家公司的人工智慧晶片戰爭,更加白熱化。



Intel on the outsideThe rise of artificial intelligence is creating new variety in the chip market, and trouble for Intel

The success of Nvidia and its new computing chip signals rapid change in IT architecture

“WE ALMOST went out of business several times.” Usually founders don’t talk about their company’s near-death experiences. But Jen-Hsun Huang, the boss of Nvidia, has no reason to be coy. His firm, which develops microprocessors and related software, is on a winning streak. In the past quarter its revenues increased by 55%, reaching $2.2bn, and in the past 12 months its share price has almost quadrupled.
A big part of Nvidia’s success is because demand is growing quickly for its chips, called graphics processing units (GPUs), which turn personal computers into fast gaming devices. But the GPUs also have new destinations: notably data centres where artificial-intelligence (AI) programmes gobble up the vast quantities of computing power that they generate.






Latest updates

See all updates
Soaring sales of these chips (see chart) are the clearest sign yet of a secular shift in information technology. The architecture of computing is fragmenting because of the slowing of Moore’s law, which until recently guaranteed that the power of computing would double roughly every two years, and because of the rapid rise of cloud computing and AI. The implications for the semiconductor industry and for Intel, its dominant company, are profound.
Things were straightforward when Moore’s law, named after Gordon Moore, a founder of Intel, was still in full swing. Whether in PCs or in servers (souped-up computers in data centres), one kind of microprocessor, known as a “central processing unit” (CPU), could deal with most “workloads”, as classes of computing tasks are called. Because Intel made the most powerful CPUs, it came to rule not only the market for PC processors (it has a market share of about 80%) but the one for servers, where it has an almost complete monopoly. In 2016 it had revenues of nearly $60bn.
This unipolar world is starting to crumble. Processors are no longer improving quickly enough to be able to handle, for instance, machine learning and other AI applications, which require huge amounts of data and hence consume more number-crunching power than entire data centres did just a few years ago. Intel’s customers, such as Google and Microsoft together with other operators of big data centres, are opting for more and more specialised processors from other companies and are designing their own to boot.
Nvidia’s GPUs are one example. They were created to carry out the massive, complex computations required by interactive video games. GPUs have hundreds of specialised “cores” (the “brains” of a processor), all working in parallel, whereas CPUs have only a few powerful ones that tackle computing tasks sequentially. Nvidia’s latest processors boast 3,584 cores; Intel’s server CPUs have a maximum of 28.
The company’s lucky break came in the midst of one of its near-death experiences during the 2008-09 global financial crisis. It discovered that hedge funds and research institutes were using its chips for new purposes, such as calculating complex investment and climate models. It developed a coding language, called CUDA, that helps its customers program its processors for different tasks. When cloud computing, big data and AI gathered momentum a few years ago, Nvidia’s chips were just what was needed.
Every online giant uses Nvidia GPUs to give their AI services the capability to ingest reams of data from material ranging from medical images to human speech. The firm’s revenues from selling chips to data-centre operators trebled in the past financial year, to $296m.
And GPUs are only one sort of “accelerator”, as such specialised processors are known. The range is expanding as cloud-computing firms mix and match chips to make their operations more efficient and stay ahead of the competition. “Finding the right tool for the right job”, is how Urs Hölzle, in charge of technical infrastructure at Google, describes balancing the factors of flexibility, speed and cost.
At one end of the range are ASICs, an acronym for “application-specific integrated circuits”. As the term suggests, they are hard-wired for one purpose and are the fastest on the menu as well as the most energy-efficient. Dozens of startups are developing such chips with AI algorithms already built in. Google has built an ASIC called “Tensor Processing Unit” for speech recognition.
The other extreme is field-programmable gate arrays (FPGAs). These can be programmed, meaning greater flexibility, which is why even though they are tricky to handle, Microsoft has added them to many of its servers, for instance those underlying Bing, its online-search service. “We now have more FPGAs than any other organisation in the world,” says Mark Russinovich, chief technology officer at Azure, the firm’s computing cloud.
Time to be paranoid
Instead of making ASICS or FPGAs, Intel focused in recent years on making its CPU processors ever more powerful. Nobody expects conventional processors to lose their jobs anytime soon: every server needs them and countless applications have been written to run on them. Intel’s sales from the chips are still growing. Yet the quickening rise of accelerators appears to be bad news for the company, says Alan Priestley of Gartner, an IT consultancy. The more computing happens on them, the less is done on CPUs.
One answer is to catch up by making acquisitions. In 2015 Intel bought Altera, a maker of FPGAs, for a whopping $16.7bn. In August it paid more than $400m for Nervana, a three-year-old startup that is developing specialised AI systems ranging from software to chips. The firm says it sees specialised processors as an opportunity, not a threat. New computing workloads have often started out being handled on specialised processors, explains Diane Bryant, who runs Intel’s data-centre business, only to be “pulled into the CPU” later. Encryption, for instance, used to happen on separate semiconductors, but is now a simple instruction on the Intel CPUs which run almost all computers and servers globally. Keeping new types of workload, such as AI, on accelerators would mean extra cost and complexity.
If such integration occurs, Intel has already invested to take advantage. In the summer it will start selling a new processor, code-named Knights Mill, to compete with Nvidia. Intel is also working on another chip, Knights Crest, which will come with Nervana technology. At some point, Intel is expected also to combine its CPU’s with Altera’s FPGAs.
Predictably, competitors see the future differently. Nvidia reckons it has already established its own computing platform. Many firms have written AI applications that run on its chips, and it has created the software infrastructure for other kinds of programmes, which, for instance, enable visualisations and virtual reality. One decades-old computing giant, IBM, is also trying to make Intel’s life harder. Taking a page from open-source software, the firm in 2013 “opened” its processor architecture, which is called Power, turning it into a semiconductor commons of sorts. Makers of specialised chips can more easily combine their wares with Power CPUs, and they get a say in how the platform develops.
Much will depend on how AI develops, says Matthew Eastwood of IDC, a market researcher. If it turns out not to be the revolution that many people expect, and ushers in change for just a few years, Intel’s chances are good, he says. But if AI continues to ripple through business for a decade or more, other kinds of processor will have more of a chance to establish themselves. Given how widely AI techniques can be applied, the latter seems likely. Certainly, the age of the big, hulking CPU which handles every workload, no matter how big or complex, is over. It suffered, a bit like Humpty Dumpty, a big fall. And all of Intel’s horses and all of Intel’s men cannot put it together again.





This article appeared in the Business section of the print edition under the headline "Silicon crumble"

2017年5月13日 星期六

如何避免遭勒索病毒攻擊Officials Expect Cyberattacks to Spread on Monday. 'Accidental hero' finds kill switch to stop spread of ransomware cyber-attack


微軟怪罪美政府

「想哭」據信是美國國家安全局(NSA)遭竊的網路工具,主要是利用微軟較舊版本操作系統的安全漏洞。微軟早於3月發布修補程式,但許多用戶未更新,淪為攻擊目標。美國總統川普已下令國家安全顧問博塞特負責遏止災情擴大。
微軟總裁史密斯將此怪罪美國政府,指責中央情報局(CIA)和NSA不向軟體商通報安全漏洞,反而儲存一些會被駭客利用的軟體密碼。「這次攻擊再次證明為何政府囤積安全漏洞會變成問題」,「全球政府應把這次攻擊視為警鐘」。 

如何避免遭勒索病毒攻擊

1.使用隨身碟、外接硬碟或雲端空間,將重要資料備份
2.關閉Windows系統的445通訊埠,關閉網路共用資料夾
3.不要點擊來路不明的網站、檔案和郵件
4.安裝微軟Eternal Blue安全性修補程式
5.不要繳納贖金,駭客得知你有能力付贖金,再被勒索可能性很高,也不保證駭客會履約
6.開啟Windows Update,隨時升級系統,修補漏洞
7.剛跳出中毒畫面,資料逐漸被加密,建議立即切斷網路並強制關機
8.檔案已全數遭駭客加密完成,可考慮嘗試防毒軟體公司釋出的解密工具,千萬不要開啟防毒軟體硬碰硬,以免檔案即使解密也無法再開啟
9.若採用上述方式解密仍失敗,務必留下硬碟,或直接找顆硬碟對拷
資料來源:中央社、資策會 


Officials Expect Cyberattacks to Spread on Monday 


Cybersecurity experts are expecting another wave of computer-system attacks that encrypt files and demand ransom to unlock them on Monday, as companies and government agencies are seeking to restore normal operations and figure out the roots of the attack.




The man who spent £8.50 and stopped the spread of ransomware that hit IT systems around the world.

Kill switch - Wikipedia

https://en.wikipedia.org/wiki/Kill_switch

kill switch, also known as an emergency stop (e-stop) or emergency power off (EPO), is a safety mechanism used to shut off a device or machinery in an ...
Powered‎: ‎Some are mechanical and others are ...
Classification‎: ‎Mechanical component
Industry‎: ‎Automotive, boating, energy, enginee...


A cyber-attack is wreaking havoc around the world – but a British man has halted its spread by registering a web domain for $10.69.



Spread of malware curtailed by expert who simply registered a domain…
THEGUARDIAN.COM

2017年5月10日 星期三

Baby brain scans reveal trillions of neural connections

Baby brain scans reveal trillions of neural connections http://bbc.in/2q0QbGt

Scientists release groundbreaking medical scans that reveal how the human brain develops.
BBC.CO.UK|作者:BBC NEWS

HIV life expectancy 'near normal' thanks to new drugs. Eliminating the HIV epidemic is possible, says UCLA. Experimental HIV Vaccine Gives Some Protection

HIV life expectancy 'near normal' thanks to new drugs http://bbc.in/2r3mBhU



Newer medications have fewer side effects and are more efficient at stopping the virus.
BBC.CO.UK|作者:BBC NEWS

----2016.5

A nearly 20-year analysis by researchers yields the first proof that the "treatment as prevention" approach could eliminate the HIV epidemic.


A new study shows that effective treatment of patients in Denmark has virtually contained the disease.
UNIVERSITYOFCALIFORNIA.EDU


*****

2009.9 An experimental vaccine regimen has shown a modest ability to protect people exposed to HIV, the first time an investigational HIV vaccine has been shown to have this effect.

The results from the trial, which involved more than 16,000 adults in Thailand, indicated that the vaccine regimen lowered the rate of contracting HIV by 31% compared with those taking a placebo, according to the U.S. National Institutes of Health, which helped fund the study.

'Additional research is needed to better understand how this vaccine regimen reduced the risk of HIV infection, but certainly this is an encouraging advance for the HIV vaccine field,' said Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, which is part of the NIH.

It is a rare piece of good news for the field of AIDS vaccine research, which has sponsored more than 100 vaccine trials since 1987 but without any significant success.

The regimen consists of two vaccines. One is a primer dose made by Sanofi Pasteur, the vaccine division of French drug maker Sanofi-Aventis; the other is a booster dose developed by Vaxgen Inc. and now licensed to Global Solutions for Infectious Diseases, in South San Francisco, Calif.

種試驗性免疫疫苗療法顯示出可適當保護愛滋病病毒(HIV)易感人群,這是尚在研究階段的HIV疫苗首次顯示出有此效果。

讚助這項研究的美國國立衛生研究院(U.S. National Institutes of Health)的資料顯示,針對泰國逾1.6萬名成人的試驗結果顯示,與服用安慰劑的對照組相比,採用這種免疫疫苗療法的HIV感染率下降了31%。

周四,研究人員宣布他們首次研制出了一種對人類有效的艾滋病疫苗。注射該疫苗後感染幾率會下降約30%。
國 立衛生研究院下屬的國家過敏症與傳染病研究所(National Institute of Allergy and Infectious Diseases)所長法奇(Anthony Fauci)說,還需要進一步研究,以便更好地了解這種免疫法降低HIV感染風險的機制,但這肯定是HIV疫苗領域的一個鼓舞人心的進展。

這是愛滋病疫苗領域極為罕見的好消息,該領域自1987年以來已讚助了100多項疫苗試驗,但沒有任何重大成就。

這 一免疫法由兩種疫苗組成。其一是法國藥品制造商賽諾菲安萬特(Sanofi-Aventis)旗下生產疫苗的子公司Sanofi Pasteur生產的基礎疫苗;另一種是Vaxgen Inc.開發的加強劑,現已授權給位於加州舊金山市的非贏利組織傳染性疾病全球解決方案(Global Solutions for Infectious Diseases)。

Gautam Naik

2017年5月9日 星期二

新款iPhone使OLED螢幕日本供應商成為焦點;Sony relaunching OLED TVs using LG-made panels





Sony will sell OLED TVs again by sourcing panels from LG Display. The TV features proprietary image-processing technology and a screen that also acts as the speaker.



Sony relaunching OLED TVs using LG-made panels
TOKYO -- Sony's high-definition OLED television, which hits domestic stores next month, will mark a return after sales were halted in 2010. But instea
ASIA.NIKKEI.COM

彭博商業周刊 / 中文版
【即時頭條】蘋果新款iPhone使日本供應商成為焦點
iPhone的成功轉變了十幾家供應商的命運,包括玻璃製造商以及生產用於切割金屬殼的機械人的製造商等等。
現在,隨著蘋果準備引入配備OLED螢幕的新型智能手機,一家以連鎖加油站聞名的日本石化公司即將加入這份名單。
出光興產在20世紀80年代中期開始了有機發光二極體試驗,在全球石油衝擊後,該公司尋求降低對石油的依賴。現在, 不管是谷歌最新的Pixel智能電話、還是三星的Galaxy,這些手機OLED螢幕上的藍色畫素點都很可能是利用出光興產的材料或專利打造。
隨著影象更明晰、更省電的OLED顯示屏的應用更加廣泛,讓直至目前與iPhone的銷售幾乎沒有關聯的日本供應商成了焦點。譬如,Canon Tokki Corp.在生產OLED螢幕的大型真空機械領域幾乎達到壟斷地位。大日本印刷和凸版印刷都是精細金屬濾網的領先製造商。此類金屬濾網是印刷OLED畫素點所必需的材料。
「三星採用OLED螢幕已經有幾年了,但是蘋果的加入,大大推動了此類螢幕,」Sanford C. Bernstein & Co.的技術分析師Alberto Moel說,「這讓其他所有人都希望做同一件事。」
知情人士透露,在iPhone誕生十周年,蘋果計劃今年至少推出一款配備OLED螢幕的新型iPhone。據行業預估,蘋果使用OLED顯示屏將促使對後者的需求增加數億塊。IHS Markit預估,今年在智能手機面板方面,OLED顯示屏的應用將超過液晶顯示屏。
OLED數十年來都是生產更優質顯示屏的前景所在,即使液晶顯示屏占主導地位。OLED顯示屏可以更輕薄、更節能,顯示更加深沉的黑色,因為其有機畫素點能夠自發光,而液晶顯示屏則需要背光光源。OLED顯示屏還可以採用柔性塑料來製造,從而形成多種形狀,擴大應用範圍。OLED顯示屏面臨的挑戰是以足夠低的價格生產能長時間使用、亮度足夠的顯示屏。
當出光興產於1985年開始該技術研發時,OLED只不過是一門有前景的未來科學。在十年裡,這家日本第二大的煉油企業難以讓該材料的持續時間超過數秒,直到一項突破性地進展,延長了其持續時間。該技術讓先鋒得以在1999年將全球首個商用OLED顯示屏配備到車載音響上。
「我們從一開始就參與了,」出光興產電子材料研發中心的首席研究員Yuichiro Kawamura說。
之後該技術取得了更多進展。韓國顯示屏生產商投入巨資,開發生產出在價格上能與液晶顯示屏競爭的OLED顯示屏。三星將其智能手機的未來押注於OLED,目前佔小尺寸OLED螢幕市場份額的絕大部分。LG Electronics Inc.在2009年與出光興產結盟,側重於電視機螢幕。
「我們持有所有重要專利長達十年,」出光興產負責電子材料部門商業策略的總經理Takamitsu Nagase說,「但僅在過去三到四年的期間,該技術才成為了實實在在的業務。」
雖然出光興產在開發OLED技術方面取得先發優勢,但是陶氏化學、默克等其他競爭對手都加入了該領域,投入大量資金進行研發,以期找到更佳的方式來製造有機材料。撰文/Pavel Alpeyev、Takako Taniguchi
#新iPhone #日本