2017年4月26日 星期三

Google和NVIDIA的人工智慧晶片之戰,Intel on the outside:The rise of artificial intelligence is creating new variety in the chip market, and trouble for Intel



Google近日發表報告,比較旗下深度學習運算晶片TPU與NVIDIA圖像處理晶片GPU的速度和頻寬,引發NVIDIA執行長黃仁勳發文回應,也讓兩家公司的人工智慧晶片戰爭,更加白熱化。
Google近日發表報告,比較旗下深度學習運算晶片TPU與NVIDIA圖像處理晶片GPU的速度和頻寬,引發NVIDIA執行長黃仁勳發文回應,也讓兩家公司的人工智慧晶片戰爭,更加白熱化。



Intel on the outsideThe rise of artificial intelligence is creating new variety in the chip market, and trouble for Intel

The success of Nvidia and its new computing chip signals rapid change in IT architecture

“WE ALMOST went out of business several times.” Usually founders don’t talk about their company’s near-death experiences. But Jen-Hsun Huang, the boss of Nvidia, has no reason to be coy. His firm, which develops microprocessors and related software, is on a winning streak. In the past quarter its revenues increased by 55%, reaching $2.2bn, and in the past 12 months its share price has almost quadrupled.
A big part of Nvidia’s success is because demand is growing quickly for its chips, called graphics processing units (GPUs), which turn personal computers into fast gaming devices. But the GPUs also have new destinations: notably data centres where artificial-intelligence (AI) programmes gobble up the vast quantities of computing power that they generate.



Latest updates

See all updates
Soaring sales of these chips (see chart) are the clearest sign yet of a secular shift in information technology. The architecture of computing is fragmenting because of the slowing of Moore’s law, which until recently guaranteed that the power of computing would double roughly every two years, and because of the rapid rise of cloud computing and AI. The implications for the semiconductor industry and for Intel, its dominant company, are profound.
Things were straightforward when Moore’s law, named after Gordon Moore, a founder of Intel, was still in full swing. Whether in PCs or in servers (souped-up computers in data centres), one kind of microprocessor, known as a “central processing unit” (CPU), could deal with most “workloads”, as classes of computing tasks are called. Because Intel made the most powerful CPUs, it came to rule not only the market for PC processors (it has a market share of about 80%) but the one for servers, where it has an almost complete monopoly. In 2016 it had revenues of nearly $60bn.
This unipolar world is starting to crumble. Processors are no longer improving quickly enough to be able to handle, for instance, machine learning and other AI applications, which require huge amounts of data and hence consume more number-crunching power than entire data centres did just a few years ago. Intel’s customers, such as Google and Microsoft together with other operators of big data centres, are opting for more and more specialised processors from other companies and are designing their own to boot.
Nvidia’s GPUs are one example. They were created to carry out the massive, complex computations required by interactive video games. GPUs have hundreds of specialised “cores” (the “brains” of a processor), all working in parallel, whereas CPUs have only a few powerful ones that tackle computing tasks sequentially. Nvidia’s latest processors boast 3,584 cores; Intel’s server CPUs have a maximum of 28.
The company’s lucky break came in the midst of one of its near-death experiences during the 2008-09 global financial crisis. It discovered that hedge funds and research institutes were using its chips for new purposes, such as calculating complex investment and climate models. It developed a coding language, called CUDA, that helps its customers program its processors for different tasks. When cloud computing, big data and AI gathered momentum a few years ago, Nvidia’s chips were just what was needed.
Every online giant uses Nvidia GPUs to give their AI services the capability to ingest reams of data from material ranging from medical images to human speech. The firm’s revenues from selling chips to data-centre operators trebled in the past financial year, to $296m.
And GPUs are only one sort of “accelerator”, as such specialised processors are known. The range is expanding as cloud-computing firms mix and match chips to make their operations more efficient and stay ahead of the competition. “Finding the right tool for the right job”, is how Urs Hölzle, in charge of technical infrastructure at Google, describes balancing the factors of flexibility, speed and cost.
At one end of the range are ASICs, an acronym for “application-specific integrated circuits”. As the term suggests, they are hard-wired for one purpose and are the fastest on the menu as well as the most energy-efficient. Dozens of startups are developing such chips with AI algorithms already built in. Google has built an ASIC called “Tensor Processing Unit” for speech recognition.
The other extreme is field-programmable gate arrays (FPGAs). These can be programmed, meaning greater flexibility, which is why even though they are tricky to handle, Microsoft has added them to many of its servers, for instance those underlying Bing, its online-search service. “We now have more FPGAs than any other organisation in the world,” says Mark Russinovich, chief technology officer at Azure, the firm’s computing cloud.
Time to be paranoid
Instead of making ASICS or FPGAs, Intel focused in recent years on making its CPU processors ever more powerful. Nobody expects conventional processors to lose their jobs anytime soon: every server needs them and countless applications have been written to run on them. Intel’s sales from the chips are still growing. Yet the quickening rise of accelerators appears to be bad news for the company, says Alan Priestley of Gartner, an IT consultancy. The more computing happens on them, the less is done on CPUs.
One answer is to catch up by making acquisitions. In 2015 Intel bought Altera, a maker of FPGAs, for a whopping $16.7bn. In August it paid more than $400m for Nervana, a three-year-old startup that is developing specialised AI systems ranging from software to chips. The firm says it sees specialised processors as an opportunity, not a threat. New computing workloads have often started out being handled on specialised processors, explains Diane Bryant, who runs Intel’s data-centre business, only to be “pulled into the CPU” later. Encryption, for instance, used to happen on separate semiconductors, but is now a simple instruction on the Intel CPUs which run almost all computers and servers globally. Keeping new types of workload, such as AI, on accelerators would mean extra cost and complexity.
If such integration occurs, Intel has already invested to take advantage. In the summer it will start selling a new processor, code-named Knights Mill, to compete with Nvidia. Intel is also working on another chip, Knights Crest, which will come with Nervana technology. At some point, Intel is expected also to combine its CPU’s with Altera’s FPGAs.
Predictably, competitors see the future differently. Nvidia reckons it has already established its own computing platform. Many firms have written AI applications that run on its chips, and it has created the software infrastructure for other kinds of programmes, which, for instance, enable visualisations and virtual reality. One decades-old computing giant, IBM, is also trying to make Intel’s life harder. Taking a page from open-source software, the firm in 2013 “opened” its processor architecture, which is called Power, turning it into a semiconductor commons of sorts. Makers of specialised chips can more easily combine their wares with Power CPUs, and they get a say in how the platform develops.
Much will depend on how AI develops, says Matthew Eastwood of IDC, a market researcher. If it turns out not to be the revolution that many people expect, and ushers in change for just a few years, Intel’s chances are good, he says. But if AI continues to ripple through business for a decade or more, other kinds of processor will have more of a chance to establish themselves. Given how widely AI techniques can be applied, the latter seems likely. Certainly, the age of the big, hulking CPU which handles every workload, no matter how big or complex, is over. It suffered, a bit like Humpty Dumpty, a big fall. And all of Intel’s horses and all of Intel’s men cannot put it together again.


This article appeared in the Business section of the print edition under the headline "Silicon crumble"

2017年4月25日 星期二

An Old Rock Could Lead to Next Generation Solar Cells


Perovskite can absorb sunlight and turn it into electricity.
鈣鈦礦可以吸收陽光, 把它變成電.
Perovskite technology could disrupt the world's solar market, currently dominated by China
SCIENTIFICAMERICAN.COM


Perovskite
維基百科,自由的百科全書

鈣鈦礦 (Perovskite)
基本資料
類別氧化物礦物
化學式CaTiO3
施特龍茨分類04.CC.30
性質
分子量135.96 u
顏色黑、紅棕、淺黃、黃橙
晶體慣態Pseudo cubic – crystals show a cubic outline
晶系Orthorhombic (2/m 2/m 2/m) space groupPnma
雙晶complex penetration twins
解理[100] good, [010] good, [001] good
斷口Conchoidal
硬度5–5.5
光澤Adamantine to metallic; may be dull
條痕grayish white
透明性Transparent to opaque
比重3.98–4.26
光學性質雙軸 (+)
折射率nα=2.3, nβ=2.34, nγ=2.38
其他特徵無放射性、無磁性
參考文獻[1][2][3][4][5][6][7][8]
鈣鈦礦是指一類陶瓷氧化物,其分子通式為ABO3;此類氧化物最早被發現者,是存在於鈣鈦礦石中的鈦酸鈣(CaTiO3)化合物,因此而得名。由於此類化合物結構上有許多特性,在凝聚體物理學方面應用及研究甚廣,所以物理學家與化學家常以其分子公式中各化合物的比例(1:1:3)來簡稱之,因此又名「113結構」。

2017年4月24日 星期一

巴斯德滅菌法1862



【那一年的這一天】
如果沒有法國人巴斯德的話,可以確定的是,世界人口絕對不會像現在這樣爆炸,因為人類健康仍然在很大程度上,受到看不見的「微生物帝國」的威脅和宰制。巴斯德的研究直搗「微生物帝國」,甚至提出「預防接種」的劃時代理論,還進一步發明狂犬病、炭疽病疫苗,破了微生物的功。這人根本是微生物的剋星來的,因此被稱為「微生物學之父」。

巴斯德還發明一種低溫殺菌的「巴斯德滅菌法」,廣泛用於牛奶、葡萄酒、啤酒和果汁消毒。如果沒有巴斯德,這些生鮮飲料不可能發展出龐大的產業,甚至不可能擺上大賣場的貨架,只能走「觀光果園」的路線,現擠現喝。


巴斯德還創辦「巴斯德研究院」,延續他「對抗微生物之戰」的戰果。1983年巴斯德研究院第一個成功分離出人類免疫不全病毒(HIV),對於白喉、破傷風、結核、小兒麻痺、流行性感冒、黃熱病和鼠疫等也有許多革命性的發現。1908年起,該機構已有8位科學家獲得諾貝爾獎。


【民報】【那一年的這一天】1862.4.20 巴斯德滅菌法首次測試完成
舉世聞名、被廣泛運用在延長牛奶、酒類等食品保存期限的「巴斯德滅菌法」(pasteurization),首度於1862年4月20日這天,由法國科學家、「微生物…

2017年4月22日 星期六

Kengo Kuma’s Carbon Fiber Curtain Makes Buildings Earthquake-Proof


his innovative reinforcement at Komatsu Seiren's showroom facility in Nomi, Japan, provides added stability to the building.
日本Nomi的小松Seiren展廳設施的創新型強化碳纖維幕,為建築物增添了穩定性。
Earthquakes? No longer a problem for fabric manufacturer Komatsu Seiren
ARCHITIZER.COM

March for Science: 为科学站出来


Ten thousand march for science in London
Financial Times · 3 hours ago



为科学站出来 全球学霸大游行
本周日,在全球超过600个城市,科学家们走上街头举行集会,反对政治干预科学,表达对科研自由的诉求。

2017年4月21日 星期五

The Brain Science of Conformity

Why are we so quick to follow along when people around us embrace a falsehood? Because of the anxiety, unease and disgust caused by standing apart.

It’s a problem in schoolyards, board rooms and, especially lately, in politics, where social media has proven a powerful tool for rallying people to untruths.
WSJ.COM