2008年5月15日 星期四

The latest processing chips require“parallel programming”

Software

Cores of the problem

May 14th 2008
From Economist.com

The latest processing chips require a new approach to writing software


Intel

COMPUTER makers talk a lot about a coming wave of software that will change the way people behave towards their machines. Rich three-dimensional virtual worlds and multimedia applications that mimic the experience of a live concert in a living room will, they say, become commonplace. But there is a problem. Although hardware makers are producing PCs, laptops and portable devices with ever increasing processing power, the software industry is falling behind in its capacity to write programs that can make use of all this power.

Everyone is familiar with how Intel, AMD and other chipmakers churn out faster and faster processors. But in the past few years the design of these chips has changed. Instead of making chips faster by making their components smaller and running them at higher speeds, makers have started building multiple processing engines, or “cores”, onto each chip. Each core can run at a lower speed, which requires less energy and produces less heat, and the overall number-crunching power of the chip continues to increase.

But this change requires programmers to write code that can split the processing tasks efficiently between the cores. Such “parallel programming” is a classic problem in computer science, but not enough programmers have mastered the necessary techniques. Even so, the chipmakers have no intention of slowing down. Dual-core and four-core chips are already available, and Intel plans to launch six-core chips later this year. Chips with even more cores will follow in 2009.

To help their colleagues in the software industry catch up, companies are dipping into their own funds. In March Microsoft and Intel teamed up to give $10m each to the University of California at Berkeley and the University of Illinois to finance work on parallel programming. At Berkeley, researchers will develop new types of software for computers and mobile devices. This could include a browser for mobile phones that can handle demanding video applications. Almost 50 researchers at Illinois will tackle similar projects.

ntel, Sun Microsystems, NVIDIA, AMD, HP and IBM are paying for a similar effort at Stanford University. A centre called the Pervasive Parallelism Lab has been created at Stanford to bring together software developers working on parallel programming. One of its first tasks is to create a programming framework that can be applied to virtual worlds, robotics and the analysis of vast amounts of scientific and financial data.

The virtual-world research will, in theory, produce online destinations with graphics and interactive capabilities as good as those from today's video-game consoles. The robotics research will try to create more life-like systems. The output of all three schools will be published under licences that enable others to build upon their work.

Similar gaps between the performance of processors and software have arisen in the past. Each time, the software industry has eventually caught up, thanks to better and more sophisticated programs. The hardware firms are hoping their grants will help programmers catch up once again, by spreading the load—just as their processors are supposed to do.

沒有留言:

張貼留言