2010年3月22日 星期一

Cisco's CRS-3 router

Cisco's CRS-3 router made a bit of a splash when it was announced on March 9, but the power of this new device hasn't yet sunk in. Consider: The CRS-3, a network routing system, is able to stream every film ever made, from Hollywood to Bombay, in under four minutes. That's right — the whole universe of films digested in less time than it takes to boil an egg. That may sound like good news for consumers, but it could be the business equivalent of an earthquake for the likes of Universal Studios and Paramount Pictures.

Most people are familiar with routers, or desktop boxes used to provide connectivity between PCs, laptops and printers in a home or small office. These are tiny geckos compared with the T. rexes used by telcos such as Verizon and AT&T to distribute data among computer networks and provide Internet connectivity to millions of homes and wireless subscribers. (See the 50 worst cars of all time.)

As it turns out, these megarouters sitting inside data centers of major telcos and cablecos are among the biggest bottlenecks of the Internet, because as bandwidth speed to end users has shot up in recent years, router technology has not kept up, resulting in traffic jams that can slow or freeze downloads.

Cisco's superrouter is expected to turn what is now the equivalent of a country road into an eight-lane superhighway for Internet data traffic, including 3-D video, university lectures and feature films such as Harry Potter and the Half-Blood Prince and The Twilight Saga: New Moon. "Video is the big driver behind all this," says analyst Akshay Sharma of technology-research company Gartner Inc., noting that voice and texting will soon be overtaken by richer multimedia content and applications. (See TIME's tech buyer's guide of 2009.)

While it's already possible to stream a feature film in real time, in the best-case scenario it takes about two hours to download to a personal film archive, at home or on a mobile device, for repeat viewing. With the predictable slowdowns and interruptions now so common, the process can eat up four hours or more of computer time — to say nothing of time lost managing the process.

But routers are not the only cause of bottlenecks, and Cisco is not alone in working to maximize the Internet's full potential. Google is also concerned about the speed limitations imposed by wires that run to the home. Last month, Google, best known for its search engine, announced plans to test ultra-high-speed broadband networks that would deliver Internet content to residential subscribers at speeds of 1 gigabit per second — 100 times as fast as the top speed available today. This would allow consumers to complete a PC download of a Hollywood blockbuster like Avatar in about 72 seconds. (Comment on this story.)

"If Google has real success with this trial, it will percolate, and people will need to copy it," says Sharma, who is based in Fort Lauderdale, Fla. However, such a quantum leap in bandwidth would need the support of Cisco-style routers in the background to deliver on its promise beyond the pilot stage.

The ability to download albums and films in a matter of seconds is a harbinger of deep trouble for the Motion Picture Association of America (MPAA) and the Recording Industry Association of America (RIAA), which would prefer to turn the clock back, way back.

Consider that the MPAA, whose members include Disney and Universal, attacked the VCR in congressional hearings in the 1980s with a Darth Vader–like zeal, predicting box-office receipts would collapse if consumers were allowed to freely share and copy VHS tapes of Hollywood movies. A decade later, the MPAA fought to block the DVD revolution, mainly because digital media could be copied and distributed even more easily than videocassettes.

Today the film and recording industries maintain an iron grip over distribution of their intellectual property through megaplexes and national retailers such as Best Buy, Tower Records and Walmart. These bricks-and-mortar distribution channels take a share of the profit, but they provide a steady and predictable stream of revenue. (See pictures of vintage computers.)

By contrast, studios and music labels have experienced limited success and even less profitability in the few instances when they have grudgingly embraced the Internet bogeyman. The prospect of tying their future success to online distribution scares them because it means they will need to develop new distribution and pricing models. (For example, Netflix can stream an unlimited number of Hollywood films for a monthly subscription fee, but this does not include new releases.) They will also need to figure out how to stop people from setting up clone video and music stores with pirated content.

The MPAA declined to comment specifically on the Cisco breakthrough but said it supports technological innovation. Meanwhile, both the MPAA and the RIAA continue to fight emerging technologies like peer-to-peer file sharing with costly court battles rather than figuring out how to appeal to the next generation of movie enthusiasts and still make a buck. These younger consumers prefer to shop for movies online, watch them at their leisure on mobile devices and desktops and share them with friends. The studios and music labels have to figure out how to fit into that lifestyle, or else risk becoming obsolete.

The hard fact is that the latest developments at Cisco, Google and elsewhere may do more than kill the DVD and CD and further upset entertainment-business models that have changed little since the Mesozoic Era. With superfast streaming and downloading, indie filmmakers will soon be able to effectively distribute feature films online and promote them using social media such as Facebook and Twitter.

The upshot is that the high castle walls built over the past 100 years by the film industry to establish privilege and protect monopolistic profits may soon come tumbling down, just as they have for the music industry. In keeping with the old storyline, the nimble David looks set to vanquish the myopic and overconfident Goliath.

See TIME's computer covers.



It is time to stop thinking of cyberspace as a new medium or an agglomeration of new media. It is a new continent, rich in resources but in parts most perilous. Until 30 years ago, it had lain undiscovered, unmined and uninhabited.

The first settlers were idealists and pioneers who set out from San José, Boston and Seattle before sending back messages about the exciting virgin lands that awaited humanity in the realm of the net. They were quickly followed by chancers and adventurers who were able to make fortunes by devising their own version of the South Sea Bubble.

It was inevitable that the wondrous materials found all over this territory would attract the interest of nation states. Now, the scramble for cyberspace has begun. Military and intelligence agencies are already staking their claim for the web's high ground as civilian powers lay down boundaries to define what belongs to whom and who is allowed to wander where.

Cyberspace is being nationalised rapidly. In some parts of the world, this has been going on for a while. Russia has been running a programme known by the delightfully sinister acronym Sorm-2 (System of operational investigative activities) since the late 1990s. This ensures that a copy of every single data byte that goes into, out of or around the country ends up in a vast storage vault run by the Federal Security Service. You can read about atrocities committed in Chechnya if you wish but you can be confident that somebody will be looking over your digital shoulder.

China, of course, has its “great firewall”, filtering politically incorrect sites along with pornography and other forms of cultural contamination. But of even greater import is China's demand, effectively conceded, that the US relinquish control of the internet's language and domain names through the Californian non-profit organisation Icann. This is being transformed into a United Nations-style regulatory operation. China will soon have absolute say over the internet's structure within its borders.

The legal mapping of cyberspace in the west is more chaotic. But we are now witnessing the establishment of myriad laws and rules by legislators and in the courts. In a hearing this week at Blackfriars Crown Court in London following a major cybercrime trial, Harendra de Silva QC put his finger on it when he argued that “we are entering a world where almost any human interaction of any kind will require use of the internet”.

So while there is clearly a pressing need to define rules that apply in cyberspace, they are emerging at speed with little coherent strategy behind them. Nobody knows where this process will lead for two central reasons. The speed of technological change means that the traditional tools of state used to carve up the world in the 19th century, such as laws and treaties, are often inadequate, if not entirely irrelevant, when applied to this new domain.

Law enforcement agencies such as the FBI and the Serious Organised Crime Agency in Britain have invested considerable time and money in bringing down criminal networks on the web. But as the Internet Crime Complaints Centre in the US has just reported, the losses from cybercrime continue to climb at a staggering rate because criminals adapt at lightning speed to new policing methods.

In the commercial world, major legislation concerning copyright, such as Britain's Digital Economy Bill, is unlikely to withstand the second great variable – the coming of age of the net generation. Laws banning file-sharing are likely to prove as unpopular as the poll tax that helped bring down the Thatcher government. They also look utterly unenforceable.

As a harbinger of change, we are seeing political parties springing up throughout Europe with names such as the Internet party or the Pirate party, which understand the web as simply part of human DNA. “In the collision between the old and the new on the web,” argues Rex Hughes, a Chatham House fellow who is leading a cybersecurity project, “the old always wins the first few rounds but eventually they die off.”

But the greatest battle is happening in the area of cyberwarfare and cyberespionage. Symbolically, the US designated cyberspace as the “Fifth Domain” last June and the first man-made one after land, sea, air and space. Nato lawyers are trying to work out how the laws of war operate in cyberspace. Hysteria is accompanying this new arms race, as when Admiral Mike McConnell, former director of US National Intelligence, claimed at a Senate hearing last month that “if the nation went to war today in a cyberwar, we would lose”.

Meanwhile, the phenomenon of “anonymisation”, so useful for cybercrime, is a gift to intelligence agencies as they sniff into every corner of the web to find out who is up to what.

None of this would amount to a hill of beans were it not for Mr de Silva's point that everything we do is somehow mediated by the web. Governments are becoming obsessed about the need to control the internet but have yet to work out how to do this without suffocating the noble goal of those pioneers who merely wanted to facilitate communication between ordinary people. Heaven forbid!

The writer's latest book is McMafia: A Journey through the Global Criminal Underworld



第一批移民是来自圣何塞、波士顿和西雅图的理想主义者和先驱们。他们传回消息,说网络大陆到处是令人兴奋的处女地,正等待人类的进驻。投机者和冒险家们很快闻风而至,通过策划另一个版本的“南海泡沫”(South Sea Bubble),这些人得以发家致富。


网 络空间正在迅速地国有化。在世界某些地区,这一进程已经持续了一段时间。自上世纪90年代末以来,俄罗斯就一直在实施一项名为“操作与调查活动系统”的计 划,该计划有个邪恶得可爱的简称——“Sorm-2”。这项计划确保每个进入、流出该国或在该国境内四处传输的数据字节得到备份,最终形成一个庞大的信息 储存库,由俄联邦安全局(FSB)负责管理。如果你愿意,你可以通过网络浏览在车臣上演的种种暴行,但可以肯定的是,有人正在网络上监视你的操作。

中 国自然有它自己的“防火长城”,用于过滤政治上不正确的网站、色情网站以及其它形式的文化垃圾。但更重要的一点是,中国要求美国放弃对互联网语言和域名的 管控权(这一管控通过加州非盈利组织ICANN实施),而美国做出了切实的让步。这一管控将转变为一种联合国式的监管操作。不久之后,中国对其境内互联网 架构将具有绝对的决定权。

在西方,网络空间的立法则处在更为混乱的状态。但我们现在看到,有无数法律法规正由立法者或在法庭上确立起来。本 周,伦敦Blackfriars刑事法庭在审理一起重大网络犯罪案件后举行了听证会,王室法律顾问哈伦德拉•德席尔瓦(Harendra de Silva QC)在会上明确指出了这一现实。他表示:“我们正进入这样一个时代:人际交流几乎全部要使用互联网——不管是哪一种交流。”

因 此,尽管我们确实亟需制定适用于网络空间的法规,但这些法规出现的速度却过快,其背后缺乏协调一致的战略。由于两个关键原因,没有人知道这一进程将会造成 什么局面。技术变革的速度意味着,19世纪各国用来瓜分世界的传统手段,例如法律和条约,往往不足以应用到这一新领域中,甚至可能全然无用。

为 了打击网络犯罪组织,美国联邦调查局(FBI)和英国有组织犯罪重案局(SOCA)等执法机构都投入了可观的时间和财力。但美国互联网犯罪投诉中心 (ICCC)近期报告称,网络犯罪所造成的损失以惊人的速度不断攀升,因为犯罪分子能够以闪电般的速度根据新的管理办法调整作案方式。

在商 业领域,与版权有关的重要法规,如英国的《数字经济法案》(Digital Economy Bill),不太可能抵挡住第二大变数——互联网一代即将成年——的影响。事实将证明,禁止文件共享的法律很可能会像当年促成撒切尔(Thatcher) 政府下台的人头税一样不受欢迎。此外,这类法律看起来也是完全不可能执行的。

欧洲各地正涌现出名为“互联网党”或“盗版党”之类的政 党,他们认为网络完全是人类DNA的一部分。这是变革的一个前兆。英国皇家国际事务研究所(Chatham House)研究员雷克斯•休斯(Rex Hughes)指出:“网络新旧两派势力展开的冲突中,旧势力总能赢得最初几个回合,但最终他们会消亡。”休斯目前正负责一个网络安全项目。

不 过,最大规模的战斗发生在网络战和网络间谍领域。去年6月,美国把网络空间定为陆地、海洋、天空和太空之外的“第五领域”,也是第一个人工领域。这是一项 具有象征性的举措。北约(Nato)的律师们正试图解决如何把战争法应用于网络空间的问题。伴随这场新军备竞赛的是歇斯底里。上月在美国参议院举行的一次 听证会上,前美国国家情报总监、海军上将迈克•麦康奈尔(Admiral Mike McConnell)断言:“如果今日国家要打一场网络战,我们将会战败。”



米沙•格伦尼是新书《超级黑帮:揭秘全球地下经济》(McMafia: A Journey through the Global Criminal Underworld)的作者


2010年3月16日 星期二


日本第一! 東京啟用太陽能單車棚

〔編 譯鄭曉蘭/綜合報導〕日本大型電氣公司「三洋電機」與東京世田谷區合作設置的太陽能電動腳踏車租借場(見圖,法新社),十六日正式啟用。兩座位於車站附近 的太陽能腳踏車租借場,舉凡腳踏車及夜間照明電力,均完全來自太陽能電池面板,提供民眾往來車站或遊覽周遭景點的環保代步新選擇。

三洋電機 耗資約九千萬日圓(約台幣三千一百六十萬),在東京京王線櫻上水車站及東急田園都市線櫻新町車站的腳踏車停車場,屋頂分別設置長一百四十二公分、寬八十九 公分的三十六片太陽能電池面板。三十六片面板的最大發電量為七.五六千瓦。兩座停車場各有四十輛電動腳踏車,另一處小田急線經堂車站的腳踏車停車場,則設 置二十輛電動腳踏車。



2010年3月12日 星期五

Driving by the Numbers


Driving by the Numbers

Published: March 11, 2010

Cambridge, Mass.

IN the wake of the Congressional hearings on the Toyota recalls, we have heard various proposals for countering unintended acceleration in automobiles.

Transportation Secretary Ray LaHood recently said the federal government may recommend that carmakers install “smart pedals”that give brakes priority when both brake and accelerator pedals are pressed simultaneously. Meanwhile, Toyota has said that, in contested acceleration accidents, it will give regulators access codes to data recorders — essentially, onboard black boxes being installed in some new cars.But sometimes the solution to a safety problem is simply more transparency. Indeed, there is a relatively easy solution that would help identify problems before they affect thousands of cars, or kill and injure dozens of people: allow drivers and carmakers real-time access to the data that’s already being monitored.

Current federal law requires annual emissions and safety inspections for all cars. A mechanic plugs an electronic reader into what’s known as the onboard diagnostic unit, a computer that sits under your dashboard, monitoring data on acceleration, emissions, fuel levels and engine problems. The mechanic can then download the data to his own computer and analyze it.

Because carmakers believe such diagnostic data to be their property, much of it is accessible only by the manufacturer and authorized dealers and their mechanics. And even then, only a small amount of the data is available — most cars’ computers don’t store data, they only monitor it. Though newer Toyotas have data recorders that gather information in the moments before an air bag is deployed, the carmaker has been frustratingly vague about what kind of data is collected (other manufacturers have been more forthcoming).

But what if a car’s entire data stream was made available to drivers in real time? You could use, for instance, a hypothetical “analyze-my-drive” application for your smart phone to tell you when it was time to change the oil or why your “check engine” light was on. The application could tell you how many miles you were getting to the gallon, and how much yesterday’s commute cost you in time, fuel and emissions. It could even tell you, say, that your spouse’s trips to the grocery store were 20 percent more fuel-efficient than yours.

Carmakers could collect the data, too. Aberrant engine and driving behavior would leap out of the carmakers’ now-large data set, allowing them, if necessary, to conduct recalls much earlier. And, in exchange for your contribution of anonymous data, carmakers could send you driving benchmarks aggregated from your peers; then your app could tell you how your driving compares with the average of all drivers of the same car.

Having such readily accessible data streaming from your car might raise fears of a Big Brother scenario, in which carmakers would know where you are and how you are using (or misusing) your vehicle. But you would still decide whether you wanted to tap into the data, how you would use it and with whom you’d share it.

Allowing drivers and carmakers access to real-time performance data wouldn’t prevent every future mechanical failure. But it would allow carmakers and entrepreneurs to develop analytical tools to help catch developing problems in both individual cars and entire model lines. Cars would continue to break down and even cause accidents, but it wouldn’t take a Congressional hearing to figure out why.

Robin Chase, the founder and former chief executive of Zipcar, is on the Intelligent Transportation Systems Program Advisory Committee for the United States Department of Transportation.

2010年3月6日 星期六

A sheet of glass : spherical solar cells



photoA sheet of glass embedded with spherical solar cells developed by Kyosemi Corp. attracts attention at a Tokyo trade show. (NOBORU TOMURA/ THE ASAHI SHIMBUN)

One of the brightest stars at a trade show for solar power and fuel-cell technology at the Tokyo Big Sight convention center turned out to be one of the smallest there.

Shown off in the three-day event were spherical solar cells with a diameter of 1.8 millimeters developed by Kyoto-based semiconductor maker Kyosemi Corp.

The Sphelar solar modules can be embedded in large number into window glass or sheets of flexible plastic.

Among other popular items were solar-charged panels that display in eight colors, including gold, purple and green. They were developed by Taiwanese maker Jintec Corp.

About 1,300 companies exhibited their products in the show, which ended Friday.

2010年3月3日 星期三

The data deluge

The data deluge

Businesses, governments and society are only starting to tap its vast potential

EIGHTEEN months ago, Li & Fung, a firm that manages supply chains for retailers, saw 100 gigabytes of information flow through its network each day. Now the amount has increased tenfold. During 2009, American drone aircraft flying over Iraq and Afghanistan sent back around 24 years' worth of video footage. New models being deployed this year will produce ten times as many data streams as their predecessors, and those in 2011 will produce 30 times as many.

Everywhere you look, the quantity of information in the world is soaring. According to one estimate, mankind created 150 exabytes (billion gigabytes) of data in 2005. This year, it will create 1,200 exabytes. Merely keeping up with this flood, and storing the bits that might be useful, is difficult enough. Analysing it, to spot patterns and extract useful information, is harder still. Even so, the data deluge is already starting to transform business, government, science and everyday life (see our special report in this issue). It has great potential for good—as long as consumers, companies and governments make the right choices about when to restrict the flow of data, and when to encourage it.

Plucking the diamond from the waste

A few industries have led the way in their ability to gather and exploit data. Credit-card companies monitor every purchase and can identify fraudulent ones with a high degree of accuracy, using rules derived by crunching through billions of transactions. Stolen credit cards are more likely to be used to buy hard liquor than wine, for example, because it is easier to fence. Insurance firms are also good at combining clues to spot suspicious claims: fraudulent claims are more likely to be made on a Monday than a Tuesday, since policyholders who stage accidents tend to assemble friends as false witnesses over the weekend. By combining many such rules, it is possible to work out which cards are likeliest to have been stolen, and which claims are dodgy.

Mobile-phone operators, meanwhile, analyse subscribers' calling patterns to determine, for example, whether most of their frequent contacts are on a rival network. If that rival network is offering an attractive promotion that might cause the subscriber to defect, he or she can then be offered an incentive to stay. Older industries crunch data with just as much enthusiasm as new ones these days. Retailers, offline as well as online, are masters of data mining (or "business intelligence", as it is now known). By analysing "basket data", supermarkets can tailor promotions to particular customers' preferences. The oil industry uses supercomputers to trawl seismic data before drilling wells. And astronomers are just as likely to point a software query-tool at a digital sky survey as to point a telescope at the stars.

There's much further to go. Despite years of effort, law-enforcement and intelligence agencies' databases are not, by and large, linked. In health care, the digitisation of records would make it much easier to spot and monitor health trends and evaluate the effectiveness of different treatments. But large-scale efforts to computerise health records tend to run into bureaucratic, technical and ethical problems. Online advertising is already far more accurately targeted than the offline sort, but there is scope for even greater personalisation. Advertisers would then be willing to pay more, which would in turn mean that consumers prepared to opt into such things could be offered a richer and broader range of free online services. And governments are belatedly coming around to the idea of putting more information—such as crime figures, maps, details of government contracts or statistics about the performance of public services—into the public domain. People can then reuse this information in novel ways to build businesses and hold elected officials to account. Companies that grasp these new opportunities, or provide the tools for others to do so, will prosper. Business intelligence is one of the fastest-growing parts of the software industry.

Now for the bad news

But the data deluge also poses risks. Examples abound of databases being stolen: disks full of social-security data go missing, laptops loaded with tax records are left in taxis, credit-card numbers are stolen from online retailers. The result is privacy breaches, identity theft and fraud. Privacy infringements are also possible even without such foul play: witness the periodic fusses when Facebook or Google unexpectedly change the privacy settings on their online social networks, causing members to reveal personal information unwittingly. A more sinister threat comes from Big Brotherishness of various kinds, particularly when governments compel companies to hand over personal information about their customers. Rather than owning and controlling their own personal data, people very often find that they have lost control of it.

The best way to deal with these drawbacks of the data deluge is, paradoxically, to make more data available in the right way, by requiring greater transparency in several areas. First, users should be given greater access to and control over the information held about them, including whom it is shared with. Google allows users to see what information it holds about them, and lets them delete their search histories or modify the targeting of advertising, for example. Second, organisations should be required to disclose details of security breaches, as is already the case in some parts of the world, to encourage bosses to take information security more seriously. Third, organisations should be subject to an annual security audit, with the resulting grade made public (though details of any problems exposed would not be). This would encourage companies to keep their security measures up to date.

Market incentives will then come into play as organisations that manage data well are favoured over those that do not. Greater transparency in these three areas would improve security and give people more control over their data without the need for intricate regulation that could stifle innovation. After all, the process of learning to cope with the data deluge, and working out how best to tap it, has only just begun.

smart house, battery-powered boat, Plastic sorting robot developed

Plastic sorting robot developed



photoVisitors to a supermarket in Ikoma, Nara Prefecture, watch a robot sort plastic bottles by material. (HIDEAKI ISHIYAMA/ THE ASAHI SHIMBUN)

A robot which can sort plastics left in garbage by using five different types of laser beams has been developed by researchers.

The robot, which was developed in a joint project by Osaka University, IDEC Corp. and Mitsubishi Electric Engineering Corp., uses lasers with differing wave lengths to distinguish between types of plastic. It can also help in sorting plastic bottles.

A spokesperson for the research group said the device, measuring 1.7 meter by 2.1 meters, can distinguish between most types of plastic currently in use, including polyethylene and polypropylene. Sorting different plastics is a major obstacle for effective recycling.

The robot was recently put on trial in Ikoma, Nara Prefecture. The group aims to have it on the market in the near future.


A team of academic and private-sector researchers is jointly developing a battery-powered boat that is expected to emit only half the carbon dioxide of a vessel that operates on diesel.

The team said Monday that test cruises are to begin as early as summer.

Under the project by the Tokyo University of Marine Science and Technology, Yamaha Motor Co., Tokyo Electric Power Co. and other companies, the vessel will be equipped with a rapid charger, similar to those used by electric vehicles.

Operating as a water taxi, the 10-meter-long craft will carry up to 10 passengers for short- distance travel in Tokyo Bay. It will be able to operate for 45 minutes between charges at full speed.

The researchers will evaluate the boat's performance on a 7-kilometer, 20-minute route between the university's two campuses facing the bay in July or later.

Efforts to reduce carbon dioxide, which has been blamed for global warming, come at a price. The battery-powered boat costs 50 percent more to build than a conventional boat.

OSAKA--The house of the future will spew out 80 percent less carbon dioxide by using a combination of solar panels, fuel cells and storage batteries, according to a study by a home builder and a gas supplier. Officials at Sekisui House Ltd. and Osaka Gas Co. said they are working to come up with a marketable "smart house" by 2015 at the earliest, based on data gathered from the government-commissioned study. The experiment found that annual carbon dioxide emissions from a 150-square-meter house can be cut by up to 5 tons, or 80 percent, through a combination of green technologies. It used a two-story prototype in Kizugawa, Kyoto Prefecture, which is capable of generating electricity around the clock by using solar and fuel cells while saving surplus energy in storage batteries. The industry ministry has set a goal of cutting carbon dioxide emissions from households by half in the future.

2010年3月2日 星期二

Microsoft wins spammer, Against Google

Computer giant Microsoft has emerged triumphant in its battle against a major source of internet spam.

A US court ruled that Microsoft can shut down the Waledac botnet, a network of hacker-controlled computers.

The firm say this botnet can send out up to 1.5 billion unwanted emails a day.

But is the court ruling a victory for all internet users?

Hugh Thompson is a New York-based internet security expert and ProgramCommittee Chair of the RSA Conference, the world's biggest gathering of security professionals.



Media Cache
It's Not Just Microsoft Against Google

Published: March 1, 2010

PARIS — After a 30-year career in the law, Dominique Barella left his job as president of the main union for French judges in 2006 and started a Web site, Ejustice.fr, that lets users search for legal resources in France.

Mr. Barella developed the site, including the search technology, with an investment of about €20,000, or $27,000, and help from a friend who is an engineer. Aside from a video interview in which an unsmiling Mr. Barella explains how Ejustice.fr works, the site is short on bells and whistles. Nonetheless, within a few months, it was attracting up to 20,000 visitors a day and selling a modest amount of advertising.

That, Mr. Barella says, is when the trouble with Google began.

Overnight, traffic plunged — because, Mr. Barella says, the company stopped indexing pages from Ejustice.fr for inclusion in Google’s search engine.

“We asked Google, ‘Why are you doing this? We have no more money,”’ Mr. Barella said. “They didn’t want to work with us, they didn’t want to help us, they didn’t want us to exist.”

More than three years later, with traffic to Ejustice.fr stuck at about 700 users a day, Mr. Barella has taken his grievances to the European Commission in Brussels. Google disclosed last week that the commission had begun a preliminary investigation of antitrust complaints from Ejustice.fr and two companies, called Ciao and Foundem, that offer online price comparisons.

In its initial comments on the investigation, Google suggested that Microsoft, its archrival, lay behind its troubles in Brussels. Google noted that Foundem was a member of a Microsoft-financed lobbying group, and said Google’s relationship with Ciao had gone downhill only after that company was acquired by Microsoft in 2008.

Mr. Barella, 54, says he is no Microsoft lackey. “They want to say they are fighting Microsoft,” he said of Google. “But I have no connection to Microsoft. Perhaps on my computer I have Microsoft Windows — that’s my only connection.”

Yet Mr. Barella’s complaints do echo those of Foundem, which says it, too, was penalized by Google because it offered competing services.

Both Foundem and Ejustice.fr are so-called vertical search engines, which provide searches focused on a particular subject, rather than the broader, Webwide sweep that Google offers. Foundem says it was downgraded by Google several years ago, which pushed it into Web obscurity, though it says it has since managed to persuade Google to restore its previous search status.

Google declined to comment on Mr. Barella’s specific allegations. The company has said that it penalizes some, but not all, vertical search engines because they are essentially spam, gathering content and links from other sites to generate traffic and ad revenue.

“In order to maintain the high quality of Google search, we flag or remove sites that we detect have malware and viruses or don’t comply with Google’s quality guidelines,” it said in a statement. “The guidelines under which we will take action are publicly documented, and this is standard industry practice among search engines.”

Mr. Barella insists that he did not set up Ejustice.fr to make a fast euro. Even on its best days, he says, the site was barely generating enough revenue to cover the cost of his computers.

He is especially bitter because he says that after meeting with Google, he took the company’s advice and replaced his own search technology with custom software provided by Google. But he says Ejustice.fr’s woes only deepened.

Mr. Barella portrayed his fight with the search giant as an effort to defend free speech — borrowing a theme that Google itself has sounded in its standoff with the Chinese government over censorship, and in an adverse court ruling in Italy, where three Google executives last week were convicted of violating privacy laws.

“How could we be a problem for Google? It’s a joke,” Mr. Barella said. “I think we were an example among other examples that Google wanted to make.”

Human Culture, an Evolutionary Force

Human Culture, an Evolutionary Force

Per-Anders Pettersson/Getty Images

Genes enabling lactose tolerance, which probably resulted in more surviving offspring, were detected in cultures like this Kenyan shepherd’s.

Published: March 1, 2010

As with any other species, human populations are shaped by the usual forces of natural selection, like famine, disease or climate. A new force is now coming into focus. It is one with a surprising implication — that for the last 20,000 years or so, people have inadvertently been shaping their own evolution.

Skip to next paragraph

RSS Feed

Radu Sigheti/Reuters

Maasai tribesman are among a culture with adult lactose tolerance.

Readers' Comments

The force is human culture, broadly defined as any learned behavior, including technology. The evidence of its activity is the more surprising because culture has long seemed to play just the opposite role. Biologists have seen it as a shield that protects people from the full force of other selective pressures, since clothes and shelter dull the bite of cold and farming helps build surpluses to ride out famine.

Because of this buffering action, culture was thought to have blunted the rate of human evolution, or even brought it to a halt, in the distant past. Many biologists are now seeing the role of culture in a quite different light.

Although it does shield people from other forces, culture itself seems to be a powerful force of natural selection. People adapt genetically to sustained cultural changes, like new diets. And this interaction works more quickly than other selective forces, “leading some practitioners to argue that gene-culture co-evolution could be the dominant mode of human evolution,” Kevin N. Laland and colleagues wrote in the February issue of Nature Reviews Genetics. Dr. Laland is an evolutionary biologist at the University of St. Andrews in Scotland.

The idea that genes and culture co-evolve has been around for several decades but has started to win converts only recently. Two leading proponents, Robert Boyd of the University of California, Los Angeles, and Peter J. Richerson of the University of California, Davis, have argued for years that genes and culture were intertwined in shaping human evolution. “It wasn’t like we were despised, just kind of ignored,” Dr. Boyd said. But in the last few years, references by other scientists to their writings have “gone up hugely,” he said.

The best evidence available to Dr. Boyd and Dr. Richerson for culture being a selective force was the lactose tolerance found in many northern Europeans. Most people switch off the gene that digests the lactose in milk shortly after they are weaned, but in northern Europeans — the descendants of an ancient cattle-rearing culture that emerged in the region some 6,000 years ago — the gene is kept switched on in adulthood.

Lactose tolerance is now well recognized as a case in which a cultural practice — drinking raw milk — has caused an evolutionary change in the human genome. Presumably the extra nutrition was of such great advantage that adults able to digest milk left more surviving offspring, and the genetic change swept through the population.

This instance of gene-culture interaction turns out to be far from unique. In the last few years, biologists have been able to scan the whole human genome for the signatures of genes undergoing selection. Such a signature is formed when one version of a gene becomes more common than other versions because its owners are leaving more surviving offspring. From the evidence of the scans, up to 10 percent of the genome — some 2,000 genes — shows signs of being under selective pressure.

These pressures are all recent, in evolutionary terms — most probably dating from around 10,000 to 20,000 years ago, in the view of Mark Stoneking, a geneticist at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Biologists can infer the reason for these selective forces from the kinds of genes that are tagged by the genome scans. The roles of most of the 20,000 or so genes in the human genome are still poorly understood, but all can be assigned to broad categories of likely function depending on the physical structure of the protein they specify.

By this criterion, many of the genes under selection seem to be responding to conventional pressures. Some are involved in the immune system, and presumably became more common because of the protection they provided against disease. Genes that cause paler skin in Europeans or Asians are probably a response to geography and climate.

But other genes seem to have been favored because of cultural changes. These include many genes involved in diet and metabolism and presumably reflect the major shift in diet that occurred in the transition from foraging to agriculture that started about 10,000 years ago.

Amylase is an enzyme in the saliva that breaks down starch. People who live in agrarian societies eat more starch and have extra copies of the amylase gene compared with people who live in societies that depend on hunting or fishing. Genetic changes that enable lactose tolerance have been detected not just in Europeans but also in three African pastoral societies. In each of the four cases, a different mutation is involved, but all have the same result — that of preventing the lactose-digesting gene from being switched off after weaning.

Many genes for taste and smell show signs of selective pressure, perhaps reflecting the change in foodstuffs as people moved from nomadic to sedentary existence. Another group under pressure is that of genes that affect the growth of bone. These could reflect the declining weight of the human skeleton that seems to have accompanied the switch to settled life, which started some 15,000 years ago.

A third group of selected genes affects brain function. The role of these genes is unknown, but they could have changed in response to the social transition as people moved from small hunter-gatherer groups a hundred strong to villages and towns inhabited by several thousand, Dr. Laland said. “It’s highly plausible that some of these changes are a response to aggregation, to living in larger communities,” he said.

Though the genome scans certainly suggest that many human genes have been shaped by cultural forces, the tests for selection are purely statistical, being based on measures of whether a gene has become more common. To verify that a gene has indeed been under selection, biologists need to perform other tests, like comparing the selected and unselected forms of the gene to see how they differ.

Dr. Stoneking and his colleagues have done this with three genes that score high in statistical tests of selection. One of the genes they looked at, called the EDAR gene, is known to be involved in controlling the growth of hair. A variant form of the EDAR gene is very common in East Asians and Native Americans, and is probably the reason that these populations have thicker hair than Europeans or Africans.

Still, it is not obvious why this variant of the EDAR gene was favored. Possibly thicker hair was in itself an advantage, retaining heat in Siberian climates. Or the trait could have become common through sexual selection, because people found it attractive in their partners.

A third possibility comes from the fact that the gene works by activating a gene regulator that controls the immune system as well as hair growth. So the gene could have been favored because it conferred protection against some disease, with thicker hair being swept along as a side effect. Or all three factors could have been at work. “It’s one of the cases we know most about, and yet there’s a lot we don’t know,” Dr. Stoneking said.

The case of the EDAR gene shows how cautious biologists have to be in interpreting the signals of selection seen in the genome scans. But it also points to the potential of the selective signals for bringing to light salient events in human prehistory as modern humans dispersed from the ancestral homeland in northeast Africa and adapted to novel environments. “That’s the ultimate goal,” Dr. Stoneking said. “I come from the anthropological perspective, and we want to know what the story is.”

With archaic humans, culture changed very slowly. The style of stone tools called the Oldowan appeared 2.5 million years ago and stayed unchanged for more than a million years. The Acheulean stone tool kit that succeeded it lasted for 1.5 million years. But among behaviorally modern humans, those of the last 50,000 years, the tempo of cultural change has been far brisker. This raises the possibility that human evolution has been accelerating in the recent past under the impact of rapid shifts in culture.

Some biologists think this is a possibility, though one that awaits proof. The genome scans that test for selection have severe limitations. They cannot see the signatures of ancient selection, which get washed out by new mutations, so there is no base line by which to judge whether recent natural selection has been greater than in earlier times. There are also likely to be many false positives among the genes that seem favored.

But the scans also find it hard to detect weakly selected genes, so they may be picking up just a small fraction of the recent stresses on the genome. Mathematical models of gene-culture interaction suggest that this form of natural selection can be particularly rapid. Culture has become a force of natural selection, and if it should prove to be a major one, then human evolution may be accelerating as people adapt to pressures of their own creation.