2009年4月28日 星期二

人类大流感回顾

时事风云 | 2009.04.28

人类大流感回顾

随着猪流感疫情的进一步蔓延,世界卫生组织已经将猪流感的警戒级别调高至第四级,该组织总干事陈冯富珍表示,猪流感病毒有在全球广泛爆发的可能性。专家指 出,当一种新型的病毒广泛传播,而大多数人还不具备抵抗这种新病毒的自然免疫力时,全球性的瘟疫就会爆发。至少从16世纪起,各种流感瘟疫就一直在以不规 律的间歇不断爆发。

世 界卫生组织宣布,北美爆发的猪流感疫情构成了国际关注的突发公共卫生事件,并将目前人类大流感警戒级别从第三级调高到第四级,即出现病毒在人与人之间持续 传播的阶段。人类大流感警戒级别共分为六级,世界卫生组织表示,把警戒级别调高到第四级意味着出现人类大流感的可能性增加了,但并不是不可避免。

在 过去的一个世纪里,全球性的流感瘟疫,也就是世界卫生组织所说的人类大流感,一共出现过至少五次。1918年至1919年爆发了被称为"史上最恐怖的流 感"--西班牙流感,这种病毒也是H1N1型病毒的一个种类,主要袭击青壮年,当时造成了全球大约20%到40%的人口感染,死亡人数至少有4千万到5千 万,是一战死亡人数的差不多五倍。这种病毒最早是在美国发现的,但是之所以得名"西班牙病毒",是因为当时的西班牙媒体对这种瘟疫投入了大量的精力报道, 而在第一次世界大战期间,许多其它国家的媒体处于严密的监控之下,无法进行自由报道。

20世纪的第二次大型全球性流感,是1957年爆发的所谓亚洲流感。这种流感病毒属于H2N2型病毒,最早在中国发现。该流感病毒早期感染的是儿童,后来则主要袭击老年人。这一波亚洲流感造成全球大约2百万人死亡。

1968年爆发的香港流感是上世纪疫情最轻的一次人类大流感。这种H3N2型流感病毒最早出现于香港,之后在接下来的两年中扩散到全世界。患病者主要是老年人,全球死亡人数大约为一百万。

进 入21世纪之后短短的不到十年里,世界上已经出现了数次流感瘟疫,但是它们的严重性和死亡人数已经比20世纪时大大降低。2002年11月,在中国广东省 出现了首例非典型性肺炎感染者。这种高危险性的肺炎简称萨斯,由于在2003年上半年在中国大部分地区以极快的速度广泛传播,加上中国有关卫生部门和政府 机构在疫情出现的初期采取隐瞒的做法,导致了许多人的恐慌。据统计,全世界一共有8000人左右感染萨斯病毒,死亡人数约为800人,其中中国有将近 350人。

2003年,H5N1型高致病性禽流感病毒由越南蔓延到东南亚其它国家,中国也受到了一定程度的波及。禽流感病毒至今一共造成250多人死亡。与这次在墨西哥爆发的猪流感不同,禽流感病毒只能通过禽类向人类传播,而不会在人与人之间直接传播。

几乎年年都有不同程度的流感疫情出现。正是因为流感病毒变异的速度十分迅猛,所以也对医学界研制治疗药物和疫苗提出了很大的挑战。据世界卫生组织公布,全球每年都有300万到500万人感染严重的流感,其中大约有25万到50万人因流感而死亡。

作者:雨涵

责编:乐然

2009年4月24日 星期五

Milk protein clue to big babies

Milk protein clue to big babies

Baby being bottle-fed
How babies are fed is the subject of much debate

Breast milk has less protein than formula, which could be why bottle-fed babies grow faster, a study suggests.

There has been concern that formula-fed babies, who tend to be bigger, are "programmed" to store fat and so have a higher risk of childhood obesity.

The international study of 1,000 babies, published in the American Journal of Clinical Nutrition, suggests protein levels in formula should fall.

But UK manufacturers said action had already been taken to cut levels.

Measures

The study was carried out in Belgium, Italy, Germany, Poland and Spain on babies born between 2002 and 2004.

Parents were recruited to take part in the first few weeks of their babies' lives.

A third were given a low protein content formula milk (around 2g per 100kcal), a third had a formula with a higher level of protein (3-4g per 10kcal), while the rest were breast-fed during their first year.

Limiting the protein content of infant and follow-on formula can normalise early growth and might contribute greatly to reducing the long-term risk of childhood overweight and obesity
Professor Berthold Koletzko, Study author

To qualify as breast-fed, babies had to be either exclusively given breast milk, or have a maximum of three bottles per week.

The infants were all then followed up to the age of two with regular weight, height and body mass index measurements taken.

At the age of two, there was no difference in height between the groups, but the high protein group were the heaviest.

The researchers suggest lower protein intakes in infancy might protect against later obesity.

The children are being followed up further to see whether those given the lower protein formulas have a reduced risk of obesity later on.

Changes needed?

Professor Berthold Koletzko, from the University of Munich, Germany, and who led the study, said: "These results from the EU Childhood Obesity Programme underline the importance of promoting and supporting breastfeeding because of the long-term benefits it brings.

"They also highlight the importance of the continual development and improvement in the composition of infant formula.

"Limiting the protein content of infant and follow-on formula can normalise early growth and might contribute greatly to reducing the long-term risk of childhood overweight and obesity."

But writing in the American Journal of Nutrition, Dr Satish Kalhan of the Case Western Reserve University in Cleveland, US, said: "On the basis of these data, should we consider prescribing low protein formula to infants?

"The answer most likely is a categorical no."

A spokesman for the UK's Infant and Dietetic Food Association said companies had already reduced protein levels to well below those mentioned in the study.

She added: "The scientific evidence reviewing the role of infant formula in the development of obesity in later life is unclear.

"Most studies in this area are short-term and very few look at the long-term effect into adulthood."

But she added: "Clearly further research is required and this is an area we follow closely to ensure that the product we represent are based on generally accepted scientific evidence."

New infant growth charts, to be introduced in the UK this summer, have been changed so they relate more closely to the growth patterns of breast-fed babies.

Existing charts are based on a 1970s study into the growth patterns of formula-fed babies, and many breast-fed babies fall short - often causing concern to their parents and to health visitors.

2009年4月19日 星期日

某些科技產品的應用落差

這幾年美加等國科技某些產品如 Blackberry/ Facebook /Twitter等沒趕上
包括這種落差
I still have vivid memories of many such moments: clicking on my first Web hyperlink in 1994 and instantly transporting to a page hosted on a server in Australia; using Google Earth to zoom in from space directly to the satellite image of my house; watching my 14-month-old master the page-flipping gesture on the iPhone's touch interface.

How the E-Book Will Change the Way We Read and Write

Medicine goes digital

A special report on health care and technology

Medicine goes digital

Apr 16th 2009
From The Economist print edition

The convergence of biology and engineering is turning health care into an information industry. That will be disruptive, says Vijay Vaitheeswaran (interviewed here), but also hugely beneficial to patients


Illustration by Otto Steininger

INNOVATION and medicine go together. The ancient Egyptians are thought to have performed surgery back in 2750BC, and the Romans developed medical tools such as forceps and surgical needles. In modern times medicine has been transformed by waves of discovery that have brought marvels like antibiotics, vaccines and heart stents.

Given its history of innovation, the health-care sector has been surprisingly reluctant to embrace information technology (IT). Whereas every other big industry has computerised with gusto since the 1980s, doctors in most parts of the world still work mainly with pen and paper.


But now, in fits and starts, medicine is at long last catching up. As this special report will explain, it is likely to be transformed by the introduction of electronic health records that can be turned into searchable medical databases, providing a “smart grid” for medicine that will not only improve clinical practice but also help to revive drugs research. Developing countries are already using mobile phones to put a doctor into patients’ pockets. Devices and diagnostics are also going digital, advancing such long-heralded ideas as telemedicine, personal medical devices for the home and smart pills.

The first technological revolution in modern biology started when James Watson and Francis Crick described the structure of DNA half a century ago. That established the fields of molecular and cell biology, the basis of the biotechnology industry. The sequencing of the human genome nearly a decade ago set off a second revolution which has started to illuminate the origins of diseases.

The great convergence

Now the industry is convinced that a third revolution is under way: the convergence of biology and engineering. A recent report from the Massachusetts Institute of Technology (MIT) says that physical sciences have already been transformed by their adoption of information technology, advanced materials, imaging, nanotechnology and sophisticated modelling and simulation. Phillip Sharp, a Nobel prize-winner at that university, believes that those tools are about to be brought to bear on biology too.

Robert Langer, a biochemist at MIT who holds over 500 patents in biotechnology and medical technologies and has started or advised more than 100 new companies, thinks innovation in medical technologies is about to take off. Menno Prins of Philips, a Dutch multinational with a big medical-technology division, explains that, “like chemistry before it, biology is moving from a world of alchemy and ignorance to becoming a predictable, repeatable science.” Ajay Royyuru of IBM, an IT giant, argues that “it’s the transformation of biology into an information science from a discovery science.”

This special report will ask how much of this grand vision is likely to become reality. Some of the industry’s optimism appears to be well-founded. As the rich world gets older and sicker and the poor world gets wealthier and fatter, the market for medical innovations of all kinds is bound to grow. Clever technology can help solve two big problems in health care: overspending in the rich world and under-provisioning in the poor world.

But the chances are that this will take time, and turn out to be more of a reformation than a revolution. The hidebound health-care systems of the rich world may resist new technologies even as poor countries leapfrog ahead. There is already a backlash against genomics, which has been oversold to consumers as a deterministic science. And given soaring health-care costs, insurers and health systems may not want to adopt new technologies unless inventors can show conclusively that they will produce better outcomes and offer value for money.

If these obstacles can be overcome, then the biggest winner will be the patient. In the past medicine has taken a paternalistic stance, with the all-knowing physician dispensing wisdom from on high, but that is becoming increasingly untenable. Digitisation promises to connect doctors not only to everything they need to know about their patients but also to other doctors who have treated similar disorders.

The coming convergence of biology and engineering will be led by information technologies, which in medicine means the digitisation of medical records and the establishment of an intelligent network for sharing those records. That essential reform will enable many other big technological changes to be introduced.

Just as important, it can make that information available to the patients too, empowering them to play a bigger part in managing their own health affairs. This is controversial, and with good reason. Many doctors, and some patients, reckon they lack the knowledge to make informed decisions. But patients actually know a great deal about many diseases, especially chronic ones like diabetes and heart problems with which they often live for many years. The best way to deal with those is for individuals to take more responsibility for their own health and prevent problems before they require costly hospital visits. That means putting electronic health records directly into patients’ hands.

2009年4月18日 星期六

2009年4月16日 星期四

Genes Show Limited Value in Predicting Diseases

may have to wait. The genetic analysis of common disease is turning out to be a lot more complex than expected.

Skip to next paragraph
Ken Cedeno for The New York Times

David B. Goldstein of Duke University is among the geneticists who are debating which path to follow in disease research.

Since the human genome was decoded in 2003, researchers have been developing a powerful method for comparing the genomes of patients and healthy people, with the hope of pinpointing the DNA changes responsible for common diseases.

This method, called a genomewide association study, has proved technically successful despite many skeptics’ initial doubts. But it has been disappointing in that the kind of genetic variation it detects has turned out to explain surprisingly little of the genetic links to most diseases.

A set of commentaries in this week’s issue of The New England Journal of Medicine appears to be the first public attempt by scientists to make sense of this puzzling result.

One issue of debate among researchers is whether, despite the prospect of diminishing returns, to continue with the genomewide studies, which cost many millions of dollars apiece, or switch to a new approach like decoding the entire genomes of individual patients.

The unexpected impasse also affects companies that offer personal genomic information and that had assumed they could inform customers of their genetic risk for common diseases, based on researchers’ discoveries.

These companies are probably not performing any useful service at present, said David B. Goldstein, a Duke University geneticist who wrote one of the commentaries appearing in the journal.

“With only a few exceptions, what the genomics companies are doing right now is recreational genomics,” Dr. Goldstein said in an interview. “The information has little or in many cases no clinical relevance.”

Unlike the rare diseases caused by a change affecting only one gene, common diseases like cancer and diabetes are caused by a set of several genetic variations in each person. Since these common diseases generally strike later in life, after people have had children, the theory has been that natural selection is powerless to weed them out.

The problem addressed in the commentaries is that these diseases were expected to be promoted by genetic variations that are common in the population. More than 100 genomewide association studies, often involving thousands of patients in several countries, have now been completed for many diseases, and some common variants have been found. But in almost all cases they carry only a modest risk for the disease. Most of the genetic link to disease remains unexplained.

Dr. Goldstein argues that the genetic burden of common diseases must be mostly carried by large numbers of rare variants. In this theory, schizophrenia, say, would be caused by combinations of 1,000 rare genetic variants, not of 10 common genetic variants.

This would be bleak news for those who argue that the common variants detected so far, even if they explain only a small percentage of the risk, will nonetheless identify the biological pathways through which a disease emerges, and hence point to drugs that may correct the errant pathways. If hundreds of rare variants are involved in a disease, they may implicate too much of the body’s biochemistry to be useful.

“In pointing at everything,” Dr. Goldstein writes in the journal, “genetics would point at nothing.”

Two other geneticists, Peter Kraft and David J. Hunter of the Harvard School of Public Health, also writing in the journal, largely agree with Dr. Goldstein in concluding that probably many genetic variants, rather than few, “are responsible for the majority of the inherited risk of each common disease.”

But they disagree with his belief that there will be diminishing returns from more genomewide association studies.

“There will be more common variants to find,” Dr. Hunter said. “It would be unfortunate if we gave up now.”

Dr. Goldstein, however, said it was “beyond the grasp of the genomewide association studies” to find rare variants with small effects, even by recruiting enormous numbers of patients. He said resources should be switched away from these highly expensive studies, which in his view have now done their job.

“If you ask what is the fastest way for us to make progress in genetics that is clinically helpful,” he said, “I am absolutely certain it is to marshal our resources to interrogate full genomes, not in fine-tuning our analyses of common variations.”

He advocates decoding the full DNA of carefully selected patients.

Dr. Kraft and Dr. Hunter say that a person’s genetic risk of common diseases can be estimated only roughly at present but that estimates will improve as more variants are found. But that means any risk estimate offered by personal genomics companies today is unstable, Dr. Kraft said, and subject to upward or downward revision in the future.

Further, people who obtain a genomic risk profile are likely to focus with horror on the disease for which they are told they are at highest risk. Yet this is almost certain to be an overestimate, Dr. Kraft said.

The reason is that the many risk estimates derived from a person’s genomic data will include some that are too high and some that are too low. So any estimate of high risk is likely to be too high. The phenomenon is called the “winner’s curse,” by analogy to auctions in which the true value of an item is probably the average of all bids; the winner by definition has bid higher than that, and so has overpaid.

Dr. Kari Stefansson, chief executive of deCODE Genetics, an Icelandic gene-hunting company that also offers a personal genome testing service, said deCODE alerted clients to pay attention to diseases for which testing shows their risk is three times as great as average, not to trivial increases in risk.

Dr. Stefansson said his company had discovered 60 percent of the disease variants known so far.

“We have beaten them in every aspect of the game,” he said of rival gene hunters at American and British universities.

The undiscovered share of genetic risk for common diseases, he said, probably lies not with rare variants, as suggested by Dr. Goldstein, but in unexpected biological mechanisms. DeCODE has found, for instance, that the same genetic variant carries risks that differ depending on whether it is inherited from the mother or the father.

2009年4月13日 星期一

Disney Expert Uses Science to Draw Boy Viewers

Disney Expert Uses Science to Draw Boy Viewers


Published: April 13, 2009

ENCINO, Calif. — Kelly Peña, or “the kid whisperer,” as some Hollywood producers call her, was digging through a 12-year-old boy’s dresser drawer here on a recent afternoon. Her undercover mission: to unearth what makes him tick and use the findings to help the Walt Disney Company reassert itself as a cultural force among boys.

Skip to next paragraph
Monica Almeida/The New York Times

Kelly Peña, a Walt Disney Company expert on youth trends.

Readers' Comments

Share your thoughts.

Ms. Peña, a Disney researcher with a background in the casino industry, zeroed in on a ratty rock ’n’ roll T-shirt. Black Sabbath?

“Wearing it makes me feel like I’m going to an R-rated movie,” said Dean, a shy redhead whose parents asked that he be identified only by first name.

Jackpot.

Ms. Peña and her team of anthropologists have spent 18 months peering inside the heads of incommunicative boys in search of just that kind of psychological nugget. Disney is relying on her insights to create new entertainment for boys 6 to 14, a group that Disney used to own way back in the days of “Davy Crockett” but that has wandered in the age of more girl-friendly Disney fare like “Hannah Montana.”

Children can already see the results of Ms. Peña’s scrutiny on Disney XD, a new cable channel and Web site (disney.go.com/disneyxd). It’s no accident, for instance, that the central character on “Aaron Stone” is a mediocre basketball player. Ms. Peña, 45, told producers that boys identify with protagonists who try hard to grow. “Winning isn’t nearly as important to boys as Hollywood thinks,” she said.

Actors have been instructed to tote their skateboards around with the bottoms facing outward. (Boys in real life carry them that way to display the personalization, Ms. Peña found.) The games portion of the Disney XD Web site now features prominent trophy cases. (It’s less about the level reached in the game and more about sharing small achievements, research showed.)

Fearful of coming off as too manipulative, youth-centric media companies rarely discuss this kind of field research. Disney is so proud of its new “headquarters for boys,” however, that it has made an exception, offering a rare window onto the emotional hooks that are carefully embedded in children’s entertainment. The effort is as outsize as the potential payoff: boys 6 to 14 account for $50 billion in spending worldwide, according to market researchers.

Thus far, Disney’s initiative is limited to the XD channel. But Disney hopes that XD will produce a hit show that can follow the “High School Musical” model from cable to merchandise to live theater to feature film, and perhaps even to Disney World attraction.

With the exception of “Cars,” Disney — home to the “Princesses” merchandising line; the Jonas Brothers; and “Pixie Hollow,” a virtual world built around fairies — has been notably weak on hit entertainment franchises for boys. (“Pirates of the Caribbean” and “Toy Story” are in a type of hibernation, awaiting new big-screen installments.) Disney Channel’s audience is 40 percent male, but girls drive most of the related merchandising sales.

Rivals like Nickelodeon and Cartoon Network have made inroads with boys by serving up rough-edged animated series like “The Fairly Oddparents” and “Star Wars: The Clone Wars.” Nickelodeon, in particular, scoffs at Disney’s recent push.

“We wrote the book on all of this,” said Colleen Fahey Rush, executive vice president for research of MTV Networks, which includes Nickelodeon.

Even so, media companies over all have struggled to figure out the boys’ entertainment market. News Corporation infamously bet big on boys in the late 1990s with its Fox Kids Network and a digital offering, Boyz Channel. Both failed and drew criticism for segregating the sexes (there was also a Girlz Channel) and reinforcing stereotypes.

The guys are trickier to pin down for a host of reasons. They hop more quickly than their female counterparts from sporting activities to television to video games during leisure time. They can also be harder to understand: the cliché that girls are more willing to chitchat about their feelings is often true.

The people on Ms. Peña’s team have anthropology and psychology backgrounds, but she majored in journalism and never saw herself working with children. Indeed, her training in consumer research came from working for a hotel operator of riverboat casinos.

“Children seemed to open up to me,” said Ms. Peña, who does not have any of her own.

Sometimes the research is conducted in groups; sometimes it involves Ms. Peña’s going shopping with a teenage boy and his mother (and perhaps a videographer). The subjects, who are randomly selected by a market research company, are never told that Disney is the one studying them. The children are paid $75.

Walking through Dean’s house in this leafy Los Angeles suburb on the back side of the Hollywood Hills, Ms. Peña looked for unspoken clues about his likes and dislikes.

“What’s on the back of shelves that he hasn’t quite gotten rid of — that will be telling,” she said beforehand. “What’s on his walls? How does he interact with his siblings?”

One big takeaway from the two-hour visit: although Dean was trying to sound grown-up and nonchalant in his answers, he still had a lot of little kid in him. He had dinosaur sheets and stuffed animals at the bottom of his bed.

“I think he’s trying to push a lot of boundaries for the first time,” Ms. Peña said later.

This kind of intensive research has paid dividends for Disney before. Anne Sweeney, president of the Disney ABC Television Group, noted it in her approach to rebuilding Disney Channel a decade ago.

“You have to start with the kids themselves,” she said. “Ratings show what boys are watching today, but they don’t tell you what is missing in the marketplace.”

While Disney XD is aimed at boys and their fathers, it is also intended to include girls. “The days of the Honeycomb Hideout, where girls can’t come in, have long passed,” said Rich Ross, president of Disney Channels Worldwide.

In Ms. Peña’s research boys across markets and cultures described the television aimed at them as “purposeless fun” but expressed a strong desire for a new channel that was “fun with a purpose,” Mr. Ross said. Hollywood has been thinking of them too narrowly — offering all action or all animation — instead of a more nuanced combination, he added. So far results have been mixed.

Disney XD, which took over the struggling Toon Disney channel, has improved its predecessor’s prime-time audience by 27 percent among children 6 to 14, according to Nielsen Media Research. But the bulk of this increase has come from girls. Viewership among boys 6 to 14 is up about 10 percent.

“We’ve seen cultural resonance, and it doesn’t come overnight,” Mr. Ross said.

Which is one reason Ms. Peña is still out interviewing. At Dean’s house her team was quizzing him about what he meant when he used the word “crash.” Ben, a 12-year-old friend who had come over to hang out, responded, “After a long day of doing nothing, we do nothing.”

Growing self-conscious, Ben added, “Am I talking too much?”

Not even close.

2009年4月9日 星期四

Public Library of Science (PLoS) 黑猩猩用肉食換取交配機會

黑猩猩用肉食換取交配機會
黑猩猩
高蛋白的肉類食物對黑猩猩很重要
最新一期《公共科學圖書館﹒綜合》雜誌上發表的文章說,黑猩猩能夠進行"用食物換取交配"這樣的交易。

科學家發現,那些願意與同伴分享自己獵取的肉食的雄性黑猩猩,比那些吝嗇的雄性黑猩猩多一倍的交配機會。

科學家在對科特迪瓦塔伊國家公園黑猩猩的生活進行的觀察中發現,分享肉食使得雄性黑猩猩交配的次數增多。

科學家說,有趣的是,如果一隻雄猩猩和固定的一隻雌猩猩分享肉食的話,它的交配次數就會加倍,同時也可能增加雌猩猩的懷孕機會。

科學家說,高蛋白的肉類食物對黑猩猩很重要,但由於雌猩猩通常自己不獵取食物,所以很少有吃肉的機會。

與人類相似

以前就有科學家提出"食物換取交配"的假說,希望以此來解釋為什麼雄猩猩會和雌猩猩分享獵物。

儘管有這樣的猜測,科學家卻從來沒能錄下這樣的現象,原因是研究者尋找的是分享肉食後立即進行交配這樣的直接交換。

但現在科學家觀察到,這一交易並非一時一事,而是一種相當長久的關係。

那些雄猩猩在雌猩猩沒有發情的時候,仍然會和它們分享獵物。或者在分享後的一兩天之後再交配。

科學家認為,這一研究結果顯示了獵取食物的勝利與交配生育的成功之間的聯繫。

而黑猩猩之間的這種長期性的"食物換取交配"的交易,和人類的配偶關係有相似之處。

The Public Library of Science (PLoS) is a nonprofit open-access scientific publishing project aimed at creating a library of open access journals and other scientific literature under an open content license. As of January 2008 it publishes PLoS Neglected Tropical Diseases, PLoS Biology, PLoS Medicine, PLoS Computational Biology, PLoS Genetics and PLoS Pathogens. PLoS ONE was launched at the end of 2006.

Contents

[hide]

[edit] History

Open Access logo

The Public Library of Science began in early 2001 as an online petition initiative by Patrick O. Brown, a biochemist at Stanford University and Michael Eisen, a computational biologist at the University of California, Berkeley and the Lawrence Berkeley National Laboratory. The petition called for all scientists to pledge that from September 2001 they would discontinue submission of papers to journals which did not make the full-text of their papers available to all, free and unfettered, either immediately or after a delay of several months. Some now do this immediately, as open access journals, such as the BioMed Central stable of journals, or after a six-month period from publication, as what are now known as delayed open access journals, and some after 6 months or less, such as the Proceedings of the National Academy of Sciences. Many others continue to rely on self-archiving.

Joined by Nobel-prize winner and former NIH-director Harold Varmus, the PLoS organizers next turned their attention to starting their own journal, along the lines of the UK-based BioMed Central which has been publishing open-access scientific papers in the biological sciences in journals such as Genome Biology and the Journal of Biology since late 1999.

As a publishing company, the Public Library of Science began full operation on October 13, 2003, with the publication of a peer reviewed print and online scientific journal, entitled PLoS Biology, and have since launched six more peer-reviewed journals. The PLoS journals are what they describe as "open access content"; all content is published under the Creative Commons "attribution" license (Lawrence Lessig, of Creative Commons, is also a member of the Advisory Board). The project states (quoting the Budapest Open Access Initiative) that: "The only constraint on reproduction and distribution, and the only role for copyright in this domain, should be to give authors control over the integrity of their work and the right to be properly acknowledged and cited."

[edit] Business model

To fund the journal, PLoS charges a publication fee to be paid by the author or the author's employer or funder. In the United States, institutions such as the National Institutes of Health and the Howard Hughes Medical Institute have pledged that recipients of their grants will be allocated funds to cover such author charges. PLoS still relies heavily on donations from foundations to cover the majority of its operating costs. PLoS was launched with large grants from the Gordon and Betty Moore Foundation and the Sandler Family Supporting Foundation which combined made up 13 millions US dollars.[1]

One criticism of charging author-side fees is that it fails to recognize the high cost of filtering and evaluating the high number of submissions the high-impact journals receive. To maintain standards, a strict review system is used that will in general lead to a large proportion of those papers that do not meet the (high) standards of a journal to be rejected for publication. Setting up and maintaining a review system requires substantial effort of both editors, editorial office and reviewers, and is hence one of the most costly elements of scientific publications.

[edit] Impact

The initiatives of the Public Library of Science in the United States have initiated similar proposals in Europe, most notably the "Berlin Declaration" developed by the German Max Planck Society, which has also pledged grant support for author charges (see also the “Budapest Open Access Initiative”).

[edit] PLoS journals and their websites

(all ISSNs are "EISSNs", for the electronic edition)

[edit] See also


2009年4月7日 星期二

Aircraft hit birds 62% more since '90s

Aircraft hit birds 62% more since '90s

WASHINGTON — Dangerous collisions between aircraft and large birds — like the one that forced a commercial airliner to make an emergency landing on the Hudson River in January — have risen dramatically, according to government data obtained by USA TODAY.

The Federal Aviation Administration's database tracking bird strikes shows reports of collisions with geese and other large birds increased from an average of 323 a year in the 1990s to 524 per year from 2000 to 2007, a 62% surge.

These birds are big enough to potentially cripple a large jet.

The most serious reported cases in which large birds damaged aircraft also were up. Large birds damaged at least one engine on aircraft an average of 10 times a year in the 1990s. Since the year 2000, that number climbed to more than 12 per year.

The data, complete through 2007, come from records that the FAA has refused to release since a flock of Canada geese ruined the engines on a US Airways jet on Jan. 15. Last month, the aviation agency proposed permanently barring the data's release, maintaining that the records could be misleading and could prompt airports and others not to report bird incidents.

However, the agency acknowledged Monday that it had previously released the data to several individuals who had made requests under the Freedom of Information Act. Spokeswoman Laura Brown said that the agency had not denied a request for access to the data in the past.

Brown confirmed that the data show an increase in aircraft strikes involving large birds but said the risks to planes remains very low. "Significant strikes are still a very small part of the total bird strike numbers," she said.

Out of 58 million flights in 2007, there were 550 instances of aircraft hitting large birds and only 190 of the strikes caused damage. Even fewer, 15, caused damage to an aircraft's engine.

Two airline jets have been downed by birds since November, including the dramatic Hudson landing. In the second incident, a Ryanair jet in Rome struck a massive flock of starlings as it attempted to land on Nov. 10. No one died in the accidents.

The findings on large birds are a concern because their populations are increasing, said Richard Dolbeer, a retired Department of Agriculture wildlife biologist who created the FAA database in 1990.

"In most cases it's going to be these large birds that are going to cause a catastrophe or a significant strike event," Dolbeer said.

Most jet engines are not required to withstand an impact from a bird weighing more than 4 pounds, according to federal standards.

Wildlife experts such as Dolbeer have been raising concern about the surge in populations of geese, cormorants, pelicans and other large bird species in recent decades.

The database contains thousands of records a year. Most are relatively minor incidents, such as when planes hit tiny birds that pose little risk. But the data also contain hundreds of incidents in which serious damage to large jets was reported.

On Oct. 12, 2007, a Skywest Airlines CRJ-700 regional leaving from Denver International Airport struck a flock of as many as 100 sandhill cranes, large birds that weigh an average of 13 pounds.

After flying for about 20 miles, they hit the birds and the pilots felt several thuds. One of the engines began running rough and a pilot declared an emergency in a radio call to controllers. The pilot said "he didn't think he was going to be able to make it back to the airport," the FAA report said. The jet returned to Denver and no one was injured.

On Nov. 11, 2007, a Jetblue Airways Embraer EMB-190 jet flying at 4,000 feet as it headed toward John F. Kennedy International Airport in New York struck a large flock of Canada geese, the same type of bird that brought down the US Airways jet into the Hudson in January.

One of the jet's two engines was severely damaged, requiring a two-week repair, according to the database. "Plane looked bloody," the report said. "Cabin had horrible burning smell."

The FAA has estimated that only about 20% of the incidents involving birds are contained in the database because reporting bird strikes is voluntary. As a result, it's difficult to know the full extent of the risks that birds create.

The National Transportation Safety Board, which investigates accidents, urged the FAA in 1999 to require airports, airlines and others to report bird strikes. But the FAA has declined.

Maggots and Modern Medicine

Spectrum | 07.04.2009 | 17:30

Maggots and Modern Medicine

Over the past few years modern medicine has been turning its attention to medieval practices to see whether those therapies can fit into today’s healthcare requirements.

A study by a team of doctors published in the Science Journal lends support to the use of maggots in high-tech healthcare. The team, led by Prof. Nicky Cullum of the University of York, carried out a study comparing maggots with a standard “hydrogel” in treating leg ulcers. Surprisingly enough, there have been two – widely differing – interpretations of the study in the media. One says that maggots can clean wounds that fail to heal five times faster than conventional treatments, while the other says that there’s little difference really between maggots and hydrogel. So…what exactly did the scientists conclude in their study? Rajiv Sharma put the question to Professor Nicky Cullum.

Taiwan, Japan researchers say invent quake sensing tool

Taiwan, Japan scientists claim breakthrough in earthquake warning (Roundup)

Asia-Pacific News

Apr 6, 2009, 9:31 GMT


Read more: Taiwan, Japan scientists claim breakthrough in earthquake warning (Roundup) - Monsters and Critics - http://www.monstersandcritics.com/news/asiapacific

Taipei - Taiwanese and Japanese scientists have made a breakthrough in earthquake early warnings, allowing the public to be alerted 10 to 30 seconds before a major quake causes destruction, a newspaper said Monday.

The breakthrough was achieved by Wu Yih-min, associate professor in the Department of Geosciences of the National Taiwan University, and Professor Hiroo Kanamori at Seismological Laboratory at the California Institute of Technology, the China Times said.

It can give people more time to seek safety, as currently the quickest alert Taiwan's Seismological Observation Centre can give is 30 seconds after a quake has struck, it added.

'Our research is to give the early warning as early as possible so that people can take precaution. Currently seismologists still cannot predict an earthquake, but we hope our research can help seismologists of future generations to predict earthquakes,' he told the German Press Agency dpa.

According to Wu, after an earthquake has occurred, it sends out P-waves and S-waves. P-waves are the less destructive vertical waves, while S-waves are devastating horizontal waves.

As P-waves travel 1.73 times faster than S-waves, Wu worked out the correlation between the P-waves and the magnitude of the quake, which means that by analyzing the characteristics of P-waves, he can gauge the destructive force of the quake.

Wu and Kanamori's study, which began in 1999, is financed by the Taiwan's National Science Council and Seismological Observation Centre. They tested the method at the South California earthquake monitoring network in 2007 and have been testing it at the Pacific Tsunami Warning Centre in Hawaii since 2008.

Tests in California showed that after an earthquake, four to six monitoring stations recorded features of the P-wave within seven seconds after the quake had struck. Adding the time of computer transmission, a warning could be sent out 10 seconds before a quake starts to cause damage.

The China Times said that Wu and Kanamori's method can prevent or cut property damage or loss of life because 10 seconds is enough time for someone to switch off a gas stove, for a bullet train to slow down or for a nuclear power plant's reactors to be shut.

Currently the quickest earthquake early warning in the world is issued by Japan's Meteorological Agency, which - by using the Nowcast early-warning method - can issue the alert three to four seconds after an earthquake has occurred.

Taiwan's Seismological Observation Centre can measure earthquake magnitude no sooner than 18 seconds after the quake begins. Measuring the magnitude and sending out the warning takes at least 30 seconds.

Wu and Kanamori plan to develop a beeper-like gadget, called a Mems Sensor, so that people can receive early warnings from the Seismological Observation Centre after a strong quake has struck.

Wu and Kanamori published their paper, Development of an Earthquake Early Warning System Using Real-Time Strong Motion Signals, in the Swiss journal Sensors in 2008, and the research has attracted attention from many countries.

Taiwan's Seismological Observation Centre said it will not adopt Wu's method until the accuracy of his earthquake early-warning system can be proven, the China Times said.

The centre uses the more conservative front-detection method which takes longer to issue a warning but provides more accurate data.

Taiwan sits on the circum-Pacific seismic belt and experiences about 18,500 earthquakes each year. Of these, only some 1,000 quakes can be felt by human beings.

On September 21, 1999, an earthquake measuring 7.3 on the Richter scale struck in Taichung County, central Taiwan, killing 2,400 people and injuring more than 10,000 people.


Taiwan researchers say invent quake sensing tool

Mon Apr 6, 2009 6:09am BST



TAIPEI, April 6 (Reuters) - A research team at Taiwan's top university has rolled out a tiny low-budget device that can sense earthquakes within 30 seconds, enough time to issue crucial disaster warnings, the lead inventor said on Monday.

The metal tool the size of a tape deck can detect an oncoming quake's speed and acceleration in time to estimate its eventual magnitude and warn trains to slow down or natural gas companies to shut off supplies, said Wu Yih-min, a researcher at the National Taiwan University Department of Geosciences.

The tool is more precise than similar technology used overseas, and could cost as little as T$10,000 ($302) once it reaches the market, said Wu, whose skeleton research team invented the tool after about five years of study.

"We can tell within 30 seconds whether it's going to be a big or small quake," Wu told reporters. "We can sense the scale and how much damage it's likely to cause."

The tool, which should be fastened to a place unlikely to be shaken by forces other than earthquakes, uses a chip that costs just a few U.S. dollars, Wu said.

Schools, railway systems and nuclear power plants would benefit from the technology, said Kuo Kai-wen, seismological centre director with Taiwan's Central Weather Bureau, which helped the university test its device

But before it can be used, researchers must figure out how to link it to computerised alert systems, Kuo said.

The university has not yet applied for a patent, Wu said.

Taiwan is prone to earthquakes, logging 20 minor ones in the past 2-????½ weeks.

In May 2008, a 7.9 magnitude quake hit Sichuan province of southwest China, killing about 70,000 people and leaving more than 10 million homeless. (Reporting by Ralph Jennings; Editing by Jerry Norton)

2009年4月6日 星期一

Editing Memory

Brain Power

Brain Researchers Open Door to Editing Memory


Published: April 5, 2009

Suppose scientists could erase certain memories by tinkering with a single substance in the brain. Could make you forget a chronic fear, a traumatic loss, even a bad habit.

Skip to next paragraph
Fred R. Conrad/The New York Times

André A. Fenton studies spatial memory in mice and rats.

Brain Power

The Speed-Dial Molecule

For all that scientists have studied it, the brain remains the most complex and mysterious human organ — and, now, the focus of billions of dollars’ worth of research to penetrate its secrets.

This is the first article in a series that will look in depth at some of the insights these projects are producing.

Fred R. Conrad/The New York Times

Research by Dr. Todd C. Sacktor, above, and André A. Fenton has demonstrated a chemical’s effect on memory with potential implications for treatment of trauma, addiction and other conditions.

Readers' Comments

If scientists succeed in creating a drug to erase memories, to whom should it be available?

Researchers in Brooklyn have recently accomplished comparable feats, with a single dose of an experimental drug delivered to areas of the brain critical for holding specific types of memory, like emotional associations, spatial knowledge or motor skills.

The drug blocks the activity of a substance that the brain apparently needs to retain much of its learned information. And if enhanced, the substance could help ward off dementias and other memory problems.

So far, the research has been done only on animals. But scientists say this memory system is likely to work almost identically in people.

The discovery of such an apparently critical memory molecule, and its many potential uses, are part of the buzz surrounding a field that, in just the past few years, has made the seemingly impossible suddenly probable: neuroscience, the study of the brain.

“If this molecule is as important as it appears to be, you can see the possible implications,” said Dr. Todd C. Sacktor, a 52-year-old neuroscientist who leads the team at the SUNY Downstate Medical Center, in Brooklyn, which demonstrated its effect on memory. “For trauma. For addiction, which is a learned behavior. Ultimately for improving memory and learning.”

Artists and writers have led the exploration of identity, consciousness and memory for centuries. Yet even as scientists sent men to the moon and spacecraft to Saturn and submarines to the ocean floor, the instrument responsible for such feats, the human mind, remained almost entirely dark, a vast and mostly uncharted universe as mysterious as the New World was to explorers of the past.

Now neuroscience, a field that barely existed a generation ago, is racing ahead, attracting billions of dollars in new financing and throngs of researchers. The National Institutes of Health last year spent $5.2 billion, nearly 20 percent of its total budget, on brain-related projects, according to the Society for Neuroscience.

Endowments like the Wellcome Trust and the Kavli Foundation have poured in hundreds of millions of dollars more, establishing institutes at universities around the world, including Columbia and Yale.

The influx of money, talent and technology means that scientists are at last finding real answers about the brain — and raising questions, both scientific and ethical, more quickly than anyone can answer them.

Millions of people might be tempted to erase a severely painful memory, for instance — but what if, in the process, they lost other, personally important memories that were somehow related? Would a treatment that “cleared” the learned habits of addiction only tempt people to experiment more widely?

And perhaps even more important, when scientists find a drug to strengthen memory, will everyone feel compelled to use it?

The stakes, and the wide-open opportunities possible in brain science, will only accelerate the pace of discovery.

“In this field we are merely at the foothills of an enormous mountain range,” said Dr. Eric R. Kandel, a neuroscientist at Columbia, “and unlike in other areas of science, it is still possible for an individual or small group to make important contributions, without any great expenditure or some enormous lab.”

Dr. Sacktor is one of hundreds of researchers trying to answer a question that has dumbfounded thinkers since the beginning of modern inquiry: How on earth can a clump of tissue possibly capture and store everything — poems, emotional reactions, locations of favorite bars, distant childhood scenes? The idea that experience leaves some trace in the brain goes back at least to Plato’s Theaetetus metaphor of a stamp on wax, and in 1904 the German scholar Richard Semon gave that ghostly trace a name: the engram.

What could that engram actually be?

The answer, previous research suggests, is that brain cells activated by an experience keep one another on biological speed-dial, like a group of people joined in common witness of some striking event. Call on one and word quickly goes out to the larger network of cells, each apparently adding some detail, sight, sound, smell. The brain appears to retain a memory by growing thicker, or more efficient, communication lines between these cells.

The billion-dollar question is how?

In the decades since this process was described in the 1960s and 1970s, scientists have found scores of molecules that play some role in the process. But for years the field struggled to pinpoint the purpose each one serves. The problem was not that such substances were so hard to find — on the contrary.

In a 1999 paper in the journal Nature Neuroscience, two of the most prominent researchers in brain science, Dr. Jeff W. Lichtman and Joshua R. Sanes of Harvard, listed 117 molecules that were somehow involved when one cell creates a lasting speed-dial connection with a neighbor, a process known as “long-term potentiation.”

They did not see that these findings were necessarily clarifying the picture of how memories are formed. But an oddball substance right there on their own list, it turned out, had unusual properties.

A Helpful Nudge

“You know, my dad was the one who told me to look at this molecule — he was a scientist too, my dad, he’s dead now but he had these instincts — so anyway that’s how it all started,” Dr. Sacktor was saying. He was driving from his home in Yonkers to his laboratory in the East Flatbush neighborhood of Brooklyn, with three quiches and bag of bagels bouncing in the back seat. Lunch for the lab.

The father’s advice led the son, eventually, to a substance called PKMzeta. In a series of studies, Dr. Sacktor’s lab found that this molecule was present and activated in cells precisely when they were put on speed-dial by a neighboring neuron.

In fact, the PKMzeta molecules appeared to herd themselves, like Army Rangers occupying a small peninsula, into precisely the fingerlike connections among brain cells that were strengthened. And they stayed there, indefinitely, like biological sentries.

In short: PKMzeta, a wallflower in the great swimming party of chemicals that erupts when one cell stimulates another, looked as if it might be the one that kept the speed-dial function turned on.

“After that,” Dr. Sacktor said, “we began to focus solely on PKMzeta to see how critical it really was to behavior.”

Running a lab is something like fielding a weekend soccer team. Players come and go, from Europe, India, Asia, Grand Rapids. You move players around, depending on their skills. And you bring lunch, because doctoral students logging 12-hour days in a yellowing shotgun lab in East Flatbush need to eat.

“People think that state schools like ours are low-key, laid back, and they’re right, we are,” said Robert K. S. Wong, chairman of the physiology and pharmacology department at SUNY Downstate, who brought Dr. Sacktor with him from Columbia. “You have less pressure to apply for grants, and you can take more time, I think, to work out your ideas.”

To find out what, if anything, PKMzeta meant for living, breathing animals, Dr. Sacktor walked a flight downstairs to the lab of André A. Fenton, also of SUNY Downstate, who studies spatial memory in mice and rats.

Dr. Fenton had already devised a clever way to teach animals strong memories for where things are located. He teaches them to move around a small chamber to avoid a mild electric shock to their feet. Once the animals learn, they do not forget. Placed back in the chamber a day later, even a month later, they quickly remember how to avoid the shock and do so.

But when injected — directly into their brain — with a drug called ZIP that interferes with PKMzeta, they are back to square one, almost immediately. “When we first saw this happen, I had grad students throwing their hands up in the air, yelling,” Dr. Fenton said. “Well, we needed a lot more than that” one study.

They now have it. Dr. Fenton’s lab repeated the experiment, in various ways; so has a consortium of memory researchers, each using a different method. Researchers led by Yadin Dudai at the Weizmann Institute of Science in Israel found that one dose of ZIP even made rats forget a strong disgust they had developed for a taste that had made them sick — three months earlier.

A Conscience Blocker?

“This possibility of memory editing has enormous possibilities and raises huge ethical issues,” said Dr. Steven E. Hyman, a neurobiologist at Harvard. “On the one hand, you can imagine a scenario in which a person enters a setting which elicits traumatic memories, but now has a drug that weakens those memories as they come up. Or, in the case of addiction, a drug that weakens the associations that stir craving.”

Researchers have already tried to blunt painful memories and addictive urges using existing drugs; blocking PKMzeta could potentially be far more effective.

Yet any such drug, Dr. Hyman and others argue, could be misused to erase or block memories of bad behavior, even of crimes. If traumatic memories are like malicious stalkers, then troubling memories — and a healthy dread of them — form the foundation of a moral conscience.

For those studying the biology of memory, the properties of PKMzeta promise something grander still: the prospect of retooling the engram factory itself. By 2050 more than 100 million people worldwide will have Alzheimer’s disease or other dementias, scientists estimate, and far more will struggle with age-related memory decline.

“This is really the biggest target, and we have some ideas of how you might try to do it, for instance to get cells to make more PKMzeta,” Dr. Sacktor said. “But these are only ideas at this stage.”

A substance that improved memory would immediately raise larger social concerns, as well. “We know that people already use smart drugs and performance enhancers of all kinds, so a substance that actually improved memory could lead to an arms race,” Dr. Hyman said.

Many questions in the science remain. For instance, can PKMzeta really link a network of neurons for a lifetime? If so, how? Most molecules live for no more than weeks at a time.

And how does it work with the many other substances that appear to be important in creating a memory?

“There is not going to be one, single memory molecule, the system is just not that simple,” said Thomas J. Carew, a neuroscientist at the University of California, Irvine, and president of the Society for Neuroscience. “There are going to be many molecules involved, in different kinds of memories, all along the process of learning, storage and retrieval.”

Yet as scientists begin to climb out of the dark foothills and into the dim light, they are now poised to alter the understanding of human nature in ways artists and writers have not.

2009年4月5日 星期日

Sir Charles Vernon Boys

Sir Charles Vernon Boys, FRS (15 March 1855 - 30 March 1944) was a British physicist, known for his careful and innovative experimental work.

Boys was the eighth child of the Revd Charles Boys, Anglican vicar of Wing, Rutland. He was educated at Marlborough College and the Royal School of Mines, where he learned physics from Frederick Guthrie and taught himself higher mathematics while completing a degree in mining and metallurgy. As a student at the School of Mines he invented a mechanical device (which he called the "integraph") for plotting the integral of a function. He worked briefly in the coal industry before accepting Guthrie's offer of a position as "demonstrator."

Boys achieved recognition as a scientist for his invention of the fused quartz fibre torsion balance, which allowed him to measure extremely small forces. He used his invention to build a radiomicrometer capable of responding to the light of a single candle more than one mile away, and used that device for astronomical observations. In 1895 he published a measurement of the gravitational constant G that improved upon the accuracy achieved by Cavendish.

Boys' work on calorimetry was used by the government to price natural gas by energy content rather than volume. He also worked on high-speed photography and conducted public lectures on the properties of soap films, which were gathered into the book Soap Bubbles: Their Colours and the Forces Which Mould Them, a classic of scientific popularization which remains in print today. The first edition of Soap Bubbles appeared in 1890, and the second in 1911. The book deeply impressed French writer Alfred Jarry, who in 1898 wrote the absurdist novel Exploits and Opinions of Dr. Faustroll, pataphysician, in which the title character, who was born at the age of 63 and sails in a sieve, is described as a friend of C.V. Boys (see also Pataphysics).

Boys was a professor at the Royal College of Science (now Imperial College London) in South Kensington from 1889 to 1897, as well as an examiner at the University of London. In 1899 he presented the Royal Institution Christmas Lectures. He was elected to the Royal Society in 1888 and knighted in 1935. He was awarded the Royal Medal in 1896 and the Rumford Medal in 1924.

He married Marion Amelia Pollock in 1892. She caused a scandal by having an affair with the Cambridge mathematician Andrew Forsyth, as a result of which Forsyth was forced to resign his chair. Marion divorced Boys in 1910 and married Forsyth.

References

  • Lord Rayleigh, "Charles Vernon Boys", Obituary Notices of Fellows of the Royal Society of London IV (1944) pp.771-778.

External links

Wikisource
Wikisource has original text related to this article:

2009年4月4日 星期六

Small Company Offers Web-Based Competition for Microsoft Word

Digital Domain

Small Company Offers Web-Based Competition for Microsoft Word


Published: April 4, 2009

WITHIN Microsoft’s Office group, the calendar on the wall appears to be 1983, the year the company introduced Microsoft Word. The company still expects customers to buy its software applications as products and install and run them on PCs.

Skip to next paragraph

Zoho.com’s home page includes Zoho Writer, an online word processor.

Recognition of the Internet has been slow in coming. Microsoft is finally preparing Web versions of its Office suite, though these are intended as supplements, not as replacements. The company maintains that Web versions of a Word or Excel will never match the functionality and responsiveness that software installed on one’s own machine provides.

It may be wrong.

Granted, Microsoft’s largest competitor, Google, has not yet marched up to the bulwarks guarding Microsoft Office and blown a gaping hole into its adversary’s complacency. Google Apps, its Office-like suite, contains an uneven bunch of services. I find Google Calendar far superior to Microsoft Outlook’s calendar, but Google’s word processor, Docs, lacks many features in Word that I rely on.

The best online word processor, however, may be the one from a tiny company, Zoho, a nimble innovator. Zoho Writer is running close enough to Word to imagine that it and other online word processors will be able to do most everything that Word can do, and more.

Zoho Writer handles the basics and provides many advanced functions without breaking a sweat — like the ability to edit a document when page breaks are displayed. Google Docs can’t. Writer works even when one is offline, thanks to open source technology developed by Google, and used by Zoho in its word processor four months before Google used it.

Zoho Writer also provides some esoteric features, like a choice of footnotes or endnotes, with note numbers in superscript, placed in the text. Google Docs does only footnotes and puts in a pound sign as a placeholder. You may never need to create the most complex mathematical equations, but Zoho Writer makes it easy to do so.

Writer is offered free to individual users and to the first 10 users in a business. (So are 9 of Zoho’s 18 other online services at Zoho.com.) And free means free of advertising, too. “We don’t do advertising at all. We don’t believe in advertising,” says Raju Vegesna, a Zoho marketing executive.

Zoho hopes that word will spread and that larger businesses will sign up, willing to pay $50 a year a user for access to the 10 productivity applications, like Writer, and separate monthly fees for business applications.

Microsoft Office comes in various configurations and prices, and Microsoft doesn’t disclose its lowest price for volume purchases. But Office Professional 2007 is available from retailers for about $400.

Zoho is a division of AdventNet, which provides online software services to corporate I.T. departments and is based in Pleasanton, Calif. AdventNet, privately held, says its I.T. software is profitable but doesn’t claim the same for Zoho, which AdventNet created in 2005.

At Microsoft, Chris Capossela, senior vice president in Microsoft’s Business Division who manages its Office product line, explained to me how the company was preparing for “the future of computing — a combination of the best of software and the best of Internet services.” The next version of Office — being prepared for release in 2010 or after, he said — will have three incarnations, beginning with what Microsoft calls the “rich client” (“rich” refers to features) and installed on the user’s PC, and the mobile version for smartphones. Both of those exist today. The third, and new, form will be the Web-based service.

Mr. Capossela sees the Web version of Office as only a stopgap for users who are away from home or office and, in a pinch, must use a machine that isn’t their own. With Office on the Web, “users can do a little bit of work, between classes, or at the airport,” he said.

Asked whether Microsoft was interested in making the Web version as fully featured as the desktop version, he scoffed at the notion that a “browser experience” could be equivalent to a “rich client,” at least without the graphical help of an add-in like Adobe’s Flash.

Adobe’s Web site offers its own free Flash-equipped online word processor, Buzzword. But to my taste, Flash is visual overkill for word processing. Zoho Writer manages perfectly well without Flash.

Mr. Capossela sounded confident when he described the lead that Microsoft enjoys over online challengers like Zoho. “A lot of our competitors have to spend a huge amount of energy copying features that we already have in the Office suite,” he said. “We don’t have that burden to bear.”

Zoho, however, doesn’t seem burdened at all. It has moved well ahead of Word in some areas, such as offering multiple users the ability to edit the same document simultaneously.

Zoho Writer is not completely polished — I lose double-spacing when exporting to Word, and there’s an irksome extra step required to print a clean copy of a final draft, without the browser’s header. In all, though, these are small irritations, balanced by gaining the ability to edit and share documents online effortlessly, in many different ways.

Microsoft estimates that 500 million copies of Office are running on the world’s one billion Windows machines. Those were the easy wins, before the Web was ready to compete against installed software. The next 500 million copies, if won, will require staying ahead of what rivals can accomplish within the unassuming frame of the Web browser.

Randall Stross is an author based in Silicon Valley and a professor of business at San Jose State University. E-mail: stross@nytimes.com.