Users login

Create an account »


Users login

Home » Hacking News » Supercomputers Try to Keep Pace

Supercomputers Try to Keep Pace

by ivy on June 26th, 2001 ROCKVILLE, Md. — Cloaked in government secrecy, the world of supercomputing was populated through the 1980s chiefly by nuclear scientists developing weapons and doomsday simulations.

The industry suffered post-Cold War pangs as government contracts diminished — even as computers themselves gained immensely in power.

Now, in a swords-to-ploughshares twist, biologists mining genetic data for life-saving drugs are driving a renaissance in supercomputing — and the machines may have a difficult time keeping pace.

"The whole area of genomics and biological sciences is fundamentally transforming our industry,'' said Ty Rabe, director of high-performance computing for Compaq Computer Corp.

The burgeoning field, known as bioinformatics, comes of the marriage between biology and high-powered computing that was instrumental in deciphering the human genome.

Craving vastly more speed and power to isolate the genetic origins of myriad afflictions, the biotechnology industry's demand for ever more muscular computers is expected to be insatiable.

It was high-speed computers that enabled researchers from Celera Genomics and the federal Human Genome Project to make a blueprint of the 3 billion chemical base pairs that comprise the human genetic makeup.

But researchers now must study the roughly 30,000 genes and hundreds of thousands of proteins that are the keys to the future drugs that can attack many diseases at their biological origin.

That requires computer simulations of how drugs may act on humans, or how they may interact with other drugs — and vast amounts of storage space for all that data.

"It's great to have all this genomics, but when you can actually model how a cell works, what causes it to malfunction and how to correct that malfunction, that is where the real value will come,'' said Marshall Peterson, head of infrastructure technology at Celera. "There isn't enough computer power on the planet yet to do that.''

Beyond the laboratory, the demand for processing and storage muscle may well come from what is becoming known as personal medicine.

In personal medicine, an individual's genetic data would be stored digitally, providing doctors with information needed to develop individualized care — from attacking inherited diseases to designing rehabilitation therapies.

"If the DNA of 6 billion people has to be stored or deciphered, the magnitude of that would be enormous,'' said Sia Zadeh, group manager of Sun Microsystem's life sciences division.

Such a task would require a now unimaginable amount of computing power.

It's no wonder companies including Sun, Compaq and IBM have in the past year made major investments in life sciences. IBM estimates the overall market could be worth $40 billion annually by 2004.

Keeping up with the number-crunching demands of biotech could help restore some shine to a computer industry battered by this year's devastating slump in PC sales.

Compaq has partnered with Celera Genomics and the federal Sandia National Laboratories nuclear facility in New Mexico to create a computer that can handle 100 trillion computations per second.

That's 100,000 times faster than the average desktop PC.

IBM is building Blue Gene, which promises to perform 1 quadrillion (referred to as 1 petaflop) operations per second when finished in 2004.

The $100 million experimental project will be 2 million times more powerful than today's desktop PCs. It will focus on how proteins fold and change their shape, a key to understanding their biological function.

A single protein can perform a much different — and often harmful — role in the body depending on its contortions.

Blue Gene is the showpiece of IBM's new Life Sciences business unit, created last year with a $100 million initial investment and headed by Caroline Kovac.

Among clients for customized IBM systems is NuTec Sciences, an Atlanta-based bioinformatics company. The cluster of 1,250 IBM servers can perform 7.5 trillion calculations per second, making it the world's fastest commercial computer.

NuTec will lease its system to academic researchers, including the Winship Cancer Institute at Emory University. Winship says it plans to tailor cancer treatments to individual patients by next year.

Biopsies will be loaded into the computer, which will analyze the genes of cancer cells. It will then help researchers find the most effective drug or treatment for exploiting a tumor's weaknesses.

"We're the first to actually hook a supercomputer into a clinic taking care of patients,'' said Winship director Dr. Jonathan Simons.

To help decipher the human genome, Celera used about 200 high-end AlphaServer systems provided by Compaq, which includes biotech giant Genentech and MIT's Whitehead Institute among its supercomputing clients.

Celera estimates that it will take a system that can perform 500 trillion operations per second just to model the activity of a single cell, said Peterson of Celera.

Simulating tissues and organs, meanwhile, will require "several tens of petaflops'' — computing power that will be out of reach of even IBM's Blue Gene.

"This will require a revolution in computing. The current evolutionary steps will not make it,'' said Peterson.

by The Associated Press

Newsletter signup

Signup to our monthly newsletter and stay in touch with IT news!

Free E-books

We've got ebooks! But they're not online. :( Please give us a few days to bring downloads back.


Have something to say or just wanna drop us a line? Please keep this in mind: to spam, we reply with spam.

Contact us »