It would be hard to exaggerate the angst that has gripped the U.S. in recent months as the election nears, markets churn and assets melt. But the headlines that have made us dread picking up the newspaper mask a long-term problem that may shape the future of America more than John McCain's plan for Iraq, Barack Obama's health care ideas or Uncle Sam's heroic efforts to rescue the economy.
By most measures, the U.S. is in a decade-long decline in global technological competitiveness. The reasons are many and complex, but central among them is the country's retreat from long-term basic research in science and technology, coupled with a surge in R&D by countries such as China.
R&D has two parts, of course, and published figures showing a rise in "research and development" hide a troubling trend. Companies still spend billions annually on development, typically aimed at the next product cycle or two. But the kind of pure research that led to the invention of the transistor and the Internet has steadily declined as companies bow to the pressure for quarterly and annual results.
To take but one example: Bell Laboratories was founded in 1925 and went on to "help weave the technological fabric of modern society," as its Web site today rightly claims. Its "top 10 innovations," according to parent Lucent-Alcatel, include the transistor, data networking, cellular telephony, digital switching, communications satellites and the Unix operating system. Although Bell Labs continues to innovate in most of those areas, all of the top 10 had their origins in the 1970s or earlier.
In January 1982, Time magazine reported: "With 22,500 people on its payroll (3,000 of them Ph.D.s), 19,000 patents and an annual budget of $1.6 billion, Bell Laboratories is a mighty engine of research and development. It is possibly the finest, and certainly the largest, private operation of its kind anywhere."
But since then, Bell Labs, beginning with its breakup in 1984 in the AT&T divestiture and continuing through subsequent sales and restructurings by its parent companies, has become steadily more focused on advanced development rather than pure research. On Sept. 4 this year, The Star-Ledger of New Jersey reported that Bell Labs was disbanding a group of scientists doing basic research in areas such as material science and device physics. The paper reported that research director Gee Rittenhouse had explained that "the team was going to have a hard time integrating its research into product development."
Not only has industry cut back on research, it has taken much of it offshore, says David Farber, a computer science professor at Carnegie Mellon University. That deprives U.S. scientists -- as well as non-U.S. scientists who were educated here and want to stay in the country -- of some of the best jobs, he says.
And the jobs of university researchers aren't so hot these days either, as professors and graduate students scramble for federal funds. "Faculty spend their careers writing proposals now. They don't get funded. The hit rates are low. People put in 20 proposals in a year," Farber says.
"Once you reduce university research, you are really mortgaging your future, because the way you train new scientists is by apprenticeships at graduate schools," he adds.
Where will the apprentices turn? "Eventually, we could all be hamburger flippers, or Wall Street brokers, if there are any left," Farber says.