"You can see the computer age everywhere but in the productivity statistics," the economist Robert Solow famously quipped in 1987.
Today, Solow's paradox is back.
U.S. productivity, or output per worker hour, just registered another dismal performance. In the first quarter, it was up a bare 0.3 percent from a year earlier.
That has unfortunately become the norm. Productivity has risen just 0.6 percent on average over the past five years.
"This is the worst five-year run for productivity since the early 1980s, and the worst five-year performance on record outside of a recession," J.P. Morgan economists observed in a client note.
Clearly, there is a problem. The trouble is determining what exactly it is—and what, if anything, to do about it.
Economists at Goldman, Sachs & Co., for instance, suspect it's the numbers themselves that might be the problem.
They argue precisely what Silicon Valley has been crowing for years now: that software is eating the world.
The "measured price of computer hardware has plunged by 91.5 percent since 1995," they note, but "measured software prices [have] only edged down slightly over the past two decades."
This is important because America's information technology "center of gravity" has shifted "away from hardware, to software and digital," whose products now make up more than half of the output and market valuation of the sector, says Goldman.
It may be well more than half, actually, given that Apple—America's biggest tech company—is still included as hardware in the consumer price index, despite selling a growing mix of software and digital products.
The price of these products matters greatly for gross domestic product, or GDP, which is reported in real, or inflation-adjusted, terms. The lower the price for a given product amid steady demand, therefore, the bigger boost that delivers to real GDP growth.
In other words, if hardware's falling prices tend to boost real GDP, software and digital content's flat prices, especially as those products grow as a proportion of tech spending, "will result in a spurious slowdown in real GDP growth," Goldman observes.
If software and digital prices were instead falling at the 20-year average pace seen on the hardware side of about 5 percent a year, that would mean an understatement to GDP growth of about 0.2 percentage-points per year.
Throw in another 0.75 percentage-points of "consumer surplus" from new software and digital products that is otherwise un-captured in the data, as Erik Brynjolfsson and JooHee Oh have estimated, and "we walk away persuaded by the notion that productivity mismeasurement could be a significant issue," says Goldman.
If this argument is correct, it means real GDP growth in this country may be much healthier than thought, that the standard of living in fact may be growing as much as in the past, that true inflation is actually lower than measured inflation, and that gauges like employment—which has shown steady improvement--may be more reliable.
In sum, "it would be better for Fed officials to delay monetary tightening until 2016," Goldman argues.
That, however, is a far more upbeat assessment than many others are putting forth.
"Welcome to life in the slow lane," is how J.P. Morgan chief U.S. economist Michael Feroli put it recently in a client note.
He was among the first outside academia to argue America's growth rate has fallen for good. Productivity rates being this low, plus labor force growth of about 0.5 percent a year, together point to trend growth of only about 1 percent for the U.S. economy, by his reckoning.
That means America's 2.4 percent average GDP growth over the past two years has perhaps only been possible because of "the massive using-up of slack labor market resources." Catch-up, in other words, from the financial crisis and deep recession.
Ethan Harris, head of North American Economics for Bank of America Merrill Lynch, said companies may even "have gotten a little ahead of themselves in hiring." Any slowdown in jobs growth from here would reinforce that conclusion.
If in fact America's potential growth rate remains this weak, it means, as J.P. Morgan's market team observed, earlier, perhaps more frequent recessions, and higher inflation. And that is more likely to cause lower interest rates over time, as opposed to higher ones.
"Lower productivity growth ultimately means lower interest rates," the firm said, "as the latter should reflect the return on capital."
It also puts high stock-market valuations at risk and is a negative for commodities and for emerging markets, whose productivity growth has been even weaker than that of developed markets, but is typically bought on the premise of much higher growth.
The risk, in the U.S. and overseas, is a vicious negative feedback loop: "Low productivity is brought about by low capital spending, while that, in turn, lowers productivity again."
This issue of too-low capital investment already has many concerned.
"The labor productivity trend has been crushed by lack of investment leading to an unprecedented decline in capital intensity," said Morgan Stanley economist Ted Wieseman.
The ratio of capital services per worker hour, he observed in a client note, fell in 2011, 2012, and 2013; "the first run of three declines on record."
Likely, it fell again last year, he said, as net investment as a share of GDP remained depressed while hours worked rose 3.5 percent, "and it's probably falling still so far in 2015" for the same reason.
This marks a major downshift from America's super-productive years of roughly 1995-2005—the same boom era during which Solow's paradox was thought to be ancient history.
"The contribution of capital intensity to labor productivity growth has thus turned negative in the post-recession period, after adding 1.0 percentage-point a year from 2000 to 2007, and 1.2 percentage-points a year from 1995 to 2000," Wieseman observed.
Ethan Harris of BAML agrees. "Capital deepening has stopped in the last eight years or so in the U.S.," he said.
Why? The theories range from blaming "zombie companies" propped up by low interest rates to investors rewarding other uses of corporate cash, like buying back stock and paying dividends, instead of investment.
Plus, contrary to the "start-up nation" image projected by Silicon Valley, there is also a general lack of new business formation, as everyone from academics to Wells Fargo chairman and chief executive officer John Stumpf recently told CNBC.
Robert Gordon, a professor at Northwestern University, has been arguing for years that America's growth rate will remain depressed. "The digital electronics revolution has begun to encounter diminishing returns," he said at a recent economics confab, and so we are witnessing "a decline in the 'dynamism' of the economy as measured by the rate of creation of new firms."
Put simply, Gordon thinks the returns from the digital revolution have been and will continue to be far lower than from prior industrial revolutions.
Finally, there may also be a "crowding-out" effect.
"The cause currently is pretty clear," said former Federal Reserve chairman Alan Greenspan in an interview.
The growth in America's entitlement programs—chiefly, Medicare, Medicaid, and Social Security—today soaks up a far larger share of the gross savings available to fund productivity-providing investment.
The sum of entitlement payments was less than 5 percent of GDP back in the mid-1960s, he notes. Today, that figure is approaching 15 percent.
Other factors ranging from the digital revolution to changes to America's supply chain, increasingly a "just-in-time" delivery system, may contribute, said Greenspan, but they aren't key to the slowdown story.
And with entitlements if anything continuing to grow as a portion of the U.S. economy, on the question of whether Greenspan today sees—like the "Maestro" famously did back in the mid-1990s—any signs of productivity picking up?
"I've seen no evidence of it," he said.