My colleague Steve Liesman has published a report on the government's quarterly GDP report. Summed up, he found a large, persistent error in GDP between initial and final GDP reports. Not only is it off significantly, the government even gets the direction of growth wrong 30 percent of the time!
Why is economic forecasting still so bad? Many feel that the tools being used to make the forecasts are simply inadequate. There are people trying to bring economics into the 21st century. They're using big data to make "the dismal science" less dismal.
One of them is Giselle Guzman, CEO of Now-Cast Data Corp, which is applying machine learning, big data and crowd sourcing to economic forecasting. They are trying to fuse economics and computer science.
"From my perspective, both the government and most human economists have the same problem—they are using old methods from the 1950's!, Guzman told me.
Giselle has a PhD in Finance and Economics and worked for 17 years with Nobel Prize winner Lawrence Klein, a pioneer of modern economic forecasting. She was also a research assistant to Nobel Prize-winner Joseph Stiglitz.
I wrote about her—and her track record—in Trader Talk in October.
On her website, subscribers can pull up a dashboard of roughly 4,000 indicators on virtually every kind of economic activity, which provides tables of all the predictions and a graph of the predictions placed over the actual numbers.
The coolest part is you can watch predictions for several indicators, including predictions for Consumer Price Index, Producer Price Index, and Consumer Spending—update in real time, part of a program called LiveWire. And I mean real time—by the second.
Think about that. The Fed updates its estimates once a quarter, and most federal economic data comes out once a month.
What's the "secret sauce?" Guzman tells me there are several components:
1) A "wisdom of crowds" approach. The Internet is the perfect vehicle for exploring what people are really worried about and how they really feel, rather than what they say they are worried about or feel.
2) Sophisticated algorithms that quantify those behaviors, but also calculate the economic impact of geopolitical events, natural disasters, and even acts of terrorism.
3) The constant improvement of forecasts using machine learning. It's like a self-learning system. The inputs are constantly changing. Traditional economists have a "model" that is static. This is dynamic. The formulas are changing depending on what happens. Think of data as evolving, as a living organism. As the data evolves, the relationships between data change.
Guzman claims her predictions are far more accurate than traditional economic forecasting methods.
For example, the consensus estimate for the January Consumer Price Index (CPI) was negative 0.1 percent for a month over month change. Now-Cast was at 0.0 percent. It came in a 0.0 percent.
The February CPI consensus was -0.3 percent month-over-month, Now-Cast was minus 0.14 percent. The reported number negative 0.16 percent.
When I wrote about Now-Cast in October, I concluded by saying that what I would like to see is analysis of her forecasts against the predictions of the top strategists on Wall Street.
That hasn't happened yet, but a post on their website claims to have nailed over 40 economic indicators for February.
"Now we have enough data to treat economics like a real science, not a pseudo-science," Guzman says.
Programming note: Guzman will speak on Closing Bell at 3:10pm ET Thursday.