Data Economy

Google can't solve the Keystone pipeline test

A weld inspector during construction of the Gulf Coast pipeline in Prague, Okla.
Daniel Acker | Bloomberg | Getty Images

Some problems of society lend themselves to a savior's arriving in the form of processing power. Google search activity can help catch deadly viruses before they spread, satellites and supercomputers can monitor deforestation, and data analysis can identify and eliminate inefficiencies in legacy bureaucracies.

Ask an algorithm a straightforward question that requires massive amounts of data and it can help find an answer. But what about the seemingly straightforward test President Barack Obama has mandated for the Keystone XL Pipeline, which would take fossil fuels from the Canadian oil sands to the Gulf Coast?

"Our national interest will be served only if this pipeline does not significantly exacerbate the climate problem," he said in June. "The net effects of climate impact will be absolutely critical to determining whether this project will go forward. It is relevant."

Relevant, but involving a good deal of guesswork. Even though the Intergovernmental Panel on Climate Change recently released its most conclusive report to date, it's impossible to enter Obama's imperative into a machine and get a "correct" answer.

Much of the discussion about Keystone, from backers and critics alike, is still rhetoric wrapped in conviction and difficult to assess quantitatively.

(Read more: Tracking Bigfoot by data footprints)

Energy consultant IHS issued a report shortly after Obama's June speech, saying that they had looked at his definition and could state that the pipeline passed the test.

Oil and gas investor T. Boone Pickens told CNBC last week that the Keystone XL Pipeline would make OPEC obsolete. No mincing words from a capitalist known for his bold, if not always accurate, predictions (consider the NatGas Act).

Hedge fund manager Tom Steyer, recently profiled in The New Yorker as the unlikely big money behind the anti-Keystone XL campaign, seems confident that environmentalists have the upper hand. In the same article, former White House chief of staff John Podesta put the odds at 50-50.

Whenever there is a big data story, all of it glosses over the nitty-gritty of how getting data to work together in a comprehensive system is a pretty big deal. If you are collecting from loyalty cards it’s one thing, but creating new data is a bit of a hurdle.
Bertrand Revenaz
Chief product officer, Ecometrica

Environmental consulting can uncover surprising inefficiencies in a business, but capturing the carbon question beyond one corporation is still elusive for those seeking a single analytical framework.

Bertrand Revenaz, chief product officer at environmental impact consulting firm Ecometrica, gives the example of a potato chip manufacturer that was using an "insane amount" of dryers. Analysts were able to determine that farmers, paid by weight, were spraying the potatoes with water.

Carbon accounting is no potato problem.

Revenaz said Ecometrica's database contains 40,000 unique emissions factors and 80,000 potential conversions, and that's still short of complete global coverage. Yet, he added, "it becomes a data analysis problem" when measuring carbon emissions and climate change, and picking the right emissions factors.

(Read more: 10 strangest data findings about you)

"Whenever there is a big data story, all of it glosses over the nitty-gritty of how getting data to work together in a comprehensive system is a pretty big deal," Revenaz said. "If you are collecting from loyalty cards it's one thing, but creating new data is a bit of a hurdle."

He said carbon data already exists within most companies, such as the number of airline miles employees fly each year, but it's difficult enough to take into account all the operations of a major corporation and its satellite offices in various geographies, let alone a question like Keystone, which probably ends up caught between analyses.

"All stopping points in analysis are not created equal," Revenaz said. "It becomes essentially competing groups debating the metrics of their analysis."

The only consistent data point, he said, might be how much carbon is contained in the Canadian oil sands.

T. Boone Pickens: Keystone makes OPEC obsolete
T. Boone Pickens: Keystone makes OPEC obsolete

For Dan Kessler, communications manager of, a staunch opponent of the pipeline, that's all that's needed.

"Suncor in Canada is the biggest producer, at 436,000 barrels a day," he said. "If you extrapolate its 5.9 billion barrels by 2050, those barrels contain 3.4 gigatons of CO2. That's 0.6 percent of the remaining carbon emission budget per the IPCC guidelines for 2050, but it's just one company."

Overall, Canada wants to increase production from 2 million to 6 million barrels a day, with Keystone the critical component.

(Read more: The KitKat surveillance state)

But even Kessler has trouble responding to Obama.

"My guess is he says no because he is committed to the climate problem—it's a legacy issue," Kessler said. (That's Steyer's argument in The New Yorker piece.) "On data ... I would say the oil pipeline by definition creates more emissions, and on a common sense test say no to it," he added.

That answer doesn't satisfy Michael Lazarus, senior scientist at the Stockholm Environment Institute, who recently co-wrote a paper on the various approaches to carbon emissions accounting and the psychology behind each approach.

Take another energy transport debate, in British Columbia, where there is a battle over exporting liquefied natural gas to Asia. To liquefy the gas and keeping it refrigerated is energy-intensive, but hydropower is abundant in British Columbia, making the intensity of that process less carbon-based, and the government of China could conceivably favor the use of natural gas over coal. "That's where it gets tricky," Revenaz said.

Lazarus and his colleagues wrote: "In general, as with Keystone XL, the few analyses that quantify emissions impacts of adding or removing fossil fuel supplies from the market diverge widely in perspectives taken, methods used, and results obtained."

"There are all sorts of possibilities, and when it comes to data and predictability with fossil fuel assets there are large uncertainties and deliberate misinformation," Lazarus said. The global oil market leads to supply misinformation, and the proprietary nature of most fossil fuel assets makes sound analysis difficult.

"We can do it within certain ranges," he said, "but I think when trying to make informed decisions on this, there isn't abundant data. We don't understand emissions consequences as fully as we might like to get to that level of precision."

In other words, data can stop where you want them to stop, especially with an issue such as Keystone, that includes values-based judgments.

Are emissions measured only in terms of the pipeline's impact as a logistics project or based on oil sands drilling versus alternate methods; on the viability of other projects if Keystone XL is rejected; on the oil's "getting to China anyway"; or the cost of what it will replace in the energy supply; or the amount of CO2 currently trapped in the oil sands, which critics say should simply never come out?

According to economist Julie Gorte, senior vice president of sustainable investing at Pax World Management, Keystone reflects a basic market limitation.

"Ask any corporate counsel what is material and it's like nailing jelly to a wall," she said. "It's always a judgment call, some things more obvious than not."

By Eric Rosenbaum,