Finding new sources of oil underground is an expensive and risky undertaking. Now IBM is working with energy company Repsol to look for ways in which new cognitive computing techniques could help reduce the uncertainty and improve production.
“This is a whole new time where people and machines will be working better together,” said Brian Gaucher, IBM Research senior manager and research staff member at the company’s Cognitive Environments Laboratory.
The oil and gas industry is fertile testing ground for testing new kinds of decision-making applications. Companies such as Repsol can spend up to a billion dollars to find and then acquire a new location to drill for oil. Once a potential location is found, the company can then spend additional hundreds of millions to drill.
Today, only one in four or five wells returns a profit. So any technique that would improve these odds could save a lot of money. One of the smaller global energy companies, comparatively speaking, Repsol looks for any creative edge it can find.
The two companies are collaborating on a pair of cognitive-computing applications. One application will help Repsol engineers determine the best potential oil fields for the company to acquire. The other will look for ways to get the most oil from existing properties.
For these applications, the researchers plan to use a variety of cognitive computing techniques to filter and analyze data from multiple sources. They also will investigate new forms of user interface, so managers and engineers can interact with data in more intuitive and natural ways.
Cognitive computing technology, for instance, could do the preliminary work of reviewing seismic imaging data, relieving engineers of hours of study. It could also analyze thousands of reports, aggregating the content and drawing conclusions. A cognitive computer system could digest and fuse not only geographical and scientific data, but pertinent political and economic indicators as well.
“Having the ability to mine, in real time, news articles, and look at trends and then pull those into your decision making process could reduce uncertainty and give you tremendous insights into what is happening right now,” Gaucher said.
One technique the group is looking at is the ability to build applications on-the-fly from reusable components, in order to meet some particular need of the moment.
Computing services of the future may not be “monolithic” but rather multi-agent distributed systems, from which different context-aware applications can be assembled, Gaucher said.
“Some of these programs could live forever. Or they may live only for an hour,” Gaucher said.