January 2021, Vol. 248, No. 1

Features

Early Adopters of AI Reaping Integrity Management Benefits

By Danielle C. Roberts, Energy Writer  

Robots that inspect pipelines are continually improving – at the same time, they are gathering ever-increasing amounts of data.  

In fact, today it’s estimated that every 150,000 miles of pipeline generates about 10 terabytes of data on everything from pressure and temperature to flow.   

To cope with this increasing data volume, big data technologies are a must, say experts – combined with the potential of artificial intelligence (AI) and machine learning – for robust, consistent and accurate analysis of pipeline damage, quality improvement of data, and not only failure prediction but also recommended actions to maximize pipeline performance.   

While the traditionally conservative oil and gas industry may not be AI’s first adopter, experts agree that the industry is on the first steps of its journey.   

As the whole world has begun to lean on technology and virtual tools due to the pandemic, COVID-19 has also accelerated the adoption of AI across the oil and gas value chain, said Vinod Raghothamarao, IHS Markit’s director consulting for energy transition and cleantech.  

For the early adopters, AI also offers an alluring prospect: a tool that could bring operators ever closer to the elusive goal of zero defects.   

What AI Can Do  

The task of combing over miles of pipeline, inch by inch, takes patience, attention to detail and a methodical mind. It also takes time – typically weeks or months for humans to accomplish a single pipeline inspection.  

But all of these tasks – defining areas where suspicious anomalies are located; classifying defect types; identifying the length, width and depth of an anomaly; or improving the quality of data to eliminate outliers, which can ease analysis and make it more accurate – have the potential to be automated through AI. Human engineers are then able to focus their time on those Level 2 analyses requiring logical thinking and problem-solving.   

Automating these laborious tasks and developing new methodologies has clear efficiency benefits, although the multiple parameters involved can make the time saved difficult to quantify. But the monetary benefits could be enormous. According to IHS Markit, operators report an average of about 10 days per year of downtime or reduced throughput due to mechanical issues, integrity inspection or other causes, along with an average daily financial impact of $230,000 from downtime or reduced throughput. Multiplied, this adds up to $2.3 million dollars per year in average cost of downtime.   

“Overall, midstream pipeline operators report an average of $27 million a year on all forms of pipeline maintenance and repair costs,” said Raghothamarao. “According to recent estimates, effective prediction of pipeline leakage incidents using AI and digital transformation could avoid a potential cost of nearly $30 billion per annum for U.S. pipeline companies.”  

AI in Practice  

Six years ago, Xavier Valero, head of big data and AI at NDT Global, was tasked with examining what, if any, business and customer value could be gained through AI.   

It was a big vision, and the team started small, with a simple solution that could be scaled. At the same time, they built a platform able to store large amounts of data for processing.   

For Valero, starting small was key since the oil and gas industry is cautious of adopting new processes and technologies.  

“Companies should not be afraid of [starting small]. There is no need to wait years to collect petabytes of data to kick off activities,” he said. “It’s a good way to start because you learn as you go.” A pilot approach, coupled with gradual adoption and scaling, can help companies gain experience as well as buy-in on what AI can accomplish through data, and then gradually develop more complex predictive models.   

NDT Global is now actively using AI, focusing on specific use cases such as improving data quality, leveraging data analysis, and developing new methodologies for improved sizing accuracy. Its integrity management program also benefits from reconstructions of pipeline wall defects and corrosion through AI.  

With the use of technological tools exploding in nearly every industry, Valero said there’s much excitement but at the same time caution.   

“We have all these opportunities because the market is growing, and available technologies – the AI solutions – are manifold. We are adapting them to ILI [inline inspection], but always keeping in mind the boundaries and the constraints,” he said. “First, we develop models that improve our efficiency, but at no point compromise the safety, so the first boundary condition is zero risk for pipeline integrity.  

“Second, we adopt responsible AI and explainable AI. That’s really important for our customers. We don’t adopt black boxes that cannot explain the analysis, but rather we use methodologies where we can reproduce the results and really go to the customer and explain how the decision was made,” he said.   

For other oil and gas companies embarking on digital transformation, Valero breaks down that journey into two steps. First is to understand the data sources available and to collect the data, placing it in a central platform where it can be accessed – yet at the same time with controls in place to protect sensitive data.  

Next is putting a team in place to make good use of the data; at the outset, that would involve building predictive models to improve pipeline integrity management, but the ultimate goal is to develop a descriptive model that “not only predicts but also recommends the best way to go forward,” said Valero.   

Ultimately, though, the journey to AI is data-driven  

Going All In  

In the 1990s, Tim Edward and Dwayne Kushniruk worked together at a company called Baseline Technologies to help solve the problem of geolocation of pipeline defects for the oil and gas industry using 3-D modeling. In 2008, Edward dusted off the software in hopes of rebuilding it with the technology of the day in mind. But the necessary computing power wasn’t available.   

Then cloud computing arrived a few years later. Kushniruk decided to go all in, reorganizing his company and reaching back out to his friend to resurrect the old idea.   

That, in short, is how OneBridge Solutions was developed in 2015, which one year later became the only oil and gas technology startup to be accepted into Microsoft’s first Accelerator incubator for machine learning and data science.   

At the outset, OneBridge set itself three challenges. “With AI, everything starts with an experiment, and you have a hypothesis, then you run a number of experiments on the data to determine whether or not you can achieve the outcome that’s desired,” said Edward.   

The first challenge was using machine learning to solve the labor issue of ingesting legacy data sets into a database.   

“The industry was challenged by the herculean task of normalizing ILI data from the vendors,” explained Edward. “There are 40-plus vendors out there in the industry today, and all of them have decades of data. As you can imagine, those formats have changed with every new report.”  

Second was resolving inherent differences in tool runs due to orientation and linear issues, even in identical pieces of pipe, to accurately geolocate problem spots.  

“That was what we called the pit-to-pit challenge,” said Edward. “We wanted to make sure we were contextualizing the exact spot on the pipe with all of the recorded data.” When one run might find a defect at 10 o’clock at 3,600 feet and a second would locate it at 12 o’clock at 3,400 feet, “the challenge was to orientate those things but do that using a machine process.”  

The third was to provide the information to the operator in a useful format – “how we can demonstrate this data that eclipses the Excel experience,” said Edward.   

And the overarching challenge, he said, was to be able to accomplish all of this with zero human interface and remove these backbreaking, time-consuming tasks from the operator.   

ASME B31.1a states that in the absence of enough data, operators and ILI vendors must use the worst-case scenario in pipeline management, typically a corrosion growth model based on the half-life or some fixed corrosion growth model. The industry standard is to select a few anomalies on a pipe to conduct in-depth analyses.  

OneBridge, on the other hand, applies the machine learning process to conduct pit-to-pit matching on all anomalies, which gives operators a deeper understanding of what’s happening within their pipelines, and the ability to detect potentially problematic anomalies more accurately and sooner.   

Using cloud computing takes just less than an hour, said Edward, compared to weeks comparing Excel spreadsheets, “which frees up those engineering hours to really do that Level 2 analysis … basically as soon as you get the log. It really changes the perspective of engineering and how it’s done on the pipeline.”  

Even further, the technology is then able to take the equivalent of a million and a half lines of Excel records to “paint” anomalies onto a 3D picture of a pipeline – and “the information that the human can absorb from that is significantly better,” said Edward.   

While that image may have a huge wow factor, Edward describes it just as another process. “It’s not so much a technology process – it’s more a cultural process where people have to figure out how to use this new tool,” he said. “And that’s really what we’ve built, it’s a new tool … It’s like moving from a slide rule to Excel and moving from Excel to cloud compute.”   

Moving to Adoption   

For a company to move to adopt AI, it’s really about management of change. There’s often initial skepticism, even outright revolt – then reluctant use and finally engagement after witnessing the quantifiable impact on operations.   

Trusting a computer algorithm to properly recognize and code a defect isn’t going to happen overnight, agrees Raghothamarao. “Overall industry acceptance of such a new and dynamic technology will also be a hurdle that AI will have to overcome,” he said. “Given the resistance to change by different stakeholders in the value chain, the acceptance will definitely take time.  

“Reduced spending due to low oil prices combined with opposition to change management and not necessarily immediate returns is a hurdle to widespread adoption, but once the companies begin to realize the tangible benefits, then the adoption rate will steadily increase,” he said.  

While AI has potential for cost savings, pipeline inspection will always remain a high-cost budget line item. “At the end of the day, we are talking about a huge business impact,” said Valero. “We are not an industry that is at the forefront of AI adoption, but I think that we are catching up. Midstream operators are having digital transformation plans, and they all have AI in their heads.”  

OneBridge has also seen some cost savings through an improved dig-to-repair ratio. Using standard techniques, that ratio is typically 50% effective – OneBridge has been able to move that up to 70%.   

Ultimately, however, it’s all about the data, and whether that’s through the cloud or other controls that ensure privacy, experts continue to agree that data sharing is key for companies to become data-driven.  

Related Articles

Comments

{{ error }}
{{ comment.comment.Name }} • {{ comment.timeAgo }}
{{ comment.comment.Text }}