This guest blog post is part of a series written by Edward J. Farmer, PE, ISA Fellow and author of the new ISA book Detecting Leaks in Pipelines. To download a free excerpt from Detecting Leaks in Pipelines, click here. If you would like more information on how to purchase the book, click this link. To read all the posts in this series, scroll to the bottom of this post for the link archive.
In Detecting Leaks in Pipelines, I mention and discuss a concept referred to as “coherence” as being important in interpreting the underlying message within a set of observations. (Refer to page 149.) My attraction to the coherence concept may come from my BSEE in which we used it a lot in assessing the “information” within communication “signals.”
It also showed up in some military intelligence work involving analyzing the likely outcome indicated by multiple sets of observations. In the first case, the usual motivation was assessing whether a communications stream contained particular characteristics. The usual textbook issue is looking for a pulse in an otherwise stochastic stream.
In military intelligence work we were usually trying to discern whether a stream of observations meant anything of consequence, or, if they did, which of several possible “consequences” was most likely. As it turns out, both points of view are useful in thinking one’s way through pipeline leak detection and many similar process management and control issues.
Coherence of a set of observations suggests a logical “fitting together,” implying either a common source, a common purpose, the result of common processing, or all of them. Establishing the interconnected linkage uniting what appear to be puzzle parts into some discernible picture is the analysis process. It can be tedious and obtuse and “success often seems to be the result of an 'aha!'” experience.
Observations are the set of things required to discern likely (or sometimes pertinent) outcomes. If the problem was finding the beautiful landscape image in a sea of jigsaw puzzle parts one might begin with some sort of algorithmic approach, perhaps putting all the pieces with straight edges into a pile, then sorting the other pieces based on some persistent characteristic, such as color. As the observations are processed, categorized, and fit together in so far as possible, elements that could be parts of several pictures emerge.
As more fitting together is done by various methods, the picture improves and begins to show a small set of likely outcomes. Eventually, confidence reaches a comfortable level and fitting the remaining minor pieces together is simple and declines in value – you see and experience the result.
There are almost always “outliers,” perhaps a black piece of the right shape and apparent connection methodology that would come from a discernible area if it were blue. That might invoke thinking about the “blue area” assumption or perhaps the unique shape of the subject piece. Maybe it’s from another region, or maybe the key to confluence of other regions? Who knows at this point? An open mind and a logical process will soon make it all perfectly (or at least statistically adequately) clear.
It should be easy to close one’s eyes at this point and see a collection of Venn diagrams, all leading to the ultimate categorization of the pieces and eventually groups of pieces, and finally conformance into some likely image. Some process work is much easier than puzzles. In real life, especially in intelligence work, there may be apparent reasons why things seem to go one way other than another. The challenge becomes: What do we need to observe in order to establish another categorization criterion?
As I’ve previously discussed, in process control and analysis we usually have but a few possible outcomes. Sometimes it's easier to discern whether a particular set of conditions suggest that a situation we should worry about is emerging. Quickly, it becomes practical to focus effort on things that matter considering the trail marked by the indicators as we observe and analyze them.
Consider a reach of pipe and what we observe about it. If we know upstream and downstream pressure and flow, we can make some assumptions, reading by reading, about what is happening on that piece of pipe.
- Matched flow indicates the line is stable and free of transients.
- Inflow greater than outflow at decreasing pressure suggests a leak.
- A decrease in pressure and flow at the downstream end warns there may be a problem there.
- An increase in upstream flow with a decrease in pressure suggests a possible leak.
- In a time-series of data, the onset of a decrease at an end suggests the precipitating event (e.g., the leak) may be closer to that observation than the “other end” of the line.
- The time between when a change is seen at opposite ends of the line can indicate the precipitating event’s actual location. If the times are the same the event is near the middle of the segment. Otherwise, it is calculably closer to the end where it is first seen.
There is more, but you get the idea. Also remember the quality and thus dependability of our conclusions improves as the stream of readings becomes larger.
A pipeline hydraulic event (a change in flow or in pressure) propagates along the pipeline at the speed of sound in the fluid in the pipe. The speed of sound can be estimated or can be updated automatically as needed. The transit time is easily calculated at the length of the line divided by the acoustic velocity.
I’ll dwell on this a bit more in a future blog. From this we know that any two events separated by more than the transit time are not coherent with a single event on the segment. There are lots of things we can surmise from such a situation by looking into, and sorting out how such a thing could happen.
- If the interval is less than or equal to the transit time the source is either a leak on the segment or at one of its ends.
- If the interval is exactly the transit time it may be at an end but is more likely beyond the end at which it is seen first.
- If we’re willing to consider more than one leak within the time-frame then there are even more options.
Given a specific situation, what possibilities could be coherent? What do we need to know to separate the possibilities? From this thought-process we can discern the requirements for confident leak detection (or some other event of interest) from unlikely spoofing. Note that understanding coherence frames our problem and observations provide what is needed to resolve the ambiguity.
How to Optimize Pipeline Leak Detection: Focus on Design, Equipment and Insightful Operating Practices
What You Can Learn About Pipeline Leaks From Government Statistics
Is Theft the New Frontier for Process Control Equipment?
What Is the Impact of Theft, Accidents, and Natural Losses From Pipelines?
Can Risk Analysis Really Be Reduced to a Simple Procedure?
Do Government Pipeline Regulations Improve Safety?
What Are the Performance Measures for Pipeline Leak Detection?
What Observations Improve Specificity in Pipeline Leak Detection?
Three Decades of Life with Pipeline Leak Detection
How to Test and Validate a Pipeline Leak Detection System
Does Instrument Placement Matter in Dynamic Process Control?
Condition-Dependent Conundrum: How to Obtain Accurate Measurement in the Process Industries
Are Pipeline Leaks Deterministic or Stochastic?
How Differing Conditions Impact the Validity of Industrial Pipeline Monitoring and Leak Detection Assumptions
How Does Heat Transfer Affect Operation of Your Natural Gas or Crude Oil Pipeline?
Why You Must Factor Maintenance Into the Cost of Any Industrial System
Raw Beginnings: The Evolution of Offshore Oil Industry Pipeline Safety
How Long Does It Take to Detect a Leak on an Oil or Gas Pipeline?
Book Excerpt + Author Q&A: Detecting Leaks in Pipelines
About the Author
Edward Farmer has more than 40 years of experience in the “high tech” part of the oil industry. He originally graduated with a bachelor of science degree in electrical engineering from California State University, Chico, where he also completed the master’s program in physical science. Over the years, Edward has designed SCADA hardware and software, practiced and written extensively about process control technology, and has worked extensively in pipeline leak detection. He is the inventor of the Pressure Point Analysis® leak detection system as well as the Locator® high-accuracy, low-bandwidth leak location system. He is a Registered Professional Engineer in five states and has worked on a broad scope of projects worldwide. His work has produced three books, numerous articles, and four patents. Edward has also worked extensively in military communications where he has authored many papers for military publications and participated in the development and evaluation of two radio antennas currently in U.S. inventory. He is a graduate of the U.S. Marine Corps Command and Staff College. He is the owner and president of EFA Technologies, Inc., manufacturer of the LeakNet family of pipeline leak detection products.