BY G2 INTEGRATED SOLUTIONS

“An ounce of prevention is worth a pound of cure.” – Benjamin Franklin

In the previous post we briefly discussed how simple lessons from several schools of process improvement can be used to improve pipeline data governance. It should be obvious by this point that data governance is primarily about process management. Here we introduce Eagle’s gisgap product, which is essentially a robust process management framework for data governance.

gisgap is kind of an odd name, so first a basic definition and a note on pronunciation:

gis·gap \ˈgiz-gap\ noun: an incredibly powerful data governance process management framework.

That’s a hard, lowercase, ‘g’, as in ‘go,’ and a soft ‘s,’ as in ‘fizz.’ gisgap is actually a two-part acronym. ‘gis‘ stands for ‘Global Input Surge,’ and refers to the plethora of data inputs that gisgapcan consume. ‘gap‘ stands for ‘Governance Automation Protocols,’ and refers to the process management engine that is the heart of gisgap. The gisgap framework incorporates the traditional components of an enterprise GIS: the GIS data repository, the mapping and reporting applications, and the field data capture tools. gisgap adds an extremely flexible and robust process management framework, tools for just-in-time data verification, validation, review and approval, and various connectors that tie it all together gisgap framework.

If we examine pipeline data management from a historical perspective, we see that the first milestones in digital pipeline data management were the development of “industry standard” data models (ISAT, PODS, the APDM, and PODS Esri Spatial), and applications to populate them (the Load & Publish component). Most vendors initially concentrated on these items. The second wave of functionality came in the form of mapping, reporting and analysis tools, and many vendors offer solutions here, as well (the Report & Distribute component). With the advent of mobile GPS devices and electronic forms technology, some vendors have concentrated on field data capture offerings (the Collect & Sendcomponent).

For the most part, overall process management has received short shrift. Not too surprisingly, many operators have ended up with defective data in their GIS databases. In fact, recent incidents prompted Ms. Debbie Hersman, the NTSB chairman, to ask the following at the April 18, 2011 National Pipeline Safety Forum:

“Unfortunately in the San Bruno accident, we found that the company’s underlying records were not accurate… My question is that if your many efforts to improve safety are predicated on identifying risk, and if your baseline understanding of your infrastructure is not accurate, how confident are you that your risks are being assessed appropriately?”

This question is scary, and you can bet you’ll soon hear it at a PHMSA audit near you. Most operators are scrambling to make sure they can answer it. And many would tell you, sadly, that this is not their first bull ride at the data clean up rodeo. Some operators have been through multiple rounds of data clean up in recent years – a clear indication of lack of focus on process uniformity and control.

Vendors have, of course, reacted to this penchant for data clean up, and most industry solutions therefore focus on cleaning up the bad data already in your GIS database. Esri’s ArcGIS Data Reviewer is a prime example. The ArcGIS Data Reviewer is a fine product, but most people use it to detect and correct existing errors. Our process management lessons from the manufacturing sector tell us this approach is bass ackwards. The idea is to detect and correct defects before they enter your GIS database. It’s far less expensive to correct data defects at, or close to, the point of collection, than it is to ferret out and correct data long after (in some cases decades after) the fact. An ounce of prevention is worth a pound of cure.

The Receive & Validate and Review & Approve components comprise the core functionality of gisgapgisgap receive/validate/review/approve activities take place in a data staging area, so no data enters the production GIS database prior to being fully vetted. Validation processes implement “autonomation” following Lean Six Sigma concepts, so automated data processing is suspended when a data inconsistency is detected, and that data is set aside for human intervention. Review and approval processes are entirely configurable. Depending on Management of Change (MOC) requirements, multiple steps of “eyes on” human review and approval can be applied, or not, as desired.

In the following posts we’ll examine each of the gisgap components in more detail. Meanwhile, if you want to learn more about gisgap and data governance, click here.