By Robert C. White
Warren Buffett once said, “Risk comes from not knowing what you’re doing”. In our world, where a well can be a high-stakes gamble, calculated risk is the norm. It’s what we do. And so, to paraphrase Mr. Buffett, more knowledge equals less risk.
How does this segue into a discussion about data quality and risk mitigation? Rather nicely, as a matter of fact. The more accurate your source information is, the less chance you and your colleagues have of making an incorrect decision.
If you’ve been in GIS for some time, you’ve likely seen the usage of your work spread far and wide throughout the organization. And those end users have ever more sophisticated tools at their own disposal to analyze and interpret products you’ve produced. There are more eyes than ever before looking critically at spatial data and making very expensive decisions based on it.
But what is the actual, concrete risk associated with this? Let’s take the case of the positional accuracy of a leasehold boundary. Particularly in the case of irregular parcels, the grid data you are using to decide where to drill may bear little resemblance to the real world. Worst-case scenario? The mistake doesn’t get noticed until after the hole is in the ground – and you’ve just drilled a well for the other guy. Believe it or not, it has happened. Inaccurate locations can result in unpleasant surprises that range from fines to firings.
Data cleanliness plays a role as well. With automated land mapping systems aggregating and dissolving geometries, clean data is a must – both to reduce the frequency of manual intervention and the potential introduction of error, and to provide an accurate basis for further statistical and positional analysis.
But enough with the glass-half-empty outlook – the point that we are coming to is that we know these risks are out there. It’s part of doing business, and it’s just plain responsible to take all the steps possible to reduce mistakes. That’s why we do what we do, and constantly strive for excellence in our data management practices and quality control procedures.
We’re proud to say that WhiteStar Grid, WhiteStar Lots and Tracts, and the rest of our suite of digital products are of the highest quality we can make them – in accuracy, topology, cleanliness, currency, and comprehensive coverage.
And that can help reduce risk.
How Good is your Grid?
Six questions to ask when evaluating a dataset
When you’re ready to turn the magnifying glass on a land grid dataset – whether you want a better understanding of the data you’re using now, or are comparing alternatives - here are a few questions to ask yourself and your data vendor. The answers are important, and will give you a clearer picture of the dataset’s accuracy and reliability, and the vendor’s data management practices.
- Can you query each individual polygon and see when (and why) the geometry and attributes were last changed?
- What was the digitizing process used, and were the points collected within a known margin of error with respect to the source material? What is that margin?
- Can you query any polygon and establish where it came from and why a particular digitization method was used to create it? If the source material was not public information, can you still determine those answers?
- What is the process used, and frequency of updates to the dataset? How is new information incorporated – and can you easily detect changes to see where updates have occurred?
- Is the data topologically clean – free of gaps, slivers, and overlaps, with contiguous coverage? Are all polygons uniquely attributed, or do duplicates exist that will stall automated mapping processes?
- Has the dataset been adopted by leading vendors as the basis for creating their own digital products?
WhiteStar is happy to answer all of these questions for you – and any others you may have.