Module 1.1: Calculating Metrics for Spatial Data Quality
In this lab, we determined the precision and accuracy measurements of provided GPS waypoint data, as well as the root-mean-square error (RMSE) and cumulative distribution function. For geospatial data, precision is how close measurements are to one another, while accuracy is how close the measurement is to the actual - or reference - value. Data can be precise without being accurate and vice versa. GIS data is held to specific accuracy and precision limits and the values are represented as differences, or errors, where accuracy is usually met with the RMSE as a guide.
For this lab assignment, the precision was determined as a distance (in meters) that accounted for 68% of the observations; while the accuracy was determined by measuring the distance between the average and accepted reference points. In both cases, the larger the value, the lower the precision and accuracy.
The result for horizontal precision within 68% of the average waypoint is 4.4 meters, which puts most of the collected points within this zone (as shown below). If 68% precision is the standard, this makes the measurements precise. The measured distance between the average location and the reference point (in meters) is 3.27 meters. This puts the estimate of the average location within the 68% precision buffer. For the horizontal measurements, we can conclude the measurements are precise and accurate within 68%. Whether this is significant or not depends on how the data will be used and the standards for the project.
Comments
Post a Comment