Data accuracy is the degree in which the data is representative of real-world features. This is an important dimension of data quality and relates to how well the data preforms when visualised and how accurate measurements are. Accuracy can be measured by the number of errors present. How accurate the data is can also increase the level of trust in measurements and broaden the use cases.
Questions to answer for accuracy:
Has the data been subjected to quality assurance processes? (Checked for errors at the point of collection and throughout processing)?
Is the level of temporal quality and positional accuracy high, and is there a high-level of confidence in the measurements and is the version history mentioned when the datasets are made available to the public?
Is the data sounding speed checked and corrected especially at the outer beams of adjacent files?
Is the level of data sounding density is high, and is there a high-level of confidence in the coverage?
Is the data fliers checked and is there a high-level of depth and resolution accuracy?
Is there any information available about No adjustments, changes that can impact on the validity of the data? (Changes or flaws in the collection or verification methods)? Or Adjustments are identified in caveats attached to the dataset.