Data processing steps

Moored instrument data processing

Moored instrument data processing steps
Moored instrument data processing steps ©

Moored instrument data go through several steps before they are incorporated in the National Oceanographic Database (NODB). Our aim is to ensure the data are of a consistent standard and to guarantee their long term security and utilisation.

1. Archive original data. When the data are first received they go through our Accession procedure. The data are securely archived in their original form along with any associated documentation.

2. Transfer data into standard format. Data arrive in various formats and are transferred to our standard format. This binary format is netCDF. NetCDF has the advantage of being able to handle multi-dimensional data from instruments such as moored Acoustic Doppler Current Profilers (ADCPs). It is also platform independent. MATLAB software, including the netCDF toolbox, is used for the transfer.

Eight byte parameter codes are assigned to each data channel and data are converted to standard BODC units. Missing data are converted to the BODC absent data value for that parameter.

3. Compile metadata. The metadata (data about data), e.g. collection date and times, mooring position, instrument type, instrument depth and sea floor depth, are loaded into Oracle relational database tables. These are carefully checked for errors and consistency with the data. The data originators may be contacted if any problems cannot be clearly resolved.

4. Screen data using BODC's in-house visualisation software. This software can be used to display different types of data, e.g.

Parameters can be plotted concurrently and records from different instruments can be compared. Data values are NOT changed or removed, but may be flagged if they appear suspect. Data values that the originator regards as suspect are flagged with 'L'. BODC uses 'M' to flag suspect data. 'N' flag is used to indicate absent data.

5. Document all data sets to ensure they can be used in the future without ambiguity or uncertainty. Comprehensive documentation is compiled using information supplied by the data originator and any information gained during BODC screening. It will include

6. Quality checking of data before loading into the database. The netCDF files and metadata are thoroughly checked using MATLAB software to ensure they conform to stringent BODC standards.

7. Archiving the final data set is performed by the BODC Database Administrator.

8. Data distribution and delivery. Currently data are available by request but soon will be available via web delivery services.


Related BODC pages

Overview of all data processing steps at BODC      Specific underway data processing for projects
General data processing for projects     BODC parameter codes
Specific CTD data processing for projects     Code and format definitions

Related external links

NetCDF at Unidata