# Using HDF5/zlib Compression in NetCDF4

Not too long ago, I posted an entry on writing NetCDF files in C and loading them in R.  In that post, I mentioned that the latest and greatest version of NetCDF includes HDF5/zlib compression, but I didn’t say much more beyond that.  In this post, I’ll explain briefly how to use this compression feature in your NetCDF4 files.

Disclaimer: I’m not an expert in any sense on the details of compression algorithms.  For more details on how HDF5/zlib compression is integrated into NetCDF, check out the NetCDF Documentation.  Also, I’ll be assuming that the NetCDF4 library was compiled on your machine to enable HDF5/zlib compression.  Details on building and installing NetCDF from source code can be found in the documentation too.

I will be using code similar to what was in my previous post.  The code generates three variables (x, y, z) each with 3 dimensions.  I’ve increased the size of the dimensions by an order of magnitude to better accentuate the compression capabilities.

```  // Loop control variables
int i, j, k;

// Define the dimension sizes for
// the example data.
int dim1_size = 100;
int dim2_size = 50;
int dim3_size = 200;

// Define the number of dimensions
int ndims = 3;

// Allocate the 3D vectors of example data
float x[dim1_size][dim2_size][dim3_size];
float y[dim1_size][dim2_size][dim3_size];
float z[dim1_size][dim2_size][dim3_size];

// Generate some example data
for(i = 0; i < dim1_size; i++) {
for(j = 0; j < dim2_size; j++) {
for(k = 0; k < dim3_size; k++) {
x[i][j][k] = (i+j+k) * 0.2;
y[i][j][k] = (i+j+k) * 1.7;
z[i][j][k] = (i+j+k) * 2.4;
}
}
}```

Next is to setup the various IDs, create the NetCDF file, and apply the dimensions to the NetCDF file.  This has not changed since the last post.

```  // Allocate space for netCDF dimension ids
int dim1id, dim2id, dim3id;

// Allocate space for the netcdf file id
int ncid;

// Allocate space for the data variable ids
int xid, yid, zid;

// Setup the netcdf file
int retval;
if((retval = nc_create(ncfile, NC_NETCDF4, &ncid))) { ncError(retval); }

// Define the dimensions in the netcdf file
if((retval = nc_def_dim(ncid, "dim1_size", dim1_size, &dim1id))) { ncError(retval); }
if((retval = nc_def_dim(ncid, "dim2_size", dim2_size, &dim2id))) { ncError(retval); }
if((retval = nc_def_dim(ncid, "dim3_size", dim3_size, &dim3id))) { ncError(retval); }

// Gather the dimids into an array for defining variables in the netcdf file
int dimids[ndims];
dimids = dim1id;
dimids = dim2id;
dimids = dim3id;```

Here’s where the magic happens.  The next step is to define the variables in the NetCDF file.  The variables must be defined in the file before you tag it for compression.

```  // Define the netcdf variables
if((retval = nc_def_var(ncid, "x", NC_FLOAT, ndims, dimids, &xid))) { ncError(retval); }
if((retval = nc_def_var(ncid, "y", NC_FLOAT, ndims, dimids, &yid))) { ncError(retval); }
if((retval = nc_def_var(ncid, "z", NC_FLOAT, ndims, dimids, &zid))) { ncError(retval); }```

Now that we’ve defined the variables in the NetCDF file, let’s tag them for compression.

```  // OPTIONAL: Compress the variables
int shuffle = 1;
int deflate = 1;
int deflate_level = 4;
if((retval = nc_def_var_deflate(ncid, xid, shuffle, deflate, deflate_level))) { ncError(retval); }
if((retval = nc_def_var_deflate(ncid, yid, shuffle, deflate, deflate_level))) { ncError(retval); }
if((retval = nc_def_var_deflate(ncid, zid, shuffle, deflate, deflate_level))) { ncError(retval); }```

The function nc_def_var_deflate() performs this.  It takes the following parameters:

• int ncid – The NetCDF file ID returned from the nc_create() function
• int varid – The variable ID associated with the variable you would like to compress.  This is returned from the nc_def_var() function
• int shuffle – Enables the shuffle filter before compression.  Any non-zero integer enables the filter.  Zero disables the filter.  The shuffle filter rearranges the byte order in the data stream to enable more efficient compression. See this performance evaluation from the HDF group on integrating a shuffle filter into the HDF5 algorithm.
• int deflate – Enable compression at the compression level indicated in the deflate_level parameter.  Any non-zero integer enables compression.
• int deflate_level – The level to which the data should be compressed.  Levels are integers in the range [0-9].  Zero results in no compression whereas nine results in maximum compression.

The rest of the code doesn’t change from the previous post.

```  // OPTIONAL: Give these variables units
if((retval = nc_put_att_text(ncid, xid, "units", 2, "cm"))) { ncError(retval); }
if((retval = nc_put_att_text(ncid, yid, "units", 4, "degC"))) { ncError(retval); }
if((retval = nc_put_att_text(ncid, zid, "units", 1, "s"))) { ncError(retval); }

if((retval = nc_enddef(ncid))) { ncError(retval); }

// Write the data to the file
if((retval = nc_put_var(ncid, xid, &x))) { ncError(retval); }
if((retval = nc_put_var(ncid, yid, &y))) { ncError(retval); }
if((retval = nc_put_var(ncid, zid, &z))) { ncError(retval); }

// Close the netcdf file
if((retval = nc_close(ncid))) { ncError(retval); }```

So the question now is whether or not it’s worth compressing your data.  I performed a simple experiment with the code presented here and the resulting NetCDF files:

1. Generate the example NetCDF file from the code above using each of the available compression levels.
2. Time how long the code takes to generate the file.
3. Note the final file size of the NetCDF.
4. Time how long it takes to load and extract data from the compressed NetCDF file.

Below is a figure illustrating the results of the experiment (points 1-3). Before I say anything about these results, note that individual results may vary.  I used a highly stylized data set to produce the NetCDF file which likely benefits greatly from the shuffle filtering and compression.  These results show a compression of 97% – 99% of the original file size.  While the run time did increase, it barely made a difference until hitting the highest compression levels (8,9).  As for point 4, there was only a small difference in load/read times (0.2 seconds) between the uncompressed and any of the compressed files (using ncdump and the ncdf4 package in R).  There’s no noticeable difference among the load/read times for any of the compressed NetCDF files.  Again, this could be a result of the highly stylized data set used as an example in this post.

For something more practical, I can only offer anecdotal evidence about the compression performance.  I recently included compression in my current project due to the large possible number of multiobjective solutions and states-of-the-world (SOW).  The uncompressed file my code produced was on the order of 17.5 GB (for 300 time steps, 1000 SOW, and about 3000 solutions).  I enabled compression of all variables (11 variables – 5 with three dimensions and 6 with two dimensions – compression level 4).  The next run produced just over 7000 solutions, but the compressed file size was 9.3 GB.  The down side is that it took nearly 45 minutes to produce the compressed file, as opposed to 10 minutes with the previous run.  There are many things that can factor into these differences that I did not control for, but the results are promising…if you’ve got the computer time.

I hope you found this post useful in some fashion.  I’ve been told that compression performance can be increased if you also “chunk” your data properly.  I’m not too familiar with chunking data for writing in NetCDF files…perhaps someone more clever than I can write about this?

Acknowledgement:  I would like to acknowledge Jared Oyler for his insight and helpful advice on some of the more intricate aspects of the NetCDF library.