Large research infrastructures, such as synchrotron facilities, generate large amounts of data (up to a few tens of terabytes) every day. This data is very valuable, as it is the result of elaborate scientific experiments, which are likely to be performed only once.
Storing this data efficiently in addition to previously accumulated data, being able to transfer it quickly, and accessing it efficiently for visualization and scientific analysis is both a necessity and a challenge that digital data compression can address.
In this review of lossless data compression, I will present the metrics to use when considering compression from a temporal perspective, some strategies for improving compression and a few tools for evaluating compression algorithms with an example based on tomography data obtained at Soleil.
|Email address of presenting email@example.com|