As data rates and experimental complexity increase it is critical that facilities reduce the burden of reduction and processing of raw experimental data for users. While this statement itself is clear and simple, the reality behind implementing generic real-time auto-processing is not.
Broadly the problem can be split into two categories: Infrastructure and User Experience. Infrastructure requirements include things like data and metadata storage and access, cross process communication between different systems, and access to High Performance Computing resources. User Experience deals with how can these separate systems come together to provide a flexible and usable system, but most critically – how do we give facility users the confidence that the data is processed correctly, and with full provenance, so that they will use it for real-time experimental decision making or as the basis for publication?
Here we present the system deployed for the Physical Sciences at the Diamond Light Source and show how technologies like HDF5 (with SWMR), message brokers (like ActiveMQ) and information management systems (like ISPyB), can be used to build a versatile system for generic real-time data processing.
|Email address of presenting email@example.com|