Speaker
Description
Atomic magnetometers offer high sensitivity for measuring magnetic fields in areas such as biomagnetic monitoring, non-destructive testing, geological exploration, and fundamental physics research. Unlike superconducting quantum interference devices (SQUIDs), they do not require cryogenic operating temperatures, making them more practical for a range of real-world conditions. In addition, they typically surpass nitrogen-vacancy magnetometers (NVMs) in sensitivity. As industry increasingly demands compact, cost-effective, and robust solutions, optimizing the choice of OPM scheme, laser source, and vapor cell type becomes critical for meeting diverse operational constraints. Identifying optimal trade-offs is thus essential for transitioning these systems from the laboratory to real-world applications.
To explore the optimal balance of performance metrics for diverse applications, we compared four optically pumped magnetometer (OPM) schemes—Free-Induction Decay (FID), Nonlinear Magneto-Optical Rotation (NMOR), Amplitude-Modulated Bell–Bloom (AMBB), and a dual-beam amplitude-modulated dead-zone-free (DZF) approach—employing different laser sources (VCSEL, DBR, and ECDL) and vapor cell types (paraffin-coated, ~50 torr neon, ~50 torr nitrogen). Experiments were conducted under controlled conditions inside a four-layer magnetic shield to minimize external interference. We systematically measured magnetometer performance at an applied field of 1 µT, focusing on key metrics such as sensitivity, bandwidth, dynamic range, and overall system complexity, and obtained sensitivity results ranging from hundreds of fT/√Hz to hundreds of pT/√Hz.
The results underscore important trade-offs in sensitivity, bandwidth, dynamic range, and complexity across different OPM implementations. They also provide guidelines for selecting an optimal configuration tailored to various measurement requirements in atomic, molecular, and optical physics, as well as in on-site applications