tagger
A suite of tools for instrument control, interactive and automated data acquisition, and data analysis of time tag patterns, all written in Rust.
Quick links
Overview
In photonics experiments, single photon detectors produce a "click", a short analog pulse that is fed to a time-to-digital converter, which tags events with a ~100 picosecond resolution timestamp and sends these tags to a computer for analysis. We are especially interested in the time correlation between clicks in different channels, since the patterns in these correlations are the signature of the quantum nature of light.
The hardware that records single photon events, as well as the software that controls the hardware and processes data in real-time, are a critical section of our data pipeline. This suite, written in Rust, aims to surpass the functionality of existing control code in LabVIEW, while enabling features and flexibility that would be very challenging to add to the existing application. Here are some key numbers:
- Keeps up with the hardware's maximum sustained event rate of 10 MHz, removing a software bottleneck which was dropping up to 1/3 of our data
- Saves data in a format that uses >4x less disk space compared to old CSV format
- Enables real-time calculation and display of coincidence rates between channels, and fast calculation of coincidence histograms, doing in milliseconds what reference codes took minutes to do
- Multiple experiments can share the same hardware, provided they agree on a small set of global configuration settings
To begin with, I wrapped the vendor's cross-platform C++ library in a modern smart-pointer style. This let me painlessly generate safe Rust bindings using CXX. I had to learn more than I wanted about constructors and aggregate initialization in order to write the wrapper library, but once this was done I had unlocked the ability to basically ignore C++ and the idiosyncrasies of the vendor library and do the rest in Rust.
I decided to split the previously-monolithic application into a server and multiple clients. This let me build a simple interactive TUI client (a dashboard essential for tuning up an experiment and monitoring progress) and introduce an automated CLI client that acquires data specified in a declarative JSON file. This architecture also allows multiple clients to take data off of the same hardware simultaneously, removing a previous limitation.
The client-server communication is using Cap'n Proto RPC, which also lets future researchers implement specialized clients via a defined API in whatever language they are comfortable with. This was chosen over e.g. gRPC since we use the same serialization as an on-disk format for the data, and unlike Protobufs or flatbuffers, Cap'n Proto is a 64-bit format. We regularly acquire many gigabytes of data in a single run, so this is a necessity. Benchmarking both the serialization and compression (using zstd) is critical, but we found that all of the subsequent steps (Rust FFI, serialization, RPC overhead, writing to disk) were able to saturate the full 10 million counts-per-second rate of the hardware, which is limited by USB 2.0 throughput.
Screen capture
Since this code is for working with expensive hardware, you won't be able to meaningfully run it yourself unless you also have a photonics lab. I've captured some terminal sessions demonstrating how to start the server, use the interactive client to tune measurement settings, and use the automated client to save data.
tagstream
server
This is the session that the client demonstrations below connected to. Note that idle time is compressed to 2 sec, so that this doesn't take forever to play. You can see two connections from the TUI client, then one connection from the CLI client
tagview
TUI client
Here, I connect to the server, and modify measurement settings (adjusting delays until the 1-2 coincidence rate is maximized). I then save the settings, copy them into my original runfile, and connect again to confirm they work.
tagsave
CLI client
Finally, with measurement settings tuned up, it's time to take data! After using
tagsave
to acquire data, I briefly examine the JSON file with measurement settings
and summary statistics, then decode the binary format to display some of the timetags.
Architecture
+-----------------+ +----------------+
| Time tagger | | CTimeTag.h |
| vendor hardware |<---- USB 2.0 ---->| vendor library |
| (FPGA) | | (C++) |
+-----------------+ +----------------+
^
vendor |
-----8<----- v
this project +-----------------------------------+
| taghelper.h |
| Smart pointer/std::vector wrapper |
| (C++) |
+-----------------------------------+
^
| CXX FFI
v
+----------------------------+ +---------------------------+
| tagstream | | timetag |
| Time tagger control server |<-->| Rust bindings for library |
| | +---------------------------+
| async runtime: tokio |
| RPC: Cap'n Proto |
+----------------------------+ control computer
^ ----------8<-----------
| control comp. or remote
tag_server.capnp |
RPC API +-------------------+
v v
+------------------------------+ +------------------------------+
| tagview | | tagsave |
| tui-rs terminal ui/dashboard | | automated instrument control |
| interactive monitor/control | | and data acquisition |
+------------------------------+ +------------------------------+
^ ^ |
| interactively tune delays, | load specification | save summary data, metadata
| thresholds, etc. | of data to take | and raw tags
v | +-----------------+
+-------------+ +-------------+ | |
| myexpt.json |---------------->| myexpt.json | v |
+-------------+ finalize data +-------------+ +------------------------------+
run parameters | 20220119T123501Z_myexpt.json |
+------------------------------+------+
| 20220119T123501Z_myexpt.tags.zst |
+----------------------------------+