Conversation
|
|
||
| CATKit2 (Control and Automation for Testbeds Kit 2) is a high-performance software framework designed for controlling complex laboratory hardware systems. Developed primarily for adaptive optics and high-contrast imaging testbeds in astronomy, it provides a robust infrastructure for hardware synchronization, real-time data streaming, and distributed process management. The framework enables researchers to orchestrate multiple hardware devices—such as cameras, deformable mirrors, motorized stages, light sources, and sensors—into cohesive, synchronized experimental setups. | ||
|
|
||
| The software employs a service-oriented architecture where each hardware device or computational task runs as an independent service process. Services communicate through high-speed, low-latency data streams implemented via shared memory, enabling real-time data exchange between components with minimal overhead. A central testbed server manages service lifecycle, configuration distribution, and provides discovery mechanisms for clients. The framework supports both hardware operation and comprehensive simulation modes, allowing researchers to develop and test control algorithms without physical hardware. |
There was a problem hiding this comment.
not sure what "discovery mechanisms for clients" means here
There was a problem hiding this comment.
That was short for "service discovery" which is standard nomenclature for service-oriented architectures. I would expand on it here, but it feels out of scope for this paragraph this early on in the paper.
|
|
||
| # Statement of need | ||
|
|
||
| Modern astronomical instrumentation relies increasingly on sophisticated laboratory testbeds to develop and validate technologies before deployment to observatories. These testbeds, such as the High-contrast Imager for Complex Apertures Testbed (HiCAT) at the Space Telescope Science Institute, require precise coordination of numerous hardware components operating at high speeds with strict timing requirements. Control frameworks must handle diverse hardware interfaces while maintaining microsecond-level synchronization and gigabyte-per-second data throughput. |
|
|
||
| Configuration management is performed by the centralized TestbedServer. This configuration is distributed as JSON to all services and clients, ensuring consistent views of the testbed state. Services receive only their own configuration section, promoting encapsulation and reducing unintended cross-dependencies that can complicate maintenance. | ||
|
|
||
| Safety is a primary concern in hardware control systems. CATKit2 includes a safety service mechanism where designated services can monitor testbed conditions and trigger fail-safe states when unsafe conditions are detected. Services can declare safety dependencies, ensuring that they will not be started, or are stopped if a safety condition is triggered. This design provides defense in depth for expensive or delicate hardware components. |
There was a problem hiding this comment.
Define what we mean by safety here - i.e. safety fro the hardware not for the people in the lab
There was a problem hiding this comment.
Rewrote this paragraph to make things a bit more clear.
|
|
||
| By providing a robust simulation environment where hardware services can be replaced with software equivalents, CATKit2 has enabled researchers to validate control strategies before hardware implementation, significantly reducing development time and risk. This capability is particularly valuable for iterative algorithm development where rapid testing cycles are essential. | ||
|
|
||
| The package is released under the BSD 3-Clause license and is publicly available on GitHub. Documentation is hosted on GitHub Pages, including API references and configuration guides for bundled services. The modular service architecture has enabled contributions from multiple institutions, with new hardware services contributed by testbed operators at collaborating facilities. This collaborative development model has expanded the hardware support ecosystem while maintaining code quality through peer review and automated testing. |
There was a problem hiding this comment.
Not sure we need the first two sentences
There was a problem hiding this comment.
Agreed. Removed the two sentences.
|
@ehpor I was confused by what you wanted for figure 3 and just saw in the description here that you wanted the trace figure, which I added for figure 4 - so maybe we can just delete one figure, or keep them if you'd like and if we have room |
967ca40 to
f4a83e5
Compare
4fe712e to
966bc3d
Compare
A paper draft for submitting to JOSS.
Figure 1: system architecture showing communication structure.
Figure 2: event performance for different OS's (latency figures for futex/semaphore/spinlock) (maybe histogram of latencies).
Figure 3: zoom-in on trace viewer for an AO loop.
Fix ROR of the institutes (AI hallucinated all of them).