Create documentation describing how Kafka works within Orthanc/Sonador
Apache Kafka provides data streaming capabilities for medical imaging workflows. It helps to integrate Sonador with downstream systems which need to consume data from Sonador, providing support for streaming-based data ingestion, event coordination for workflows, and AI/ML integration.
Kafka is used pervasively throughout the Sonador platform, but there is limited information about the integration, data schemas, and how components work together. Documentation is needed to address these shortfalls.
Needed documentation:
-
What is Kafka and what role does it play within the Sonador ecosystem? -
Architecture: how does Sonador integrate with Kafka, what data is exported, and how can messages be triggered? -
Orthanc modules and class structure -
Sonador IO client and tools -
Available integrations and examples
Resources:
Edited by Rob Oakes