One of the implementation dreams of enterprise algorithm developers is the ability to implement various specific data implementation algorithms on a defined data stream. The ability to encapsulate data feed ingestion, operational dynamics of the algorithm, and the visualization definition, has long been the dream of altruistic clinical data architecture designers ever since commercial clinical predictive algorithms started being implemented. I know I’ve had at least a few conversations with teams regarding the ability to use a Kubernetes-style approach to algorithm containers so that they could be “traded”. The forefront of that was the Excel Medical (now Hillrom) WAVE platform that provided a socket-like connection to a stream of consistent, high-speed data interface (HSDI) data, similar to what GE and Philips natively offer from their Carescape and PIIC iX servers respectively. This provides a stream of continuous, streaming data, including waveform data, so that your algorithm has the actual raw data, not just an HL7 feed.

The early “heavy iron” implementations of before have been replaced by more agile cloud-based AI tools such as, which offer similarly agile UI tools such as qbo, their innovative human contextual reporting agent, as well as innovative query definition tools including structured queries and Python hooks. This allows the ability to modularly construct enterprise predictive algorithm surveillance tools fairly readily, patient monitoring gateway license fees notwithstanding.

The missing piece has always been the visualization object. How do you convey a dynamic object that is responding dynamically to physiological changes on 30 patients in an ICU, or fifty patients in a NICU? Creating the visualization object or ‘viz’ object itself is hard enough, as the “grandfather of data” Edward Tufte said “There is no such thing as information overload. There is only bad design.” Perhaps true, but exemplary clinical visualization designs such as Dr. Randall Moorman’s CoMET score would benefit from the ability to be encapsulated, and inserted into the appropriate data feed chain as well. The concept of storing algorithms in UML is not new, but the ability to encapsulate them in a high-level descriptive language would allow an easier mechanism for the conveyance of the entire predictive analytics loop including ingestion, operational and visual data. Model-based systems engineering makes an obvious choice for architecting the nuances of data flow, analysis, and reporting.

Enter Dassault Systems NoMagic – a Systems Modeling tool that provides a toolbox for everything SysML. Developed as a high-level modeling language, it transforms itself into a unique tool in the product development arsenal that allows intangible dynamic visualization tools to be conveyed. SysML is already used in a broad context of applications, and perhaps this is one that might not catch your eye in a sea of design and manufacturing tools but the ability to explain very large complex data operations is certainly its forte’. Moving forward, for complex data visualizations, for instance, say, that use Unity for dynamic 3-dimensional visual object generation