Quality Measures for Model-Based Systems Engineering
Model-Based Systems Engineering (MBSE) is experiencing a renaissance. The rise in MBSE is alongside the rise in cyber-physical systems, with MBSE being the cornerstone for solutions based on Internet of Things (IoT) and Digital Twin technology.
Models are used to document the structure and simulate the behavior of large, complex IT systems in industries such as aerospace, automotive, and financial services. Models allow us to test and validate system characteristics, discuss aspects of the system with stakeholders, and support reuse in design work.
Yet, the value of MBSE is totally dependent on the quality of the model. If the model is weak or vulnerable, any analyses or decisions based on it are flawed, as well as the system built from the model.
CISQ has launched a joint Working Group with OMG to create a specification for measuring model quality. Currently, there are no standards to analyze a software model during the design phase. The earlier that system engineers can detect vulnerabilities and weaknesses in a model, the less expensive and risky to repair. The objective of this Working Group is to define quality measures based on counting severe architectural and design weaknesses that can be detected through analyzing formal models developed in MBSE languages and technologies.
While much of the focus on MBSE quality to date has been on the functional fit of the model, we need to focus on other quality characteristics, such as vulnerabilities and weaknesses, system maintainability, and reliability for MBSE to reach its full potential with models produced that are robust and trustworthy.
The Quality Measures for MBSE Working Group started its research in March 2019 and continues to work on the specification in 2021.
- Recap of Quality Measures for MBSE Project - David Norton, Executive Director, CISQ
- U.S. Department of Defense Digital Engineering Strategy and Vision - Philomena Zimmerman, Deputy Director for Engineering Tools and Environments
- Results of MBSE Benchmarking Survey led by SERC, NDIA, INCOSE - Tom McDermott, Deputy Director, Systems Engineering Research Center
- Dr. Bill Curtis, CISQ
- David Norton, CISQ
- Paul Seay, Northrop Grumman
- Philippe-Emmanuel Douziech, CAST
- Joe Jarzombek, Synopsys
- Robert Martin, MITRE
- Paul Rainey, CGI
- Kavitha Sridhar, CGI
- Michael McFarren, MITRE
- Bill Nichols, SEI
- Jerome Hugues, SEI
- Girish Seshagiri, ISHPI
- Dr. Barry Boehm, University of Southern California
- LaMont McAliley, Aerospace Corporation
- Travis Lenhart, Lockheed Martin
- Michael Vinarcik, University of Detroit Mercy
We are seeking Working Group members to contribute to this specification and our roadmap.