M&S Definitions
Term | Meaning |
Abstraction | The process of simplifying, focusing, or transforming aspects of an RWS (or referent system) represented in an M&S. (Note: Simplifying includes selecting aspects of the RWS to reduce in complexity in, or exclude from, the model. Focusing includes either emphasizing or deemphasizing certain aspects of the RWS when including them in the model. Transforming includes any change in the appearance, character, composition, configuration, expression, or structure of aspects of the RWS (when including them) in the model (e.g., Rotation, Translation, Mapping, Scaling, Mathematics). Any modeling abstraction carries with it the assumption that it does not significantly affect the intended uses of the M&S.) |
Accepted Use | The successful outcome of a use assessment designating the M&S is sufficient for a proposed use. |
Accuracy | The closeness of a parameter or variable (or a set of parameters or variables) within a model, simulation, or experiment to the true value or the assumed true value. |
Accreditation | Info for…?? |
Actual Use | The specific purpose and domain of application for which an M&S is being, or was, used. |
Aleatory Uncertainty | The inherent variation in the physical system; it is stochastic and irreducible without changes to the system or how it operates. |
Analysis | The examination of a situation or problem in order to understand the item in question and make appropriate recommendations. (Note: Analysis spans the whole extent of the M&S process from the study of the RWS or its referents, the gathering and reduction of data from the RWS or accepted referents for incorporation into a model, the development of simulation scenarios, and the study and reduction of data from use of the M&S into recommendations for the RWS.) |
Architecture | The essential elements of any system and their interrelationships, functions, and behaviors, including the influences of the environment and other (interfacing) systems. |
Architectural Diagram | Any one of the possible visual (graphical) representations (viewpoints) depicting select aspects (features) of a system. (See definition of Architecture.) |
Assumption | Asserting information as a basis for reasoning about a system. (Note: In modeling and simulation, assumptions are taken to simplify or focus certain aspects of a model with respect to the RWS or presume values for certain parameters in a model.) |
Calibration | The process of adjusting numerical or modeling parameters in the model to improve agreement with a referent. (Note: Calibration can also be known as “tuning.”) |
Caveat | “An explanation to prevent misinterpretation, or a modifying or cautionary detail to be considered when evaluating, interpreting, or doing something.” (http://www.merriam-webster.com/dictionary/caveat) |
Certification | Info for…?? |
Computational Model | The operational or usable implementation of the conceptual model, including all mathematical, numerical, logical, and qualitative representations. This may also be known as “simulation model.” |
Conceptual Model | The collection of abstractions, assumptions, and descriptions of physical components and processes representing the reality of interest, which includes the RWS, its environment, and their relevant behaviors. (Note: The conceptual model provides the source information for conceptual validation with respect to the RWS, model construction, and model verification. It may consist of flow charts, schematic drawings, written descriptions, math models, etc., that explain the RWS and its interaction with the surrounding/interfacing environment. The conceptual model should be independent of any specific model implementation.) |
Conceptual Validation | The process of determining the degree to which a conceptual model (as defined in this NASA Technical Handbook) or model design adequately represents the real world from the perspective of the intended uses of the model or the simulation. |
Configuration Management (CM) | A management discipline applied over the product’s life cycle to provide visibility into and to control changes to performance and to functional and physical characteristics. (Note: NPR 7120.5, NASA Space Flight Program and Project Management Requirements.) |
Correlated (as in an M&S correlated with an RWS) | The extent to which an M&S and RWS, or some aspect of an M&S and RWS, behave similarly due to a particular change in some set of input variables, parameters, perturbations, etc. |
Credibility | “The quality to elicit belief or trust in M&S results.” (NASA-STD-7009A.) |
Critical Decision | The selection of a course-of-action related to design, development, manufacturing, ground, or flight operations that may significantly impact human safety, mission success, or program success, as measured by program/project-defined criteria. |
Data Pedigree | A record of traceability from the data’s source through all aspects of its transmission, storage, and processing to its final form used in the development of an M&S. (Note: Any changes from the real-world source data may be of significance to its pedigree. Ideally, this record includes important quality characteristics of the data at every stage of the process.) |
Design of Experiments (or Experimental Design) | A series of tests in which purposeful changes are made to the input variables of a system or process and the effects on response variables are measured. (Note: It is applicable to both physical processes and computer simulation models.) |
Deterministic | A term describing a system whose time evolution can be predicted exactly. (Note: For comparison, see definition of “Probabilistic.”) |
Domain of Validation | The region enclosing all sets of model inputs for which the M&S’s responses compare favorably with the referent. |
Domain of Verification | The region enclosing all sets of model inputs for which the solution is determined to be correct and satisfy requirements for computational accuracy. |
Empirical Validation | The process of determining the degree to which an operating model or simulation is or provides an accurate representation of the real world from the perspective of the intended uses of the model or the simulation. |
Environment of the System (or RWS) | The set of elements external to a system. The RWS and its environment may interact through the exchange of properties. (Note: Only the interactions relevant to an analysis should be included in the M&S.) |
Epistemic Uncertainty | A lack of knowledge of the quantities or processes identified with the system; it is subjective, is reducible, and comprises both model and parameter uncertainty. |
Expanded Diagram | An illustration or diagram of a construction showing its parts separately but in positions that indicate their proper relationships to the whole. |
Framework | A set of assumptions, concepts, values, and practices constituting a way of viewing reality. (Note: For M&S, this may be a computing environment that integrates multiple interacting components on a single computer or across a distributed network.) |
Human Safety | The condition of being protected from death, permanently disabling injury, severe injury, and several occupational illnesses. In the NASA context, this refers to safety of the public, astronauts, pilots, and the NASA workforce. (Note: Adapted from NPR 8000.4 and the NASA Safety Hierarchy.) |
Input Pedigree | A record of the traceability from the input data’s source through all aspects of its transmission, storage, and processing to its final form when using an M&S. (Note: Any changes from the real-world source data may be of significance to its pedigree. Ideally, this record includes important quality characteristics of the data at every stage of the process.) |
Intended Use | The expected purpose and application of an M&S. |
Kriging | An interpolation technique in which the surrounding measured values are weighted to derive a predicted value for an unmeasured location. Weights are based on the distance between the measured points, the prediction locations, and the overall spatial arrangement among the measured points. |
Limits of Operation | The boundary of the set of parameters for an M&S, based on the outcomes of verification, validation, and uncertainty quantification, beyond which the accuracy, precision, and uncertainty of the results are indeterminate. (Note: NASA-STD-7009A.) |
M&S Risk | The potential for shortfalls with respect to sufficiently representing an RWS. |
Margin | The allowances carried in budget, projected schedules, and technical performance parameters (e.g., weight, power, or memory) to account for uncertainties and risks. (Note: NASA-SP-2016-6105, NASA Systems Engineering Handbook.) |
Mathematical Model | The mathematical equations, boundary values, initial conditions, and modeling data needed to describe the conceptual model. (Note: Adapted from ASME V&V 10, Guide for Verification and Validation in Computational Solid Mechanics.) |
Mission Success Criteria | Specifications against which the program or project will be deemed to have achieved operational objectives. |
Model | A description or representation of a system, entity, phenomena, or process. (Note: A model may be constructed from multiple sub-models; the sub-models and the integrated sub-models are all considered models. Likewise, any data that go into a model are considered part of the model.) |
Model Capability | The potential or ability (of a model) to represent an RWS, entity, phenomenon, or process. |
Model Uncertainty | Variation in M&S results due to assumptions, formulas, and representations, and not due to factors inherent in the RWS. |
Model Uncertainty Factor (MUF) | A semi-quantitative (i.e., a quantitative magnitude based on past experience rather than data) adjustment, either additive or multiplicative or both, made to the results of an M&S-based analysis to account for uncertainty. (Note: The MUF is also likely to have some associated confidence or coverage range.) |
Modeling | (a) The act of creating a system representation (i.e., the act of creating a model); (b) The act of utilizing a system representation (i.e., utilizing a model) as an approach for analyses. |
Numerical Errors | Errors traceable to various sources, including but not limited to floating point precision, inherent in all computer systems and leading to round off, underflow, and overflow; truncation of infinite series expansions; and approximations of exact solutions inherent in all numerical methods, e.g., approximation of derivatives and integrals by algebraic operations on sampled continuous functions. |
Peer Review | A technical assessment conducted by one or more persons of equal technical standing to person(s) responsible for the work being reviewed. |
Permissible Use | The purposes for which an M&S is formally allowed. |
Probabilistic | Pertaining to non-deterministic events, the outcome of which is described by a measure of likelihood. |
Proposed Use | A desired specific application of an M&S. |
Real World System | The reality of interest the model is representing, which may include relevant operating conditions or aspects of its environment. (Note: The RWS may interact with its environment, i.e. a set of relevant elements external to the RWS, through the exchange of properties. The term RWS is used to differentiate between the “system represented” and the “modeling system” used for the analysis.) |
Recommended Practices | Guidelines developed by professional societies, best practices documented for specific simulation codes, and NASA Handbooks and Guidebooks. |
Referent | Data, information, knowledge, or theory against which simulation results can be compared. (NASA-STD-7009A; adapted from ASME V&V 10, Guide for Verification and Validation in Computational Solid Mechanics.) (Note: A referent may be the RWS to which the analysis is directed, or it could be a similar or analogous system, whereby the closeness of the referent to the RWS becomes pertinent, or a higher fidelity model.) |
Regression Testing | Selective checking of the quality, performance, or reliability of an M&S system or component to verify that modifications have not caused unintended effects and that the M&S still complies with its requirements. (Note: Adapted from ISO/IEC/IEEE 24765:2010 Systems and software engineering—Vocabulary. This term is in no way related to statistical regression analysis.) |
Responsible Party | The group or individual identified as accountable for complying with requirements in NASA-STD-7009A. (Note: Different parties may be identified for the various requirements.) |
Results Robustness | The characteristic whereby the behavior of (result from) an M&S does not change in a meaningful way in response to as-designed control variations in parameters. (Note: The results from an M&S are robust if they are relatively stable (do not change in a meaningful way) with respect to as-designed changes in the control parameters or input variables of the M&S. Key sensitivities are parameters and variables shown to produce large changes in results with relatively small perturbations to input.) |
Risk | The potential for shortfalls with respect to achieving explicitly established and stated objectives. (Note: This definition has been updated from the definition found in NASA-STD-7009A, to the recently revised definition found in NPR 8000.4B, 12/06/2017.) |
Scenario | The description or definition of the relevant system and environmental assumptions, conditions, or parameters used to drive the course of events during the run of a simulation model. (Note: The scenario may include, but is not limited to the set of initial conditions, a set of assumptions, the values of relevant parameters [including system and environmental conditions, locations and quantities of objects, entities, or resources], or a sequence of actions, which may be specified in the model itself. Running the model with the given scenario is the simulation.) |
Sensitivity Analysis | The study of how variation in the output of an M&S can be apportioned to different sources of variation in the model input and parameters. (Note: The Results Robustness of an M&S-based analysis is obtained via sensitivity analysis (NASA-STD-7009A, adapted from Saltelli, 2005).) |
Simulation | The imitation of the behavioral characteristics of a system, entity, phenomena, or process. |
Stochastic | Involving or containing a random variable or variables. Pertaining to chance or probability. (Note: http://mathworld.wolfram.com/Stochastic.html.) |
Subject Matter Expert (SME) | An individual having education, training, or experience in a particular technical or operational discipline, system, or process and who participates in an aspect of M&S requiring their expertise. |
Tailoring | The process used to adjust or modify a prescribed requirement to better meet the needs of a specific program/project task or activity. |
Uncertainty | (a) The estimated amount or percentage by which an observed or calculated value may differ from the true value; (b) A broad and general term used to describe an imperfect state of knowledge or a variability resulting from a variety of factors, including but not limited to lack of knowledge, applicability of information, physical variation, randomness or stochastic behavior, indeterminacy, judgment, and approximation (adapted from NPR 8715.3, NASA General Safety Program Requirements); (c) Non-negative parameter characterizing the dispersion of values attributed to a measured quantity. |
Uncertainty Characterization | The process of identifying all relevant sources of uncertainties and describing their relevant qualities (qualitatively or quantitatively) in all models, simulations, and experiments (inputs and outputs). |
Uncertainty Quantification | The process of identifying all relevant sources of uncertainties; characterizing them in all models, experiments, and comparisons of M&S results and experiments; and quantifying uncertainties in all relevant inputs and outputs of the simulation or experiment. (Note: NASA-STD-7009A.) |
Unit Testing | Any type of software testing conducted on the smallest meaningful, testable fragments of code to ensure the code behaves exactly as intended under various conditions. For procedural programming languages, such code fragments are generally functions or subroutines. |
Use Assessment | The process of determining if an M&S is accepted for a Proposed Use. |
Validation | The process of determining the degree to which a model or a simulation is an accurate representation of the real world from the perspective of the intended uses of the M&S. |
Verification | The process of determining the extent to which an M&S is compliant with its requirements and specifications as detailed in its conceptual models, mathematical models, or other constructs. |
Voluntary Consensus Standards (VCS) | Standards developed or adopted by VCS bodies, both domestic and international, that include provisions requiring that owners of relevant intellectual property have agreed to make that intellectual property available on a non-discriminatory, royalty-free, or reasonable royalty basis to all interested parties. (Note: OMB Circular No. A-119.) |
Waiver | A documented authorization intentionally releasing a program or project from meeting a requirement. (Note: NPR 7120.5D. Deviations and exceptions are considered special cases of waivers.) |