Background
Objective
Methods
Results
Decision characteristics
Decision need | Clinical example | Defense example |
---|---|---|
Urgency, need for rapid initial response |
Emergency department triage. Triage requires near-instantaneous decisions, in order to avoid delaying critical treatment for high-priority patients[49]. |
Command centers. The air force command center, which controls all available aircraft, must decide which requests for air support from ground forces in trouble get priority, with timeliness sometimes being essential. |
Adaptation to changing circumstances, new information |
Ventilator management. As patient respiratory function changes, ventilators must adjust accordingly[50]. |
Improvised explosive device (IED) disarmament. The tactics and technology of IEDs changes over a period of weeks, so the strategy to disarm them must do so also. DS includes real-time surveillance and computer models that help anticipate adversary actions. |
High consequence, life or death implications | Many, for example cancer chemotherapy order sets, dose checking; radiation therapy planning. |
Mission choice. Send troops on a dangerous mission from which they may not return. The decision to send a SEAL team in to get bin Laden risked lives of the team and international crisis, but was in pursuit of a compelling objective. |
Uncertain possibilities due to incomplete, imperfect information |
Diagnostic expert systems. Diagnosis relies upon accurate assessment of signs and symptoms, but these do not always provide reliable information, such as when the patient is unable to communicable effectively. |
War planning. Enemy behavior cannot be predicted with full precision, information on the enemy is incomplete[20] and imperfect, with deep uncertainty[18]. War planners must anticipate and be ready to deal with many adversary tactics. In the 2003 war with Iraq, U.S. forces prepared for chemical-weapon attacks, mass movement of refugees, burning of oil facilities, etc. |
Balancing disparate types of risks and benefits |
Treatment selection. The effectiveness of adjuvant chemotherapy, which has its risks and side effects, depends on patient factors and tumor stage[51]. |
Attacks near population centers. Air force attacks must evaluate how weapons used affect target accuracy, the risk of civilian casualties, and effectiveness. This balances risks of collateral damage, international incident, and effectiveness. DS includes accurate computer maps, weapon-effects models, and rigid doctrine and discipline. |
Principles of decision-making
Decision support system design features
Design feature and problems addressed | Key lessons from the literature |
---|---|
Feature:
Broad, system-level views of the big picture
| ● Provide a broad overview, so that the decision-maker can see the entire environment, what is known, and what is not[22]. Develop a comprehensive view of operations and interconnected systems by identifying key nodes within each system (persons, places, or things), establishing relationships, emphasizing baseline data for the current situation and how it relates more generally to the known solutions, and categorizing information objectively[17]. Provide multiple levels of detail (i.e., the broad view with zooms)[45]. |
Problems addressed: Tunnel vision, cognitive biases that prevent consideration of the full range of options (e.g., representation heuristic, anchoring and adjustment) | |
Parallels in CDS literature: None | |
● Frame problems with all the relevant factors and friendly/opposing viewpoints, posing questions throughout the process that prompt users to search for the root of the problem and think about what is not known[17]. Continue the problem formulation process until an opposing view is considered[37, 42]. | |
Feature:
Customized to address specified problems and user needs
| ● Development should balance virtues of careful initial design and rapid prototyping[47]. Tools that are simplified and customized for niche uses may, in some instances, be developed rapidly and avoid unnecessary complexity. Expanding a niche system to other user groups then requires a significant jump, and should be done after the processes, data formats, and availability are evaluated[27]. |
Problems addressed: Generic systems with too much complexity, which are not user-friendly do not handle any single problem well | |
● Commercial, off-the-shelf systems may work, but they need to be adapted appropriately to the targeted users[25, 32]. In some situations, fully customized systems are required[22]. They should be part of an integrated information system, follow standard software development processes (developing, testing, maintenance), and use standards-consistent hardware and software platforms for acceptability, reliability and maintenance[35]. | |
Parallels in CDS literature: Addressed somewhat by Bates[9] | |
● Different situations may demand different tools. A defense operation involves many phases: planning, deployment, execution, recovery, and post-operation work – and different tools are needed at each phase[19]. | |
Feature:
Involving users in system design
| |
Problems addressed: Poor adoption of system, user trust, ease of use | |
Parallels in CDS literature: Addressed in many studies; see Kawamoto[7] | |
Feature:
Transparency that documents the underlying methodologies and decision processes
| ● Ensure that users can apply their own judgment and explore trade-offs by using interactive tools and visuals to show likely/unlikely possibilities, short- and long-term trends, etc. – give “better answers, not just the answer,” including supporting evidence and key drivers of outcomes. Show how trade-offs between competing objectives affect outcomes[27], and provide the right level of granularity to back up recommendations[23]. “Build insight, not black boxes”[27]. |
Problems addressed: User acceptance and over- or under-trust of system recommendations, “satisficing” behavior, ethical biases | |
● Collect metadata – data that describes the nature of the data, such as user actions and date/time stamps. Build in system capabilities that show what actions are recommended, when they were taken, and what criteria were satisfied to justify those actions. This facilitates tracking how the decision was made, and can be used to improve decisions or provide liability protection[19, 22]. | |
Parallels in CDS literature: Partial: Kawamoto addressed “justification of decision support via provision of reasoning and evidence”[7] | |
● Elicit the decision-making structure[39]. Provide information about the reliability of the decision aid, and about the reliability of human judgment, to encourage appropriate use of systems – e.g., avoiding blind adherence (overuse) and distrust (underuse)[24]. Restate issues and build flow diagrams that challenge the user to consider how each piece of evidence supports their decision[37]. | |
Feature:
Effective organization and presentation of data
| ● Use presentation methods such as summary dashboards, graphics and visuals, interactive simulations and models, storyboards, matrices, spreadsheets, qualitative data fields, and customized interfaces[25, 26, 33, 37, 39, 42]. The most effective presentation format depends on the situation, and research does not consistently support which is right in which situation[39]. |
Problems addressed: Cognitive limits on processing large volumes of data, meaningful application of naturalistic-intuitive decision-making within rational-analytic DS systems | |
● Display patterns that are better recognized by humans than computers in showing a trend, and avoid asking users for extra information from unformatted text[22]. | |
● Provide well-conceived default formats and easy restoration, but allow users to control and customize displays using scatter diagrams, bar charts, dashboards, statistical analyses, reports, etc.[32]. Organize data using filtering and retrieval functions that allow users to change the aggregation level from highly detailed to overall summaries, but add in alerts in case users filter out important information[22, 32] – i.e., allow users to “pull” extra information as desired. | |
Parallels in CDS literature: Partial: Topic of “relevant data display” in Wright[6] | |
● “Push” key information and updates to users – deliver prompts when critical new pieces of information arrive, tailored to the action requirements of specific users, and develop pre-programmed sets of plans that can be applied in response to new information[21]. Good DS design will push out only key information that facilitates the task, not overwhelm the user with too much information. | |
● Use consistent standards and terminology so that words, situations, and actions are clear, and to increase user friendliness[21]. | |
Feature:
Multi-scenario, multi-option generation
| ● Use multi-scenario generation, portfolio analysis, foresight methods, and branch-and-sequel methods to educate the decision-maker on the implications of uncertainty and ways to hedge, including with planned adaptation[18]. Use rational-analytic structures to assure presence of alternative choices and (possibly to apply probabilities and weights), but avoid making a single recommendation about the final choice – instead, show how changes in variables or criteria affect assessments[18, 39]. |
Problems addressed: Co-existing presence of rational-analytic and naturalistic-intuitive decision-making, unreliable nature of optimization-based models | |
● Allow the user to explore various outcomes by generating a distribution of all plausible outcomes, accounting for both desired and undesired effects[20]. Simplify by grouping assumptions (including those about values and disagreements), so that users can more readily see how choice depends on “perspective”[45].1
| |
Parallels in CDS literature: None | |
● Work backward from the observed outcome. Map out the possible chains of events that could have led to the outcome[28]. Alternatively, identify the potential outcomes, then examine all the branches that could lead to those outcomes. Use a hierarchical / nested design to show DS rules that lead to different results[29]. Functionally, the point is to show what one would have to believe to get different results. | |
Feature:
Collaborative, group, and web-based systems
| |
Problems addressed: De-centralized information sources, team collaboration in decision-making, interoperability of systems, need for broad range and depth of expertise from individuals in disparate locations | |
● At the same time, recognize that expert opinion is often not nearly so reliable as often assumed. This is highly dependent upon details of knowledge elicitation[54]. | |
Parallels in CDS literature: None | |
● For “wicked problems” with unclear solutions, use cognitive, dialogue, and process mapping methods to encourage brainstorming and organize a group’s ideas[34]. |