An Initiative of

Supported by

Illustration

Assessing Responsibility in Digital Solutions that operate with or without AI - A Review for Policymakers

APR 2024

Abstract

Background

Digital health solutions that operate with or without artificial intelligence (D/AI) raise several responsibility challenges. Though many frameworks and tools have been developed, determining what principles should be translated into practice remains under debate. This scoping review aims to provide policymakers with a rigorous body of knowledge by asking: 1) what kinds of practice-oriented tools are available?; 2) on what principles do they predominantly rely?; and 3) what are their limitations?

Methods

We searched six academic and three grey literature databases for practice-oriented tools, defined as frameworks and/or sets of principles with clear operational explanations, published in English or French from 2015 to 2021. Characteristics of the tools were qualitatively coded and variations across the dataset identified through descriptive statistics and a network analysis.

Findings

A total of 56 tools met our inclusion criteria: 19 health-specific tools (33.9%) and 37 generic tools (66.1%). They adopt a normative (57.1%), reflective (35.7%), operational (3.6%), or mixed approach (3.6%) to guide developers (14.3%), managers (16.1%), end users (10.7%), policymakers (5.4%) or multiple groups (53.6%). The frequency of 40 principles varies greatly across tools (from 0% for ‘environmental sustainability’ to 83.8% for ‘transparency’). While 50% or more of the generic tools promote up to 19 principles, 50% or more of the health-specific tools promote 10 principles, and 50% or more of all tools disregard 21 principles. In contrast to the scattered network of principles proposed by academia, the business sector emphasizes closely connected principles. Few tools rely on a formal methodology (17.9%).

Conclusion

Despite a lack of consensus, there is a solid knowledge-basis for policymakers to anchor their role in such a dynamic field. Because several tools lack rigour and ignore key social, economic, and environmental issues, an integrated and methodologically sound approach to responsibility in D/AI solutions is warranted.

Read the full article

Contributors

Illustration
Carl Mörch

Contributors

Share

Other publications

All publications

All publications