Objectives While ethicists have largely underscored the risks raised by digital health solutions that operate with or without artificial intelligence (AI), limited research has addressed the need to also mitigate their environmental footprint and equip health innovators as well as organisation leaders to meet responsibility requirements that go beyond clinical safety, efficacy and ethics. Drawing on the Responsible Innovation in Health framework, this qualitative study asks: (1) what are the practice-oriented tools available for innovators to develop environmentally sustainable digital solutions and (2) how are organisation leaders supposed to support them in this endeavour?
Methods Focusing on a subset of 34 tools identified through a comprehensive scoping review (health sciences, computer sciences, engineering and social sciences), our qualitative thematic analysis identifies and illustrates how two responsibility principles—environmental sustainability and organisational responsibility—are meant to be put in practice.
Results Guidance to make environmentally sustainable digital solutions is found in 11 tools whereas organisational responsibility is described in 33 tools. The former tools focus on reducing energy and materials consumption as well as pollution and waste production. The latter tools highlight executive roles for data risk management, data ethics and AI ethics. Only four tools translate environmental sustainability issues into tangible organisational responsibilities.
Conclusions Recognising that key design and development decisions in the digital health industry are largely shaped by market considerations, this study indicates that significant work lies ahead for medical and organisation leaders to support the development of solutions fit for climate change.
- medical leadership
Data availability statement
Data are available upon reasonable request.
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Contributors LR and PL contributed substantially to the design of the study. LR and RRdO were responsible for data collection. LR, PL, RRdO and HA were involved in data analyses. LR wrote the first draft of the article. All authors contributed to the interpretation of the findings, critically revised preliminary versions of the paper and contributed important intellectual content. All authors reviewed and edited the manuscript and approved the final version. The guarantor, PL, accepts full responsibility for the conduct of the study, had access to the data, and controlled the decision to publish.
Funding This study was funded through a peer-reviewed call for proposals of the International Observatory of the Societal Impacts of Artificial Intelligence and Digital Technologies (OBVIA) (no award/grant number). Our research team benefits from funding by the Canadian Institutes of Health Research (CIHR; #FDN-143294). Our research group infrastructure is supported by the Fonds de la recherche en santé du Québec (FRQ-S) (no award/grant number).
Competing interests None declared.
Provenance and peer review Not commissioned; externally peer reviewed.
Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.