Ethical Considerations in AI Simulations for Designing Assistive Technologies

Main Article Content

Evin Miser
Orcun Sarioguz


Current ethical debates on the use of artificial intelligence (AI) in healthcare approach AI technology in three primary ways. First, they assess the risks and potential benefits of current AI-enabled products using ethical checklists. Second, they propose ex ante lists of ethical values relevant to the design and development of assistive technologies. Third, they advocate for incorporating moral reasoning into AI's automation processes. These three perspectives dominate the discourse, as evidenced by a brief literature summary. We propose a fourth approach: viewing AI as a methodological tool to aid ethical reflection. This involves an AI simulation concept informed by three elements: 1) stochastic human behavior models based on behavioral data for simulating realistic scenarios, 2) qualitative empirical data on value statements regarding internal policy, and 3) visualization components to illustrate the impact of variable changes. This approach aims to inform an interdisciplinary field about anticipated ethical challenges or trade-offs in specific settings, prompting a re-evaluation of design and implementation plans. This is particularly useful for applications involving complex values and behaviors or limited communication resources, such as dementia care or care for individuals with cognitive impairments. While simulation does not replace ethical reflection, it allows for detailed, context-sensitive analysis during the design process and before implementation.Finally, we discuss the quantitative analysis methods enabled by stochastic simulations and the potential for these simulations to enhance traditional thought experiments and future-oriented technology assessments.

Article Details

How to Cite
Miser, E. ., & Sarioguz, O. . (2024). Ethical Considerations in AI Simulations for Designing Assistive Technologies. Journal of Artificial Intelligence General Science (JAIGS) ISSN:3006-4023, 4(1), 209–218.