Research Article
Voice conversion and cloning: psychological and ethical implications of intentionally synthesising familiar voice identities
, , ,
t.dinkar@hw.ac.uk
Abstract
Voice identity conversion and cloning technologies use artificial intelligence to generate the auditory likeness of a specific human talker’s vocal identity. Given the deeply personal nature of voices, the widening availability of these technologies brings both opportunities and risks for human society. This article outlines key concepts and findings from psychological research on self-voice and other-voice perception that have a bearing on the potential impacts of synthetic voice likenesses on human listeners. Additional insights from speech and language therapy, human–computer interaction, ethics, and the law are incorporated to examine the broader implications of emergent and future voice cloning technologies.
Keywords
voice identityvoice synthesisartificial intelligencevoice perceptionpsychologyethicsCopyright statement © The author(s) 2025. This is an open access article licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License
Cite this article McGettigan, C., Bloch, S., Bowles, C., Dinkar, T., Lavan, N., Reus, J.C. & Rosi, V. (2025), ‘Voice conversion and cloning: psychological and ethical implications of intentionally synthesising familiar voice identities’, Journal of the British Academy, 13(3): a31 https://doi.org/10.5871/jba/013.a31

No Data Found

No Data Found

No Data Found
This paper examines whether artificial intelligence industry developers of large language models should be permitted to use copyrighted works to train their models without permission and compensation to creative industries rightsholders. This is examined in the UK context by contrasting a dominant social imaginary that prioritises market driven-growth of generative artificial intelligence applications that require text and data mining, and an alternative imaginary emphasising equity and non-market values. Policy proposals, including licensing, are discussed. It is argued that current debates privilege the interests of Big Tech in exploiting online data for profit, neglecting policies that could help to ensure that technology innovation and creative labour both contribute to the public good.
Philosophy is being hit hard by the decline in university funding, thanks in particular to the lack of a significant overseas student market and the reliance of many departments on a large number of individually small joint courses, which universities are keen to axe as a cost-cutting measure. One—admittedly modest—way in which the situation can be ameliorated, and which is working well at Leeds, is to offer bespoke teaching to university science and medical departments. Such departments want, and often need for accreditation purposes, to teach their students about (for example) ethics and sustainability, and they see the benefits for student engagement and employability in incorporating some relevant humanities teaching and assessment. So this is one way in which philosophy departments in particular, but perhaps humanities departments in general, might try to keep their heads above water. This article is published in the thematic collection ‘On recent closures and threats of closure in the Humanities and Social Sciences’, edited by Regenia Gagnier.

No Data Found
