Apple's iPhone voice-to-text feature has come under scrutiny after a viral TikTok video showcased a user experiencing a glitch where the word 'racist' momentarily displayed as 'Trump' before reverting back to 'racist.'
Upon investigation, sources were able to replicate the issue multiple times, with the dictation feature briefly showing 'Trump' instead of 'racist' when the word was spoken, although not consistently.
Apple has acknowledged the problem and stated that they are working on a fix for the speech recognition model that powers Dictation. The company explained that the issue arises due to phonetic overlap, affecting words with an 'r' consonant when dictated.
This incident is not the first time technology has been embroiled in controversy related to political figures. In a separate incident, Amazon's Alexa faced backlash for allegedly displaying bias by providing reasons to vote for certain candidates while refusing to do so for others.
Amazon later clarified that the responses were due to pre-programmed manual overrides and promptly addressed the issue by implementing manual overrides for all candidates and election-related prompts.
Following an audit of its system, Amazon ensured that Alexa refrains from displaying political bias and maintains neutrality in responses to user queries.
Both Apple and Amazon have taken steps to rectify the issues with their respective technologies, emphasizing the importance of unbiased and accurate functionalities in voice-assisted devices.