近日,据美联社报道,多位软件工程师、开发人员和学术研究人员指出,OpenAI推出的语音转写工具Whisper存在 ... 将这一现象称为“AI幻觉”。
这种现象被称为“AI幻觉”。虽然OpenAI已提醒用户在高风险领域时勿使用Whisper,这一工具仍然在全球多个行业,尤其是医疗领域,被广泛应用 ...
【TechWeb】美联社近日的最新报道揭露了一个关于OpenAI的语音 ... 研究人员透露,Whisper有时会生成大量虚假内容,这些内容被称为“AI幻觉”。
IT之家 10 月 28 日消息,当地时间 27 日,据美联社报道,超过十位软件工程师、开发人员和学术研究人员称,OpenAI 的语音转写工具 Whisper 存在一个 ...
OpenAI的Whisper模型,曾被誉为接近“人类水平稳定性和准确性”的AI转录工具,如今却深陷“幻觉”泥潭。这种AI“胡编乱造”的现象,正引发业界对人工智能伦理、应用安全以及未来发展方向的深刻反思。
今天凌晨,知名科技媒体Venturebeat消息,OpenAI高级研究科学家、德扑AI之父Noam Brown,在美国旧金山举办的TED AI大会上提出了一个震惊的理论——让AI ...
IT之家 10 月 28 日消息,当地时间 27 日,据美联社报道,超过十位软件工程师、开发人员和学术研究人员称,OpenAI 的语音转写工具 Whisper 存在一个 ...
OpenAI's Whisper, an artificial intelligence (AI) speech recognition and transcription tool launched in 2022, has been found to hallucinate or make things up -- so much so that experts are worried ...
As reported by AP News, researchers and experts are sounding the alarms about Whisper, claiming that not only is it inaccurate, it often makes things up entirely. While all AI is prone to ...
OpenAI’s new AI audio transcription tool Whisper is having frequent “AI hallucinations”, despite its rapid adoption in “high-risk industries” like healthcare, AI hallucination is where a ...
Software engineers, developers, and academic researchers have serious concerns about transcriptions from OpenAI’s Whisper ... discussion around generative AI’s tendency to hallucinate ...