We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
How to Mitigate Hallucination Risk in GenAI.
- Authors
Black, Lamont; Stern, Matthew
- Abstract
The article shares strategies to accounting and finance professionals to mitigate hallucination risk and safeguard against threat of incorrect or misleading information in generative artificial intelligence (AI). These include setting the precision parameters of chatbots for large language models (LLM), prompt engineering, checking the references provided, uploading documents and files, fine-tuning of an AI model on a specialized data set, and use of retrieval augmented generation system.
- Subjects
GENERATIVE artificial intelligence; HALLUCINATIONS; NATURAL language processing; ACCOUNTING standards; LANGUAGE models
- Publication
Strategic Finance, 2024, p1
- ISSN
1524-833X
- Publication type
Article