Anti-Hallucination Prompt

Everyday Prompts By

Description

This prompt tackles the fear that AI is “making things up” by forcing the model to show its sources, cross-check important facts, and state how sure it is. Who would benefit Anyone using AI for research, learning, or decision-making—especially non-experts who can’t easily spot errors—will benefit from these reliability constraints. From the author: For anyone worried that ChatGPT might be hallucinating, try using this prompt. With recent AI models having much better search and verification capabilities, adding just one instruction can take accuracy to another level. Just adding this reduces "know-it-all" behaviour and raises the level of your AI assistant. ## Constraints: Ensuring Reliability Please strictly adhere to the following when responding. - [Source disclosure] Always specify the information sources that support your claims or facts. - [Multi-Perspective verification] For important facts, do not rely on a single source. Verify with multiple sources before answering. - [Confidence level] Provide the confidence level (%) of your answer and explain the reason. Do not guess when unsure.

Tags

Content Creation, System Prompts

Compatible Tools

ChatGPT

Subscribe to Newsletter