Hardening RAG System Against Hallucinations

When building a RAG system, one must take into account risk of hallucinations. In this talk I will share my experience identifying sources of hallucinations and provide proven hardening strategies that help to identify and mitigate risk of hallucinations in RAG application developed with Azure stack (OpenAI, AI Search and Document Intelligence as key components).

  • English

Speaker

FURTHER SESSIONS

  • APIs für Deine Datenbank: REST, GraphQL und mehr

  • Jump, Run, Observe! Building a Game to Explore App Observability on Azure

  • Legacy-Systeme neu gedacht: Event Processing mit Drasi

  • Mission 00AI: Mit .NET und Semantic Kernel zum Geheimagenten-Status

  • Have you discovered something exciting?

    Register today!

    Don't miss the chance to learn from leading experts and make valuable contacts!