From Chaos to Clarity: Building Searchable Knowledge Graphs with Local LLMs

Large Language Models aren't just for chatbots—we can use them to power sophisticated information extraction systems that run entirely on your own infrastructure. This talk demonstrates how to transform organizational documentation into searchable knowledge graphs using local LLMs, keeping all data private within UiO's network.

We'll show live how a local LLM can:

  1. ?Read our documentation (Markdown, Github, Vortex pages, Mattermost)
  2. Extract entities and relationships automatically (people, services, hardware, software, organizational structures)
  3. Build a knowledge graph with full provenance (source document, URL, line numbers) and embeddings
  4. Answer questions in natural language by querying the graph

No external APIs. No cloud services. Everything stays on local servers.

Perfect for staff curious about practical LLM applications—I'll walk through concrete examples using real USIT documentation. You'll see how multi-hop queries can answer complex questions like *"What is the difference between Autotekst and TSD-Autotekst?"*

Publisert 31. mars 2026 09:03 - Sist endret 31. mars 2026 09:03