In this upcoming series of articles, we explore Large Language Models (LLMs), Knowledge Graphs (KGs), and their integration. Our goal is to examine the popular pattern of combining them and finally discuss to what extent this pattern will persist or perish in the future.
The first article focuses on KG-enhanced LLMs. LLMs enriched with structured knowledge from graphs.
The second article covers LLM-augmented KGs. Where LLMs assist in constructing or extending knowledge graphs.
The third article explores the fusion of LLMs and KGs. Discussing the long-term viability of their coexistence and integration.
Each article explores the core concepts, underlying mechanisms, limitations, and integration strategies of these systems—highlighting both their benefits and the challenges involved in combining them.
Hi Ole, I think this is likely to be great material. I have a question for you and Anis before you start.
A Knowledge Graph is just a way of representing stuff. I often look at tables and relationships and think they are similar to KG but less rich and expressive. An LLM is a probabilistic model that manipulates data (usually text) to generate new data (text, image etc).
So are you really talking in this series of posts about ways to represent data (KG being one) and ways to process and generate data (LLM being the relevant one) and progressing from one way interactions, to a more conversational style of interaction?
At its simplest the KG is basically a corpus of text describing concepts as nodes and edges. The LLM reads the text and generates output. Instead of text we specify KG as output, … and it goes on.
My question is that by simplifying this down (if accurate) have I destroyed or enhanced your argument?