Every IT team has one. A knowledge base. Years of articles, written by half a dozen different authors, in slightly different styles, with slightly different terminology, mostly unloved.

The senior engineers don't use it because they remember the answers. The junior engineers don't use it because the search box is keyword-based and the article they need is titled "VPN Connectivity Troubleshooting Guide v3 (2024-Q3 Refresh)" and they searched for "vpn won't connect".

So they ask the help desk anyway. Or Slack. Or the senior engineer who's trying to focus. The knowledge base sits there gathering dust, year after year, paid for in writing time but rarely consulted.

FreeITSM's Knowledge module ships with a small but powerful change to that pattern: an AI chat assistant called Ask AI that searches your articles semantically and answers questions in plain English, with sources cited. It doesn't replace the articles — it changes how people get to the answers inside them.

From "find the right article" to "get the answer".

What it actually feels like

You open Ask AI from a button that's available on every page in FreeITSM. A chat panel appears. You type a question the way you'd ask a colleague:

My laptop won't connect to the VPN after the Windows update last week, what should I try?
After the November Windows feature update, the most common VPN failure is the IKEv2 driver getting stuck in a "registered but disabled" state. The fix that works for most users is to open Device Manager, find the WAN Miniport (IKEv2) entries under Network adapters, uninstall them with the "delete the driver software" option ticked, then reboot. Windows reinstalls them automatically on the way back up. If that doesn't resolve it, our standard secondary check is whether the AnyConnect client has a compatibility flag set in its profile — the second source below covers that.

That's the whole interaction. You typed a sentence, got an answer that synthesises information from two articles, and have direct links to both if you want to verify or read further.

The change is mostly psychological. With keyword search, you and the article need to share vocabulary — if the article calls it "VPN connectivity troubleshooting" and you call it "VPN won't connect", you get nothing. With semantic search, the system understands intent. It doesn't matter which words you used; the underlying concept is what gets matched.

Under the hood: retrieval-augmented generation

The technique Ask AI uses is called retrieval-augmented generation, or RAG. Two AI providers are involved, doing two different jobs:

1Embed articles
Each Knowledge article is converted into a vector representation using OpenAI's embedding model. Stored alongside the article.
2Embed the question
When you ask, your question is embedded using the same model so it lives in the same vector space as the articles.
3Retrieve closest matches
A nearest-neighbour search picks the few most semantically similar articles — the ones whose content is closest in meaning to your question.
4Synthesise the answer
Anthropic Claude is given the retrieved articles as context and asked to compose a plain-English answer that draws only on them.

The clever bit isn't either AI in isolation — embeddings are commodity, big language models are commodity. The clever bit is the combination. Embeddings give you semantic search; Claude gives you a synthesis layer that actually answers the question rather than just returning links. RAG is what binds them together.

One important property: the answer is grounded in your knowledge base. Claude doesn't get to free-associate from its training data. It's instructed to answer from the retrieved articles only. If the relevant article doesn't exist or isn't retrieved, Ask AI says so — rather than inventing something plausible-sounding. That property matters more than any clever prompting trick.

Why citations change everything

Every Ask AI response lists the source articles it drew from. They're clickable. You can verify the answer in one click, and the citations make it easy to spot when the AI has wandered.

This is the difference between an AI assistant you can trust with operational questions and one you can't. If the citations are missing or wrong, you'll notice immediately the first time the answer doesn't match the source. If they're present and accurate, the trust compounds — the next time someone in your team asks a similar question, they're a little more likely to type it into Ask AI rather than asking you.

Citations also turn the knowledge base into self-improving infrastructure. When Ask AI says "no relevant article exists" for a question your team asks regularly, that's a strong signal you should write one. When the cited article is out of date, the answer is too — which surfaces stale articles in a way that traditional search never did.

Embedded everywhere

Ask AI isn't just a feature on the Knowledge page. It's available from a small button on every page in FreeITSM. The most useful place is from inside a ticket: from the ticket detail panel, click Ask AI and the question is pre-filled with the ticket's subject. So when a help desk analyst opens a ticket about VPN trouble, they're one click away from the relevant knowledge instead of going to find it themselves.

The same is true from change management, asset detail, and other module pages. Anywhere you might wonder "is there a related knowledge article on this", Ask AI is one click away with the right context already loaded. That dramatically lowers the bar for actually using the knowledge base versus paying lip service to it.

The article-authoring side still matters

Ask AI is only as good as the articles behind it. The Knowledge module's article-authoring tools are deliberately well-equipped to encourage good content:

  • Rich-text editor (TinyMCE) with syntax highlighting for code — so technical articles look like technical articles, not markdown blobs.
  • Article versioning and history — every edit is preserved. Restore an earlier version if a recent edit broke something.
  • Tag-based categorisation — articles can carry multiple tags for cross-cutting topics.
  • Article review workflow — assign a review cycle so articles get re-checked annually rather than rotting silently.
  • Recycle bin with restore and auto-purge — deleted articles go to a recycle bin first, with configurable auto-purge after N days.
  • View count tracking — you can see which articles are actually being consulted (vs which are being surfaced by Ask AI but not clicked into).
  • Email sharing — one-click share of an article to a user or external email address, useful when a request can be resolved with a knowledge link rather than a written reply.

Combine that with Ask AI on top and you have a knowledge base that grows in value over time rather than decaying. The articles get written and edited; Ask AI surfaces them in context; the team uses them; gaps become obvious and get filled. Each loop reinforces the next.

What changes for you

Without Ask AI, your knowledge base is a library: a collection of articles that requires a librarian's mindset to navigate. The team uses it when they remember to and skips it when they don't.

With Ask AI, your knowledge base is an oracle: ask it anything, get an answer, with sources. The barrier to using it drops to "type a sentence". The senior engineers use it because it's faster than remembering. The junior engineers use it because it speaks their language. The help desk uses it from inside a ticket because it's already there.

Time saved is one part. The bigger shift is cultural — the knowledge base stops being a chore and becomes a tool. The articles already exist; the AI is the missing layer that makes them useful.

The articles already exist. Ask AI is the missing layer.