Conversational Knowledge

A hands-on data science research project

Introduction

One of the (so far) very few effective applications of generative AI is text summarization and meaning abstraction. Organizations managing topic-specific knowledge can greatly improve the educational experience of their audiences by offering a conversational interface (chatbot) that allows users to reach understanding through their own learning pathway carved through natural language conversations.

Tech companies (OpenAI, Google, Azure, IBM, etc.) offer solutions that allow their LLMs to integrate organizational knowledge (through RAG and fine-tuning), but this comes at a price, and the educational infrastructure is on a permanent lease and never owned.

Objectives

Implement a chatbot that uses technologies that are free to use and hold: Linux Ubuntu, Docker, Ollama, Llama3, OpenWebUI, ducling

As a secondary objective, we also explore the possibility of inducing the chatbot to use the Socratic method to guide a productive conversation with the learner.

The depth and breadth of the knowledge will be limited by the capacities of the RAG component.

Knowledge file formats: pdf, docx, mp3, mp4, pptx, jpg, webp. extracted with docling to markdown files

Methodology

Follow these steps to create a free AI chatbot that answers questions from your website visitors based on your current knowledge base.

If you don’t own a server (or a computer capable of neural network processing), you will need a virtual private server hosting plan like Hostinger KVM 2 at about €6/Mon. The container can easily be cloned and redeployed. Whoever follows the implementation holds the custody of the model, the ownership of the knowledge, and all access controls.

Before deploying it to an online server, we shall develop the containerized solution locally.

Install Ubuntu

Install Ollama

Install Docker

Install OpenWebUI

Install docling

after installation….

First, launch Ollama on Ubuntu:

sudo systemctl start ollama

Verify Ollama is running:

sudo systemctl status ollama

Launch OpenWebUI:

docker start open-webui

On a browser on the same machine, navigate to http://localhost:8080/

to stop working on it, shut down openwebui and ollama:

docker stop open-webui
sudo systemctl stop ollama

Privacy Policy | Affiliate Disclaimer | Terms & Conditions | Opt-Out Preferences