Skip to content

Chat & AI Assistant

Kai is Konnect’s built-in AI assistant. Ask questions in plain English and get answers, charts, and analytics from your manufacturing data.


When you ask Kai a question, it:

  1. Analyses your question to understand what data you need.
  2. Selects tools — database queries, MQTT data, analytics, etc.
  3. Queries your connected data sources.
  4. Returns an answer with data, charts, or action results.

Kai maintains conversation context, so you can ask follow-up questions without repeating yourself.


  • Show me all tables in the production database
  • What was the average temperature on Line 3 yesterday?
  • List the top 10 products by output this week
  • How many alarms fired in the last 24 hours?
  • Create a line chart of motor temperature over the last 8 hours
  • Show production output as a bar chart grouped by shift
  • Build a dashboard with temperature, pressure, and flow rate
  • Create a gauge showing current OEE
  • Detect anomalies in pressure readings from Pump 2
  • Calculate OEE for Machine A this week
  • Run process capability analysis (Cp/Cpk) on thickness measurements
  • Forecast energy consumption for the next 7 days
  • What are the current MQTT readings from the temperature sensors?
  • Subscribe to live data from OPC-UA node ns=2;s=Temperature
  • Show me all active Sparkplug B devices
  • Scan for available Docker containers with data services
  • Monitor CPU temperature every 5 seconds and alert if > 90°C
  • Set up continuous monitoring for pressure spikes on Line 1
  • Create a scheduled report of daily production totals

Konnect supports multiple LLM providers. You can switch between them at any time using the LLM Provider dropdown in the chat interface:

ProviderModelNotes
GroqLlama 3.3 70BRecommended — fast, free tier available
OpenAIGPT-4oPaid — high accuracy
AnthropicClaude 3.5Paid — excellent reasoning
OllamaLocal modelsSelf-hosted — run models on your own hardware

Your conversation history is preserved when switching providers.


  • Conversation memory — Kai remembers context within a session and learns your preferences across sessions.
  • Inline charts — Visualisations appear directly in the chat conversation.
  • PDF export — Export any conversation as a formatted PDF report.
  • Code display — SQL queries and code snippets are syntax-highlighted.
  • Data source awareness — Kai knows which databases and brokers are connected and their schemas.
  • Tool transparency — See which tools Kai used to answer your question (query, analytics, etc.).

  1. Open the Manufacturing IDE or switch to Chat mode.
  2. Click the chat input field at the bottom of the Kai panel.
  3. Type your question — e.g., Show me all tables in the database.
  4. Press Enter or click Send.
  5. Kai will process your request and display the results.
  6. If a chart is generated, it will appear both inline in the chat and on the active dashboard tab.

  1. Look for the LLM Provider dropdown at the top of the chat panel.
  2. Click the dropdown and select your preferred provider (Groq, OpenAI, Anthropic, or Ollama).
  3. Your conversation continues with the new provider — no data is lost.

Step-by-step: Exporting a conversation as PDF

Section titled “Step-by-step: Exporting a conversation as PDF”
  1. Have a conversation with Kai that includes charts or data.
  2. Click the Export PDF button in the chat toolbar.
  3. A formatted PDF is generated with your conversation, charts, and data tables.
  4. Save or share the PDF with your team.

  • Be specific about time ranges — Say “last 24 hours” or “this week” rather than “recently”.
  • Mention the data source — If you know the table or broker name, include it.
  • Ask follow-up questions — Kai maintains context so you can refine results iteratively.
  • Use “create a dashboard” — This generates multi-chart layouts from a single request.
  • Say “explain” — Get a description of what Kai did and what tools it used.