OpenAI ChatKit: Build Custom AI Chat Interfaces Without Starting from Scratch

Confused about the new OpenAI ChatKit? We cut through the hype to explain what it is, how it works, and how you can use it to build better conversational AI.

OpenAI ChatKit: Build Custom AI Chat Interfaces Without Starting from Scratch

Updated February 2026. This article has been reviewed and updated to reflect the latest information.

If you have been following the AI tooling space, you have probably noticed that building a chat interface sounds simple until you actually try to do it properly. Streaming responses, managing conversation state, handling file uploads, rendering rich content inside messages, theming it to match your brand. The list of “small details” becomes a full engineering project fast.

That is the problem OpenAI’s ChatKit sets out to solve. It is an open-source UI library that works with any framework and gives you a production-ready chat interface out of the box, with solid customisation options when you need them.

We have been using ChatKit across several client projects since it hit general availability, and this is our honest take on what it does well, where it falls short, and whether it makes sense for your business.

What Is OpenAI ChatKit, Exactly?

ChatKit is a batteries-included framework for building AI-powered chat experiences. Think of it as the front-end layer that sits between your users and an AI backend. It handles message rendering, streaming, tool visualisation, and file attachments.

The core library ships as a web component, which means it works with any JavaScript framework. OpenAI also provides first-class React bindings through @openai/chatkit-react, which is what most teams (including ours) reach for in practice.

Here is what it is not: ChatKit is not a chatbot platform. It does not host your AI model or manage your prompts. It is purely the interface layer. You bring the AI backend (OpenAI’s API directly, a self-hosted model, or something built with the OpenAI Agents SDK) and ChatKit handles how users interact with it.

The project sits on GitHub under an Apache 2.0 licence, with active development and regular releases (v1.6.0 landed in February 2026). It has picked up over 1,800 stars and the contributor base is still growing.

Why This Matters for Australian Businesses

Every week we talk to Australian organisations that want to build custom AI solutions, whether that is internal knowledge bases, customer-facing support bots, or document analysis tools. The conversation almost always hits the same friction point: the AI part is actually the easier bit. It is the interface that eats the budget.

Building a quality chat UI from scratch typically means two to four weeks of front-end development before you have even connected it to a model. You need streaming support, markdown rendering, code highlighting, mobile responsiveness, accessibility, state management, and error handling. And that is before you start on features like file uploads or conversation history.

ChatKit shortens that timeline significantly. We have gone from zero to a functional, branded chat interface in under a day on multiple projects. That is time and budget freed up for the work that actually differentiates your product: the AI logic, the integrations with your business systems, and the domain-specific tuning that makes the tool useful.

Key Features Worth Knowing About

Streaming and Response Rendering

ChatKit handles response streaming natively. Messages appear progressively as the model generates them, which is table stakes for a modern chat experience but surprisingly fiddly to implement well from scratch. It also supports chain-of-thought visualisations, so you can show users the reasoning process behind an answer when transparency matters.

Rich Widgets

This is where ChatKit pulls ahead of simpler chat UI libraries. Widgets let your AI responses include structured, interactive content beyond plain text. The built-in widget types include:

  • Cards for displaying structured information (order summaries, ticket details, product info)
  • ListViews for search results or article recommendations
  • Forms for collecting structured input within the conversation flow
  • Buttons that trigger server-side actions
  • Date pickers, dropdowns, badges, and images

These widgets are defined as JSON and rendered inside the chat. Your backend decides what widgets to show based on context, and ChatKit handles the rendering and interaction.

Tool Integration and Agentic Workflows

If you are building AI agent experiences, ChatKit has first-class support for tool invocation. Users can see when the AI is calling external tools and what results come back. Long-running tools can stream progress updates to the UI, which matters for tasks like document processing or database queries that take more than a couple of seconds.

File Handling

Users can upload files and images directly in the chat interface. ChatKit manages the upload UI, preview rendering, and attachment state. Your backend handles the actual file processing, but the front-end experience is handled for you.

Conversation Management

Threads, message history, and conversation organisation are built in. For internal tools where users come back to previous conversations regularly, this saves a significant amount of development time.

Getting Started: Two Deployment Paths

ChatKit offers two distinct approaches, and which one you choose depends on how much control you need.

Path 1: OpenAI-Hosted Backend (Fastest Setup)

This is the quickest route. You build your agent logic in OpenAI’s Agent Builder, embed the ChatKit component in your frontend, and OpenAI handles the backend infrastructure, message storage, and scaling.

The setup is straightforward. Install the React package:

npm install @openai/chatkit-react

Create a backend endpoint to generate session tokens (you never expose your API key to the client):

from openai import OpenAI
import os
openai = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
@app.post("/api/chatkit/session")
def create_chatkit_session():
    session = openai.chatkit.sessions.create()
    return {"client_secret": session.client_secret}

Then drop the component into your React app:

import { ChatKit, useChatKit } from '@openai/chatkit-react';
export function AIChatWidget() {
  const { control } = useChatKit({
    api: {
      async getClientSecret() {
        const res = await fetch('/api/chatkit/session', {
          method: 'POST',
          headers: { 'Content-Type': 'application/json' },
        });
        const { client_secret } = await res.json();
        return client_secret;
      },
    },
  });
  return <ChatKit control={control} className="h-[600px] w-full" />;
}

That is really it for a basic implementation. The component renders a fully functional chat interface with streaming, markdown, and responsive design.

Path 2: Self-Hosted Backend (Maximum Control)

For businesses that need full control over data flow (particularly relevant for Australian organisations dealing with sensitive data or regulatory requirements), the self-hosted path uses the ChatKit Python SDK with your own FastAPI server.

You host the backend, control where messages are stored, manage your own authentication, and connect to whatever AI backend you want. The front-end ChatKit component connects to your server instead of OpenAI’s infrastructure.

This is the path we recommend for most enterprise clients, especially those in financial services, healthcare, or government. Your data stays on your infrastructure, and you have complete visibility into every interaction.

Customisation: Making It Yours

ChatKit’s theming system is more flexible than you might expect. You are not stuck with a generic OpenAI-branded chat window. The ThemeOption configuration gives you control over:

  • Colour schemes: light mode, dark mode, or fully custom accent colours
  • Typography: bring your own fonts
  • Border radius: from sharp corners to fully rounded
  • Information density: compact or comfortable layouts
  • Component-level overrides: customise the header, composer, and message areas independently

For most Australian businesses building customer-facing tools, this means the chat interface can look like a natural part of your product rather than an obviously embedded third-party widget.

If you need to go further, ChatKit supports custom widgets. You can define your own widget types with custom rendering logic, so you can build domain-specific UI elements. Interactive charts for financial data, say, or property comparison cards for a real estate tool.

ChatKit vs the Alternatives

Building from Scratch

If you have a front-end team with capacity and you need something highly bespoke, building from scratch gives you total control. But it is expensive. We estimate 80 to 160 hours of development to build a chat interface that matches ChatKit’s feature set (streaming, widgets, file handling, threading, and mobile responsiveness). For most businesses, that is not a good use of engineering time.

Vercel AI SDK

The Vercel AI SDK is good for building streaming chat interfaces, particularly in Next.js applications. It is more of a toolkit than a complete solution. You get hooks and utilities for managing chat state and streaming, but you build the actual UI yourself. It also supports multiple AI providers (Anthropic, Google, and others), which ChatKit does not.

Choose Vercel AI SDK if you want provider flexibility and are comfortable building your own UI components. Choose ChatKit if you want a complete, production-ready interface with minimal front-end work and you are committed to the OpenAI ecosystem.

Other Chat UI Libraries

Libraries like chatscope/chat-ui-kit-react give you styled components but lack the AI-specific features (streaming integration, tool visualisation, widget rendering) that ChatKit includes. They are fine for simple chat interfaces but require a lot more work for agentic AI experiences.

Use Cases We Have Seen Work in Practice

Internal Knowledge Bases

An Australian professional services firm needed their team to query internal policies and project documentation through a conversational interface. ChatKit’s widget system let us surface structured document summaries and related article lists directly in the chat, instead of just dumping raw text.

Customer-Facing Support

For a mid-market SaaS company, we built a support interface that uses ChatKit’s form widgets to collect structured issue details and button widgets to trigger actions like ticket creation and escalation, all within the conversation flow. The self-hosted backend meant customer data stayed within their existing Australian infrastructure.

Document Analysis Tools

ChatKit’s file upload support, combined with custom widgets for displaying extracted data, works well for building document analysis interfaces. Invoice processing, contract review, compliance checking: the user uploads a document and gets structured results rendered as cards and tables inside the chat.

These are the kinds of projects where our AI consulting team gets the most traction, taking a capable but generic tool like ChatKit and shaping it around specific business workflows.

Security and Data Handling

Security is where the deployment path decision really matters.

With the managed backend, your conversation data flows through OpenAI’s infrastructure. OpenAI’s enterprise privacy commitments apply, and they state that API data is not used for model training. But for Australian businesses in regulated industries, “the data goes to a US server” is often a non-starter regardless of the provider’s privacy policy.

With the self-hosted backend, you control the entire data path. Messages can stay within your Australian-hosted infrastructure. You manage authentication, encryption, and access controls according to your own security requirements. This matters for organisations subject to the Privacy Act, APRA CPS 234, or sector-specific regulations.

Regardless of which path you choose, ChatKit’s token-based authentication means your OpenAI API key is never exposed to the client. Your backend generates short-lived session tokens, and the frontend uses those tokens to communicate. This is a solid security pattern, but you still need to implement proper authentication on your token endpoint. Do not leave it open.

One area to watch: if you are using Agent Builder with file search or MCP connectors, be aware that those data sources can potentially be exposed through prompt injection attacks. Run thorough security testing before connecting sensitive data sources.

When ChatKit Makes Sense (and When It Does Not)

ChatKit is a strong choice when:

  • You are building a chat-based interface and want to move fast
  • Your AI backend is OpenAI-based (or you are happy to use OpenAI for the conversational layer)
  • You need rich, interactive responses beyond plain text
  • Your team’s strength is backend AI logic, not front-end development
  • You want something that looks polished on day one

Consider alternatives when:

  • You need to support multiple AI providers (Anthropic, Google, etc.)
  • Your interface is fundamentally not a chat experience
  • You need deep, pixel-perfect UI customisation that goes beyond theming
  • You are building on a non-JavaScript stack

Practical Tips from Our Deployments

  1. Start with the managed backend for prototyping. Even if you plan to self-host in production, the managed path gets you a working demo fastest. It makes a real difference for stakeholder buy-in.
  1. Invest time in widget design early. The difference between a generic chatbot and a genuinely useful tool often comes down to how well you use widgets to structure information. Plan your widget types during the design phase, not as an afterthought.
  1. Theme it from the start. Do not demo a generic-looking ChatKit instance to stakeholders. Spend thirty minutes applying your brand colours and typography. It changes how people perceive the tool more than you would expect.
  1. Build your token endpoint properly. It is tempting to knock together a quick session endpoint and move on. But this is your security boundary. Add rate limiting, proper authentication, and logging from day one.
  1. Test mobile early. ChatKit is responsive, but your custom widgets might not be. Test on actual devices, not just browser DevTools.

Where to Go from Here

If you are looking at custom AI interfaces for your business, ChatKit is worth evaluating. It will not be the right fit for every project, but for the ones where it works, it saves weeks of development time and delivers a better result than most teams would build from scratch.

We help Australian businesses navigate exactly these decisions. If you are trying to figure out whether ChatKit, a custom build, or a different approach is right for your project, book a call with our team and we will give you an honest assessment.

Already have a project in mind? Get in touch and we will scope out what the build looks like, including which parts of the stack to build custom and which parts to speed up with tools like ChatKit.


Frequently Asked Questions

Is ChatKit free to use?

Yes, ChatKit itself is open-source under the Apache 2.0 licence. You can use it without any licensing fees. However, you still pay for the underlying OpenAI API usage (model inference, token costs) as normal. If you use the managed backend path through Agent Builder, OpenAI’s standard API pricing applies to the messages processed.

Can I use ChatKit with AI models other than OpenAI’s?

With the self-hosted backend, technically yes. The ChatKit frontend component communicates with your backend server, so you can connect any AI model behind it. However, the developer experience is optimised for OpenAI’s API. If multi-provider support is a core requirement, the Vercel AI SDK may be a better fit for the streaming and state management layer, with your own UI on top.

Does ChatKit support Australian data residency requirements?

With the self-hosted deployment path, absolutely. You host the backend on your own infrastructure, whether that is AWS Sydney, Azure Australia East, or on-premises servers. Conversation data never leaves your environment. With the managed backend, data is processed through OpenAI’s infrastructure, which is US-based. For organisations with strict data sovereignty requirements, self-hosting is the way to go.

How long does it take to build a production-ready chat interface with ChatKit?

Based on our project experience, a basic branded chat interface can be up and running in a day or two. A more complex implementation with custom widgets, tool integrations, and a self-hosted backend typically takes one to two weeks of development. Compare that with the two to four months we have seen teams spend building equivalent functionality from scratch. For a more specific estimate, reach out to our team.

Can ChatKit be embedded into an existing web application?

Yes. ChatKit ships as a web component, so it can be embedded into any web application regardless of the framework: React, Vue, Angular, or plain HTML. The React bindings provide the smoothest integration if you are already in the React ecosystem, but the vanilla JavaScript web component works anywhere you can add a