✨ Chat Modal

A modal to perform "document search" using AI. Use or modify it LLM-powered features that involve asking questions on documents.

It includes an input field where users can type their queries. When they press [Enter], it performs vector search on documents you've generated embeddings for and calls the OpenAI text completions API to stream a chat-like response back to the user.

Behind the scenes, it makes use of the useCompletion() hook from the Vercel AI SDK, and calls the /api/vector-search endpoint.

The modal can be opened by pressing Cmd + K on your keyboard, or by defining your own opening trigger, like a button click.

Chat cmd k

import ChatModal from "@/components/ChatModal";
<ChatModal />

⚠️ Make sure you've followed the steps to set up the vector search feature before using this component. This includes setting your OpenAI api key and generating embeddings for documents in the docs/ folder

The /api/vector-search endpoint can only be called by authenticated users. You can disable that by commenting-out the authentication check in @/app/api/vector-search/route.ts

The ChatModal component is built for one-time Q&A, not for chatting. You can modify it to do back-and-forth chatting by replacing the useCompletion() hook with the useChat() hook, and displaying the list of messages as in the ChatButton component.

Last Updated: June 7