CentML Chat App with Gradio
This Gradio-based web app allows you to interact with the CentML Serverless API . It streams the assistant’s responses in real-time, providing a conversational interface where users can chat with an AI assistant.Features
- Real-time Chat: The assistant responds in real-time to your inputs by streaming responses as they are generated.
- Easy-to-Use Interface: The Gradio interface allows for a clean and intuitive user experience.
Prerequisites
Before running the app, ensure that you have the following:- Python 3.10+ installed on your machine.
- Gradio and other necessary Python libraries. You can install these via
pip(see instructions below). - A valid CentML API key. The API key should be stored in an environment variable called
CENTML_API_KEY.
Usage
Docker File
- Docker
- Open Browser
- navigate to http://localhost:7860
Local
- Enter chat-apps/gradio directory
- Set up a virtual environment (optional but recommended):
- Install the required dependencies:
- Set the CentML API key as an environment variable:
Running the App
To run the Gradio app, execute the following command:Customization
- Model: If you want to change the model or parameters such as temperature or max_tokens, you can modify the data dictionary in the send_message function in app.py.
- UI Layout: You can adjust the layout of the interface by modifying the Gradio components in the gr.Blocks section.
Troubleshooting
- Error Connecting to API: If you encounter an error connecting to the CentML API, ensure that your API key is valid and set in the environment variable CENTML_API_KEY.
- Python Version: Ensure you are using Python 3.8 or higher.
- Dependencies: Make sure you have installed all the necessary dependencies (Gradio and Requests).