Meta-Llama-3-8B-Instruct Streamlit Chat
This repository contains a Streamlit application that allows users to chat with CentML’s Serverless API. The application streams responses in real-time, displaying a conversation-style interface.Features
- Real-time response streaming: As the assistant generates responses, they are displayed in real-time, simulating a natural conversation.
- Chat history: The application maintains a persistent chat history, allowing users to see the full context of their conversation.
- Simple and intuitive interface: The app uses Streamlit’s straightforward UI elements, making it easy to input messages and view responses.
Requirements
- Python 3.8+
- A valid CentML API key (to be set as an environment variable
CENTML_API_KEY) - Streamlit library
- Requests library
Installation
- Enter directory:
- Install the required dependencies:
- Set up your CentML API key as an environment variable:
- Run the Streamlit app:
- Open your browser and navigate to the URL provided in the terminal (usually http://localhost:8501).
- Once the app is running, you can enter a message in the text input box and click “Send.”
- The assistant’s responses will be streamed in real-time below the input box.
- Your conversation history will be displayed in the order of user and assistant messages.