Quickstart
Access CentML’s Managed Models in Less Than a Minute.
CentML’s serverless endpoints provide programmatic and browser-based access to popular models.
No need to worry about the underlying infrastructure, scaling deployments, or maintenance - just log in to start testing and evaluating models to determine how they fit your business needs.
When you’re ready to move to production, you can deploy dedicated LLM endpoints that fit your performance requirements and budget, with CentML’s LLM Serving.
1. Log into the CentML Platform
To access a CentML serverless endpoint, you need to log in to the CentML platform. To do so, navigate to https://app.centml.com in your browser and create a CentML user account or log in with your account’s credentials if you already have one.
If you need guidance on how to create an account, please visit the Creating an Accoount documentation and follow the guide to create an account and login.
Once logged in, you will see the CentML Platform console home page, as shown below.
2. Create an API key
When accessing a CentML Serverless endpoint, you must authenticate using a Serverless API key even if you are using the chat UI. By default, you should have a serverless API key already generated for you on account creation. If not, you may see errors similar to below.
To generate an API key, please follow the Generating Serverless API Tokens and Vault Objects documentation and move onto the next section. Worth noting that these API keys are long lived, so you may want to rotate them occasionally.
3. Begin Querying Your Desired Large Language Models
Accessing the Chat UI
To access the chat UI, select the Serverless
option from the sidebar menu.
Configure, Prompt, and Submit
Once you are on the appropriate Your Serverless Endpoint
page, follow the instructions below to submit a request to the model.
- Configure the endpoint settings (right side of the screen) to fit your testing needs. The configuration menu is where you can select the model you’d like to leverage via the chat interface.
- Enter a prompt into the textbox at the bottom of the screen.
- Select the arrow (pointing up) to submit the prompt to the model.
Congratulations! You’ve begun your journey with CentML and submitted your first serverless request!
Advanced Serverless Usage and Considerations
For more advanced Serverless topics such how to access the API programmatically and request an additional model, please view our Serverless Endpoints documentation.
Additional Support: Billing, Sales, and/or Technical
For billing or sales support reach out to sales@centml.ai
.
You can also fill out a support request by following our Requesting Support guide. Support requests are not limited to sales and billing. They can include technical support, new model requests, and more. Please do not hesitate to reach out! We’re here to help!
What’s Next
Agents on CentML
Learn how agents can interact with CentML services.
Clients
Learn how to interact with the CentML platform programmatically
Resources and Pricing
Learn more about CentML Pricing
Get Support
Submit a Support Request
CentML Serverless Endpoints
Dive deeper into advanced serverless configurations and patterns.
LLM Serving
Explore dedicated public and private endpoints for production model deployments.