AI with Upsun
Back to home
On this page
Upsun provides powerful capabilities for hosting AI applications, agents, and services. You can deploy AI workloads using any supported runtime and integrate with various LLM APIs and services.
Note
Before you start, check out the Upsun demo app and the main Getting started guide. They provide all the core concepts and common commands you need to know before using the following materials.
AI applications and services
- AI Agents - Host conversational AI agents and chatbots using any supported runtime
- MCP Servers - Deploy Model Context Protocol servers for AI tool integration
Supported technologies
- Runtimes: Python, Node.js, PHP, Ruby, Go, Java, and supported runtime types
- LLM APIs: OpenAI, Anthropic Claude, Google Gemini, Azure OpenAI, AWS Bedrock, and any other HTTP-based API service
- AI Frameworks: LangChain, LlamaIndex, Chainlit, and custom implementations
- Integration: REST APIs, WebSockets, and event-driven architectures
API flexibility
Upsun supports integration with any LLM service that provides an HTTP API. The services listed above are just popular examples. You can integrate with self-hosted models, specialized AI services, or any custom API endpoint that follows standard HTTP protocols.
Get started
- Choose your runtime: Select the programming language that best fits your AI application needs
- Configure your app: Set up your application in the
.upsun/config.yaml
file - Integrate LLM APIs: Connect to your preferred AI service providers
- Deploy and scale: Push your code and let Upsun handle the infrastructure
For detailed examples and tutorials, see the AI and Machine Learning tutorials on DevCenter.
Find out more about the many languages Upsun supports.