# YoMo > The YoMo Documentation, a serverless LLM function calling framework, enabling developers to build and deploy ai agents with MCP fast and easily. ## Docs - [/v1/chat/completions](https://yomo.run/api-ref/endpoint/post.md): Creates a completion for the chat message - [LLM Bridge API](https://yomo.run/api-ref/intro.md): LLM Bridge API - [yomo init](https://yomo.run/cli/init.md): Generate a LLM Function Calling - [Overview](https://yomo.run/cli/overview.md): YoMo CLI - [yomo run](https://yomo.run/cli/run.md): Start a Serverless LLM Function - [yomo serve](https://yomo.run/cli/serve.md): Start LLM Bridge and expose MCP server - [Introduction](https://yomo.run/introduction.md): Open source Serverless LLM Function Calling framework for AI agents - [Quickstart](https://yomo.run/quickstart.md): Create your first LLM function calling tool in minutes - [Chat Completions API](https://yomo.run/server-configuration/llm-bridge.md): LLM Bridge provides an OpenAI API-compatible API server with support for multiple providers. - [MCP Server](https://yomo.run/server-configuration/mcp-bridge.md): MCP Bridge creates a MCP server that can expose all the Serverless LLM Functions. - [Concept](https://yomo.run/sfn/concept.md): The Core of YoMo Serverless LLM Function ## OpenAPI Specs - [openapi](https://yomo.run/api-ref/openapi.yaml) ## Optional - [Documentation](https://yomo.run) - [Community](https://discord.gg/RMtNhx7vds) - [Github](https://github.com/yomorun/yomo)