1

Install Yomo CLI

curl -fsSL "https://get.yomo.run" | sh
2

Initialize a LLM Function

yomo init tool-get-weather

an example serverless llm function will be created in the tool-get-weather directory:

$ tree tool-get-weather
drwxr-xr-x@   - c3ylabs 15 Apr 10:00 tool-get-weather
.rw-r--r--@  53 c3ylabs 15 Apr 10:00 ├── .env
.rw-r--r--@  32 c3ylabs 15 Apr 10:00 ├── .gitignore
.rw-r--r--@ 391 c3ylabs 15 Apr 10:00 ├── package.json
.rw-r--r--@ 14k c3ylabs 15 Apr 10:00 ├── pnpm-lock.yaml
drwxr-xr-x@   - c3ylabs 15 Apr 10:00 ├── src
.rw-r--r--@ 612 c3ylabs 15 Apr 10:00   └── app.ts
.rw-r--r--@ 266 c3ylabs 15 Apr 10:00 └── tsconfig.json
3

Start YoMo Server and expose MCP server

Create your yomo.config.yaml file in the root directory of your project. This file is used to configure the YoMo server and specify the MCP server settings.

name: ai-zipper
host: 0.0.0.0
port: 9000

bridge:
  mcp:
    server:
      addr: localhost:9001
  ai:
    server:
      addr: localhost:9000
      provider: openai

    providers:
      openai:
        api_key: sk-proj-xxxxx
        model: gpt-4.1
yomo serve -c yomo.config.yaml

The output will be like:

3:14PM INF Starting YoMo Zipper...
3:14PM INF using config file service=zipper zipper_name=ai-zipper file_path=./yomo.config.yaml
3:14PM INF listening SIGUSR1, SIGUSR2, SIGTERM/SIGINT...
3:14PM INF register LLM providers num=11
3:14PM INF start AI Bridge service service=llm-bridge addr=localhost:9000 provider=vllm
3:14PM INF zipper is up and running service=zipper zipper_name=ai-zipper zipper_addr=[::]:9000 pid=34882 quic="[v1 v2]" auth_name=[none]
4

Start the serverless LLM function

yomo run
Press Ctrl-C to stop the Serverless LLM Function.