1

Install CLI

curl -fsSL "https://get.yomo.run" | sh
2

Initialize a LLM Function

yomo init tool-get-weather

an example llm function calling will be created in the ool-get-weather directory:

$ tree ool-get-weather
drwxr-xr-x@   - c3ylabs 15 Apr 10:00 tool-get-weather
.rw-r--r--@  53 c3ylabs 15 Apr 10:00 ├── .env
.rw-r--r--@  32 c3ylabs 15 Apr 10:00 ├── .gitignore
.rw-r--r--@ 391 c3ylabs 15 Apr 10:00 ├── package.json
.rw-r--r--@ 14k c3ylabs 15 Apr 10:00 ├── pnpm-lock.yaml
drwxr-xr-x@   - c3ylabs 15 Apr 10:00 ├── src
.rw-r--r--@ 612 c3ylabs 15 Apr 10:00 │   └── app.ts
.rw-r--r--@ 266 c3ylabs 15 Apr 10:00 └── tsconfig.json
3

Run YoMo Server and expose MCP server

yomo serve

The output will be like:

3:14PM INF Starting YoMo Zipper...
3:14PM INF using config file service=zipper zipper_name=ai-zipper file_path=/Users/fanweixiao/tmp/yomo.config.ai.bridge.yaml
3:14PM INF listening SIGUSR1, SIGUSR2, SIGTERM/SIGINT...
3:14PM INF register LLM providers num=11
3:14PM INF start AI Bridge service service=llm-bridge addr=localhost:9000 provider=vllm
3:14PM INF zipper is up and running service=zipper zipper_name=ai-zipper zipper_addr=[::]:9000 pid=34882 quic="[v1 v2]" auth_name=[none]
4

Start the serverless LLM function

yomo run
Press Ctrl-C to stop the Serverless LLM Function.