diff --git a/README.md b/README.md index 09acbe0..8f0e25e 100644 --- a/README.md +++ b/README.md @@ -53,17 +53,19 @@ cp .env.example .env > **note:** droidclaw requires [bun](https://bun.sh), not node/npm. it uses bun-specific apis (`Bun.spawnSync`, native `.env` loading) that don't exist in node. -edit `.env` - fastest way to start is with ollama (fully local, no api key): +edit `.env` - fastest way to start is with groq (free tier): + +```bash +LLM_PROVIDER=groq +GROQ_API_KEY=gsk_your_key_here +``` + +or run fully local with [ollama](https://ollama.com) (no api key needed): ```bash -# option a: local with ollama (no api key needed) ollama pull llama3.2 LLM_PROVIDER=ollama OLLAMA_MODEL=llama3.2 - -# option b: cloud with groq (free tier) -LLM_PROVIDER=groq -GROQ_API_KEY=gsk_your_key_here ``` connect your phone (usb debugging on): @@ -206,8 +208,8 @@ name: Send WhatsApp Message | provider | cost | vision | notes | |---|---|---|---| +| groq | free tier | no | fastest to start | | ollama | free (local) | yes* | no api key, runs on your machine | -| groq | free tier | no | fastest cloud option | | openrouter | per token | yes | 200+ models | | openai | per token | yes | gpt-4o | | bedrock | per token | yes | claude on aws | diff --git a/site/index.html b/site/index.html index ed08697..5f713f9 100644 --- a/site/index.html +++ b/site/index.html @@ -793,19 +793,18 @@ cp .env.example .env
2

configure an llm provider

-

edit .env - fastest way is ollama (fully local, no api key):

-
# local (no api key needed)
-ollama pull llama3.2
-LLM_PROVIDER=ollama
+        

edit .env - fastest way to start is groq (free tier):

+
LLM_PROVIDER=groq
+GROQ_API_KEY=gsk_your_key_here
 
-# or cloud (free tier)
-LLM_PROVIDER=groq
-GROQ_API_KEY=gsk_your_key_here
+# or run fully local with ollama (no api key) +# ollama pull llama3.2 +# LLM_PROVIDER=ollama
+ - diff --git a/site/install.sh b/site/install.sh index 2f0fd12..ced152c 100755 --- a/site/install.sh +++ b/site/install.sh @@ -168,12 +168,12 @@ fi printf "next steps:\n\n" printf " ${BOLD}1.${RESET} configure an llm provider in ${CYAN}droidclaw/.env${RESET}\n\n" -printf " ${DIM}# local with ollama (no api key needed)${RESET}\n" -printf " ollama pull llama3.2\n" -printf " ${DIM}# set in .env:${RESET} LLM_PROVIDER=ollama\n\n" -printf " ${DIM}# or cloud with groq (free tier)${RESET}\n" +printf " ${DIM}# groq (free tier, fastest to start)${RESET}\n" printf " ${DIM}# set in .env:${RESET} LLM_PROVIDER=groq\n" printf " ${DIM}# ${RESET} GROQ_API_KEY=gsk_...\n\n" +printf " ${DIM}# or local with ollama (no api key needed)${RESET}\n" +printf " ollama pull llama3.2\n" +printf " ${DIM}# set in .env:${RESET} LLM_PROVIDER=ollama\n\n" printf " ${BOLD}2.${RESET} connect your android phone (usb debugging on)\n\n" printf " adb devices\n\n" printf " ${BOLD}3.${RESET} run it\n\n"
providercostvisionnotes
groqfreenofastest to start
ollamafree (local)yes*no api key, runs on your machine
groqfreenofastest cloud option
openrouterper tokenyes200+ models
openaiper tokenyesgpt-4o
bedrockper tokenyesclaude on aws