Setting Up OpenClaw with Ollama Cloud

April 27, 2026

For the last couple of years, I've been running local models on my own hardware. There's a certain satisfaction in having an AI that lives entirely on your machine. No internet is required while running your LLM, and you have total privacy, with nothing going to the cloud. But lately I have wanted my AI to have more capabilities, to perform tasks on my behalf. I wanted an AI that could read and write files, browse the web, research hotels, set reminders, and more. In other words, I wanted an agentic AI.

I heard about OpenClaw a couple of months ago, and wanted to try it out. Since I already use Ollama to run my local models, I decided to try a similar setup, with everything running locally. But my Linux computer ran very slowly with the Qwen model I chose, and even the most powerful computer in the house, the M4 Mac Mini, took too long. Since I am already familiar with Ollama, and since Ollama introduced cloud models several months ago, I decided to give that setup a try.

Below are the steps I used to set up OpenClaw using Ollama and Ollama Cloud models.

8 Steps to Setting up OpenClaw

  1. Create an Ollama Account: Head over to ollama.com and sign up for a free account. This is your key to the cloud models.
  2. Install Ollama: Download the installer for your OS (Linux, Mac, or Windows) from the site. The instructions are pretty straightforward.
  3. Launch Ollama: Open your terminal and type ollama. This starts the engine.
  4. Choose "Chat with a Model": You'll see a menu. You can hit Enter for the default, but I recommend pressing the right arrow to explore. Look for a cloud model (they usually end in :cloud). Make sure it's multi-modal if you want it to handle images or files alongside text.
    Note: This will prompt you to log in with the account you created in Step 1.
  5. Test Drive: Chat with the model for a bit. Ask it a riddle, request a poem, or see how it handles a coding question.
  6. Exit the Chat: Type /bye to leave the raw model interface.
  7. Launch OpenClaw: Choose the "Launch OpenClaw" option.
    Note: If you haven't installed Node.js yet, you might need to do that now. OpenClaw runs on it, and the installer should guide you if it's missing.
  8. Start Your Session: Once OpenClaw is running, it will ask you to choose a model. Pick the cloud model you tested earlier. Now, start chatting naturally. Ask it what it can do!

Conclusion

I watched a lot of videos on OpenClaw before I decided to dive in. I took a Udemy class on it, as well as watching YouTube videos to figure out how other people implemented it. In the end I used Ollama to set it up. It has been running really well for me, so if you are looking for an easy way to try the tech out, with an affordable plan, this might be the option for you.