How to Install and Set Up Ollama

Follow this quick guide to set up Ollama for AI Book Writer

Before You Start

Ollama allows you to run AI models locally on your own computer, giving you free and private access to powerful language models. This is an excellent option for users who prefer not to use cloud-based API services or want to work offline.
1

Download Ollama

Visit ollama.com/download and select the version for your operating system (Windows, macOS, or Linux). The download is approximately 30MB and installs quickly on most systems.

2

Install Ollama

Run the installer and follow the on-screen prompts to complete the installation. On Windows, you might need to accept a security prompt. Once installed, Ollama will run in the background with a small icon in your system tray.

3

Download an AI Model

Open a command prompt or terminal and enter the command ‘ollama pull llama2’ to download the Llama 2 model. This may take several minutes depending on your internet connection. You can also download other models like ‘mistral’ or ‘phi’.

4

Run Ollama

After the model download completes, Ollama will automatically start serving the API locally. To verify it’s working, you can enter ‘ollama run llama2’ in your terminal to test the model with a simple chat interface.

5

Configure AI Book Writer

Open AI Book Writer and go to Settings > Model Settings. Select ‘Ollama’ as your API provider and then go to -> Ollama (Local) Tab and enter the API URL: ‘http://localhost:11434/api‘. Set the Name of your downloaded Model. Save your settings and you’re ready to create books using your local model.

Important: Running AI models locally requires sufficient RAM and CPU resources. For optimal performance, we recommend at least 16GB RAM for smaller models and a dedicated GPU for larger models.

Additional Information

  • Ollama is completely free and runs offline after initial model download
  • Multiple language models are available (Llama 2, Mistral, Phi, and others)
  • Models vary in size from 3GB to 20GB+ depending on parameters
  • To change models, use Settings > Ollama (Local) in AI Book Writer and type the used model
  • Local models may be slower than cloud APIs but offer complete privacy
  • For heavy usage or large books, consider a computer with dedicated GPU

Create complete, publication-ready books with AI-powered assistance.

Contact

Email: aibookwriter [at] robertbuerger.de

© 2024 AI Book Writer. All rights reserved.