Skip to content

Get Started

There is a couple of ways you can start using Open Notebook depending on your objectives and technical knowledge.

📦 Boilerplate (all you need)

There is a project on Github to help you get started very easily. It will help you start the services and also setup the folders you need to customize your transformations.

Take a look at the Open Notebook Boilerplate repo with a sample of how to set it up for maximum feature usability.

🐳 Docker Setup (quick start)

This docker-compose approach will get the service running for you.

Just create a file named docker-compose.yaml in a folder on your computer.

yaml
services:
  surrealdb:
    image: surrealdb/surrealdb:v2
    ports:
      - "8000:8000"
    volumes:
      - surreal_data:/mydata
    command: start --user root --pass root rocksdb:/mydata/mydatabase.db
    pull_policy: always
    user: root

  open_notebook:
    image: lfnovo/open_notebook:latest
    ports:
      - "8080:8502"
    environment:
        - OPENAI_API_KEY=API_KEY
        - SURREAL_ADDRESS=surrealdb
        - SURREAL_PORT=8000
        - SURREAL_USER=root
        - SURREAL_PASS=root
        - SURREAL_NAMESPACE=open_notebook
        - SURREAL_DATABASE=open_notebook
    depends_on:
      - surrealdb
    pull_policy: always
    volumes:
      - notebook_data:/app/data

volumes:
  surreal_data:
  notebook_data:

The example above will get you setup quickly, but if you want to extract the most of the tool, you should do a couple more steps.

First, you might want to use your local folder for storage instead of a Docker volume.

yaml
services:
  surrealdb:
    image: surrealdb/surrealdb:v2
    ports:
      - "8000:8000"
    volumes:
      - ./surreal_data/:/mydata
    command: start --user root --pass root rocksdb:/mydata/mydatabase.db
    pull_policy: always
    user: root

  open_notebook:
    image: lfnovo/open_notebook:latest
    ports:
      - "8080:8502"
    env_file:
      - docker.env
    depends_on:
      - surrealdb
    pull_policy: always
    volumes:
      - ./notebook_data:/app/data
      - ./transformations.yaml:/app/transformations.yaml
      - ./user:/app/prompts/patterns/user

This will create 3 folders: one for the database data (for easier backups), another for all data (uploads, downloads, podcasts) and another one for your custom transformations.

If you plan to create custom transformations, you need to have access to the transformations.yaml file. So, the best way to handle this is to download the file and put it on the project folder.

Then, you can just start the Application

bash
docker compose up -d

Navigate to: http://localhost:8080 and start having fun.

📦 Installing from Source

If you really want to play with the source code.

sh
git clone https://github.com/lfnovo/open_notebook.git
cd open_notebook
poetry install
cp .env.example .env

Run the database with:

bash
docker compose --profile db_only up

Run the app with:

sh
poetry run streamlit run app_home.py

Initial setup after loading the app

After the app is running, you'll be requested to configure your models. Head to the Models page for more informations on this.

Upgrading Open Notebook

Running from source

Just run git pull on the root project folder and then poetry install to update dependencies.

Running from docker

Just pull the latest image with docker pull lfnovo/open_notebook:latest and restart your containers with docker-compose up -d

System Requirements

  • Docker Engine running on your machine
  • 4GB RAM minimum
  • 2GB free disk space (the more the better)
  • An API Key for OpenAI, Anthropic, Gemini, Vertex or OpenRouter

Next Steps

  1. Create your first notebook
  2. Import some content
  3. Try out the AI features
  4. Write your notes
  5. Generate a podcast

Released under the MIT License.