PricingDocsTutorialsBlogAbout

Develop

Get Started
Store Embeddings
Generate Embeddings
Automatic Embedding Generation
Calculate Distance
Query Embeddings
Create Index
Quantization
Asynchronous Tasks
Weighted Vector Search
Troubleshooting
Single Operator
Postgres Notes
Security

Languages

Javascript
Python
Ruby
Rust

Migrate

Migrate from Postgres to Lantern Cloud
Migrate from pgvector to Lantern Cloud
Migrate from pgvector to self-hosted Lantern
Migrate from Pinecone to Lantern Cloud

Lantern HNSW

Installation

Lantern Extras

Installation
Generate Embeddings
Lantern Daemon

Lantern CLI

Installation
Generate Embeddings
Indexing Server
Daemon
Autotune Index
Product Quantization

Contributing

Dockerfile
Visual Studio Code

Lantern CLI

Installation

Run with Docker

You can use the Docker image to quickly run Lantern CLI with docker.

Run CPU image

bash
Copy
docker run --rm lanterndata/lantern-cli -h

Run GPU image

bash
Copy
nvidia-docker run --rm lanterndata/lantern-cli:gpu -h

Using volumes

When running create-embeddings command pass -v lantern-models:/models to Docker and --data-path /models to CLI, so the downloaded models will persist in a Docker volume. Example:

bash
Copy
docker run -v lantern-models:/models --rm --network host lanterndata/lantern-cli create-embeddings --model 'BAAI/bge-large-en' --uri 'postgresql://[postgres@host.docker.internal](mailto:postgres@host.docker.internal):5432/postgres' --table "wiki" --column "content" --out-column "content_embedding" --pk "id" --batch-size 40 --data-path /models

In case of start-daemon command, just pass -v lantern-daemon-data:/var/lib/lantern-daemon to Docker

Install ONNX Runtime

ONNX Runtime is required for Lantern CLI Embeddings. This is pre-packaged in the Docker image. It needs to be installed separately when installing from binaries. For Cargo, it is required except for Linux.

  1. Get ONNX Runtime library

    Get ONNX Runtime library from Github release page

    bash
    Copy
    PLATFORM=linux
    ARCH=x64
    ONNX_VERSION=1.16.1
    
    wget "https://github.com/microsoft/onnxruntime/releases/download/v${ONNX_VERSION}/onnxruntime-${PLATFORM}-${ARCH}-${ONNX_VERSION}.tgz"

    Note: If you have GPU, download the GPU version of onnxruntime library.

  2. Extract archive

    Extract the downloaded archive in /usr/local/lib

    bash
    Copy
    mkdir -p /usr/local/lib
      tar xzf "onnxruntime-${PLATFORM}-${ARCH}-${ONNX_VERSION}.tar" -C /usr/local/lib
  3. Setup environment

    Export path to ONNX library

    bash
    Copy
    export ORT_STRATEGY=system
    export ORT_DYLIB_PATH=/usr/local/lib/onnxruntime-${PLATFORM}-${ARCH}-${ONNX_VERSION}/lib/libonnxruntime.so

    Note: On MacOS the library will have .dylib extension instead of .os You can export this variables from your shell profile or add them to ld search path like this:

    bash
    Copy
    echo "/usr/local/lib/onnxruntime-${PLATFORM}-${ARCH}-${ONNX_VERSION}/lib/libonnxruntime.so" > /etc/ld.so.conf.d/onnx.conf
    ldconfig

Install From Binaries

Prerequisities

  • ONNX Runtime - see previous section for instructions

Supported platforms

  • Linux (x86_64)

Use our releases from GitHub

You can download prebuilt binaries from our releases page

Install With Cargo

Prerequisites

  • Rust >=1.70.0
  • Cargo
  • ONNX Runtime - see earlier section for instructions

Installation

bash
Copy
cargo install --git https://github.com/lanterndata/lantern.git

Verify installation

bash
Copy
lantern-cli -h

Edit this page

On this page

  • Run with Docker
  • Run CPU image
  • Run GPU image
  • Using volumes
  • Install ONNX Runtime
  • Install From Binaries
  • Prerequisities
  • Supported platforms
  • Use our releases from GitHub
  • Install With Cargo
  • Prerequisites
  • Installation
  • Verify installation
PricingDocsTutorialsBlogAbout
LoginSign Up