Requirements

  1. Install Docker on your machine;
  2. Make sure Docker Compose is installed and available (it should be the case if you have chosen to install Docker via Docker Desktop); and
  3. Make sure Git is installed on your machine.

Run the app

To start using Lago, run the following commands in a shell:

# Get the code
git clone https://github.com/getlago/lago.git

# Go to Lago folder
cd lago

# Set up environment configuration
echo "LAGO_RSA_PRIVATE_KEY=\"`openssl genrsa 2048 | base64`\"" >> .env
source .env

# Start
docker-compose up

You can now open your browser and go to http://localhost to connect to the application. Just after signing up, Lago’s API is exposed at http://localhost:3000.

Signing up

It’s mandatory to create your organization by signing up to Lago. This organization is the core object of your biller as it’s used to invoice your customers.

  1. Write down your organization name;
  2. Use the main billing email of your company; and
  3. Define the admin password for this email.

You will be able to invite other email addresses within the application. If you already have an account, you can also log in. Once you are able to access the app, you can retrieve your API key.

Find your API Key

Your API Key can be found directly in the UI:

  1. Access the Developer section from the sidebar;
  2. The first tab of this section is related to your API keys; and
  3. Click the Copy button to copy it to clipboard.

Configuration

Version

Docker images are always updated to the last stable version in the docker-compose.yml file. You can use a different tag if needed by checking the releases list.

We recommend to avoid the usage of latest tag, you should use the last tagged version, you can track what are the last version on Dockerhub

Environment variables

Lago uses the following environment variables to configure the components of the application. You can override them to customise your setup.

VariableDefault valueDescription
POSTGRES_HOSTdbHost name of the postgres server
POSTGRES_DBlagoName of the postgres database
POSTGRES_USERlagoDatabase user for postgres connection
POSTGRES_PASSWORDchangemeDatabase password for postgres connection
POSTGRES_PORT5432Port the postgres database listens to
REDIS_HOSTredisHost name of the redis server
REDIS_PORT6379Port the redis database listens to
LAGO_REDIS_CACHE_HOSTredisHost name of the redis cache server
LAGO_REDIS_CACHE_PORT6379Port the redis cache server listens to
LAGO_FRONT_URLhttp://localhostURL of the Lago front-end application.Used for CORS configuration
FRONT_PORT80Port the front-end application listens to
LAGO_API_URLhttp://localhost:3000URL of the Lago back-end application
API_PORT3000Port the back-end application listens to
SECRET_KEY_BASEyour-secret-key-base-hex-64Secret key used for session encryption
SENTRY_DSNSentry DSN key for error and performance tracking
LAGO_RSA_PRIVATE_KEYPrivate key used for webhook signatures
LAGO_SIDEKIQ_WEBActivate the Sidekiq web UI, disabled by default
LAGO_ENCRYPTION_PRIMARY_KEYEncryption primary key used to secure sensitive values stored in the database
LAGO_ENCRYPTION_DETERMINISTIC_KEYEncryption deterministic key used to secure sensitive values stored in the database
LAGO_ENCRYPTION_KEY_DERIVATION_SALTEncryption key salt used to secure sensitive values stored in the database
LAGO_WEBHOOK_ATTEMPTS3Number of failed attempt before stopping to deliver a webhook
LAGO_USE_AWS_S3Use AWS S3 for files storage
LAGO_AWS_S3_ACCESS_KEY_IDAWS Access Key id that has access to S3
LAGO_AWS_S3_SECRET_ACCESS_KEYAWS Secret Access Key that has access to S3
LAGO_AWS_S3_REGIONAWS S3 Region
LAGO_AWS_S3_BUCKETAWS S3 Bucket name
LAGO_AWS_S3_ENDPOINTS3 compatible storage endpoint. Should be set only if you are using another storage provider than AWS S3
LAGO_USE_GCSfalseUse Google Cloud Service Cloud Storage for file storage, ⚠️ If you want to use GCS, you have to pass the credentials json key file to the api and worker service
LAGO_GCS_PROJECTGCS Project name
LAGO_GCS_BUCKETGCS Bucket Name
LAGO_PDF_URLhttp://pdf:3000PDF Service URL on your infrastructure
LAGO_DISABLE_SIGNUPDisable Sign up when running Lago in self-hosted
LAGO_RAILS_STDOUTSet to true to activate logs on containers

We recommend that you change POSTGRES_PASSWORD, SECRET_KEY_BASE, LAGO_RSA_PRIVATE_KEY, LAGO_ENCRYPTION_PRIMARY_KEY, LAGO_ENCRYPTION_DETERMINISTIC_KEY and LAGO_ENCRYPTION_KEY_DERIVATION_SALT to improve the security of your Lago instance:

  • SECRET_KEY_BASE can be generated using the openssl rand -hex 64 command.
  • LAGO_RSA_PRIVATE_KEY can be generated using the openssl genrsa 2048 | base64 command.
  • LAGO_ENCRYPTION_PRIMARY_KEY, LAGO_ENCRYPTION_DETERMINISTIC_KEY and LAGO_ENCRYPTION_KEY_DERIVATION_SALT can all be gerated using the cat /dev/urandom | tr -dc 'a-zA-Z0-9' | fold -w 32 | head -n 1 command.

Components

Lago uses the following containers:

ContainerRole
frontFront-end application
apiAPI back-end application
api_workerAsynchronous worker for the API application
api_clockClock worker for the API application
dbPostgres database engine used to store application data
redisRedis database engine used as a queuing system for asynchronous tasks
pdfPDF generation powered by Gotenberg

You can also use your own database or Redis server. To do so, remove the db and redis configurations from the docker-compose.yml file and update the environment variables accordingly.

SSL Support

Lago Front application can be configured to support SSL certificates. You have two options to achieve this:

  • by using a self-signed certificate
  • by using a signed certificate generated by Let’s Encrypt

Self Signed Certificate

  • Run the script to generate the certificates
# Be sure to be in your lago folder
./extra/init-selfsigned.sh

# This should create certificates in the ./extra/ssl/ folder
  • Take a look at the docker-compose.yml file and uncomment the part related to the Self-Signed certificate
volumes:
  - ./extra/nginx-selfsigned.conf:/etc/nginx/conf.d/default.conf
  - ./extra/ssl/nginx-selfsigned.crt:/etc/ssl/certs/nginx-selfsigned.crt
  - ./extra/ssl/nginx-selfsigned.key:/etc/ssl/private/nginx-selfsigned.key
  - ./extra/ssl/dhparam.pem:/etc/ssl/certs/dhparam.pem
  • You can now start the front application with a self signed SSL certificate support
docker-compose up front

Let’s Encrypt Certificate

  • Edit the file extra/init-letsencrypt.sh
    • You must replace lago.example with your domain name
    • You must enter a valid email address
  • Edit the file extra/nginx-letsencrypt.conf
    • You must replace lago.example with your domain name
  • Run the script
# Be sure to be in your lago folder
./extra/init-letsencrypt.sh

# You will be asked to provide some information
# After that you should be able to see the extra/certbot folder
  • Take a look at the docker-compose.yml file and uncomment all the parts related to the Let’s Encrypt’s support
command:
  '/bin/sh -c ''while :; do sleep 6h & wait $${!}; nginx -s reload; done & nginx
  -g "daemon off;"'''
---
volumes:
  - ./extra/nginx-letsencrypt.conf:/etc/nginx/conf.d/default.conf
  - ./extra/certbot/conf:/etc/letsencrypt
  - ./extra/certbot/www:/var/www/certbot
  • You can now start the front application with the signed certificate support
docker-compose up front

Storage

By default, Lago uses the internal storage of the container. You can customize it by defining different environment variables.

We currently support :

  • AWS S3
  • AWS S3 Compatibles Endpoints
  • Google Cloud Service Cloud Storage

If you use S3 compatibles endpoints, you should set the LAGO_AWS_S3_REGION to a default value (ei: us-east-1), it is required to work properly!

AWS S3

You have to set these variables to use AWS S3.

NameDescription
LAGO_USE_AWS_S3Set to “true” if you want to use AWS S3
LAGO_AWS_S3_ACCESS_KEY_IDAWS S3 Credentials Access Key Id
LAGO_AWS_S3_SECRET_ACCESS_KEYAWS S3 Credentials Secret Access Key
LAGO_AWS_S3_REGIONAWS S3 Region
LAGO_AWS_S3_BUCKETAWS S3 Bucket

AWS S3 Compatible Endpoints

You have to set these variables to use AWS S3 Compatible Endpoints.

NameDescription
LAGO_USE_AWS_S3Set to “true” if you want to use AWS S3 Compatible Endpoints
LAGO_AWS_S3_ENDPOINTAWS S3 Compatible Endpoint
LAGO_AWS_S3_ACCESS_KEY_IDAWS S3 Credentials Access Key Id
LAGO_AWS_S3_SECRET_ACCESS_KEYAWS S3 Credentials Secret Access Key
LAGO_AWS_S3_BUCKETAWS S3 Bucket
LAGO_AWS_S3_REGIONNot used but required by the AWS SDK

Google Cloud Service Cloud Storage

You have to set those variables to use GCS Cloud Storage.

NameDescription
LAGO_USE_GCSSet to “true” if you want to use GCS Cloud Storage
LAGO_GCS_PROJECTGCS Project name
LAGO_GCS_BUCKETGCS Bucket name

In the docker-compose.yml file, you must uncomment the lines and pass the correct GCS credentials json file.

api:
  volumes:
    - gcs_keyfile.json:/app/gcs_keyfile.json

api-worker:
  volumes:
    - gcs_keyfile.json:/app/gcs_keyfile.json

SMTP Configuration

In order to use the email feature, you need to configure some environment variables.

NameDescription
LAGO_FROM_EMAILRequired to send emails (i.e:* noreply@getlago.com*)
LAGO_SMTP_ADDRESSAddress of the SMTP server
LAGO_SMTP_PORTPort of the SMTP Server
LAGO_SMTP_USERNAMEUsername of the SMTP Server
LAGO_SMTP_PASSWORDPassword of the SMTP Server