Airbyte is an open-source data integration platform used as an ETL. With this integration, you can connect Lago billing data to any warehouses.
You can push Lago billing data to destinations such as Snowflake, BigQuery, Redshift, S3 buckets or Azure, for instance. The entire list of data destinations enabled by Airbyte is listed on their destinations documentation.
With Airbyte’s native integration of Lago, you can push the following billing data to warehouses:
At present this connector only supports full refresh syncs meaning that each time you use the connector it will sync all available records from scratch. Please use cautiously if you expect your API to have a lot of records.
Find the full documentation of Airbyte’s native Lago integration.
First of all, you simply need to bring your Lago private API key. In airbyte:
Lago data source in Airbyte
You can select any of the data destinations available in Airbyte. It could be a warehouse (BigQuery, Redshift, Snowflake…) or a file storage tool (S3, for instance). Please find here the entire list of data destinations available in Airbyte.
Destination in Airbyte
In the following example, we connected Lago billing data to Snowflake data warehouse. Obviously, you can select another destination if needed.
This action will populate Lago billing data into a warehouse (Snowflake in our example).
Lago data in Snowflake
Once the data has been populated in your destination, a warehouse in our example, you can easily query your billing data. Here is a query calculating your monthly revenue with Lago:
Query in snowflake
Airbyte is an open-source data integration platform used as an ETL. With this integration, you can connect Lago billing data to any warehouses.
You can push Lago billing data to destinations such as Snowflake, BigQuery, Redshift, S3 buckets or Azure, for instance. The entire list of data destinations enabled by Airbyte is listed on their destinations documentation.
With Airbyte’s native integration of Lago, you can push the following billing data to warehouses:
At present this connector only supports full refresh syncs meaning that each time you use the connector it will sync all available records from scratch. Please use cautiously if you expect your API to have a lot of records.
Find the full documentation of Airbyte’s native Lago integration.
First of all, you simply need to bring your Lago private API key. In airbyte:
Lago data source in Airbyte
You can select any of the data destinations available in Airbyte. It could be a warehouse (BigQuery, Redshift, Snowflake…) or a file storage tool (S3, for instance). Please find here the entire list of data destinations available in Airbyte.
Destination in Airbyte
In the following example, we connected Lago billing data to Snowflake data warehouse. Obviously, you can select another destination if needed.
This action will populate Lago billing data into a warehouse (Snowflake in our example).
Lago data in Snowflake
Once the data has been populated in your destination, a warehouse in our example, you can easily query your billing data. Here is a query calculating your monthly revenue with Lago:
Query in snowflake