We are unlocking the future of analytics work with our open platform that provides data quality, data observability and semantic layer.
In the final chapter to our Paradime v3.0 release, we are announcing the availability of data quality, data observability and Looker-like semantic layer through:
within the Paradime platform. We are combining the power of open source with the flexibility of the cloud so analytics work can happen uninterrupted.
We believe tooling and technology should be decoupled. Customers should have the flexibility to choose the technology they want or need without getting locked in. This flexibility is central to the popularity and success of the modern data stack. And as an operating system for analytics, it's not our choice to decide, which vendor or technology the end customer ultimately chooses to use. We need to remain un-opinionated. That's our fundamental responsibility.
To use dbt™* you should not be locked into dbt Cloud™*, just like to use Python you don't need Python Cloud 😆
We are ultimately providing our end users with everything they need to get their daily work done. We want to change how organizations use data to develop products, run daily operations and plan for the future without getting slowed down.
To help them in that mission, we need to unblock our end users and their pipeline.
So today we are announcing the availability of the following packages within Paradime:
These are some of the fantastic OSS technologies with Cloud versions that will be readily available to our end users. With these additions, Paradime users will get
out of the box with zero infrastructure and tooling to maintain.
Some of our customers are already using re_data, Lightdash, and Elementary and it's unbelievable to see how excited they are about these capabilities coming to the Paradime platform.
Availability of OSS applications will unlock novel use cases and access patterns that we haven't even thought about. But I think it will surely transform the way we work.
All the above libraries and their CLI will be available in our cloud-based terminal in the Code IDE. End users will be able to run CLI commands without any additional effort. Where the CLI needs a login / password (e.g. Lightdash) you will still need to sign-up on the respective platforms and get that information.
Our users now have superpowers. They can schedule Bolt jobs and run non-dbt commands as part of their production runs. This functionality was previously only possible with Airflow and other orchestrators. Now you can run them natively in Bolt by specifying the commands in the order you want them to run.
Once runs are complete, all your artifacts will be available in our / your S3 bucket for downstream consumption, wherever you need it. Thus there is no data loss, no walled garden, and everything you need at your fingertips.
In the last four days, we have launched:
We are extremely thankful to the re_data, Elementary and Lightdash team for making it their mission to build such high-quality applications that add a non-linear value jump to our end user workflows.
The future of analytics work is open and our mission is to provide that open platform where analytics work can happen uninterrupted.
It's time to connect the dots 😉.