About the Quantemplate API
Overview and use cases
What can I do with the Quantemplate API?
The Quantemplate API makes it even easier to flow data through Quantemplate by automating:
- Data egress
- Reference data ingress
- Pipeline runs
- Event notifications
📤 Data Egress
Automatically check for updates to datasets and pipeline outputs in Quantemplate, then extract the data.
Use cases
- Automatically export to a data warehouse. When Quantemplate informs you when a dataset update is available, extract the data and bring it into your data warehouse.
- Connect to a third party analytics tool, such as Power BI
- Export to a shared drive
Examples and documentation
- Azure Integration video walkthrough and code example
- Download a Dataset as CSV example
- Download Dataset API documentation
📨 Data Ingress
Automatically update a dataset in your Quantemplate Data Repo.
Use cases
- Ensure a third party reference dataset, such as Capital IQ or Dun & Bradstreet company names, is always up to date. Use their API to listen out for the latest version of company name data, then use the Quantemplate API to bring updated datasets into your Data Repo.
- Retrieve results from a third party address cleanser
- Connect to a lookup table hosted on a shared drive
Examples and documentation
- Upload New Dataset Data to Data Repo code example
- Upload Dataset API documentation
▶️ Auto Run Pipeline
Automatically run a pre-configured data processing pipeline when a dataset is updated.
Use cases
- Straight-through processing. When a dataset in the Data Repo is updated, run a pipeline to process that data, then use the Egress API to automatically download the results or extract to a downstream system.
Examples and documentation
- End-to-end Pipeline Processing code example
- [Execute Pipeline] (https://quantemplate.readme.io/reference#execute-pipeline) API documentation
🔔 Notifications
Published when a pipeline run completes or when data is uploaded or exported to a dataset from a pipeline.
Use cases
- Combine with Download API to orchestrate extraction of data to a downstream system
- Use a service such as Zapier to publish notifications to Slack or Teams about pipeline runs and dataset updates, or send updates via email
Examples and documentation
- Receive Pipeline Notification code example
- Event notifications API documentation
API endpoints compatiblity
All API functionality in the final release will be covered under the https://api.prod.quantemplate.com endpoint.
Although the API reference already uses the mentioned endpoint, some of the existing functionality is available on different routes:
- https://fabric.prod.quantemplate.com/external - data ingress & egress
- https://accounts.prod.quantemplate.com/auth/realms/qt/protocol/openid-connect/token - authentication
For more information refer to the examples section
Updated over 3 years ago