Product features

Product features

Powers data teams

Try Pro

Try Pro

Write

Test

Improve

Explain

retriever.py

1

2

3

4

5

6

7

from dense_retriever import DenseRetriever


class CustomRetriever(DenseRetriever):

def retrieve(self, query, documents):

# Custom retrieval logic

custom_results = super().retrieve(query, documents)

return custom_results

Output

Running

In

Out

Write

Test

Improve

Explain

retriever.py

1

2

3

4

5

6

7

from dense_retriever import DenseRetriever


class CustomRetriever(DenseRetriever):

def retrieve(self, query, documents):

# Custom retrieval logic

custom_results = super().retrieve(query, documents)

return custom_results

Output

Running

In

Out

Streamlined Developer Experience

Streamlined Developer Experience

Streamlined Developer Experience

Maximize team productivity with an intuitive, developer-friendly interface.

Maximize team productivity with an intuitive, developer-friendly interface.

Maximize team productivity with an intuitive, developer-friendly interface.

Interactive code editor with visual feedback for immediately previewing execution results

Interactive code editor with visual feedback for immediately previewing execution results

Interactive code editor with visual feedback for immediately previewing execution results

Mix and match dbt models and custom Python, SQL, or R code blocks within the same pipeline

Mix and match dbt models and custom Python, SQL, or R code blocks within the same pipeline

Mix and match dbt models and custom Python, SQL, or R code blocks within the same pipeline

Data integrations with sources and destinations from 100+ third-party services

Data integrations with sources and destinations from 100+ third-party services

Data integrations with sources and destinations from 100+ third-party services

40

Template

8

Workflow

100

File

6

Pipeline

40

Team

Import

Efficient Development Workflow

Efficient Development Workflow

Efficient Development Workflow

Improve code quality with standard workflows & reusable components.

Improve code quality with standard workflows & reusable components.

Improve code quality with standard workflows & reusable components.

Use 100+ pre-written code blocks or create customizable templates to reduce development time and bugs

Use 100+ pre-written code blocks or create customizable templates to reduce development time and bugs

Use 100+ pre-written code blocks or create customizable templates to reduce development time and bugs

Built-in testing and data validation framework for ensuring data quality in production pipelines

Built-in testing and data validation framework for ensuring data quality in production pipelines

Built-in testing and data validation framework for ensuring data quality in production pipelines

Automatically deploy new code changes and data pipelines to different environments

Automatically deploy new code changes and data pipelines to different environments

Automatically deploy new code changes and data pipelines to different environments

Daily

Schedule

Running

Weekly

Schedule

Fail

On-demand

API

60ms

Catch-up

Backfill

10min

Daily

Schedule

Running

Weekly

Schedule

Fail

On-demand

API

60ms

Catch-up

Backfill

10min

Powerful Data Orchestration

Powerful Data Orchestration

Powerful Data Orchestration

Manage and run complex data operations across multiple projects, regions, and timelines.

Manage and run complex data operations across multiple projects, regions, and timelines.

Manage and run complex data operations across multiple projects, regions, and timelines.

Trigger pipelines to run on a schedule, in response to an event, from an API request, or to finish at a specific time

Trigger pipelines to run on a schedule, in response to an event, from an API request, or to finish at a specific time

Trigger pipelines to run on a schedule, in response to an event, from an API request, or to finish at a specific time

Share and reuse a single trigger across different pipelines

Share and reuse a single trigger across different pipelines

Share and reuse a single trigger across different pipelines

Backfill data with dynamically generated configurations and variables at runtime using custom code blocks

Backfill data with dynamically generated configurations and variables at runtime using custom code blocks

Backfill data with dynamically generated configurations and variables at runtime using custom code blocks

Details

Error

Stacktrace

Today

12:00:00.123

Exception thrown when attempting to run .__execute_with_retry at 0x7f3fcdb63010>, attempt 1 of 1

Investigate

Dismiss

Details

Error

Stacktrace

Today

12:00:00.123

Exception thrown when attempting to run .__execute_with_retry at 0x7f3fcdb63010>, attempt 1 of 1

Investigate

Dismiss

Reliable Monitoring & Insights

Reliable Monitoring & Insights

Reliable Monitoring & Insights

Increase uptime & reliability with comprehensive observability tools.

Increase uptime & reliability with comprehensive observability tools.

Increase uptime & reliability with comprehensive observability tools.

Custom events, metrics, and alert notification rules

Custom events, metrics, and alert notification rules

Custom events, metrics, and alert notification rules

Manage cross-pipeline dependencies and execution flow across every pipeline within a project

Manage cross-pipeline dependencies and execution flow across every pipeline within a project

Manage cross-pipeline dependencies and execution flow across every pipeline within a project

Data catalog, metadata management, and data lineage

Data catalog, metadata management, and data lineage

Data catalog, metadata management, and data lineage

Engineering

Design

Marketing

Builder

EDITOR

VIEWER

-

Comments

EDITOR

EDITOR

-

Triggers

ADMIN

-

VIEWER

Monitoring

EDITOR

VIEWER

-

Dashboard

VIEWER

VIEWER

VIEWER

Alerts

EDITOR

-

-

Enterprise-Grade Security

Enterprise-Grade Security

Enterprise-Grade Security

Protect sensitive data and safeguard secrets with fine-grained controls.

Protect sensitive data and safeguard secrets with fine-grained controls.

Protect sensitive data and safeguard secrets with fine-grained controls.

Granular data retention policies and a built-in secret manager with 3rd party integrations.

Granular data retention policies and a built-in secret manager with 3rd party integrations.

Granular data retention policies and a built-in secret manager with 3rd party integrations.

User audit trail, developer account management, role-based access control, and single sign on (SSO)

User audit trail, developer account management, role-based access control, and single sign on (SSO)

User audit trail, developer account management, role-based access control, and single sign on (SSO)

VPN, SSL certificate database authentication, dedicated static IPs, and regional deployment for data processing operations

VPN, SSL certificate database authentication, dedicated static IPs, and regional deployment for data processing operations

VPN, SSL certificate database authentication, dedicated static IPs, and regional deployment for data processing operations

Architecture

Architecture

High-Performance Architecture

High-Performance Architecture

High-Performance Architecture

Run highly concurrent pipelines with autoscaling for maximum performance and reducing costs.

Run highly concurrent pipelines with autoscaling for maximum performance and reducing costs.

Run highly concurrent pipelines with autoscaling for maximum performance and reducing costs.

Run Spark pipelines, monitor execution metrics, and manage compute resources all from within a specialized user interface

Run Spark pipelines, monitor execution metrics, and manage compute resources all from within a specialized user interface

Run Spark pipelines, monitor execution metrics, and manage compute resources all from within a specialized user interface

Execute 100,000+ dynamically created block runs concurrently and process 1,000+ gigabytes (GB) of data without running out of memory

Execute 100,000+ dynamically created block runs concurrently and process 1,000+ gigabytes (GB) of data without running out of memory

Execute 100,000+ dynamically created block runs concurrently and process 1,000+ gigabytes (GB) of data without running out of memory

Automatically and intelligently scale data pipelines, both vertically and horizontally, using predictive analytics and machine learning

Automatically and intelligently scale data pipelines, both vertically and horizontally, using predictive analytics and machine learning

Automatically and intelligently scale data pipelines, both vertically and horizontally, using predictive analytics and machine learning

Adaptable & Extensible Design

Adaptable & Extensible Design

Adaptable & Extensible Design

Adapt to evolving complexity with customizable frameworks and extensive Mage APIs.

Adapt to evolving complexity with customizable frameworks and extensive Mage APIs.

Adapt to evolving complexity with customizable frameworks and extensive Mage APIs.

Run pipelines and configure runtime variables using no-code user interface elements, such as dropdown menus and autocomplete inputs

Run pipelines and configure runtime variables using no-code user interface elements, such as dropdown menus and autocomplete inputs

Run pipelines and configure runtime variables using no-code user interface elements, such as dropdown menus and autocomplete inputs

Deploy high-performance, low-latency API endpoints for executing blocks and returning output data, such as inference endpoints

Deploy high-performance, low-latency API endpoints for executing blocks and returning output data, such as inference endpoints

Deploy high-performance, low-latency API endpoints for executing blocks and returning output data, such as inference endpoints

High throughput API endpoints for integrating Mage Pro with any 3rd party or in-house services

High throughput API endpoints for integrating Mage Pro with any 3rd party or in-house services

High throughput API endpoints for integrating Mage Pro with any 3rd party or in-house services

For teams. Fully managed platform for integrating and transforming data.

Try Pro

For teams. Fully managed platform for integrating and transforming data.

Try Pro

For teams. Fully managed platform for integrating and transforming data.

Try Pro