Data Factory Data Factory

Hybrid data integration service that simpliflies ETL at scale


Pricing Details

Azure Data Factory: Data Pipeline Pricing

Pricing for Data Pipeline is calculated based on:

  • Pipeline orchestration and execution
  • Data flow execution and debugging
  • Number of Data Factory operations such as create pipelines and pipeline monitoring

Data Factory Pipeline Orchestration and Execution

Pipelines are control flows of discrete steps referred to as activities. You pay for data pipeline orchestration by activity run and activity execution by integration runtime hours. The integration runtime, which is serverless in Azure and self-hosted in hybrid scenarios, provides the compute resources used to execute the activities in a pipeline.

For example, the Azure Data Factory copy activity can move data across various data stores in a secure, reliable, performant, and scalable way. As data volume or throughput needs grow, the integration runtime can scale out to meet those needs.

*The following prices are tax-inclusive.
TYPE PRICE DESCRIPTION
Orchestration ¥10.176 per 1,000 runs Activity, trigger, and debug runs
Self-hosted integration runtime

¥15.264 per 1,000 runs

Execution Azure integration runtime

Data movement activities:¥2.544/DIU-hour *

Pipeline activities:¥0.0509/hour **

External:¥0.00254/hour

Cost to execute an Azure Data Factory activity on the Azure integration runtime
Self-hosted integration runtime

Data movement activities:¥1.018/hour *

Pipeline activities:¥0.0204/hour **

External:¥0.00102/hour

Cost to execute an Azure Data Factory activity on a self-hosted integration runtime
* Use of the copy activity to egress data out of an Azure datacenter will incur additional network bandwidth charges, which will show up as a separate outbound data transfer line item on your bill. Learn more about outbound data transfer pricing .

** Pipeline activities execute on integration runtime. Pipeline activities include lookup, get metadata, and schema operations during authoring (test connection, browse folder list and table list, preview data). External pipeline activities are managed on integration runtime but execute on linked services. External activities include Databricks, stored procedure, HDInsight activities, and others.

Data Factory Operations

*The following prices are tax-inclusive.
TYPE PRICE EXAMPLES
Read/Write * ¥5.088 per 50,000 modified/referenced entities Read/write of entities in Azure Data Factory *
Monitoring ¥2.544 per 50,000 run records retrieved Monitoring of pipeline, activity, trigger, and debug runs **
* Read/write operations for Azure Data Factory entities include create, read, update, and delete. Entities include datasets, linked services, pipelines, integration runtime, and triggers.

** Monitoring operations include get and list for pipeline, activity, trigger, and debug runs.

Inactive pipelines

A pipeline is considered inactive if it has no associated trigger or any runs within the month. An inactive pipeline is charged at ¥8.141 per month.

FAQ

Expand All
  • What are Azure Data Factory read/write operations?

    Read/write operations include create, read, update, and delete Azure Data Factory entities. Entities include datasets, linked services, pipelines, integration runtime, and triggers.

  • What are Azure Data Factory monitoring operations?

    Monitoring operations include get and list for pipeline, activity, trigger, and debug runs.

  • What is an activity? What is a run?

    An activity is a step within a pipeline. The execution of each activity is called a run.

  • What is an integration runtime?

    An integration runtime is the compute infrastructure used by Azure Data Factory to provide the following data integration capabilities across different network environments:

    • Data movement: Transfer of data between data stores in public and private (on-premise or virtual private) networks, providing support for built-in connectors, format conversion, column mapping, and performant and scalable data transfer.
    • Activity dispatch: Dispatching and monitoring of transformation activities running on a variety of compute services, such as Azure HDInsight, Azure Machine Learning, Azure SQL Database, SQL Server, and others.
    • SQL Server Integration Services package execution: Native execution of SQL Server Integration Service packages in a managed Azure compute environment.
  • What is a trigger, and what is a trigger run?

    A trigger is a unit of processing that determines when a pipeline execution needs to be initiated. A trigger run is the execution of a trigger, which may produce an activity run if the conditions are satisfied.

  • What is a debug run?

    A debug run is a test run that a user can perform during iterative development to ensure the steps in the pipeline are working as intended before changes are published to the data factory.

  • What is an inactive pipeline, and when is it charged?

    An inactive pipeline is one that’s not associated with a trigger and that has zero runs within a month. A charge is incurred after one month of zero runs.

  • How am I charged for pipeline execution?

    Pipeline execution activities (Azure integration runtime data movement, pipeline activities, external and self-hosted integration runtime data movement, pipeline activities, and external) are billed at the hourly rate shown above. Pipeline execution charges are prorated by the minute and rounded up.

    For example: If you run an operation that takes 2 minutes and 20 seconds, you will be billed for 3 minutes.

  • Where can I find pricing examples?

    Find scenario-based pricing examples on the Azure Data Factory Documentation page.

Support & SLA

If you have any questions or need help, please visit Azure Support and select self-help service or any other method to contact us for support.

To learn more about the details of our Service Level Agreement, please visit the Service Level Agreements page.