What is a data pipeline vs ETL?

Data pipelines are basically the tools and processes used to move data between sources (or multiple sources) and targets (i.e., databases, data warehouses, data lakes, cloud-based systems, or applications). ETL pipelines are one type of data pipeline. The term “ETL” refers to Extract, Transform, and Load, which combines the three database functions into one tool to move data from one database to another.   

Extract is the process of reading data from a database. Transformation occurs by using rules or lookup tables or by combining the data with other data. Load is the placement of the data into the target system or repository.

Have more questions? Check our FAQ section

Nicholas Murphy
Nicholas Murphy
Sales Engineer

Didn’t find the answers you were looking for? Get in touch with us!

Book a demo