What are Data Pipelines?
- Set of processes to move, transform, or analyze data
- Typical steps:
- ETL: Extract data from various data sources,
then Transform into a meaningful schema,
finally Load into a target data sink (e.g., a data warehouse )
- ELT: Extract data from various data sources,
then Load into a target data sink (e.g., a data lake),
finally Transform data into meaningful schema when needed