Data Engineer - manage projects and support data management, data governance, reporting, and business intelligence
Experience
Minimum of 3 years of professional experience in data engineering
, with a strong focus on designing, developing, and optimizing scalable data pipelines, ETL/ELT workflows, and data integration solutions using modern cloud technologies.
Knowledge, Skills, and Abilities
Comprehensive understanding of data pipeline and modern data stack architectures, with hands-on experience in cloud-based platforms such as AWS
, Azure
, or GCP
, and data platforms such as Snowflake
, Databricks
, or Redshift
.
Technologies:
Data extraction:
SQL
, Python
, API integration, Change Data Capture (
CDC
)
Database systems:
Postgre
SQL, MySQL, SQL Server
Data storage repositories:
SFTP, AWS S3
Job scheduling and orchestration:
Apache Airflow, AWS Step Functions
ETL/ELT tools and workflows:
dbt, PySpark, AWS Glue, AWS Lambda, Slowly Changing Dimensions (SCD)
CI/CD and infrastructure automation:
Bitbucket, Git, Jenkins, Cloud Formation, Terraform, Flyway
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search: