More jobs:
Senior DataStage ETL
Job in
Tempe, Maricopa County, Arizona, 85285, USA
Listed on 2025-12-20
Listing for:
Goldenpick Technologies LLC
Full Time
position Listed on 2025-12-20
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
Responsibilities
- Design, develop, and maintain robust ETL pipelines using MS Azure Data Factory, MS Azure Databricks, and other Azure services.
- Build and manage data integration interfaces between internal systems and external platforms such as Coupa and SAP.
- Collaborate with data architects, analysts, and business stakeholders to understand data requirements and deliver scalable solutions.
- Optimize data workflows for performance, reliability, and cost-efficiency.
- Implement data quality checks, error handling, and logging mechanisms.
- Participate in code reviews, architecture discussions, unit testing, and performance tuning.
- Ensure compliance with data governance, security, and privacy standards.
- Mentor junior developers and contribute to best practices in data engineering.
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field.
- 5 years of experience in ETL/ELT development, with at least 2 years focused on Azure Data Factory.
- Strong proficiency in SQL, Postgres, and data transformation logic.
- Experience with Azure Blob Storage, Azure Functions, and Databricks.
- Hands‑on experience integrating with Coupa and SAP platforms.
- Familiarity with CI/CD pipelines and version control tools (e.g., Git, Azure Dev Ops).
- Experience with Unix/Linux shell scripting for automation and process orchestration.
- Expertise in data extraction, transformation, and loading from/to various data sources (e.g., relational databases, flat files, XML, JSON, etc.).
- Solid understanding of data warehousing concepts, dimensional modeling, and big data processing.
- Excellent problem‑solving, communication, and documentation skills.
- Experience with IBM Data Stage development.
- Experience with Power BI, Snowflake, or Apache Spark.
- Knowledge of data lake architecture, data mesh, or data fabric concepts.
- Microsoft Azure certifications (e.g., DP-203: Data Engineering on Microsoft Azure).
- Experience with REST/SOAP APIs and middleware platforms for enterprise integration.
Position Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×