×
Register Here to Apply for Jobs or Post Jobs. X

Senior DataStage ETL

Job in Tempe, Maricopa County, Arizona, 85285, USA
Listing for: Goldenpick Technologies
Full Time position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing
Job Description & How to Apply Below

Responsibilities

  • Design, develop and maintain robust ETL pipelines using MS Azure Data Factory, MS Azure Databricks and other Azure services.
  • Build and manage data integration interfaces between internal systems and external platforms such as Coupa and SAP.
  • Collaborate with data architects, analysts and business stakeholders to understand data requirements and deliver scalable solutions.
  • Optimize data workflows for performance, reliability and cost‑efficiency.
  • Implement data quality checks, error handling and logging mechanisms.
  • Participate in code reviews, architecture discussions, unit testing and performance tuning.
  • Ensure compliance with data governance, security and privacy standards.
  • Mentor junior developers and contribute to best practices in data engineering.
Mandatory Skills Description
  • Bachelors or Masters degree in Computer Science, Information Systems or related field.
  • 5 years of experience in ETL/ELT development with at least 2 years focused on Azure Data Factory.
  • Strong proficiency in SQL, Postgres and data transformation logic.
  • Experience with Azure Blob Storage, Azure Functions and Databricks.
  • Hands‐on experience integrating with Coupa and SAP platforms.
  • Familiarity with CI/CD pipelines and version control tools (e.g. Git, Azure Dev Ops).
  • Experience with Unix/Linux shell scripting for automation and process orchestration.
  • Expertise in data extraction, transformation and loading from / to various data sources (relational databases, flat files, XML, JSON, etc.).
  • Solid understanding of data warehousing concepts, dimensional modeling and big data processing.
  • Excellent problem‑solving, communication and documentation skills.
  • Experience with IBM Data Stage development.
  • Experience with Power BI, Snowflake or Apache Spark.
  • Knowledge of data lake architecture, data mesh or data fabric concepts.
  • Microsoft Azure certifications (e.g. DP-203: Data Engineering on Microsoft Azure).
  • Experience with REST/SOAP APIs and middleware platforms for enterprise integration.
Key Skills

SQL, Pentaho, PL/SQL, Microsoft SQL Server, SSIS, Informatica, Shell Scripting, T‑SQL, Teradata, Data Modeling, Data Warehouse, Oracle

Job Details

Employment Type:

Full Time

Experience:

years

Vacancy: 1

#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary