×
Register Here to Apply for Jobs or Post Jobs. X

Data Engineer Data Warehouse

Remote / Online - Candidates ideally in
New York, New York County, New York, 10261, USA
Listing for: DS Technologies Inc
Remote/Work from Home position
Listed on 2026-01-01
Job specializations:
  • IT/Tech
    Data Engineer, Data Warehousing
Job Description & How to Apply Below
Location: New York

About US:

We are a company that provides innovative, transformative IT services and solutions. We are passionate about helping our clients achieve their goals and exceed their expectations. We strive to provide the best possible experience for our clients and employees. We are committed to continuous improvement and innovation, and we are always looking for ways to improve our services and solutions. We believe in working collaboratively with our clients and employees to achieve success.

DS Technologies Inc is looking for Data Engineer – Data Warehouse role for one of our premier clients.

Job Title:

Data Engineer – Data Warehouse

Location:

New York, NY (Hybrid – 3 days onsite at Amex Tower) Duration:
Long Term Contract Only W2

Work Arrangement:

Hybrid: 3 days onsite at the American Express Tower, New York, NY.

Job Description:

We are seeking an experienced Data Engineer with strong expertise in ETL development, data warehousing, and large-scale data processing. The ideal candidate will have a solid understanding of data modeling, SQL, and Python-based ETL pipeline development. This role involves building and maintaining data pipelines that enable analytics, governance, and reporting across various enterprise systems.

Key Responsibilities:
  • Design, develop, and deploy ETL pipelines using Python from scratch.
  • Write efficient, reusable, and modular code adhering to OOP principles.
  • Design and optimize data models including schemas, entity relationships, and transformations.
  • Develop and analyze SQL queries across multiple RDBMS systems (SQL Server, DB2, Oracle).
  • Work with data warehouses such as Big Query and Databricks Delta Lakehouse for ingestion, cleansing, governance, and reporting.
  • Create, manage, and monitor Airflow DAGs for scheduling and workflow automation.
  • Collaborate with cross-functional teams to ensure data consistency, accuracy, and accessibility.
  • Troubleshoot data issues and optimize ETL performance.
Required

Skills & Experience:
  • 6+ years of experience in Data Engineering, ETL Development, and Data Warehousing.
  • Strong programming skills in Python with proven experience in ETL pipeline design.
  • Expertise in SQL, RDBMS (MS SQL Server, DB2, Oracle).
  • Hands‑on experience with Airflow for orchestrating workflows.
  • Solid understanding of OOP concepts and data modeling principles.
  • Experience with Big Query, Databricks Delta Lakehouse, or similar data warehouse technologies.
  • Familiarity with Spark is a strong plus.
Nice to Have:
  • Experience with IBM Apptio platform and cost management data workflows.
  • Knowledge of Node.js for integration scripting or data API handling.

Flexible work from home options available.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary