Senior Data Engineer
Listed on 2025-12-19
-
IT/Tech
Data Engineer, Data Analyst
Ampcus Inc. is a certified global provider of a broad range of Technology and Business consulting services. We are in search of a highly motivated candidate to join our talented Team.
Job Title: Senior Data Engineer
Location(s): White Plains, NY
We are seeking a Senior Data Engineer to join our Data Services team. The ideal candidate will be an individual contributor responsible for developing ETL workflows, API integrations, supporting business-as-usual (BAU) processes, and collaborating with cross-functional teams to ensure smooth data integration, quality, and governance. This role requires expertise in Azure Databricks, ADF, Pentaho, Globalscape FTP, API Development, SQL stored procedures, and complex queries.
ETL & Data Integration:
- Design, develop, and optimize ETL pipelines using Azure Databricks, ADF, and Pentaho to support enterprise data workflows.
- Implement and maintain data movement, transformation, and integration across multiple systems.
- Ensure seamless data exchange between cloud, on-prem, and hybrid environments.
- Work with Globalscape FTP for secure file transfers and automation.
API Development and Integration:
- Develop, consume, and integrate RESTful and SOAP APIs to facilitate data exchange.
- Work with API gateways and authentication methods such as Oauth, JWT, certificate, and API keys.
- Implement and optimize API-based data extractions and real-time data integrations.
Data Quality & Governance:
- Implement data validation, cleansing, and enrichment techniques.
- Develop and execute data reconciliation processes to ensure accuracy and completeness.
- Adhere to data governance policies and security compliance standards.
BAU Support & Performance Optimization:
- Troubleshoot and resolve ETL failures, data load issues, and performance bottlenecks.
- Optimize SQL stored procedures and complex queries for better performance.
- Support ongoing enhancements and provide operational support for existing data pipelines.
Collaboration & Documentation:
- Work closely with Data Analysts, Business Analysts, and stakeholders to understand data needs.
- Document ETL processes, data mappings, and workflows for maintainability and knowledge sharing.
- Provide guidance and best practices to ensure scalability and efficiency of data solutions.
Required
Skills & Experience:
- 7+ years of experience in ETL development, data integration, and SQL scripting.
- Strong expertise in Azure Databricks, ADF (Azure Data Factory), and Pentaho.
- Experience handling secure file transfers using Globalscape FTP.
- Hands-on experience in developing and consuming APIs (REST/SOAP).
- Experience working with API security protocols (Oauth, JWT, API Keys, etc.).
- Proficiency in SQL, stored procedures, performance tuning, and query optimization.
- Understanding of data modeling, data warehousing, and data governance best practices.
- Hands-on experience with cloud-based data platforms (Azure/AWS) is a plus.
- Strong problem-solving skills, troubleshooting abilities, and ability to work independently.
- Excellent communication skills and ability to work in a fast-paced environment.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).