More jobs:
Freelance Data Engineer; AWS/PySpark
Job in
Union, Union County, New Jersey, 07083, USA
Listed on 2025-12-24
Listing for:
Darwin Partners
Contract
position Listed on 2025-12-24
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
We are looking for a Data Engineer to join the data platform team of a major global commodity trading company.
You will work in the same team as a Lead Data Engineer already in place and focus mainly on enhancements of the data platform.
The environment is strongly cloud-oriented (AWS), with Python and PySpark at the core of all data pipelines and processing. You will help keep mission‑critical trading and risk data reliable, performant, and available to users across the business.
Key Responsibilities- Monitor and operate existing data pipelines and jobs (Python / PySpark / AWS).
- Perform bug fixes, small evolutions, and refactoring on existing codebases.
- Work closely with the Lead Data Engineer and business stakeholders to understand issues and prioritize fixes.
- Maintain documentation for pipelines, workflows, and operational procedures.
- Contribute to the development of new data ingestion and transformation pipelines in Python / PySpark on AWS.
- Implement new features and small data products on top of the existing data platform.
- Participate in code reviews and help improve engineering best practices (testing, CI/CD, observability).
- Several years of experience as a Data Engineer (or similar role).
- Strong hands‑on skills in Python for data processing (ETL/ELT, APIs, automation).
- Solid experience with PySpark in a production context (batch and/or streaming).
- Professional experience on AWS (data‑oriented services such as S3, Glue, EMR, Lambda, ECS, Redshift, etc.).
- Good knowledge of SQL and relational databases.
- Experience operating data pipelines in production: monitoring, alerting, troubleshooting, optimisation.
- Comfortable working in remote, international teams, communicating in English (written and spoken).
- Experience in trading, financial markets, or commodities environments.
- Knowledge of data orchestration tools (e.g. Airflow, Step Functions) and CI/CD practices.
- Familiarity with microservices / APIs (FastAPI or similar) and containerization (Docker).
- Exposure to data governance, security, and role‑based access control in cloud environments.
- Strong sense of ownership and reliability on production systems.
- Pragmatic and solution‑oriented, able to work on both small fixes and incremental improvements.
- Good communication skills with both technical and non‑technical stakeholders.
- Team player, comfortable pairing with a more senior engineer and following existing architecture and standards.
- Environment:
Data platform for trading & risk use cases. - Contract type:
Long‑term mission / consulting.
- Mid‑Senior level
- Full‑time
- Information Technology
- IT Services and IT Consulting
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×