Data Engineer
Listed on 2026-02-18
-
Software Development
Data Engineer
Job Description
Our Client, a play in the telecom industry is looking to bring on a Data Engineer for a 12 month contract. The Data Engineer builds and optimizes scalable data pipelines using AWS, Databricks, and Snowflake. This role focuses on transforming complex data into trusted, analytics-ready datasets while ensuring performance, reliability, and cost efficiency. This role will collaborate with business teams to deliver actionable insights;
strong written and verbal communication is required.
We are a company committed to creating diverse and inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity/affirmative action employer that believes everyone matters. Qualified candidates will receive consideration for employment regardless of their race, color, ethnicity, religion, sex (including pregnancy), sexual orientation, gender identity and expression, marital status, national origin, ancestry, genetic factors, age, disability, protected veteran status, military or uniformed service member status, or any other status or characteristic protected by applicable laws, regulations, and ordinances.
If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to
To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy:
Skills and Requirements- 4+ years developing cloud solutions as a Data Engineer
- Bachelor’s Degree in Computer Science, Computer Engineering, or a related field
- Clear and concise verbal and written communication; ability to communicate technical concepts to non-technical people
- Proven experience developing solutions with:
- AWS (Redshift, S3, Step Functions, Eventbridge, Cloud Watch)
- Databricks (Spark, Delta Lake, Apache Iceberg, Unity Catalog)
- Snowflake
- Strong proficiency in SQL to write and optimize performance of large-scale analytics queries
- Strong proficiency in Python for custom data processing
- Familiarity with CI/CD, version control (Git), and automated testing for data pipelines
- Familiarity with Infrastructure as Code (Terraform, or similar)
- Solid understanding of data warehousing and dimensional modeling
- Ability to write detailed and comprehensive testing documentation. Strong focus on code quality with the ability to design and execute thorough tests.
- Ability to conduct effective code reviews.
- Hands‑on experience integrating AI tools/solutions into data workflows to improve efficiency, automation, or developer productivity
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).