More jobs:
Data Engineer
Job in
Westlake, Cuyahoga County, Ohio, 44145, USA
Listed on 2026-02-01
Listing for:
Compunnel, Inc.
Full Time
position Listed on 2026-02-01
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing
Job Description & How to Apply Below
The Data Engineer will design, develop, and maintain scalable, resilient, and cost-efficient data products across cloud and on-premises environments.
This role requires expertise in modern data platforms, strong programming skills, and the ability to build reliable data pipelines that support enterprise-wide data integration and analytics initiatives.
The engineer will collaborate across teams to support a platform modernization journey and deliver high-value operational data solutions.
Key Responsibilities- Build, enhance, and maintain scalable ETL/ELT data pipelines and modern data products.
- Work with relational, No
SQL, and graph databases, including Oracle, Postgre
SQL, Dynamo
DB, Elasticsearch, Neptune, and Neo4J. - Develop batch processing workflows using tools such as AWS Event Bridge, Step Functions, Lambda, S3, EC2, ECS/EKS, or similar technologies.
- Orchestrate workflows and schedules using Control‑M, Airflow, Argo, cron, or equivalent tools.
- Write, optimize, and debug complex SQL queries, PL/SQL procedures, and data transformation logic.
- Develop data engineering solutions using programming and scripting languages such as Python, Java, and Unix shell scripting.
- Build and operate scalable data solutions on cloud platforms, preferably AWS and Snowflake.
- Utilize messaging and streaming platforms such as Kafka, Kinesis, SNS, and SQS.
- Implement CI/CD pipelines and Dev Ops practices using tools such as Maven, Jenkins, AWS Cloud Formation Templates, uDeploy, Stash, and Ansible.
- Manage testing, deployment, and release processes across environments.
- Diagnose and resolve issues during development, testing, and production.
- Collaborate effectively with global, distributed teams in an Agile environment.
- Communicate clearly and effectively through verbal and written channels.
- Bachelor’s or Master’s degree in Computer Science, Computer Engineering, or related technology field.
- Strong ability to learn and implement new technologies in a fast‑paced environment.
- Experience designing and building scalable, resilient, and cost‑effective data engineering solutions (preferably on AWS and Snowflake).
- Hands‑on experience with relational, No
SQL, and graph databases. - Strong understanding of data modeling and data integration patterns.
- Experience developing batch processes and data pipelines.
- Proficiency in SQL, PL/SQL, and debugging complex data logic.
- Experience with Unix scripting, Python, and/or Java.
- Experience with data orchestration tools and job schedulers.
- Knowledge of cloud services such as Lambda, S3, ECS/EKS, Event Bridge, and Step Functions.
- Knowledge of messaging platforms including Kafka, Kinesis, SNS, and SQS.
- Experience using Dev Ops and CI/CD tools.
- Ability to troubleshoot development, testing, and production issues.
- Strong communication skills and ability to work in distributed Agile teams.
- AWS Associate, Professional, or Specialty certification.
- Experience with Snowflake data engineering workloads.
- Prior experience in large‑scale, enterprise‑wide data integration environments.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×