More jobs:
Senior Data Engineer; AWS & Confluent Data/AI Projects | Remote
Remote / Online - Candidates ideally in
Singapore
Listed on 2025-12-06
Singapore
Listing for:
TASQ Staffing Solutions
Full Time, Remote/Work from Home
position Listed on 2025-12-06
Job specializations:
-
IT/Tech
Data Engineer, Big Data, Cloud Computing, Data Science Manager
Job Description & How to Apply Below
About the job Senior Data Engineer (AWS & Confluent Data/AI Projects) | Remote
Work Set-up: Remote
Schedule: 10am-6pm SGT
Responsibilities:- Architect and Design Data Solutions:
Lead the design and architecture of scalable, secure, and efficient data pipelines for both batch and real-time data processing on AWS. This includes data ingestion, transformation, storage, and consumption layers. - Confluent Kafka Expertise:
Design, implement, and optimize highly performant and reliable data streaming solutions using Confluent Platform (Kafka, ksql
DB, Kafka Connect, Schema Registry). Ensure efficient data flow for real-time analytics and AI applications. - AWS Cloud Native Development:
Develop and deploy data solutions leveraging a wide range of AWS services, including but not limited to:- Data Storage: S3 (Data Lake), RDS, Dynamo
DB, Redshift, Lake Formation. - Data Processing:
Glue, EMR (Spark), Lambda, Kinesis, MSK (for Kafka integration). - Orchestration: AWS Step Functions, Airflow (on EC2 or MWAA)
- Analytics & ML:
Athena, Quick Sight, Sage Maker (for MLOps integration).
- Data Storage: S3 (Data Lake), RDS, Dynamo
- Bachelor's or Master's degree in Computer Science, Software Engineering, or a related quantitative field.
- 3 to 5 years of experience in data engineering, with a significant focus on cloud-based solutions.
- Extensive hands‑on experience with Confluent Platform/Apache Kafka for building real‑time data streaming applications.
- Proficiency in programming languages such as Python, PySpark, Scala, or Java.
- Expertise in SQL and experience with various database systems (relational and No
SQL). - Solid understanding of data warehousing, data lakes, and data modeling concepts (star schema, snowflake schema, etc.).
- Experience with CI/CD pipelines and Dev Ops practices (Git, Terraform, Jenkins, Azure Dev Ops, or similar).
- AWS Certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Solutions Architect - Associate/Professional).
- Experience with other streaming technologies (e.g., Flink).
- Knowledge of containerization technologies (Docker, Kubernetes).
- Familiarity with Data Mesh or Data Fabric concepts.
- Experience with data visualization tools (e.g., Tableau, Power BI, Quick Sight).
- Understanding of MLOps principles and tools.
Position Requirements
10+ Years
work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×