×
Register Here to Apply for Jobs or Post Jobs. X

Senior Data Engineer; AWS & Confluent Data​/AI Projects | Remote

Remote / Online - Candidates ideally in
Singapore
Listing for: TASQ Staffing Solutions
Full Time, Remote/Work from Home position
Listed on 2025-12-06
Job specializations:
  • IT/Tech
    Data Engineer, Big Data, Cloud Computing, Data Science Manager
Salary/Wage Range or Industry Benchmark: 60000 - 80000 SGD Yearly SGD 60000.00 80000.00 YEAR
Job Description & How to Apply Below
Position: Senior Data Engineer (AWS & Confluent Data/AI Projects) | Remote

About the job Senior Data Engineer (AWS & Confluent Data/AI Projects) | Remote

Work Set-up: Remote

Schedule: 10am-6pm SGT

Responsibilities:
  • Architect and Design Data Solutions:
    Lead the design and architecture of scalable, secure, and efficient data pipelines for both batch and real-time data processing on AWS. This includes data ingestion, transformation, storage, and consumption layers.
  • Confluent Kafka Expertise:
    Design, implement, and optimize highly performant and reliable data streaming solutions using Confluent Platform (Kafka, ksql

    DB, Kafka Connect, Schema Registry). Ensure efficient data flow for real-time analytics and AI applications.
  • AWS Cloud Native Development:
    Develop and deploy data solutions leveraging a wide range of AWS services, including but not limited to:
    • Data Storage: S3 (Data Lake), RDS, Dynamo

      DB, Redshift, Lake Formation.
    • Data Processing:
      Glue, EMR (Spark), Lambda, Kinesis, MSK (for Kafka integration).
    • Orchestration: AWS Step Functions, Airflow (on EC2 or MWAA)
    • Analytics & ML:
      Athena, Quick Sight, Sage Maker (for MLOps integration).
Required Skills and Qualifications:
  • Bachelor's or Master's degree in Computer Science, Software Engineering, or a related quantitative field.
  • 3 to 5 years of experience in data engineering, with a significant focus on cloud-based solutions.
  • Extensive hands‑on experience with Confluent Platform/Apache Kafka for building real‑time data streaming applications.
  • Proficiency in programming languages such as Python, PySpark, Scala, or Java.
  • Expertise in SQL and experience with various database systems (relational and No

    SQL).
  • Solid understanding of data warehousing, data lakes, and data modeling concepts (star schema, snowflake schema, etc.).
  • Experience with CI/CD pipelines and Dev Ops practices (Git, Terraform, Jenkins, Azure Dev Ops, or similar).
Preferred Qualifications (Nice to Have):
  • AWS Certifications (e.g., AWS Certified Data Analytics - Specialty, AWS Certified Solutions Architect - Associate/Professional).
  • Experience with other streaming technologies (e.g., Flink).
  • Knowledge of containerization technologies (Docker, Kubernetes).
  • Familiarity with Data Mesh or Data Fabric concepts.
  • Experience with data visualization tools (e.g., Tableau, Power BI, Quick Sight).
  • Understanding of MLOps principles and tools.
#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary