More jobs:
Cloud Data Engineer
Job in
Snowflake, Navajo County, Arizona, 85937, USA
Listed on 2025-12-02
Listing for:
Effulgent INC
Full Time
position Listed on 2025-12-02
Job specializations:
-
IT/Tech
Data Engineer, Cloud Computing, Big Data, AWS
Job Description & How to Apply Below
Salary : $ 90000 - $ 96000 Job Type :
Full-Time Location :
Texas Job Posted :
A Cloud Data Engineer is responsible for designing, building, and maintaining cloud-based data infrastructure and pipelines. They work with cloud services to ensure efficient data storage, processing, and integration, supporting analytics and business intelligence needs.
Key Responsibilities:- Design and implement scalable data architectures in cloud environments (AWS, Azure, GCP).
- Build and manage cloud-based data warehouses (Snowflake, Redshift, Big Query).
- Develop and optimize ETL/ELT pipelines for data ingestion, transformation, and processing.
- Automate workflows using tools like Apache Airflow, AWS Glue, or Azure Data Factory.
- Work with big data technologies (Apache Spark, Hadoop, Kafka) for batch and streaming data.
- Implement real-time data processing solutions for analytics and reporting.
- Ensure data security, compliance (GDPR, HIPAA), and best practices in cloud environments.
- Implement role-based access control and data encryption strategies.
- Work closely with data scientists, analysts, and software engineers to optimize data workflows.
- Monitor and optimize cloud data storage and computing costs.
Education: Bachelor’s or Master’s degree in Computer Science, Data Engineering, IT, or related fields.
Technical Skills:
- Proficiency in SQL and No
SQL databases (Postgre
SQL, Mongo
DB, Dynamo
DB). - Strong programming skills in Python, Java, or Scala.
- Experience with cloud services:
- AWS: S3, Redshift, Glue, Lambda, EMR, Kinesis
- Azure: Data Factory, Synapse, Cosmos DB, Blob Storage
- GCP: Big Query, Dataflow, Pub/Sub, Cloud Storage
- Knowledge of Infrastructure as Code (Terraform, Cloud Formation).
- Experience with CI/CD pipelines for data deployment.
- Certifications (AWS Certified Data Analytics, Google Professional Data Engineer, etc.).
- Experience with Kubernetes and Docker for containerized data applications.
- Familiarity with MLOps and AI/ML model deployment in cloud environments.
- Minimum 5 years of experience is required.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×