More jobs:
Job Description & How to Apply Below
Role: AWS Data Engineer
JOB LOCATION :
Chennai, Pune
EXPERIENCE REQUIREMENT : 5+
Required Technical Skill:
Strong Knowledge of Aws Glue/AWS REDSHIFT/SQL/ETL. Good knowledge and experience in Pyspark for forming complex Transformation logic.
AWS Data Engineer,
SQL,ETL, DWH , Secondary : AWS Glue , Airflow
Must-Have
Good Knowledge of SQL , ETL
A minimum of 3 + years' experience and understanding of Python core concepts, and implementing data pipeline frameworks using PySpark and AWS.
Work well independently as well as within a team
Good knowledge in working with various AWS services including S3, Glue, DMS, Redshift.
Proactive, organized, excellent analytical and problem-solving skills
Flexible and willing to learn, can-do attitude is key
Strong verbal and written communication skills
Good-to-Have
Good knowledge of SQL ,ETL ,understanding of Python core concepts, and implementing data pipeline frameworks using PySpark and AWS.
Good knowledge in working with various AWS services including S3, Glue, DMS, Redshift
Responsibility:
AWS Data Engineer
Pyspark / Python / SQL / ETL
A minimum of 3 + years' experience and understanding of Python core concepts, and implementing data pipeline frameworks using PySpark and AWS
Good knowledge of SQL, ETL and also working with various AWS services including S3, Glue, DMS, Redshift
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×