More jobs:
Job Description & How to Apply Below
Senior Consultant - AWS Consultant/Data engineer
Location:
Hyderabad/Indore
Duration:
Full time
Kindly revert me on s
Hinduja Global Solutions Ltd., (HGSL) is an Indian listed company and part of Hinduja Group serving fortune 500 companies for the last 40 years. It is a Digital Technology BPM company specialising into Consumer Engagement Services in the field of Technology, Telecom, banking, retail operations, Data Analytics and Digital Technology services. The Parent Co holds entities in the USA, Philippines, UK, Canada, South Africa, Colombia and Jamaica through subsidiaries and step-subsidiaries.
With more than 18,000 employees spread across 9 countries, our mission is to make our clients more competitive by providing exceptional experience. Powered by people’s first philosophy and experience serving over 1100 of the world’s leading brands, HGS is the perfect place to build your future!
Position Summary:
We are looking for an AWS Consultant/Data engineer to add to our cloud practice.
We believe that Data Engineer enables data-driven decision making by collecting, transforming, and publishing data. A Data Engineer should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance; scalability and efficiency; reliability and fidelity; and flexibility and portability. A Data Engineer should also be able to leverage, deploy, and continuously train pre-existing machine learning models.
As an AWS Senior consultant, you are primarily responsible for advising and designing and implementing the right cloud solution for each of the customer requirements and constraints.
Key Responsibilities:
Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties - Spark, EMR, Dynamo
DB, Red Shift, Kinesis, Lambda, Glue, Snowflake
Design and build production data pipelines from ingestion to consumption within a big-data architecture, using any programming language like Java, Python, Scala.
Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programming.
Perform detail assessments of current state data platforms and create an appropriate transition path to AWS cloud.
Design, implement and support an analytical data infrastructure providing ad-hoc access to large datasets and computing power
Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies.
Creation and support of real-time data pipelines built on AWS technologies including Glue, Redshift/Spectrum, Kinesis, EMR and Athena
Working closely with team members to drive real-time and batch model implementations for monitoring and alerting of risk systems.
Collaborate with other tech teams to implement advanced analytics algorithms that exploit our rich datasets for statistical analysis, prediction, clustering and machine learning.
Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers.
Education:
Bachelor's degree in a relevant field.
Experience & Requirements:
5+ years’ experience of working in AWS components like S3, EC2, Redshift, RDS, EMR, No SQL databases
Experience building and maintaining cloud-native applications
Experience using Dev Ops tools in a cloud environment, such as Ansible, Artifactory, Docker, Git Hub, Jenkins, Kubernetes, Maven, or Sonar Qube
Expertise in data modeling, data warehousing, and building ETL pipelines
Knowledge of one or more of the most-used programming languages available for today’s cloud computing (i.e., SQL data, XML data, R math, Clojure math, Haskell functional, Erlang functional, Python procedural, and Go procedural languages)
Experience in troubleshooting distributed systems.
Working exp of Zoho CRM APIs integration with AWS Glue
Proficiency in script development and scripting languages
Able to recommend process and architecture improvements
Excellent programming skills with SQL and Python
The ability and skill to train other people in procedural and technical topics
Strong communication, Analytical and collaboration skills
Preferable AWS Professional Data Engineer certification.
Nice to have - SAP data background, Gen AI skills, AI / LLMs on datalake skills.
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×