More jobs:
Technology Lead - CDH Admin
Job in
Hartford, Hartford County, Connecticut, 06112, USA
Listed on 2026-01-12
Listing for:
TecTammina
Full Time
position Listed on 2026-01-12
Job specializations:
-
IT/Tech
Cloud Computing, Data Engineer
Job Description & How to Apply Below
Responsibilities
- Experience in administering large Hadoop clusters, (either Cloudera or Hortonworks) including activities like deploy Hadoop cluster, add and remove nodes, keep track of jobs, monitor critical parts of the cluster, configure name-node high availability, schedule and configure it and take backups
- General operational expertise such as good troubleshooting skills, understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.
- Responsible for implementation and ongoing administration of Hadoop infrastructure.
- Aligning with the systems engineering team to propose and deploy new hardware and software environments required for Hadoop and to expand existing environments.
- Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and testing HDFS, Hive, Pig and Map Reduce access for the new users.
- Cluster maintenance as well as creation and removal of nodes using tools like Ganglia, Nagios, Cloudera Manager Enterprise, Dell Open Manage and other tools.
- Monitoring, Performance tuning of Hadoop clusters and Hadoop Map Reduce routines.
- Screen Hadoop cluster job performances and capacity planning
- Monitor Hadoop cluster connectivity and security
- Manage and review Hadoop log files.
- File system management and monitoring.
- HDFS support and maintenance, disk space management
- Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability.
- Collaborating with application teams to install operating system and Hadoop updates, patches, version upgrades when required.
- Point of Contact for Vendor escalation
- Software installation and configuration, patches and upgrades
- Analytical skills
- Experience and desire to work in a Global delivery environment
Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.
At least 4 years of experience with Information Technology.
At least 4 years of experience in Software development life cycle with primary experience in DW/BI and related tools, understanding of HADOOP Framework, Hive , Pig and No
SQL.
At least 4 years of experience in Project life cycle activities on DW/BI development and maintenance projects.
Additional InformationJob Status:
Full Time
Contact:
Keep the subject line with Job Title and Location
#J-18808-LjbffrTo View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×