×
Register Here to Apply for Jobs or Post Jobs. X

Hadoop Administartor

Job in Bridgewater, Plymouth County, Massachusetts, 02324, USA
Listing for: IndSoft, Inc.
Full Time position
Listed on 2026-01-12
Job specializations:
  • IT/Tech
    Cloud Computing, Systems Administrator
Salary/Wage Range or Industry Benchmark: 60000 - 80000 USD Yearly USD 60000.00 80000.00 YEAR
Job Description & How to Apply Below

Founded in 1998 and managed by a visionary who has a very strong technical background, Ind Soft is one of the fastest growing consulting services companies and is headquartered in Chicago. We have international delivery centers in the USA and India. Our motto "We put the IT in your PROF-IT" is more than just a tagline; it inspires the true spirit of Ind Soft - delivering business value and creating a sustainable competitive advantage.

Job Description

Hi,

Please find the below Job description

If you are interested for this position please contact me at  Ext 304

Position: Hadoop Admin

Location: Bridge Water, NJ

Duration: 6+ months

Client is seeking a Hadoop Administrator to manage large scale Hadoop cluster environment, handling all Hadoop environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring with good technical, organization, and communication skills. This position is located in Bridgewater, NJ site.

As part of the Global Infrastructure Services organization, the Data Management Platforms and Services team is looking for a Hadoop Administrator with a good knowledge of the Hadoop eco-systems to participate in the building of a new Hadoop service offering.

Responsibilities
  • Participate in the design, build and deployment of a new Hadoop platform & service in order to cover all Big Data projects with a common service request, monitoring definition, backup and recovery strategy, and security.
  • Assist in definition of an operation model and establishing support best practices for Hadoop cluster.
  • Participate in POC's with applications and projects interested in the Big Data Advanced Analytics, working on enabling/testing solutions and software as needed
  • Work closely with the Regional Service Delivery operation teams to perform training (KT Sessions with L2 and L3), support and optimize Hadoop cluster infrastructure.
  • Support development and production deployments as required.
  • Manage and monitor Hadoop cluster and platform infrastructure.
  • Automate cluster node provisioning and repetitive operational tasks.
  • Manage Hadoop stack support runbook in collaboration with the Regional Service Delivery operation teams.
  • Support development and production deployments.
Qualifications
  • 3-4 years of experience with large scale Hadoop environment build and support including design, capacity planning, cluster set up, performance tuning and monitoring.
  • Strong understanding of Hadoop ecosystem such as HDFS, Map Reduce, HBase, Zookeeper, Pig, Sqoop, Oozie and Hive.
  • 2-3 years of experience with Cloudera Distribution of Hadoop (CDH).
  • Experience with disaster recovery and business continuity practices in Hadoop stack.
  • A deep understanding of Hadoop design principles, cluster connectivity, security and the factors that affect distributed system performance.
  • Cloudera Certification CCA 500 is preferred.
  • Experience with IMPALA and Spark.
  • Experience maintaining, troubleshooting and setup large clusters.
  • Thorough knowledge of current hardware systems commonly used in production environments.
  • Ability to proactively identify, troubleshoot and resolve production systems issues.
  • Technical documentation skill on supported application and Operational tools.
  • Strong knowledge in Hadoop monitoring tools.
  • Experience with Oracle, SQL, Teradata database platforms is a plus;

    Experience with Mongo

    DB platform is a plus;
    Expertise in UNIX architecture, UNIX tools and UNIX shell scripting.
  • Experience with RHEL 5 or 6 Linux.
  • Experience with Amazon Cloud (AWS) is a plus (not mandatory).

Ind Soft

Ext 304 |  (F)

Additional Information

Required Skills:

HDFS, Map Reduce, HBase, Zookeeper, Pig, Sqoop, Oozie and Hive.

#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary