Back to Job Search

Hadoop Administrator

  • Location: Charlotte, 28202
  • Salary: $70.32 - $77.67 / hour
  • Job Type:Contract

Posted 13 days ago

Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package!

Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.

Position: Sr. Hadoop Administrator
Location: Charlotte, NC, Des Moines, IA, Minneapolis, MN, Dallas, TX
Term: 9 months

Day-to-Day Responsibilities:

We are seeking a Senior Hadoop Administrator to manage the Hadoop / MapR on cloud or on-premises Linux instances, including configuration, capacity planning, expansion, performance tuning and monitoring.
  • Work with data engineering team to support development and deployment of Spark and Hadoop jobs.
  • Work with end users to troubleshoot and resolve incidents with data accessibility.
  • Contribute to the architecture design of the cluster to support growing demands and requirements.
  • Contribute to planning and implementation of software and hardware upgrades with ability to utilize disaster recovery related to Hadoop platforms if needed.
  • Recommend and implement standards and best practices related to cluster administration using Kafka, HBase, Spark, Hive etc.
  • Research and recommend automated approaches to cluster administration. Utilize Yarn configuration in a multi-tenancy environment with Yarn capacity scheduler.
  • Act as Sr. person on the team in providing Hadoop Administration Services.
  • Develop new documentation, departmental technical procedures, and user guides.
  • Assure quality, security and compliance requirements are met for supported area and oversees creation to update and test the business continuation plan. 
Required Qualifications : Position requires a Bachelor's degree in Information Systems Technologies, Electrical Engineering, or related technical field plus five (5) years of experience in the job offered or in a related position involving software engineering experience.
Specific skills required:
  • Experience with Big Data, Hadoop, MapR/Ezmeral Data Fabric, Hortonworks Data Platform, and YARN;
  • Experience working in Linux environments;
  • Experience with Hortonworks or Ezmeral Data Fabric cluster administration (not development);
  • Knowledge of industry standard Incident, Problem, Release, and Change Management processes;
  • Experience troubleshooting issues related to Hadoop Ecosystem components like Hive, Spark, Hbase;
  • Experience installing and maintaining Python Libraries;
  • Experience with environment and infrastructure integration; and
  • Capacity management experience.