Back to Job Search

Hadoop Developer

  • Location: Charlotte, 28202
  • Salary: $70.32 - $77.67 / hour
  • Job Type:Contract

Posted 4 months ago

Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package!

Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.

Position: Data Engineer
Location: CHARLOTTE, North Carolina, 28202
Term: 24 months

Day-to-Day Responsibilities:

  • Lead complex technology initiatives including those that are companywide with broad impact.
  • Act as a key participant in developing standards and companywide best practices for engineering complex and large scale technology solutions for technology engineering disciplines.
  • Design, code, test, debug, and document for projects and programs.
  • Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors.
  • Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives.
  • Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals.
  • Lead projects, teams, or serve as a peer mentor.


Is this a good fit? (Requirements):

  • 1+ year of Agile experience.
  • 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education.
  • 5+ years of experience in ETL or Talend or PySpark Components (database, file, API, etc.), Context management, Deployment, Performance Tuning and Administration (setting up projects, using admin-console to configure jobs, etc.) to help build end to end ETL connection from client sources, staging data lake to solution DB/server.
  • 3+ years of strong working knowledge on Database: Object Oriented & Relational DB. Strong experience on SQL, PL/SQ and DB query languages and tools.
  • 3+ years of hands-on experience working with Hadoop Ecosystem and Big data technologies like HDFS, Object Store, Hortonwork, Hive, Kafka, & Spark.
  • Candidate should have good knowledge of Linux and have ETL development, deployment and optimization experience using standard big data tools.
  • 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, or education.



Desired Qualifications:

  • Autosys knowledge
  • Experience delivering complex enterprise wide information technology solutions e.g. Google Cloud Platform tools.
  • Excellent verbal, written, and interpersonal communication skills.
  • Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations.