Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package!
Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Position: Hadoop Developer
Location: CHARLOTTE, North Carolina, 28202
Term: 12 months
- Lead complex technology initiatives including those that are companywide with broad impact.
- Act as a key participant in developing standards and companywide best practices for engineering complex and large scale technology solutions for technology engineering disciplines.
- Design, code, test, debug, and document for projects and programs.
- Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors.
- Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives.
- Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals.
- Lead projects, teams, or serve as a peer mentor.
Is this a good fit? (Requirements):
- 5+ years of ETL (Extract, Transform, Load) Programming experience.
- 3+ years of reporting experience, analytics experience or a combination of both.
- 4+ years of Hadoop development/programming experience.
- 5+ years of operational risk or credit risk or compliance domain experience.
- 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop.
- 6+ years of Java or Python experience.
- 5+ years of Agile experience.
- 5+ years of design and development experience with columnar databases using Parquet or ORC file formats on Hadoop.
- 5+ years of Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs).
- 2+ years of experience integrating with RESTful API.
- 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, or education.