Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package!
Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Position: Hadoop Developer
Location: Charlotte, NC, 28202
Term: 12 months
Day-to-Day Responsibilities:
- The Risk & Finance Core Services is a horizontal function within Risk and Finance CIO organization and is responsible for delivering data consistently across Risk & Finance.
- This Technology Manager will report to Risk & Finance Core Services CTO and is responsible for sourcing Product, Transactional and Master/Reference data from the Approved Provisioning Points to Risk & Finance Foundational Platform; and enabling consumption of that data as data services across Risk & Finance.
- This technology leadership role requires strong big-data technology expertise and domain expertise in Risk & Finance data.
- This technology manager will lead a team of technology leads, designers, developers, analysts and testers and will have end-to-end responsibility for technology delivery.
- This manager also be part of the transformation of existing disparate data solutions and capabilities to an integrated, flexible and scalable data platform across Risk and Finance.
- Design and build data services that deliver Strategic Enterprise Risk Management data
- Design high performing data models on big-data architecture as data services.
- Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark, MongoDB and object storage architecture.
- Design and build the data services on container-based architecture such as Kubernetes and Docker
- Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data
- Work with business analysts, development teams and project managers for requirements and business rules.
- Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.
- Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist
- Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.
- Work with DBAs in Enterprise Database Management group to troubleshoot problems and optimize performance
- Support ongoing data management efforts for Development, QA and Production environments
- Utilizes a thorough understanding of available technology, tools, and existing designs.
- Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.
- Acts as expert technical resource to programming staff in the program development, testing, and implementation process.
- Lead complex technology initiatives including those that are companywide with broad impact.
- Act as a key participant in developing standards and companywide best practices for engineering complex and large scale technology solutions for technology engineering disciplines.
- Design, code, test, debug, and document for projects and programs.
- Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors.
- Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives.
- Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals.
- Lead projects, teams, or serve as a peer mentor.
Is this a good fit? (Requirements):
- 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, or education.
- Hadoop Developer
- Spark
- Hive
- Data Integration
- 5+ years of application development and implementation experience
- 5+ years of experience delivering complex enterprise wide information technology solutions
- 5+ years of ETL (Extract, Transform, Load) Programming experience
- 3+ years of reporting experience, analytics experience or a combination of both
- 4+ years of Hadoop development/programming experience
- 5+ years of operational risk or credit risk or compliance domain experience
- 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop
- 6+ years of Java or Python experience
- 5+ years of Agile experience
- 5+ years of design and development experience with columnar databases using Parquet or ORC file formats on Hadoop
- 5+ years of Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs)
- 2+ years of experience integrating with RESTful API
- Excellent verbal, written, and interpersonal communication skills
- Experience designing and developing data analytics solutions using object data stores such as S3
- Experience in Hadoop ecosystem tools for real-time & batch data ingestion, processing and provisioning such as Apache Spark and Apache Sqoop
- Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations
- Ability to interact effectively and confidently with senior management
- Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects
- Knowledge and understanding of DevOps principles
- A BS/BA degree or higher in information technology - Masters required
- Good to have:
- Code branching
- Github
- Those that have worked out well came from an ETL background but have spent a few years doing data integration in Hadoop, but that does not mean it is a required background