Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package!
Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Position: Senior Specialty Software Engineer
Location: CHARLOTTE, North Carolina, 28202
Term: 12 months
- Design and build data services that deliver Strategic Enterprise Risk Management data.
- Design high performing data models on big-data architecture as data services.
- Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark, MongoDB and object storage architecture.
- Design and build the data services on container-based architecture such as Kubernetes and Docker.
- Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data.
- Work with business analysts, development teams and project managers for requirements and business rules.
- Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.
- Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist.
- Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.
- Work with DBAs in Enterprise Database Management group to troubleshoot problems and optimize performance.
- Support ongoing data management efforts for Development, QA and Production environments.
- Utilizes a thorough understanding of available technology, tools, and existing designs.
- Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.
- Acts as expert technical resource to programming staff in the program development, testing, and implementation process.
- Lead or participate in complex initiatives on selected domains.
- Assure quality, security and compliance for supported systems and applications.
- Serve as a technical resource in finding software solutions.
- Review and evaluate user needs and determine requirements
- Provide technical support, advice, and consultation with the issues relating to supported applications.
- Create test data and conduct interfaces and unit tests.
- Design, code, test, debug and document programs using Agile development practices.
- Understand and participate to ensure compliance and risk management requirements for supported area are met and work with other stakeholders to implement key risk initiatives.
- Conduct research and resolve problems in relation to processes and recommend solutions and process improvements.
- Assist other individuals in advanced software development.
- Collaborate and consult with peers, colleagues and managers to resolve issues and achieve goals.
Is this a good fit? (Requirements):
- 5+ years of application development and implementation experience.
- 5+ years of experience delivering complex enterprise wide information technology solutions.
- 5+ years of ETL (Extract, Transform, Load) Programming experience.
- 3+ years of reporting experience, analytics experience or a combination of both.
- 4+ years of Hadoop development/programming experience.
- 5+ years of operational risk or credit risk or compliance domain experience.
- 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop.
- 6+ years of Java or Python experience.
- 5+ years of Agile experience.
- 5+ years of design and development experience with columnar databases using Parquet or ORC file formats on Hadoop.
- 5+ years of Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient Distributed Datasets (RDDs).
- 2+ years of experience integrating with RESTful API.
- Excellent verbal, written, and interpersonal communication skills.
- Experience designing and developing data analytics solutions using object data stores such as S3.
- Experience in Hadoop ecosystem tools for real-time & batch data ingestion, processing and provisioning such as Apache Spark and Apache Sqoop.
- Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations.
- Ability to interact effectively and confidently with senior management.
- Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects.
- Knowledge and understanding of DevOps principles.
- A BS/BA degree or higher in information technology - Masters required.
- 4+ years of Specialty Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education.