Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package!
Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Position: Big Data System Designer
Location: CHARLOTTE, North Carolina
Term: 12 months
- Design high performing data models on big-data architecture as data services.
- Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark and Amazon S3 based object storage architecture.
- Work with business analysts, development teams and project managers for requirements and business rules.
- Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.
- Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist.
- Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.
- Support ongoing data management efforts for Development, QA and Production environments.
- Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.
- 10+ years of application development and implementation experience.
- 10+ years of experience delivering complex enterprise wide information technology solutions.
- Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark or Apache Storm.
- 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop 5+ years of Java or Python experience.
- 10+ years of ETL (Extract, Transform, Load) Programming experience.
- 10+ years of reporting experience, analytics experience or a combination of both 5+ years of Hadoop experience 5+ years of operational risk, conduct risk or compliance domain experience.
- Excellent verbal, written, and interpersonal communication skills.
- Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations.
- Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects.
- Knowledge and understanding of DevOps principles Ability to interact effectively and confidently with senior management.
- Lead complex initiatives on selected domains.
- Ensure systems are monitored to increase operational efficiency and managed to mitigate risk.
- Define opportunities to maximize resource utilization and improve processes while reducing cost.
- Lead, design, develop, test and implement applications and system components, tools and utilities, models, simulation, and analytics to manage complex business functions using sophisticated technologies.
- Resolve coding, testing and escalated platform issues of a technically challenging nature.
- Lead team to ensure compliance and risk management requirements for supported area are met and work with other stakeholders to implement key risk initiatives.
- Mentor less experienced software engineers.
- Collaborate and influence all levels of professionals including managers.
- Lead team to achieve objectives.
- Partner with production support and platform engineering teams effectively.
Is this a good fit? (Requirements):
- 5+ years of Specialty Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education.