Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package!
Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Position: Senior Software Engineer
Location: Charlotte, NC; Minneapolis, MN; St. Louis, MO
Term: 12 months
- Act as an expert in designing, building and operationalization of large-scale enterprise data and applications using one or more of GCP (Global Cloud Platform) data and analytics services in combination with 3rd parties – Apache Spark, Hive, Cloud DataProc, Cloud Dataflow, Apache Beam/ composer, Big Table, Cloud BigQuery, Cloud PubSub, Cloud storage, Cloud Functions & GitHub
- Act in the highest-level technical role as an individual contributor and/or team lead for the most complex computer applications and/or application initiatives. Utilizes a thorough understanding of available technology, tools, and existing designs
- Works on the most complex problems where analysis of situations or data requires evaluation of intangible variance factors. Plans, performs, and acts as the escalation point for the most complex platform designs, coding, and testing. Leads most complex multiple modeling, simulations, and analysis efforts
- Acts as expert technical resource to programming staff in the program development, testing, and implementation process
- Design and build the data services on container-based architecture such as Kubernetes and Docker
- Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data
- Work with business analysts, development teams and project managers for requirements and business rules
- Collaborate with source system, Architects, Data Analysts and Data Modelers to build scalable and performant data solutions
- Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist
- Work with Infrastructure Engineers and System Administrators as appropriate
- Work with DBAs in Enterprise Database Management group to troubleshoot problems and optimize performance
- Support ongoing data management efforts for Development, QA and Production environments
- Utilizes a thorough understanding of available technology, tools, and existing designs
- Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.
- Lead moderately complex initiatives within Technology and contribute to large scale data processing framework initiatives related to enterprise strategy deliverables
- Build and maintain optimized and highly available data pipelines that facilitate deeper analysis and reporting
- Review and analyze moderately complex business, operational or technical challenges that require an in-depth evaluation of variable factors
- Oversee the data integration work, including developing a data model, maintaining a data warehouse and analytics environment, and writing scripts for data integration and analysis
- Resolve moderately complex issues and lead teams to meet data engineering deliverables while leveraging solid understanding of data information policies, procedures, and compliance requirements
- Collaborate and consult with colleagues and managers to resolve data engineering issues and achieve strategic goals.
- Lead moderately complex initiatives and deliverables within technical domain environments.
- Contribute to large scale planning of strategies.
- Design, code, test, debug, and document for projects and programs associated with technology domain, including upgrades and deployments.
- Review moderately complex technical challenges that require an in-depth evaluation of technologies and procedures.
- Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements.
- Collaborate and consult with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals.
- Lead projects and act as an escalation point, provide guidance and direction to less experienced staff.
Is this a good fit? (Requirements):
- 8+ years of Data Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
- A BS/BA degree or higher
- 8+ years of experience in Informatica ETL, Oracle and Unix
- 5+ years’ experience migrating legacy data platforms to cloud and modern architecture
- 5+ years’ experience with Google Cloud Platform (GCP), Big Query, and building GCP based data lakes
- 5+ years advanced SQL, PL/SQL & Python skills
- 5+ years’ experience with one or more of the following: Pub/Sub, Cloud Functions, Dataflow, DataProc (Hadoop, Spark, and Hive), Cloud Datastore and Cloud Big Query
- 5+ years expertise in traditional Oracle and any NoSQL databases
- 3+ years AGILE experience using SCRUM/KANBAN
- 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, or education.
- 4+ years of Autosys experience
- 4+ years of RESTful or SOAP web services
- Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations
- Knowledge and understanding of DevOps principles
- Experience designing and developing data analytics solutions using object data stores such as S3
- Experience in Infrastructure, Software Configuration and Environment Management.