Back to Job Search

Senior Data Engineer

  • Location: San Francisco, 94111
  • Job Type:Contract

Posted about 1 year ago

Senior Data Engineer
San Francisco, CA (Bay Area) – Hybrid Remote

Target Start Date: September 2022

What you’ll do as a Senior Data Engineer:

The Senior Data Engineer works with product managers, business/data analysts in the implementation of data driven solutions to solve for complex business analytics problems. The data engineer supports Enterprise Data and Client Insights department in building centralized data assets in support of banking operations, sales organization, cross-functional analytics and reporting. The individual will work with Data SME’s, Business SME’s, Product Managers and Technical teams in the development and delivery of data driven solutions to cater towards users that shall use the enriched information for analytics, quantitative modeling, internal and external reporting.

Primary Responsibilities:

  • Build and Deliver data driven products and solutions for Loan Originations or Client Interactions that shall be centralized single source of truth for analytics, modeling and reporting.
  • Develop data models and structures to support client reporting and banker facing applications.
  • Develop a Data Quality and Reconciliation framework.
  • Integrate Total Plus and FIS-IBS data into a single repository to facilitate consumption of data by downstream applications such as Deposit Services, Lending Services, Digital Banking, BSA/AML etc.
  • Design semantic layer to support static reports and dynamic dashboards.
  • Participate in the design and integration of Real Time Transactions data.
  • Design and Develop Data pipelines using SnowSQL and Python. Should be familiar with core Python packages like Pandas, Matplotlib, Numpy, TensorFlow, PyTorch.
  • Well conversant with API development life cycle from end to end API build perspective and good to have experience in deployment on OpenShift. Plus if API is developed using Python framework.
  • Multidisciplinary work supporting data pipelines, data warehouses and reporting services.
  • Design and develop data movements using Snowflake capabilities like Snow SQL, Tasks, Streams, Time travel, Data sharing and stored procedures.
  • Follow data standards, resolve data issues, complete unit testing and system documentation for ETL processes.
  • Collaborate with IT operations and testing organizations to ensure timely releases of projects and make sure database environments are sustainable, code migrations to production.
  • Collaborate with business analysts, subject matter experts, and other team members to determine data extraction and transformation requirements.
  • Use Big Data technologies such as Kafka, Snowflake and related technologies to store, curate, process and publish datasets for consumption by downstream business users and applications.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability etc.
  • Create solutions utilizing industry standard dimensional models and data architecture to support business users that include data scientists, analysts and non-technical functional users.
  • Respond quickly to bug fixes and enhancement requests and be able to take directions and complete tasks on-time with minimal supervision.
  • Follow coding best practices-Unit testing, design/code reviews, documentation, etc.

You could be a great fit if you have:

  • Requires 7+years of progressive experience in developing and supporting data and reporting projects.
  • Requires 5+ years of in-depth experience using ETL tools like Data Stage, Informatica, SnowSQL and BI tools including Qulikview, Power BI, Tableau, etc.
  • Requires 2+ years of using Snowflake capabilities like Snow SQL, Tasks, Streams, Time travel, Data sharing and stored procedures.
  • Requires 2+ years of experience in Python, Anaconda and API development.
  • Expertise in operational data stores and real time data integration is preferred.
  • Must possess excellent Analytical and Communications skills and the ability to clearly articulate data pipeline solutions.
  • Ability to multi-task and work on adhoc requests like point analytics or reports to support product managers and business teams.
  • Ability to look and think outside the box to identify solutions that furthers the initiative and/or improves the effectiveness of the data usage for analytics and reporting.


Candidates will be responsible for following the client's COVID-19 protocols. Please refer to your MATRIX representative for specifics.

About Us

At MATRIX, we expertly match talented professionals with job opportunities to elevate careers. Since 1983, we have placed thousands of professionals at innovative clients across every industry ranging from small startups to Fortune 50 companies. It’s why we’re a top 15 U.S. IT staffing firm and why our consultants rate us well above the industry average. People come to us for a job, and stay with us because of our top-notch consultant care.

For hourly W2 contract roles, MATRIX offers a highly competitive benefit package including Medical, Dental, Vision, Life, Disability, HSA, and 401(k) with pre and post-tax options. Please see for more information. For direct hire placement with our clients, benefits will be offered in accordance with that particular client’s offerings. This may include PTO, Medical, Dental, Vision, 401K and other pre and post-tax options.

Motion Recruitment Partners is an Equal Opportunity Employer, including Veterans/Disability/Women. All applicants must be currently authorized to work on a full-time basis in the country for which they are applying, and no sponsorship is currently available. Accommodation will be provided in all parts of the hiring process as required under Motion Recruitment Employment Accommodation policy. Applicants need to make their needs known in advance.