Back to Job Search

Big Data Engineer

  • Location: Philadelphia, 19130
  • Salary: $92.89 - $102.59 / hour
  • Job Type:Contract

Posted 9 months ago

Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package!

Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.

Position: Lead Specialty Software Engineer
Location: PHILADELPHIA, Pennsylvania, 19130
Term: 12 months

Day-to-Day Responsibilities:

  • Lead complex initiatives on selected domains.
  • Ensure systems are monitored to increase operational efficiency and managed to mitigate risk.
  • Define opportunities to maximize resource utilization and improve processes while reducing cost.
  • Lead, design, develop, test and implement applications and system components, tools and utilities, models, simulation, and analytics to manage complex business functions using sophisticated technologies.
  • Resolve coding, testing and escalated platform issues of a technically challenging nature.
  • Lead team to ensure compliance and risk management requirements for supported area are met and work with other stakeholders to implement key risk initiatives.
  • Mentor less experienced software engineers.
  • Collaborate and influence all levels of professionals including managers.
  • Lead team to achieve objectives.
  • Partner with production support and platform engineering teams effectively.


Is this a good fit? (Requirements):

  • Skills:
    • Data Pipeline Product development – hands on experience designing and building.
    • Google Cloud Big Data Specialty – hands on experience.
    • Hadoop Services Development - hands on experience.
    • Python and Spark Programming – expert level.
    • Airflow Custom Operator Development – experience.
    • Debezium Open Source CDC – experience is a plus.
    • Kafka development.
    • CI/CD, DevOps, QA Automation.
    • Strong Communicator and Leader.
    • 10+ years of information architecture and data modeling experience.
    • 10+ years of BI/DW and data modeling experience.
    • Knowledge of Data Warehousing, ETL and BI architectures, concepts and frameworks.
    • Deep knowledge of Kimball, Inmon, Data Vault approaches for data warehouse modeling and the core differences.
    • Experience in data warehouse/data mart/NoSQL modeling principles/methods including conceptual, logical & physical Data Models.
    • Deep experience with time-series, dimensional, column-oriented, event sourced, and semantic data modeling patterns.
  • Additional Skills:
    • Optimizing physical models based on use case needs and data access patterns, considering performance, volumes, and complexity of queries.
    • Capable of facilitating data discovery sessions involving business subject matter experts.
    • Experience in translating/mapping relational data models into Data Schemas.
    • Experience with modeling tools (such as ERWin and ER/Studio).
    • Ability to create and maintain conceptual/business, logical and physical data models.
    • Ability to translate a logical model into a physical model, adding appropriate physical objects needed to create the database e.g. indexes.
    • Experience maintaining a model repository, along with enforcing logical model versioning, tied to physical schema evolution.
    • Ability to synchronize models to ensure that database structures match models.
    • Familiar with metadata ingestion and translation.
    • Excellent presentation, communication, and organizational skills.
    • Strong interpersonal skills and ability to work as part of a team.
    • Familiar with Data Governance Process Management.
    • Domain Driven Design based Data Product and Data Mesh Architecture concepts to develop highly interoperable data products.
    • Cyber Security domain knowledge.
    • Data modeling in the GCP/Azure cloud environments are a bonus.
  • 5+ years of Specialty Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education.