Back to Job Search

Senior Data Architect

  • Location: Phoenix, 85004
  • Salary: 119000.0
  • Job Type:Permanent

Posted 16 days ago

Terrific REMOTE opportunity with a FULL suite of benefits

Job Title:  Senior Data Architect
Location:  Remote

Salary: $119,000 - $129,000 (DOE) + annual bonus and great benefits package
Type: Permanent

POSITION OVERVIEW

You will be a project leader on a global fast-growing Data Architecture and Modeling team pursuing a vision of analytics-driven mining.  Your expertise in data warehouse architecture, data modeling, and ETL will enable and empower our organization to maintain a robust and trusted Enterprise Data Warehouse. You will work in close collaboration with subject matter experts, data engineers, business intelligence analysts, data scientists, and software engineers to develop advanced, highly automated data products.

  • Agile Project Work: Work as a project leader in cross-functional, geographically distributed agile teams of highly skilled delivery teams to continuously innovate analytic solutions.
    • Ensure agile team delivers on time by defining clear goals, identifying appropriate development patterns, maintaining a plan for execution, and actively secure its implementation through the scrum process
    • Develop data requirements through data modeling techniques and structured working sessions. Take action to express data requirements as 3rd Normal Form logical data models through review of source system documentation, review of system features, and workshop sessions.
    • Create physical database design for the Snowflake data warehouse and other database technologies in partnership with other technical resources
    • Develop documentation of Data Lineage and Data Dictionaries to create a broad awareness of the enterprise data model and its applications
    • Develop real-time/bulk data pipelines from a variety of sources (streaming data, APIs, data warehouse, messages, images, video, etc)
    • Partner with key business SMEs to build and manage the workgroup database view library by building relevant data shapes in SQL
    • Apply best practices within DataOps (Version Control, P.R. Based Development, Schema Change Control, CI/CD, Deployment Automation, Test Automation, Shift left on Security, Loosely Coupled Architectures, Monitoring, Proactive Notifications)
  • Problem Solving/Project Management: Provide thought leadership in problem solving to enrich possible solutions by constructively challenging paradigms and actively soliciting other opinions. Actively participate in R&D initiatives
  • Architecture: Ensure the project team utilizes modern cloud technologies, follow established design patterns, and employ best practices from DevOps/DataOps to produce enterprise quality production Python and SQL code with minimal errors. Identify and direct the implementation code optimization opportunities during code review sessions and proactively pull in external experts as needed.
  • Self-Development: Flexibly seek out new work or training opportunities to broaden experience. Independently research latest technologies and openly discuss applications within the department. Actively coach and mentor Jr team members.
  • Perform other duties as requested.

QUALIFICATIONS

Minimum Requirements:

  • Bachelor’s degree in engineering, computer science, analytical field (Statistics, Mathematics, etc.) or related discipline and five (5) years of relevant work experience
    OR
  • Master’s or Ph.D. in engineering, computer science, analytical field (Statistics, Mathematics, etc.) or related discipline and three (3) years of relevant work experience
  • Knowledge of data model using IDEF1X or similar data modeling methodologies
  • Knowledge of data modeling tools like ER Studio or Erwin
  • Proficient practitioner of SQL development
  • Experience leading joint design sessions and working in groups
  • Strong verbal and written skills in English language

Preferred Qualifications:

  • Proficient practitioner of Python development
  • Working knowledge of Software Engineering and Object Orient Programming Principles
  • Working knowledge of Parallel Processing Environments such as Snowflake or Spark SQL.
  • Working knowledge of problem solving/root cause analysis on Production workloads
  • Working knowledge of Agile, Scrum, and Kanban
  • Working knowledge of enterprise scheduling and workflow orchestration using tools such as Airflow, Prefect, Dagster, or similar tooling
  • Working knowledge with CI/CD and automation tools like Jenkins or Azure DevOps

Criteria/Conditions:

  • Our client promotes a drug/alcohol-free work environment through the use of mandatory pre-employment drug testing and on-going random drug testing as allowed by applicable state laws