Senior Data Architect (Remote)
The Senior Data Architect is responsible for overseeing the implementation of technical solutions for the ingestion, transformation, processing, and storage of data in large-volume/low-latency pipelines. This role requires an enterprise mindset to build out robust, high-performance technology solutions.
The Senior Data Architect position mixes strong hands-on technical leadership with the management of matrixed resources in order to achieve business outcomes. It is expected that this person is as excited about developing code as they are about developing the skills and competencies of the team at large.
Duties and Responsibilities
- Use a variety of programming languages and tools to develop, test, and maintain data pipelines within the Platform Reference Architecture.
- Working directly with management, product teams and practice personnel to understand their platform data requirements
- Maintaining a positive work atmosphere by behaving and communicating in a manner that encourages productive interactions with customers, co-workers and supervisors
- Developing and engaging with team members by creating a motivating work environment that recognizes, holds team members accountable, and rewards strong performance
- Fostering an innovative, inclusive and diverse team environment, promoting positive team culture, encouraging collaboration and self-organization while delivering high quality solutions
- Collaborating on an Agile team to design, develop, test, implement and support highly scalable data solutions
- Collaborating with product teams and clients to deliver robust cloud-based data solutions that drive tax decisions and provide powerful experiences
- Analyzing user feedback and activity and iterate to improve the services and user experience
- Securing data in alignment with internal information and data security policies, best practices and client requirements
- Creating and implementing robust cloud-based data solutions that scale effectively, and provide powerful experiences for both internal teams and clients
- Performing unit tests and conducting reviews with other team members to make sure solutions and code are rigorously designed, elegantly coded and effectively tuned for performance
- Staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities and mentoring other members of the engineering community
Education and Experience
- Bachelor’s and/or master’s degree in a related field
- 10+ years of experience developing data technologies.
- 10+ years of experience deploying ETL solutions in production environments.
- 10+ years of experience developing cloud-based data services, preferably in AWS or Azure.
- 10+ years of developing and overseeing the development of Python, Scala, Java, .Net or similar solutions
- 10+ years of database/query tuning
- 10+ years of experience in mixed Windows/Linux environments.
Additional Required Skills and Experience
- Results-proven track record of exceeding goals and evidence of the ability to consistently make good decisions through a combination of analysis, experience and judgment
- Fluency in one or more databases, preferably relational and NoSQL is a plus.
- Fluency with distributed data platforms.
- Knowledge of at least one AI/ML pipeline technology or platform.
- Experience deploying, monitoring, and maintaining data pipelines in production environments
- Design models of data processing that implement the intended business model
- Develop diagrams representing key data entities and their relationships
- Generate a list of components needed to build the designed system
- Communicate clearly, simply, and effectively
- Commitment to diversity, accountability, transparency, and ethics.
- Computer Skills: To perform this job successfully, an individual must have intermediate knowledge of Microsoft Project, Word, Excel, Access, PowerPoint, Outlook, and Internet navigation and research.