- Translate business requirements into technology solutions.
- Integrate the data pipelines with various upstream and downstream applications.
- Examine and identify database structural necessities by evaluating client operations, applications, and programming.
- Build and Design end to end data pipeline using Amazon AWS, Spark and Snowflake data warehouse.
- Development Expertise with Spark Using Python.
- Aid in gathering Requirements, conducting Business Analysis, and writing technical design specifications.
- Understanding of Conceptual /Logical and Physical Data Models; define logical views and physical data security structures.
- ETL performance tunings using optimization, code quality and standards.
- Define Different Data Framework and Standards around those.
- Perform problem determination (root cause analysis), document the source of problems and their resolution. Recognize and analyze trends in errors so as to identify and install long term solutions to problems.
- Adhere to Development Standards, Best Practices and Guidelines to ensure compatibility, scalability, and integration with other data platforms.
- Best Practices in leading a development team providing guidance / training and taking care of the work reviews.
- Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation.
- Effectively communicate with Business users and customers.
- Review the designs and code created by others and provides constructive feedback.
- Work with product owners to identify areas requiring decisions/clarity in order to define project scope and objectives/requirements.
- Identify and assess risks due to changes in scope (business decisions/changes), resources, or timeline.
- Align architecture diagrams and process flows to project plan to identify and remediate gaps.
- Keep open communication with all team members involved.
3-5+ years Python Development experience.
In-depth cloud experience (AWS preferred).
Extensive experience building Cloud technology using Spark.
In depth knowledge about data warehousing (data acquisition, data management and data consumption) and Cloud data Platform ETL Design on assigned projects.
Data Engineering technical acumen.
- Strong Communication skills.
- Business Intelligence experience.
- Cloud technology experience with AWS, Spark Snowflake.