Back to Job Search

ETL Consultant

  • Location: Charlotte, 28202
  • Salary: 53.2
  • Job Type:Contract

Posted about 1 month ago

Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package!

Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.

Position: ETL Consultant
Location: CHARLOTTE, North Carolina
Term: 12 months

Day-to-Day Responsibilities:

  • Design high performing data models on big-data architecture as data services.
  • Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark and Amazon S3 based object storage architecture.
  • Work with business analysts, development teams and project managers for requirements and business rules.
  • Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.
  • Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist.
  • Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.
  • Support ongoing data management efforts for Development, QA and Production environments.
  • Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.
  • Lead moderately complex initiatives and deliverables within technical domain environments.
  • Contribute to large scale planning of strategies.
  • Design, code, test, debug, and document for projects and programs associated with technology domain, including upgrades and deployments.
  • Review moderately complex technical challenges that require an in-depth evaluation of technologies and procedures.
  • Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements.
  • Collaborate and consult with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals.
  • Lead projects and act as an escalation point, provide guidance and direction to less experienced staff.


Is this a good fit? (Requirements):

  • 7+ years of application development and implementation experience.
  • 7+ years of experience delivering complex enterprise wide information technology solutions.
  • Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark or Apache Storm.
  • 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop 5+ years of Java or Python experience.
  • 7+ years of ETL (Extract, Transform, Load) Programming experience.
  • 7+ years of reporting experience, analytics experience or a combination of both 5+ years of Hadoop experience 5+ years of operational risk, conduct risk or compliance domain experience.
  • Excellent verbal, written, and interpersonal communication skills.
  • Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations.
  • Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects.
  • Knowledge and understanding of DevOps principles/
  • Ability to interact effectively and confidently with senior management.
  • 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, or education.