Back to Job Search

ETL Developer

  • Location: Charlotte, North Carolina, 28202
  • Salary: 52.79
  • Job Type:Contract

Posted about 1 month ago

Terrific Long-Term Contract Opportunity with a FULL suite of benefits!

As one of the largest financial institutions in the world, our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers financial needs and helping people succeed financially, apply today.

Position: ETL Developer
Location: Charlotte NC 
Term: 12 months

Day-to-Day Responsibilities:
This technical role will be responsible for:

  • Design high performing data models on big-data architecture as data services.
  • Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark and Amazon S3 based object storage architecture.
  • Partner with Enterprise data teams and identify the best place to source the data
  • Work with business analysts, development teams and project managers for requirements and business rules.
  • Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.
  • Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist
  • Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.
  • Work with DBAs in Enterprise Database Management group to troubleshoot problems and optimize performance
  • Support ongoing data management efforts for Development, QA and Production environments
  • Utilizes a thorough understanding of available technology, tools, and existing designs.
  • Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.
  • Acts as expert technical resource to programming staff in the program development, testing, and implementation process.

Is this a good fit? (Requirements):

  • 10+ years of application development and implementation experience
  • 10+ years of experience delivering complex enterprise wide information technology solutions
  • 10+ years of ETL (Extract, Transform, Load) Programming experience
  • 10+ years of reporting experience, analytics experience or a combination of both
  • 5+ years of Hadoop experience 5+ years of operational risk, conduct risk or compliance domain experience
  • 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop
  • 5+ years of Java or Python experience
  • Excellent verbal, written, and interpersonal communication skills
  • Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations
  • Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects
  • Knowledge and understanding of DevOps principles
  • Ability to interact effectively and confidently with senior management
  • Experience designing and developing data analytics solutions using object data stores such as S3
  • Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark or Apache Storm