Back to Job Search

Big Data Developer

  • Location: Irving, Texas, 75063
  • Salary: 66.51
  • Job Type:Contract

Posted 29 days ago

Terrific Long-Term Contract Opportunity with a FULL suite of benefits!

As one of the largest financial institutions in the world, our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.

Position: Big Data Developer
Location: Irving, TX
Term: 12 months

Day-to-Day Responsibilities:

  • Design high performing data models on big-data architecture as data services. 
  • Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark and Amazon S3 based object storage architecture. 
  • Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data 
  • Work with business analysts, development teams and project managers for requirements and business rules. 
  • Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions. 
  • Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist 
  • Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure. 
  • Work with DBAs in Enterprise Database Management group to troubleshoot problems and optimize performance 
  • Support ongoing data management efforts for Development, QA and Production environments 
  • Utilizes a thorough understanding of available technology, tools, and existing designs. 
  • Leverage knowledge of industry trends to build best in class technology to provide competitive advantage. 
  • Acts as expert technical resource to programming staff in the program development, testing, and implementation process. 
  • Leads, designs, develops, test and implements applications and system components, tools and utilities, models, simulation, and analytics to manage complex business functions using sophisticated technologies. 
  • Resolves coding, testing and escalated platform issues of a technically challenging nature. 
  • Responsible for defining opportunities across IT to maximize resource utilization and improve processes while reducing cost. 
  • Ensures that systems are monitored to increase operational efficiency and managed to mitigate risk. 
  • Mentors and trains other members of the team. 
  • Partners with Management, Dev, QA, production support and platform engineering teams effectively. 
  • Focuses on building relevant capabilities in the organization to keep pace with demand and best practices in the industry. 
  • Manages vendor/contractor partnerships to improve efficiency and effectiveness. 
  • Designs, codes, tests, debugs and documents programs using Agile development practices. 
  • Operates in restricted to niche domains: such as Capital Markets, Quants, Artificial Intelligence, Machine Learning. 
  • Leads implementation of complex projects/initiatives on above domain. 
  • Understands and leads the team to ensure compliance and risk management requirements for supported area are met and works with other stakeholders to implement key risk initiatives. 
  • Mentors lower level team members. 


Is this a good fit? (Requirements):

  • 10+ years of software engineering experience 
  • 7+ years of experience in one or a combination of the following: securities, quantitative trading, artificial intelligence, or machine learning 
  • 10+ years of application development and implementation experience 
  • 10+ years of experience delivering complex enterprise wide information technology solutions 
  • 10+ years of ETL (Extract, Transform, Load) Programming experience 
  • 10+ years of reporting experience, analytics experience or a combination of both 
  • 5+ years of Hadoop experience 
  • 5+ years of operational risk, conduct risk or compliance domain experience 
  • 5+ years of experience delivering ETL, data warehouse and data analytics capabilities on big-data architecture such as Hadoop 
  • 5+ years of Java or Python experience 
  • Excellent verbal, written, and interpersonal communication skills 
  • Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations 
  • Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects 
  • Knowledge and understanding of DevOps principles 
  • Ability to interact effectively and confidently with senior management 
  • Experience designing and developing data analytics solutions using object data stores such as S3 Experience in Hadoop ecosystem tools for real-time batch data ingestion, processing and provisioning such as Apache Flume, Apache Kafka, Apache Sqoop, Apache Flink, Apache Spark or Apache Storm


Desired Qualifications:

  • Building Big Data Hadoop systems
  • Building optimized big data pipelines for sourcing data from various databases
  • Building frameworks using Java, Scala, Python programming
  • Experience working with Apache Spark, Apache Kafka
  • Ability to interact effectively and confidently with senior management
  • Excellent verbal, written, and interpersonal communication skills
  • Ability to work effectively in virtual environment where key team members and partners are in various time zones and locations
  • Knowledge and understanding of project management methodologies: used in waterfall or Agile development projects
  • Knowledge and understanding of DevOps principles
  • Masters degree or higher in computer science or finance 
  • An industry-standard technology certification 
  • Basic knowledge of industry regulations related to building