Back to Job Search

Hadoop Developer

  • Location: Addison, Dallas, Texas, 75001
  • Salary: 74.43
  • Job Type:Contract

Posted 22 days ago

Work with the brightest minds at one of the largest financial institutions in the world. This is long-term contract opportunity that includes a competitive benefit package!

Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.

Position: Financial Systems Engineere
Location: Addison, Texas, 75001
Term: 10 months

Day-to-Day Responsibilities:

  • Leads, designs, develops, test and implements applications and system components, tools and utilities, models, simulation, and analytics to manage complex business functions using sophisticated technologies.
  • Resolves coding, testing and escalated platform issues of a technically challenging nature.
  • Responsible for defining opportunities across IT to maximize resource utilization and improve processes while reducing cost.
  • Ensures that systems are monitored to increase operational efficiency and managed to mitigate risk.
  • Mentors and trains other members of the team.
  • Partners with Management, Dev, QA, production support and platform engineering teams effectively.
  • Focuses on building relevant capabilities in the organization to keep pace with demand and best practices in the industry.
  • Manages vendor/contractor partnerships to improve efficiency and effectiveness.
  • Designs, codes, tests, debugs and documents programs using Agile development practices.
  • Operates in restricted to niche domains: such as Capital Markets, Quants, Artificial Intelligence, Machine Learning. Leads implementation of complex projects/initiatives on above domain.
  • Understands and leads the team to ensure compliance and risk management requirements for supported area are met and works with other stakeholders to implement key risk initiatives.
  • Mentors lower level team members.


Is this a good fit? (Requirements):

  • Spark internals
  • Hadoop expertise
  • Good architectural knowledge of deployment patterns on VMs, Clusters, Kubernetes, etc.
  • Proficiency in GCP
  • 10+ years of software engineering experience
  • 7+ years of experience in one or a combination of the following: securities, quantitative trading, artificial intelligence, or machine learning
  • Masters degree or higher in computer science or finance
  • An industry-standard technology certification
  • Basic knowledge of industry regulations related to building technological solutions