Back to Job Search

Sr. Kafka Developer

  • Location: San Ramon, California, 94583
  • Job Type:Permanent

Posted 11 months ago

  • Job Ref: 183703

Sr Kafka Developer $180k
Permanent Position
San Ramon, California


The Kafka Developer will design and develop scalable and reliable real-time stream processing solutions using Hortonworks Data Flow HDF product suite (Nifi/Kafka/Spark). Candidate will work directly with business partners to translate complex functional and technical requirements for streaming data ingestion solutions into detailed design & implementation plans


Qualifications / Experience Requirements

  • Minimum 15+ years of software industry and integration solutions development experience.
  • Minimum 6+ years of experience in development and support of stream processing solutions in Hadoop technologies
  • Expert level knowledge of Kafka and related technologies (Hive, Hadoop, Spark, Storm, Nifi, Zookeeper, Ambary, Ranger)
  • Strong Knowledge of SOAP, REST, JMS architecture Concepts and PUB/SUB Pattern
  • Strong knowledge of web services security
  • Solid knowledge of  Java SE, Java EE, XML, XML Schema, XSD, XSLT/XPath and JSON technologies
  • Experience with CA API Gateway and/or IBM WebSphere MQ and/or Software AG WebMethods Integration Server is preferred.
  • Experience in building and deploying MicroServices on containers such as Docker, Kubernetes etc.
  • Experience documenting and communicating integration designs aligned to business requirements
  • Experience designing queries and working with standard object/relational databases such as Oracle, MS SQL and NoSQL such as Cassandra and Hive
  • Experience in DevOps tools like Jenkins
  • Experience with source code-controlled environments like GIT or SVN
  • Experience with Source control/Bug Tracking/Automated Build tools Jira, Jenkins and SVN
  • Experience in the financial services industry preferred
  • Strong sense of ownership and passion for developing simple solutions for complex problems
  • Education: Bachelor’s Degree in Computer Science, Computer Science Engineering, or related field

In this important role you will:

  • Provide expertise and hands on experience working with Kafka brokers and Kafka connectors
  • Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
  • Work closely with EA and cross-functional technical resources to devise and recommend solutions based on the understood requirements
  • Work closely with Platform Engineering team to analyze complex distributed production deployments, and make recommendations to optimize performance
  • Provide input for capacity planning and sizing of streaming environment (Kafka/Nifi)
  • Implement application development lifecycle management using industry standard frameworks
  • Write and produce technical documentation, knowledgebase articles
  • Work with Production Support to assist with troubleshooting service stability, message topic or delivery issues, perform data related benchmarking, performance analysis and tuning.
  • Perform design & code reviews