Location: Richardson, TX; Chicago, IL
Job Type: Permanent
This position is responsible for collaborating with Solutions Engineering, Infrastructure Operations, and Infrastructure Service Management teams in the design and build of infrastructure solutions/blueprints for the area of responsibility; participating in the design and build of repeatable patterns (build-kits) to improve deployment times for non-prod and prod environments; transitioning knowledge to Infrastructure Operations.
Required Job Qualifications:
- Bachelor's Degree and 5 years in Information Technology or relevant experience.
- Have hands-on experience in working with Hadoop distribution platforms like Hortonworks, Cloudera, and others.
- Candidate should be ready for Hadoop on-call support if and when needed.
- Expert in implementing and troubleshooting hive, spark, pig, storm, Kafka, Nifi, Elastic Search, Solr, HBase applications.
- Working knowledge of Ruby or Python and known DevOps tools like Git and GitHub.
- Working experience in installing, configuring, upgrading and supporting Apache Kafka or Confluent Kafka for batch / streaming usecases in an enterprise cloud (Azure / AWS) environment.
- Experience in developing or customizing messaging related monitoring tools , utilities, alerts and Kafka connectors.
- Setting up the Kafka multi broker cluster and configuration in cloud fabric catering to high availability requirements.
- Ability to install new Kafka clusters and troubleshoot Kafka related issues in production environment with in given SLAs.
- Capacity planning and implementation of new/upgraded hardware and software releases as well as for storage infrastructure.
- Working experience in installing, configuring, upgrading and support of enterprise Databricks in Azure / AWS cloud environment.
- Good understanding of spark internals.
- Ability to analyze, performance tuning and troubleshoot issues related to Databricks spark.
- Experience in integrating Databricks with Control-M / Zena, delta lake, ADLSGen2/S3 and other related tools.
- Experience in a scripting language to automate Infrastructure deployments tasks.
- Ability to simplify & standardize complex concepts / processes.
- Understanding of business priorities (e.g., vision), trends (e.g., industry knowledge) and markets (e.g., existing/ planned).
- Oral & written communications.
- Ability to prioritize and make trade-off decisions.
- Drive cross-functional execution.
- Adaptability and ability to introduce/manage change.
- Teamwork and collaboration.
- Organized and detail-oriented.
Preferred Job Qualifications:
- Experience in Hadoop application infrastructure engineering and development methodology background.
- Experience with Ambari, Hortonworks, and HDInsight.
- Experience with Cloudera Technology (Azure/AWS).
- Experience with streaming technology(e.g Kafka).
- Experience with Kerberos, TLS encryption, SAML, LDAP.
- Knowledge in the cloud (Azure/AWS) big data solutions using EMR, HDInsight, Kinesis, Azure Event Hubs, etc.