Thursday, 10 May 2018

Job requirement for Hadoop Developer

Job Title                       :  Hadoop Developer  
Work Location              :  Jersey City, NJ
Duration                       : 
​ 6+ Months of contract 

Professional Experience

·         Solid working knowledge of next generation data and analytics capabilities built on the Spark/Scala, Kafka and Hadoop ecosystem.

·         Understanding of design patterns, application architectures and analytics workloads and drivers for Spark/Scala jobs.

·         Should be able to tune Spark/Scala solutions to improve performance and end-user experience.

·         Familiarity with Data ware Housing, ETL, BI, Visualization tools, Machine Learning

·         Excellent problem solving, hands-on engineering skills and communication skills and ability to lead the team and resolve their issues.

·         Interact with client for requirement understanding and status reports.

·         Professional Experience Preferred

·         Knowledge/experience of AWS EMR

·         Good to have experience in Application development and deployment.

·         Good to have Cloudera/Hortonworks/Spark Certification – (not mandatory)

·         Good to have Core Java knowledge.

·         Experience with Maven, Git, Sbt

Technical Skills Required

·         Any combination of below technical skills

·         Big Data: HDFS, Spark & Scala,

·         Processing & Streaming : Spark 2.x, Kafka

·         NoSQL: Couchbase

·         RDBMS: PostGreSQL

·         Languages: Java, Scala, Linux, Apache, Perl/Python/PHP



No comments:

Post a Comment