Wednesday, March 24, 2021

KAFKA DEVELOPER @ Charlotte, North Carolina

NO OPT & CPT VISA'S

Hello,
My name is Matt Adams and I am a Staffing Specialist at Veda Info Inc. I am reaching out to you on an exciting job opportunity with one of our clients.
 
Please find the JD below and kindly respond with the below details if interested:
 
Work authorization:
Expected pay-rate (or) employer details:
 
Job Title: KAFKA DEVELOPER
Work Location: Charlotte, North Carolina
Client: Infosys Technologies Limited
Contract duration: 06 Months + Extendable Possible
 
Does this position require Visa independent candidates only? No
 
Job Details:
Must Have Skills:
·        Kafka Development
·        Hadoop Development
·        Python Scripting
·        Scala Development
·        Spark Development
 
Detailed Job Description:
Required Qualifications:
·        Candidate must be located within commuting distance of Charlotte, NC or be willing to relocate to the area. This position may require travel in the US and Canada.
·        Bachelor’s Degree or foreign equivalent, will consider work experience in lieu of a degree
·        4+ years of experience with Information Technology
·        3+ years in Kafka programming, working knowledge around Big Data (Hadoop, Hbase, Hive, Scala, Spark, Python etc.).
·        Experience building and optimizing ‘big data’ data pipelines, architectures and data sets using Kafka. Experience in Core Java/ OOPS.
·        Experience in Apache/Confluent Kafka components (Connect, Schema, Registry, KSQL, Control Center, brokers, KStream)
·        Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
·        Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms.
·        Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem.
·        Build processes supporting data transformation, data structures, metadata, dependency and workload management.
·        A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing and highly scalable ‘big data’ data stores.
·        Good experience in end-to-end implementation of DW BI projects, especially in data warehouse and mart developments
·        Strong knowledge and hands-on experience in Unix shell scripting
·        Knowledge and experience with full SDLC lifecycle
·        Experience supporting and working with cross-functional teams in a dynamic environment.
·        Independently able to debug and support to prod support team as and when required
·        Experience with Lean / Agile development methodologies.
 
Interview Process (Is face to face required?): No
Minimum years of experience: 10+ Years
Certifications Needed: No
 

If this job/position is not suitable to you, please refer a candidate and earn a referral bonus. Vedainfo Referral Program is one of the best in the industry! Contact us for more details.
 
Thanks,
Matt Adams
Technical Recruiter
VEDAINFO INC.
Office: 310-929-1190 EXT: 156
Direct: -+1 310-929-1190





Company Name | Website

No comments:

Post a Comment