Friday, April 3, 2020

Jaya-BigData Developer | Python @ Hillsboro, OR

 
Hello, 
 
Please find the JD below and kindly respond with the work authorization and expected rate:
 
 
Job Title* Technology Lead | Analytics - Packages | Python - Big Data

Work Location* Hillsboro, OR - 97124



Contract duration (in months) * 6

Target Start Date* 13 Apr 2020

Does this position require Visa independent candidates only? No

Job Details:

Must Have Skills (Top 3 technical skills only) *
1. Python
2. Apache NiFi
3. Kafka

Detailed Job Description:
The Senior Platform Engineer will work as part of an Agile scrum team that will analyze, plan, design, develop, test, debug, optimize, improve, document, and deploying complex, scalable, highly available & distributed software platforms. A successful candidate will have a deep understanding of data flow design patterns, cloud environments, infrastructure-as-code, container-based techniques, and managing large-scale deployments.

Qualifications
* 5+ years' experience in developing systems software using common languages like Java, JavaScript, Golang or Python.
* Experience in developing dataflow orchestration pipelines, streaming enrichment, metadata management, data transformation using Apache NiFi. Added experience with Kafka or Airflow may help.
* 5+ years in building Microservices /API based large-scale, full-stack production systems.
* Deep industry experience in cloud engineering problems around clustering, service design, scalability, resiliency, distributed backend systems etc.
* 4+ of Experience with multi-cloud (AWS, Azure) and/or multi-region active-active environments with background in deploying and managing Kubernetes at scale.
* DevOps experience with a good understanding of continuous delivery and deployment patterns and tools (Jenkins, Artifactory, Maven, etc.).
* Experience with various data stores (SQL, No SQL, caches, etc.). Oracle, MySQL or PostgreSQL
* Experience with observability and with tools like Elasticsearch, Prometheus, Grafana, or any other open-source.
* Strong Knowledge of distributed data processing & orchestration frameworks and distributed architectures like Lambda & Kappa.
* Excellent understanding of Agile and SDLC processes spanning requirements, defect tracking, source control, build automation, test automation and release management.
* Ability to collaborate and partner with high-performing diverse teams and individuals throughout the firm to accomplish common goals by developing meaningful relationships.

Minimum years of experience*: 5+

Certifications Needed: No

Top 3 responsibilities you would expect the Subcon to shoulder and execute*:
1. 5 years in building Microservices API based largescale, fullstack production systems.
2. Deep industry experience in cloud engineering problems around clustering, service design, scalability, resiliency, distributed backend systems etc.
3. 4 years of Experience with multicloud AWS, Azure andor multiregion activeactive environments with background in deploying and managing Kubernetes at scale.

Interview Process (Is face to face required?): No

Any additional information you would like to share about the project specs/ nature of work:
5 years in building Microservices API based largescale, fullstack production systems.Deep industry experience in cloud engineering problems around clustering, service design, scalability, resiliency, distributed backend systems etc.4 of Experience with multicloud AWS, Azure andor multiregion activeactive environments with background in deploying and managing Kubernetes at scale. DevOps experience with a good understanding of continuous delivery and deployment patterns and tools Jenkins, Artifact
 

Jaya
Technical Recruiter
Vedainfo Inc
Office: 310-929-2578. EXT: 129
Direct: 310-929-2578.
 
W: Vedainfo.com

Vedainfo Inc, Hawthorne Blvd, Suite B-01, Torrance, CA 90505
Sent by jaya@us.vedainfo.com in collaboration with
Constant Contact
Try email marketing for free today!

No comments:

Post a Comment