Wednesday 23 December 2020

ETL Data warehouse Lead | AWS @ Englewood, Colorado

Hi,

Please find the job description below and let me know if any consultant is available:

Client: Infosys
Job Title: ETL Data warehouse Lead | AWS
Location: Englewood, Colorado
Position type: Contract
Contract duration: 7 months+ extension possible
Minimum years of experience required: 10+  years

Overview:
We need a person who know Datawarehousing and also good hands on knowledge on AWS services. It would be helpful to have some Java experience as well.
Seeking for a seasoned ETL & Data Lead Engineer with a passion for hands-on design/development and collaboration with business partners.
The ETL Lead must have deep experience in Enterprise Data Warehouse, Data Mart design, development and end to end execution of data solutions.
Prior experience in developing design patterns for Big Data and Cloud solutions is a big plus.
This role will work closely with manager and other data leads in the areas of data integration, architecture/structure, business intelligence, analytics and governance to execute business strategy.
This role will establish standards and tools across data analytics and reporting platform, based on industry best practices.

Responsibilities:
• Collaborate with product managers, data scientists and analysts to prepare complex data sets that can be used to solve difficult problems
• Administer, maintain, and improve data infrastructure and data processing pipeline including ETL jobs, events processing, and job monitoring & alerting.
• Help define, implement, and reinforce data engineering best practices and processes
• Design and implement highly scalable ELK (ElasticSearch, Logstash and Kibana) stack in AWS
• Build Java based applications (APIs) and RESTful web services using Micro-services architecture.
• Work across all phases of the software development lifecycle in a cross-functional, agile development team setting
• Partner with engineers, data scientists, architects and Manager to define and refine our data architecture and technology choices.
• Help define, implement, and reinforce data engineering best practices and processes
• Design and develop programs that will help provisioning of complex enterprise data to achieve analytics, reporting, and data science.
• Identify and recommend appropriate continuous improvement opportunities.
• Experience developing back end, data warehouse technology solutions
Qualifications::
• 5+ Years of Data Warehouse Experience with Oracle, PostgreSQL, etc.
• Demonstrated strength in complex SQL queries, data modeling, ETL development, and data warehousing
• Extensive experience working with AWS with a strong understanding of S3, Snowflake, Athena, Lambda, S3, EC2, etc.
• Experience in maintaining data warehouse systems and working on large scale data transformation using Hadoop or other Big Data technologies
• Experience mentoring other Data Engineers
• 3+ years hands on experience with Java based applications.
• 2+ years of experience in working with Spring Boot, Spring Rest API, Kafka or other messaging technologies
• Experience in end to end Low level design, development, administration and delivery of ELK/Java solutions.

Technology Stack:
Java 8, RESTful APIs, Micro-services, Spring framework, AWS(S3, Lambda, ECS, Fargates, etc), Snowflakes, Logstash, Kafka, Elastic Search, Databricks, Git, IntelliJ, CI/CD, Jenkins, Docker, Rancher



                         

Thanks & Regards
Abhishek Chellumala
VEDAINFO INC
Office:-310-929-1616 EXT 113
Direct:-+1 310-589-4470
E-Mail:-Abhishek@vedainfo.com

Company Name | Website

No comments:

Post a Comment