Sr Data Engineer
remote
12+ Years
Skills: ETL, Datatage, Python, CI/CD
About the Data Engineer Role
As a Data Engineer with ETL/ELT background, the candidate needs to design and develop reusable data ingestion processes from variety of sources and build data pipelines for Snowflake cloud data warehouse platform and reporting processes. The ideal candidate should be proficient in writing SQLs so they can be embedded into the Data Build Tool (DBT) for data transformation. Also need to understand the legacy ETL components developed in DataStage, SQL scripts and shell scripts. Familiarity with AWS and Snowflake CDW platform big plus. Ideal candidate should be willing to learn open source technologies on the fly as job demands.
Responsibilities
• Design, develop & implement ETL/ELT processes for Cloud Datawarehouse platform Snowflake
• Ability to understand legacy DataStage ETL components and rewrite them into SQLs and DBT (Data Build Tool) components
• Work closely with other data engineering teams to ensure alignment of methodologies and best practices
• Leverage the common components and CI/CD pipelines
Requirements
• Bachelor's Degree or master’s degree in Computer Science.
• 10+ years of hands-on software engineering experience.
• 2 years of strong ELT experience using Data Build Tool (DBT) loading to Cloud DW Snowflake / Redshift.
• 3+ years of strong ETL experience on either DataStage, Informatica, Ab-Initio, Talend etc. is plus
• Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS
• Strong database fundamentals including SQL, performance, and schema design.
• Proficiency in SQL coding
• Ability to interpret/write Python scripting is a plus.
• Experience with AWS platform big plus
• Experience with Git.
• Experience building CI/CD pipeline is plus
• To be able to work in a fast-paced agile development environment.
Murthy Chavali
murthy@vedainfo.com
310-589-4458
No comments:
Post a Comment