Thursday, December 19, 2019

AZURE ARCHITECT | AZURE DATA BRICKS - CINCINNATI, OH

This requirement is only for H1B / GC / US Citizens /
GC EAD / H4 EAD / L2 EAD - NO OPT EAD
 
Job Title        Azure Architect | Azure Data Bricks
Location        Cincinnati, Ohio
Duration       6+ Months
 
Job Description      
Job Title* Azure Architect | Azure Data Bricks
Work Location & Reporting Address* Cincinnati OH 45202
Does this position require Visa independent candidates only? No
Minimum years of experience*: 8+ years
Interview Process (Is face to face required) Yes (Skype/WebEx interview)
Certifications Needed: No
Rate* C2C or W2 all-inclusive
 
Job Details:
Must Have Skills (Top 3 technical skills only) *
1. azure
2. Azure Data Bricks
 
Detailed Job Description:
* Ownership of overall POS/CDS landscape architecture
* In depth understanding of various CDS processes, sources and downstream
* Define standards for various ETL processes within CDS landscape
* Provide best in class architecture solution for various business
requirements within CDS landscape.
* Review landscape changes
* Responsible Application optimizations (performance, stability etc)
* Define future landscape
* Mentorship to team members in terms of logical and technical solution designing
 
Technical Expertise:
* Extensive experience in Azure PaaS Services and Security implementation.
* Knowledge of BI processes involving RDMBS is must
* Proficient in
o Linux
o Oracle 11g/12c
o Azure Stack (Data bricks, ADLS, ADF, SQL DB, HDI(Spark) )
o Hands -on Pyspark, Python, Angular JS and Java (MVC architecture)
* Working Knowledge of Power BI, Airflow, other Azure peripheral offerings.
* Experienced in SQL & Database Knowledge (PostgreSQL, SQL Server)
Git/Azure DevOps/Agile
 
Top 3 responsibilities you would expect the Subcon to shoulder and execute*:
1. Responsible for overall solution to implement CDL Data quality engine and new user interface.
2. Individual implementation of CDL Data quality engine and user interface components.
3. API Service
 
Responsible for overall solution to implement CDL Data quality engine and
new user interface.
Individual implementation of CDL Data quality engine and user interface components.
Data quality engine will have following generic modules created reused from turbine
API Service Assessment module ML Data Anomaly Master Data Validation Validity
module SOT Validation User interface for dashboard and scorecard
Notification service DevOps pipeline for pyspark, database and airflow components
 
Thanks,
Tabitha Monisha Rayi
 
Office: 310-818-4424 Ext:157
E-Mail: tabitha@us.vedainfo.com
www.vedainfo.com
 
Certified Women Owned Minority Business Enterprise {WMBE}
3868 Carson Street, Suite 204, Torrance, CA 90503 | Offices: USA, India, Australia, UK and New Zealand

Vedainfo Inc, Hawthorne Blvd, Suite B-01, Torrance, CA 90505
Sent by tabitha@us.vedainfo.com in collaboration with
Constant Contact
Try email marketing for free today!

No comments:

Post a Comment