| Hi, This is Sravan from Veda info and I have the below requirements from one of our esteemed client so please respond to me if you are comfortable or if you have any consultant matching the below requirement Title: Bigdata Architect Location: Stamford, CT Duration: 6 Months Minimum years of experience: 5+ years Must Have Skills Azure cloud knowledge Spark,Scala,Hive,HDFS and Scoop Azure HD insights Job Details: * At least 7+ years of experience in Software development life cycle with primary experience in DW/BI big data and related tools * Strong Knowledge of DW architecture, ETL Frameworks, Design solutions using big data technologies * Proficiency in modern programming languages and tools - Python, XML, Informatica Big data Management, Hortonworks, Hadoop, Spark, Scala, HIVE, Oozie, Sqoop and Responsible for design and development of integration solutions with Hadoop/HDFS, Data Warehouses, and Analytics solutions. * Hands on experience in HD insight and Spark cluster Nice to have skills Azure HD insights Detailed Job Description Strong Knowledge of DW architecture, ETL Frameworks, Design solutions using big data technologies, Best Practices and Troubleshooting.Proficiency in modern programming languages and tools Python, XML, Informatica Big data Management, Hortonworks, Hadoop, Spark, Scala, HIVE, Oozie, Sqoop and Responsible for design and development of integration solutions with HadoopHDFS, Data Warehouses, and Analytics solutions.Strong Knowledge of DW architecture, ETL Frameworks, Design solutions using big data technologies, Best Practices and Troubleshooting.Proficiency in modern programming languages and tools Python, XML, Informatica Big data Management, Hortonworks, Hadoop, Spark, Scala, HIVE, Oozie, Sqoop and Responsible for design and development of integration solutions with HadoopHDFS, Data Warehouses, and Analytics solutions. Thanks & Regards, Sravan Kumar Office: 310-929-1616 Extn: 154 Direct: 310-929-1147 |
No comments:
Post a Comment