Role : Spark Scala Developer Experience : 4-8 Years Skills Required : Spark,Scala Location : Mumbai
Desired Competencies (Technical/Behavioral Competency) | |
Must-Have** (Ideally should not be more than 3-5) |
2. Experience in designing and development of solutions for Big Data using Hadoop ecosystem technologies such as with Hadoop Bigdata components like HDFS, Spark, Hive Parquet File format, YARN, MapReduce, Sqoop
4. Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins, Views etc 5. Experience in debugging the Spark code 6. Working knowledge of basic UNIX commands and shell script 7. Experience of Autosys, Gradle |
Good-to-Have | 1. Good analytical an d debugging skills 2. Ability to coordinate with SMEs, stakeholders, manage timelines, escalation & provide on time status 3. Write clear and precise documentation / specification 4. Work in an agile environment 5. Create documentation and document all developed mappings |
SN |
Desired Candidate Profile
Qualifications :BACHELOR OF ENGINEERING
Explore more jobs like this, or continue your search
All product names, logos, and brands are property of their respective owners.