Experience -Must have: a) Scala: Minimum 2 years of experience b) Spark: Minimum 2 years of experience c) Hadoop: Minimum 2 years of experience (Security, Spark on yarn, Architectural knowledge) d) Hbase: Minimum 2 years of experience e) Hive - Minimum 2 years of experience f) RDBMS (MySql / Postgres / Maria) - Minimum 2 years of experience g) CI/CD Minimum 1 year of experience Experience (Good to have): a) Kafka b) Spark Streaming c) Apache Phoenix d) Caching layer (Memcache / Redis) e) Spark ML f) FP (Scala cats / scalaz) Qualifications Bachelor's degree in IT, Computer Science, Software Engineering, Business Analytics or equivalent with at-least 2 years of experience in big data systems such as Hadoop as well as cloud-based solutions