Experience in code optimize to fine tune the applications.
- Experience/Knowledge with scalatest or Junit
Experience with big data technologies such as Spark, Hadoop, Hive, Impala, Sqoop, Oozie, Airflow
- Knowledge in Spark Core & SQL is mandatory, Other module experience is plus (Streaming, ML)
Familiar with development tools such as Jenkins, Jira, Git, etcUsing Kafka on publish-subscribe messaging as a distributed commit log, have experienced in its fast, scalable and durability.
- Kerberos Security Authentication protocol
Experience in Data Warehousing and ETL processes.