Note : Only immediate joiner Role : Pyspark Experience : 8 years Location : Hyderabad, Chennia Interview mode is Walkin drive Requirements : Mandatory skills : (8 Years of experience in Data engineering with 5 Years on Pyspark/NoSQL is Mandatory) 1.
Person should be strong in Pyspark 2.
Should have hands on in MWAA (Airflow) / AWS EMR(Hadoop, Hive) framework 3.
Hands on and working knowledge in Python 4.
Knowledge in AWS services like EMR, S3, Lamda, Step Function, Aurora - RDS.
5.
Having good knowldge on RDBMS Any SQL 6.
Person should work as Individual contributor Good to have : 1.
Having experience to convert large set of data from RDMBS to No SQL.
2.
Having experience to build data lake & configuration on delta tables.
3.
Having good experience with computing & cost optimization.
4.
Understanding the environment and use case and ready to build holistic frame works.
Soft skill : 1.
Having good communication to interact with IT-Stake holders and Business.
2.
Understand the pain point to delivery.
(ref:hirist.tech)