Job title: Big Data Architect
Work Location: Denver, CO (onsite)
Minimum years of experience required: 14+ years
Must Have Skills:
• Big Data and Hadoop
• ETL and Pyspark
Detailed Job Descriptions:
Good Knowledge Data Warehousing (ETL/BI) Testing, Big Data Testing
10+ years in big data development with 6+ years on AWS.
Good understanding of AWS and AZURE cloud solutions.
Excellent understanding/knowledge of Hadoop architecture and various components such as HDFS, Yarn, Hive, Sqoop, Pig, Apache Spark SQL.
Experience in Azure Databricks using pyspark.
Hands on experience on writing complex SQL and HQL scripts