Job Title: Azure Data Engineer
About the Company: Insight Globals Client
Type: Contract Position
Compensation: 30- 38 lahks per year
Location: Remote - prefer banglore india
Interview Process: Immediate Interviews Available! Priority scheduling for candidates who:
Submit resume promptly
Are available for immediate interviews
Connect via LinkedIn with resume and CTC rate
Requirements:
Ability to work full-time hours on a contract basis
paid hourly - 8 hour working days
Strong communication skills
Independent contract - NO PF NO BENIFITS
NEEDS TO START ASAP - NO NOTICE PERIOD
NO Providence fund
Must Haves
7+ years' experience with database engineering - building out and deploying pipelines, ideally working with financial data.
2+ years’ experience working with Azure applications (Azure Data Factory, Azure Batch, Azure SQL Server, Azure Data Warehouse, Azure Databricks etc.) and building out Azure pipelines.
2+ years; experience working with Azure Databricks.
1+ year of experience working in an Agile environment.
Experience creating PowerBI reports.
2+ years' experience with Snowflake.
2+ years' experience with Python, PySpark & SQL.
2+ years' experience with infrastructure administration.
Working knowledge of CI/CD.
Working knowledge of building data integrity checks as part of delivery of applications.
Plusses
Retail and/or e-commerce background; experience working for a multi-channel retailer.
Day to Day
A large, North American, retail company is seeking an Azure Data Engineer in Bangalore, India. You will be joining our client’s FP&A Team, focusing on their Finance Data Hub (FDH). The FDH hosts all of our client’s sales data, inventory data, PNL data, financial data, etc. FDH houses Oracle RMS, Oracle EPM, and Oracle EBS and the Data Engineer will integrate data coming from these 3 systems, into the FDH. Owning data pipelines that gather data coming from multiple sources and consolidating that data for different use cases. Lead development and maintenance of data science and analytics processes, procedures, and policies. Lead identification, design, and implementation of integration, modelling, and orchestration of complex data. Act as subject matter expert for data engineering needs. Understanding of modern data platforms including data lakes and data warehouse, with good knowledge of the underlying architecture, preferably in Snowflake.