Experience Required
5
Roles & Responsibilities
Data Engineering
Data pipeline development using COSMOS / SCOPE, On demand data pull using COSMOS / SCOPE
Data automation / processing using lens
Data pipeline automation using ADF
Data Engineering support
Data Preparation, Exploration and Sustenance
1. Investigate and understand business questions
2. Explore data, identify the right sources; performing data extraction and
transformation
3. Perform data validation
Experimentation: Control Tower and Scorecards:
Knowledge of how Control Tower works
How to update metrics
Update scorecard layouts
Run Scorecard Validation
Run exploratory analysis
Skills needed for this:
Python (only for investigations)
XML
Git
PySpark (only for investigations)
4. Implement and provide support for data pipelines and reports
5. Implement monitoring on all critical data pipelines
6. Support migration to Azure Lens Orchestrator from Xflow
Self Service/Datawarehouse
Support ADF /lens pipelines to ingress into Synapse on a regularly basis (performance,
reprocessing and monitoring)
2. Support visualization/reporting on top of data via Cube, PowerBI premium (based
on cost/size needs), excel or Spark
Rhythm of Business
1. Maintenance and Sustenance of existing Data Pipelines
2. Workflow Monitoring and Fixes
3. Maintained and monitored existing data