URFI ABDUL REHMAN
Project Lead: Data Analysis, Development, Engineering and Analytics Reporting
E-Mail: *************@*****.***
Location Clearwater, FL USA
OBJECTIVE
To obtain a challenging position in the field of Data Engineering, Analytics & IT with a progressive organization that will fully utilize my talents and technical skills and help me reach higher echelons.
SYNOPSIS
Young, energetic, and result-oriented professional with 12+ years of experience as a Team member of Database Development, Data Engineering, Data mining, Business intelligence, Reporting and Analytics across Insurance & Banking, Health Care, and Advanced Metering Infrastructure domains, possess sound work knowledge on Database management, Create Own logic as per business requirements & Operation Reporting services through advanced tools, Deep understanding of technology with a focus on delivering business solutions; Persuasive communicator with exceptional relationship management skills with the ability to relate to people at any level of business and management; Highly ethical, trustworthy and discreet.
EDUCATION
•Certifications.
•Professional Cloud Architect (GCP)
•Cloud Associate Engineer(GCP)
•Azure Data Fundamentals
•Professional Cloud Data Engineer (GCP)
•Master of Technology (Computational Mathematics) with specialization in Machine Learning from Jamia Millia Islamia (2019-2021)
•Master of Business Administration with specialization in Finance and IB from Jamia Millia Islamia (2016-2018)
•Bachelor in Technology with specialization in Electronics and Communication from Punjab Technical University (2004-2008)
TECHNICAL EXPERTISE
Database Development : SQL Server, Azure SQL, Oracle 12C, MySQL, PostgresSQL,
ETL Development : SSIS, ADF, PySpark, Data Flow, Data Fusion
Data Warehouse: Azure Synapse, BigQuery, Databricks,Snowflake
Reporting: SSRS, Power-BI
WORK EXPERIENCE
PERSISTENT (Since Jan 2023 – Till Date)
Project: - Frank Crum
Domain: - Insurance Billing
Team “Analytics and Development”
Technologies: Azure MSSQL 2017, 2019, SSIS, ADF, Azure Synapse, Azure Databricks,BigQuery,Snowflake
Joined as Project Lead
Key Responsibilities:
•Designed and implemented complex ETL pipelines using Azure Data Factory, GCP DataFlow
•Design and develop SQL databases, including creating tables, views, indexes, and schemas based on business requirements.
•Write, optimize, and troubleshoot complex SQL queries for data extraction, manipulation, and reporting for SQL Server, Snowflake, and BigQuery.
•Develop stored procedures, triggers, functions, and scripts to automate repetitive tasks and ensure efficient data processing.
•Design and implement ETL (Extract, Transform, Load) processes to bring data from various sources into the database.
•Architect end-to-end data solutions on Azure, including designing data models, storage solutions, and scalable pipelines.
•Use Azure services like Azure Data Lake, Azure Synapse Analytics, Azure SQL Database, and Cosmos DB to support data warehousing and analytics.
•Design and develop ETL/ELT processes using Azure Data Factory (ADF) to extract, transform, and load data from various sources.
•Implement data integration from on-premises and cloud-based sources to Azure for unified data storage.
•Manage data storage using Azure Blob Storage, Data Lake Storage, and Azure SQL for structured and unstructured data.
•Set up and manage partitioning, indexing, and clustering for optimal data retrieval and storage efficiency.
•Use tools like Databricks, Azure Synapse Analytics for big data processing and data transformations.
•Database Management and Performance Tuning
•Administered and optimized relational and non-relational databases, including SQL Server on Azure, and Managed PostgreSQL.
•Enable data teams by preparing data for analysis and reporting with Power BI, Azure Analysis Services, and Synapse Analytics.
•Collaborate with data analysts and scientists to provide necessary data for BI, predictive analytics, and machine learning models.
•Implement CI/CD pipelines with Azure DevOps to automate deployment of data pipelines and infrastructure.
•Automate data workflows and processes to improve data pipeline reliability, speed, and error handling.
•Write efficient PySpark code to process large datasets, leveraging Spark’s distributed computing capabilities.
•Perform data analysis to derive insights and support decision-making within the organization using pyspark.
•Migrate on-premises databases and applications to GCP, for BigQuery Data
•Develop complex SQL queries, scripts, stored procedures, functions, and triggers in both T-SQL, PL/SQL, Snowflake and BigQuery
MEDIAAGILITY (Oct 2020 – Dec 2022 )
Project: - L&T, HUDSON INSURANCE GROUP
Domain: - Banking, Financial services and Insurance
Team “Analytics and Development”
Technologies: MSSQL, SSIS, ADF,Power-BI,,BigQuery, Cloud Storage, Cloud SQL DataFlow
Joined as Consultant
Key Responsibilities:
•Design and create pipelines to move and transform data from various sources (on-premises, cloud) to a destination (data lake, data warehouse).
•Design, build, and manage scalable ETL/ELT pipelines to move and transform data using GCP tools such as Dataflow, Apache Beam.
•Create data pipelines to support real-time analytics and business reporting.
•Set up and manage data storage solutions using BigQuery, Cloud Storage, and Cloud SQL to ensure efficient data storage and access..
•Optimize BigQuery data structures, including partitioning and clustering, to support high-performance analytics and reduce query costs.
•Schema Design and Optimization: Design database schemas to ensure efficient data storage, indexing, and querying using SQL Server, BigQuery and Azure SQL.
•Query Optimization: Write and optimize SQL queries for faster processing and to reduce resource consumption.
•Migrate on-premises databases and applications to GCP, for Cloud Sql or BigQuery Data Transfer Service to bring legacy data sources into the cloud environment
•Integrate various external and internal data sources, including APIs and third-party platforms, into GCP data warehouses or lakes.
•Build and execute scalable data processing pipelines to handle massive datasets using Azure Databricks.
•Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to ensure data solutions meet organizational needs.
•Design and manage data warehouses for scalable storage, allowing large datasets to be stored for analytics.
•Automate deployment and orchestration of data workflows using Cloud Composer (Apache Airflow)
LANDISGYR (Since Nov 2016 –Oct 2020 )
Domain: AMI (Advanced Metering Infrastructure)
Team BIAR “Business Intelligence and Analytics Reporting”
Technologies: Oracle 12C, MSSQL, Pentaho, MySQL, PostgreSQL, PowerBI, SSIS
Joined as Senior Data Analyst
Key Responsibilities:
•Design database schemas (tables, indexes, constraints) that meet application requirements for scalability and efficiency. Structured data for analytics and machine learning applications.
•Develop logical and physical data models based on business requirements.
•Write and optimize stored procedures, functions, and triggers to automate and manage data processing.
•Create views, indexes, and materialized views Oracle to improve data retrieval efficiency and support reporting needs.
•Write complex SQL queries for data retrieval, reporting, and data manipulation.
•Optimize SQL queries, indexes, and database structures to enhance performance, ensuring efficient data access and minimizing server load.
•Develop processes for importing, exporting, and transforming data between SQL Server, Oracle, and other databases or file formats.
•Create ETL workflows using tools like SQL Server Integration Services (SSIS) for SQL Server
•Write transformation logic for preparing data for analytical purposes or ensuring it meets target schema requirements.
•Support the DBA team with database backup, recovery, and restoration processes.
•Regularly manage index rebuilding and reorganization to maintain database health.
•Maintain comprehensive documentation for all stored procedures, functions, and ETL processes.
NIIT TECHNOLOGY (Since June 2015 – Oct 2016 )
Domain: Wealth Management
Team “Software Development”, Business Intelligence
Technologies: MSSQL, SSIS, SSRS
Joined as Senior Engineer
Key Responsibilities:
•Write and optimize stored procedures, functions, and triggers to automate and manage data processing.
•Experience with SSIS and its components (Data Flow Tasks, Control Flow, etc.).
•Knowledge of SQL Server Data Tools (SSDT) for developing SSIS packages
•Design and develop reports using SSRS, including tabular, matrix, and chart reports.
•Deploy reports to the SSRS report server and manage report subscriptions.
•Write complex SQL queries for data retrieval, reporting, and data manipulation.
•Optimize SQL queries, indexes, and database structures to enhance performance, ensuring efficient data access and minimizing server load.
•Team meeting and defining tasks for team members, Task assigning, Task management as according to resource.
•Understand data model and data flow of application.
ACCRETIVE HEALTH SERVICES (Since Nov 2013 – April 2015)
Domain: Health Care
Team “Software Development”/Dev_ETL
Technologies: MSSQL, SSIS, SSRS
Joined as Engineer
Key Responsibilities:
•Amass the ETL Issues, Analyze issue to replicate those issues in Dev. Environment.
•Develop code, perform reviews to advance application upgrade, extension, or other development
•Work on Enhancement Stories, check enhancement level for pre existing environment or develop new enhancement.
•Develop Unit test cases and perform Unit testing and system testing of assigned components. To put all Stories in Sprint to create the task and fill up acceptance criteria as per product owner.
•Developing T-SQL stored procedures, functions, views, change the logic as per the business requirement.
•Develop ETL SSIS packages or find out or test issues that come in ETL Packages.
•Support to the Operation, Front hand, and back hand issues.
•Triage the Data discrepancy in database and application.
•Create new reports as per operations request on SSRS.
•Identifying issues with the database and resolving them.
•Report subscription in SSRS as per the client or organization business need.
•Identifying Data missing or wrong data in database or any data related issue.
•Find out the Client Report issues and resolve it through SSRS.
•Create new logic for fetching the data, create the Excel report as per client side and find out the data loading issue on multiple databases.
•Support all Application which is ongoing in organization.
DAR AL AKIFA Est. Riyadh Saudi Arabia (Since Aug 2012-Sep 2013)
Domain: Health Care
Team “Development”
Technologies: MSSQL, SSIS, SSRS
Joined as Engineer
Key Responsibilities:
•Installation, Administration and Maintenance of SQL Server Instances
•ETL old packages testing and enhancement as per client requirement.
•Database Testing has per the specific issue or deployment, Regression testing after complete deployment.
•Created reports using various tables, views, and T-SQL
•Experience in creating SQL Server reports, handling sub reports and defining query for generating drill down reports using SSRS 2005/2008.
•Developing database applications which are automated.
•Planning and implementing new module in order to ensure that the company’s databases are scalable.
•Identifying issues with the database and resolving them.
•Installing and configuring the necessary components so as to ensure that the database is accessible.
•Diagnosing and resolving database access and checking on performance issues
•Expertise in generating reports using SQL Server Reporting Services (SSRS) .
•Expertise in Extraction, Transforming and Loading (ETL) data flows using SSIS, creating mappings/workflows to extract data SQL Server and Flat File sources, legacy systems and load into various Business Entities.
•Database development, Business Intelligence and have Software Development experience.
•Monitoring and Performance Tuning; Physical Server Level,
•Setup Test, Dev, Staging and Production Environments.
ACCRETIVE HEALTH Pvt. Ltd. Noida (Since Feb 2010- Jul 2012)
Domain: Health Care
Team “ETL Operations”
Technologies: MSSQL, SSIS, SSRS
Joined as IT Analyst
Key Responsibilities:
•Worked at Multiple projects as File Management and User-Management in parallel.
•Extensive work in SSIS, MS SQL server and SQL programming.
•Extract the client reports from SQL Server database.
•Expertise on DQL and T-SQL statements on SQL server.
•Identifying issues with the database and resolving them.
•Identifying Data missing or wrong data in database.
•Find out the Clients File issues like file missing, File Quarantine and Empty Files on Client-side servers.
•Resolve the day-to-day end site issue through SQL server 2005/2008, work on multiple Client Databases of different sites. These all through a Ticketing System (Advent Tool).
•Configure the file actions for FTP servers on File-Management tool (FLEX).
•Diagnosing and resolving database access and checking on performance issues.
Languages: English, Hindi, Punjabi, Urdu
DATE: (Urfi Rehman)