Post Job Free

Resume

Sign in

Power Bi Sql Server

Location:
Hyderabad, Telangana, India
Posted:
February 06, 2024

Contact this candidate

Resume:

Vijaya Kumar N

Ph: 732-***-****

ad3evs@r.postjobfree.com

BI Developer

Around 10+ years of IT experience in Microsoft platform of BI/Cloud suite and other Cloud Technologies. Involved in all phases of Software Development life cycle from requirements gathering, design, development, implementation, testing and support with the tools of SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), SQL Server Analysis Services along with the SQL server versions of 2008/12/16/19 and Oracle. And experience in Power BI Desktop Application Power BI (Power Query, Power Pivot, Power View), Power BI Services, Power BI Report Builder, Power BI Report Server and DAX Studio and Power Apps. Good Knowledge in Azure SQL, ADF (Azure Data Factory), Azure Data Lake, Azure DevOps, Snowflake, AWS S3. Good knowledge in Python.

oExperience in gathering technical requirements and propose solutions.

oStrong knowledge and experience/ hands on working in SSIS ETL, SSRS & SQL Server.

oHands-On experience of designing SSIS Packages & SSRS Dashboard reports and validating results as per business.

oGood understanding in all phases of software development life cycle (SDLC)- Project Analysis, requirements, Design document, development, Unit testing, User acceptance testing, maintenance, support.

oImport/Migrate SSIS Excel, Flat Files, XML & Authenticate a SSRS Reports.

oHaving experience in troubleshooting the SSIS & Optimization.

oFamiliar in writing queries in SQL, T-SQL.

oStrong Knowledge in designing and creating dimension and cubes with Snowflake and STAR schema using SSAS

oInvolved in configuring Azure platform for data pipelines, ADF, Azure Blob Storage and Data Lakes and building workflows to automate data flow using ADF.

oExtensively worked on performance tuning & code optimization.

oExpertise in source migration and server migration activities.

oWrite pySpark code to read parquet files from Azure Data Lake to Azure Data Bricks & Load the transformed data from Azure Databricks to Azure SQL Database further to onboard them into Azure Analysis Services

oBulk loading from the external stage (S3), internal stage to snowflake cloud using COPY Command.

oUsed COPY, LIST, PUT and GET commands to validate the Snowflake internal staging files.

oHands on experience in creating tables, views, and stored procedures in Snowflake.

oDeveloped and implemented a data quality process that reduced data errors by 50% and improved overall data accuracy by 30% in Snowflake.

oKnowledge on AWS services like RedShift.

oExperience in Windows Batch scripting to automate the Jobs and schedules.

oBackground in installing and upgrading of Tableau server and server performance tuning for optimization.

oSSIS Packages designed by using For Loop Container, For Each Loop container, Sequence Container, Execute SQL Task, File System Task, Data Flow Task, Send Mail Task, Data Transformations like Derived Column, Data Conversion, Conditional Split, Aggregate, Merge join, Lookup, Row Count, Union, Union All, Script Task & Sort transformations.

oExperienced to create Integration Services Catalogs Jobs &SQL server agent Jobs using XML configurations & Environment Variable configurations.

oExperienced in writing SQL Queries for SSRS reports/Dashboards.

oDesign the reports as per the client requirements in the form of Table, Matrix, Sub reports, Drill Down and Drill through reports in SSRS.

oHaving experience in Optimization of SSRS Dashboards which is pulled out of SSIS Packages.

oExperience in integration of various data sources like SQL Server, Oracle, Flat files, XML files, Raw Files

oExperience in import data from various sources like SQL server, SSAS Tabular model, Excel, Azure SQL,

In power query and implemented the data cleansing.

oExperienced to create POWER BI Reports using Power BI desktop. Performed DAX Queries using Power BI tools.

oStrong development skills with Azure Data Lake, Azure Data Factory, SQL Data Warehouse Azure Blob, Azure Storage Explorer

oIdentifying key performance indicators (KPI) to help companies adjust and fulfill objectives.

oGood Experience in Creating Power BI Visualization reports and dashboards using various source data (Oracle, SQL server, Excel, Csv files and interactive visuals).

oCreated action filters, parameters and calculated sets for preparing dashboards and worksheets in Power BI.

oCreated different visualizations in the report using custom visualizations like Bar Charts, Pie Charts, Line Chart, Column Chart, Stacked Column Chart, Slicers.

oGood Experience in creating Bookmarks, Grouping, Binning, Interactions

oInvolved in creating Power BI Reports and Reports are migrating from Excel to Power BI Desktop

oVisualize data, author reports, and schedule automated refresh of reports.

oUsed DAX functions like aggregate, date Functions, string functions & filter functions to Create

some business calculation based on requirement & Experience in DAX using YTD, QTD, MTD operations and resolved the issues on this.

oExperience on different filter levels like visual, page, report level, sync slicers, Drill down & Drill

through filters.

oGood Experience in Power BI Report Builder, Power BI Report Server and DAX Studio

oGood Experience on Paginated reports using Power BI report builder and import to PBI Desktop and publish to Report server and Power BI Service.

oCreating workspaces, apps and applying Row Level Security (RLS) on top of datasets.

oGood Experience to create Paginated reports using Power BI Report Builder.

oGood Experience on Paginated reports using Power BI report builder and import to PBI Desktop and publish to Report server and Power BI Service.

oGood Knowledge in creating Canvas Apps using Power Apps Studio.

oExperience in connecting different source connectors in Power APPs Dataverse to load data in to Power Apps model.

oBuilding the pipelines to copy the data from source to destination in Azure Data Factory

oMoving CSV files from Azure blob to on-prem server

oMonitoring Produced and Consumed Data Sets of ADF. Scheduled Jobs in Flows and ADF Pipelines Migrating the data from different sources to the destination with the help of ADF Creating Pipelines with GUI in Data Factory.

oScheduling Pipelines and monitoring the data movement from source to destinations Transforming

data in Azure Data Factory with the ADF Transformations

oHas deployed applications into Azure cloud - App service, VM, Azure functions.

oExperienced to create batch files using windows shell scripting to run the ETL through Control M jobs. And using SQL Server Agent to create jobs in SSMS environment.

oGood Knowledge in TFS (Team Foundation Server), SVN and GitHub.

oGood Knowledge in Azure DevOps (CI/CD).

oWorked in implementation life cycles of Agile and waterfall methodologies.

oJob scheduling and Automation using SQL Agent, Control M, ROBOT Scheduler.

oExperience on JIRA tool for project management while picking up the sprint tasks and update the task status with technical details.

oPrepared Design documents, Test case documents & KT sessions.

oGood Hands-on ServiceNow Tool in creating CRs and bring them to closure as per the release cycle.

oHave creative problem-solving skills, quick and independent learning skills in implementations of data warehousing projects.

oWilling to learn modern technologies.

Professional Qualification:

Master of Computer Applications (MCA) from JNT University, Anantapur.

Technical Skills:

Operating Systems

Windows XP/7/8/10

Databases Tools

SQL Server 2008/12/16/19, Azure Data Studio, Azure SQL DB, Oracle 18c/12c/11g

Reporting Tools

SQL Server Reporting Services (SSRS), POWER BI, Power App

ETL Tools

SQL Server Integration Services (SSIS), ADF (Azure Data Factory).

Data warehousing

OLAP, OLTP, Star & Snowflake Schema. Fact and Dimensions, Snowflake

Cloud Technologies

ADF (Azure Data Factory), Databricks, BLOB Storage, Azure - App service, VM, Azure Functions, AWS S3.

Programming Languages

SQL, Oracle, Python, Windows Shell Scripting, Unix

Other Tools

SQL Server Profiler, Visual Studio 2010/12/2015/2017/2019, BIDS Helper & SSDT, Control M, ROBOT Scheduler

Project Management Tools

Jira, ServiceNow.

CHUBB Insurance. Philadelphia, PA Nov -2021 - Till Date

BI Lead Developer

Description:

PCW Daily Catcher:

Sapiens is the new Reinsurance platform for Chubb. The proposal is to take away the Reinsurance capabilities from Genius, which is the current system that performs RI calculations in EMEA before feeding the data in to PCW. The Business requirement here is to devise a solution that can stage the Gross Policy, Claims and Commissions information from Genius and aggregate the Reinsurance details from Sapiens before feeding the details to PCW along with any unearned premium calculations (UPR).

Functional and non-functional requirements of the PCW Daily Catcher (PDC) system that will interact with the below system for a different purpose.

Genius - Gross Policy, Claims & Commissions, Journal and also RI information for the items Genius still going to perform RI.

SIAC - Generate the PCW record format based on the Sapiens output.

UPR - To calculate the Unearned Premium for all the records by the stand-alone UPR module

PCW - Post a final feed to PCW with all the Gross + RI information weekly/monthly.

TTP: Payments, Cash Receipts, Vouchers to feed target system and perform Daily Transactions Reconciliation and End of the Day Reconciliation.

Responsibilities:

oInvolved in design and conversion of business requirements to Conceptual Data Model, Relational Data Model, Dimensional Data Model, logical Data Model and Physical Data Model for subject areas.

oDesigning the SSIS packages as per the business requirements using mapping documents.

oFor creating the ETL packages to fetch the data from different sources as OLEDB, flat files loading data into staging OLAP database.

oDesigned and Developed ETL pipelines that will read data from various sources like Reinsurance, Insurance and loaded into Azure SQL Datawarehouse.

oCreated Fact & Dimension tables and generate the relationship in between.

oWorked on data modeling to create the required tables and their relationships.

oHands on experience in creating tables, views, and stored procedures in Snowflake.

oLoading data in to Snowflake tables from the internal stage using Snow SQL.

oPerformed data quality issue analysis using Snow SQL by building analytical warehouses on Snowflake.

oDevelop and maintain ETL processes to move data from source systems to Snowflake.

oUsed import and export from the internal storage (Snowflake) from the external stage (AWS S3).

oUsing for each loop container, Sequential container, script task, dataflow task transformations to load the data from source to staging DB. And using file system for archive the source files with current date.

oDeployed the packages from SSIS solution to integration services catalog in SSMS.

oCreated Windows Batch files using Shell scripting to run the packages through Control M and automate the jobs.

oCreated action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.

oMaintained security and data integrity of the database.

oIngesting Parquet files using ADF from multiple sources and loaded data into ADLS (Azure data lake Storage) using Pyspark in Databricks.

oDesigned Developed various objects like External Tables (HDFS format on any Azure cloud storage), Relational Tables, Views,

oHandled performance requirements for databases in OLTP and OLAP models.

oCreated Power BI dashboards using various source data (Oracle, SQL server, XML, Webservices and interactive visuals.

oExperience in integration of various data sources like SQL Server, Oracle, Flat files, XML files, Raw Files

oExperienced to create POWER BI Reports and POWER PIVOT. Performed DAX Queries using Power BI tools.

oIdentifying key performance indicators (KPI) to help companies adjust and fulfill objectives.

oGood Experience in Creating Power BI Visualization reports and dashboards using various source data (Oracle, SQL server and Excel) interactive visuals.

oCreated action filters, parameters and calculated sets for preparing dashboards and worksheets in Power BI.

oInvolved in creating Power BI Reports and Reports are migrating from Excel to Power BI Desktop

and identify the bugs in Power BI Desktop software.

oStrong development skills with Azure Data Lake, Azure Data Factory, SQL Data Warehouse Azure Blob, Azure Storage Explorer

oVisualize data, author reports, and schedule automated refresh of reports.

oCreated Bookmarks, Grouping in the Visualizations.

oWorked on DAX query to create calculated columns and calculated measures based on Requirements.

ocreating app workspaces, publishing apps, and setting up Power BI Gateway.

oPublished Power BI Reports in the required originations and made Power BI Dashboards available in web clients.

oWorked on generating various dashboards in Power BI using various data sources like Snowflake, Oracle, SQL server, Excel, MS Access in Power Query.

oInvolved in creating data visualizations like Cards, Clustered Column Chart, Line Chart, Slicer, Gauge Visualization etc. with Power BI.

oCreated Paginated reports using Power BI report builder and published to Power bi report manager.

oConfigured Paginated reports with Data source connections, dataset folders in PBI Report server

oWith a Power BI Pro license published paginated reports to Power BI workspaces.

oWorked on to create Canvas Apps for business needed to view the Policy and claims details using Power Apps to Web Brower, Mobile Canvas.

oLoaded the data using Dataverse to connect the SQL DB and load data in to Canvas App.

oCreated Gateway to connect SQL DB from Power Apps and provided access to users and clients.

oExperience managing Big Data platform deployed in Azure Cloud

oUtilized Power BI gateway to keep dashboards and reports up to date with on premise data sources.

oSharing of Power BI reports and managing workspaces in Power BI service.

oWorked on to create RLS (row level security) in Power BI.

oExperience on JIRA tool for project management while picking up the sprint tasks where we maintain task status and technical details of the Story.

oGood Hands-on ServiceNow Tool in creating CR and bring them to closure as per the release cycle.

oCoordinated in Testing, writing, and executing test cases, procedures, and scripts, creating test scenario.

oExtensively used Agile methodology as the Organization Standard to implement the Data Models.

oWorking Experience and knowledge on data warehouse concepts and dimensional data modeling.

Environment: SSIS, Power BI Desktop, Power BI Service, Power Apps, SQL, ADF, Azure SQL, Oracle, Snowflake

Client: Ernst & Young (E&Y), Chennai, India. Jun -2021 - Nov- 2021

BI Lead Developer

Description:

FR Reporting:

The GFT-Financial Reporting system consists of several components. The back-end system consists of Extract, Transform and Load (ETL) processes that retrieve data from various source systems and then combine, convert, classify the data according to pre-defined business rules. Once the transformations are complete, the data is loaded into Online Analytical Processing cubes. The front-end of the application will consist of a SharePoint based user interface to run reports, an application for user management referred to as User Access Management (UAM) tool and an application for managing configuration data referred to as the Hierarchy Management Tool (HMT). And processing the data from Cubes as source and writing MDX queries from SSMS to load data into SSRS reports.

Responsibilities:

oCreated ETL, Report design documents & involved in requirements gathering.

oCreated End to End flow architecture design and Technical Specification Documents.

oInvolved in Designing the Database Objects like Tables, Functions and Stored Procedures.

oGathering technical requirements and propose solutions.

oDesigning the SSIS packages as per the business requirements using mapping documents.

oFor creating the ETL packages to fetch the data from different sources as OLEDB, flat files loading data into staging OLAP database.

oCreated ETL pipelines in Python and PySpark to load data into Hive tables under Databricks

oGathering requirements from business users for the development of Tableau visualization solutions.

oCreated Azure Data Factory for copying data from Azure BLOB storage to SQL Server. Created ADF pipelines / SSIS Package and Performance tuning in the existing package/pipeline

oCreated Fact & Dimension tables and generate the relationship in between.

oWorked on data modeling to create the required tables and their relationships.

oProposed users to use Power BI in replacement of Analysis for Office, to run reports connecting to Snowflake.

oUsing for each loop container, Sequential container, script task, dataflow task to load the data from source to staging DB. And using file system for archive the source files with current date.

oDeployed the packages from SSIS solution to integration services catalog in SSMS.

oActively working on Azure, Azure SQL Database, Azure SQL Data Warehouse, ADF v2, Blob Storage, Poly Base and using SSIS in ADF environment on purpose of scripting calling APIs and so on.

oCreated environment variables to run the packages through Control M process.

oCreated Windows Batch files to run the packages through Control M.

oHandling the Errors & Data Discrepancy using Event handlers like On Error, Post Execute, and On Waring in control flow level. And creating project level parameters for connection managers.

oExperience with creating reports and Dashboards using Power BI.

oUsing DAX for data transformation in Power BI Desktop.

oCreated Parameters in Paginated reports to pass in execution for Tablix reports.

oUsing Power BI Report builder created Sales performance Tablix reports using Labels and functions.

oCreated Invoices paginated reports to print the page after executed the reports with all data and all pages.

oPublished Paginated reports to Power BI Report manager with Webservice URL and configured the data source connections and Dataset mappings.

oExpert in Using DMVs, Performance dash board, Mirroring, database snapshots and tracking Performance Counters.

oUsed complex data transformations and manipulations on business use cases/ requirements with Data flows, Databricks

oPrimarily involved in Data Migration using SQL, SQL Azure, Azure storage, and Azure Data Factory, SSIS, PowerShell

oHands on experience in SQL/Snow SQL of Oracle/SQL Server/Snowflake.

oRebuilding / monitoring the indexes at regular intervals for better performance.

oWorked on Custom Visualizations like Chiclet Slicer, Waterfall Chart and Pulse Chart. Good experience in creating Line, Table, Matrix, Bar, Pie chart, Doughnut charts, funnel, clustered columns for visualization.

oDesigning Backup & Recovery and for point in time recovery

oProviding Security by giving access permission to users at various levels.

oPublishing Reports to Power BI Service to create Dashboards or Visualizations.

Environment: SSIS, SSAS, Azure, Power BI Desktop, Power BI Service, Microsoft SQL Server 2012/16/19, SQL, T-SQL

Client: Ernst & Young (E&Y), Chennai, India. Apr -2016 - May-2021

BI Lead Developer

Description:

Mercury Recon Expense:

The Mercury Recon Expense Project is a reconciliation system that integrates the expense data from various systems and provides a report that can be used for the clients for analytics. The expense data from different SAP systems (BW, ECC, EAT, EDW) will be integrated under a common reconciliation database in a common format. These data would be then used to extract Reconciliation report. A Batch Control program shall coordinate all the expense data beginning from the Expense data Load to the reconciliation database to report extraction.

Mercury Recon Expense project utilized the Microsoft Business Intelligence platform that used the SQL server Management services (2016) for maintaining the reconciliation database, integration services for reconciliation of the systems and reporting services for reporting. The batch controller is the scheduler program that periodically loads the expense data twice a day into the warehouse systems. The data that is loaded in the morning shall be termed as AM PASS data and the set that is loaded in the evening shall be termed as EU PASS. These data are sequentially loaded into various expense data systems. The sequential load shall begin with SAP ECC then proceed with SAP BW, EDW and EAT in the exact order.

Mercury Recon ECC EAT Comparison:

Mercury Recon ECC EAT Comparison Project is a reconciliation system that integrates the recon expense data from various systems and provides a report that can be used for the clients for analytics. The expense data from different SAP systems BW and EAT will be integrated under a common reconciliation database in a common format. These data would be then used to extract comparison report and showed difference between the systems. A Batch Control program shall coordinate all the Comparison data beginning from the Expense data Load to the reconciliation database to report extraction.

Mercury HR Reconciliation:

Mercury HR Reconciliation project utilized the Microsoft Business Intelligence platform that used the SQL server Management services (2016) for maintaining the reconciliation database, integration services for reconciliation of the systems and reporting services for reporting. The batch controller is the scheduler program that periodically loads the HR expense data once in a week into the warehouse systems. These data from the systems are brought together by the ETL process, to load them under one database. This database shall be here by called as Staging Database / Reconciliation database. This is an OLAP database. This is a standalone database that converts the data across the systems into a common format.

Mercury MR Reconciliation:

Mercury MR Reconciliation project utilized the Microsoft Business Intelligence platform that used the SQL server Management services (2016) for maintaining the reconciliation database, integration services for reconciliation of the systems and reporting services for reporting.

These data from the systems are brought together by the ETL process, to load them under one database. This database shall be here by called as Staging Database / Reconciliation database. This is an OLAP database. This is a standalone database that converts the data across the systems into a common format.

CPM Booking Failure Report:

CPM booking failure report is to utilized the Microsoft Business Intelligence platform that used the SQL server Management services (2019) for maintaining the reconciliation database, integration services for differentiate the EDW and Retain (APAC & EMEIA) regions data from Oracle systems and reporting services for reporting.

These data from the systems are brought together by the ETL process, to load them under one database. This database shall be here by called as Staging Database / Reconciliation database. This is an OLAP database. This is a standalone database that converts the data across the systems into a common format.

Once load the data into Mercury databases we are comparing the data from two source systems and displaying the error/issue engagement records into SSRS report rdl. We are taking Retain SKEYS as driving keys and joining all the engagements with EDW data. If any engagement is not matching with EDW then we are displaying status as feature record. If any engagements matching with both systems those records, we are pulling into SSRS reports.

Deploying the Mercury Recon applications from SQL server 2012/16 to SQL 2019 as new up gradation.

Responsibilities:

oCreated ETL, Report design documents & involved in requirements gathering.

oCreated End to End flow architecture design and Technical Specification Documents.

oInvolved in Designing the Database Objects like Tables, Views, and Functions Stored Procedures.

oGathering technical requirements and propose solutions.

oDesigning the SSIS packages as per the business requirements using mapping documents.

oFor creating the ETL packages to fetch the data from different sources as OLEDB, flat files loading data into staging OLAP database.

oCreated Fact & Dimension tables and generate the relationship in between.

oWorked on data modeling to create the required tables and their relationships.

oExperience with snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source systems which include loading nested JSON formatted data into snowflake tables.

oCreated snow pipe for continuous data load.

oRedesigned the views in snowflake to improve the performance.

oValidating data from SQL Server to Snowflake to make sure it has apple to apple match.

oDefined virtual warehouse sizing for snowflake for different types of workloads.

oImplemented one-time data migration of multistate level data from SQL server to Snowflake by using Snow SQL.

oAutomated the Jobs using Windows shell scripting.

oUsing for each loop container, Sequential container, script task, dataflow task to load the data from source to staging DB. And using file system for archive the source files with current date.

oDeployed the packages from SSIS solution to integration services catalog in SSMS.

oCreated environment variables to run the packages through Control M process.

oCreated Windows Batch files to run the packages through Control M.

oHandling the Errors & Data Discrepancy using Event handlers like On Error, Post Execute, and On Waring in control flow level. And creating project level parameters for connection managers.

oCreated stored procedures to view the reports through SSRS.

oCreating Sub reports, cascading reports, Drill through & drill down reports to generate the Report for client purpose.

oDeploying the SSRS reports into Report server and grant the permissions for Client & Users to generate the reports.

oExperience with creating reports and Dashboards using Power BI.

oUsing DAX for data transformation in Power BI Desktop.

oCreated Tablix paginated reports using Power bi report builder and published to power bi report manager.

oPublished Paginated reports to Power BI Report manager with Webservice URL and configured the data source connections and Dataset mappings.

oExpert in Using DMVs, Performance dash board, Mirroring, database snapshots and tracking Performance Counters.

oRebuilding / monitoring the indexes at regular intervals for better performance.

oWorked on Custom Visualizations like Chiclet Slicer, Waterfall Chart and Pulse Chart. Good experience in creating Line, Table, Matrix, Bar, Pie chart, Doughnut charts, funnel, clustered columns for visualization.

oDesigning Backup & Recovery and for point in time recovery

oProviding Security by giving access permission to users at various levels.

oPublishing Reports to Power BI Service to create Dashboards or Visualizations.

Environment: SSIS, SSRS, Power BI Desktop, Power BI Service, Microsoft SQL Server 2012/16/19, SQL, T-SQL, Oracle, Snowflake.

Client: IHG (Ian H.Graham) Insurance, Chennai, India July 2014 – Mar 2016

BI Developer

Description:

IHG (Ian H.Graham) is a Insurance company for various products like Property manager insurance, Accounts, Healthcare, Doctors, Lawyers, Hospitalities Insurance etc. The main aim of this project is to migrate the last 5 years of data of US and Spain from AS400 & Oracle DB into SQL server 2012 DB using SSIS, T-SQL. It’s an insurance domain of property managers and quote the policy and submitted to third party insurance company under AON. Once they submitted the policy then converted as submission policy and if conciliation happened then converted as canceled policy. And once the policy renewal date is happening default to send a message to customer for policy renewal. Each person has different policies and policy numbers are unique. If the customer is going to close the policy, then it will convert to closed policy a status.

From AS400 extracting the data into Local DB like Dm_Landing from here creating views and stored procedures to populate the data into landing tables. Using landing tables to creating the source scripts based on the client requirement and load data into Dm_Target DB using various SSIS components. Once load the data and test the data based on scenarios. The data is fine based on the requirements then load data into Goldmine (AU mine) DB.

Responsibilities:

oCreated design documents & involved in requirements gathering.

oInvolved in Designing the Database Objects like Tables, Views, and Stored Procedures.

oDesigned the SSIS Packages as per the requirement and pulling the data from Different Source Systems like SQL server, Flat file and Excel by using SSDT tool.

oWorked with Control Flow Items like Data Flow Task, Sequence container, Execute Sql task, Execute Package Task, File system task, Script task, Send Mail Task, For Each Loop Container.

oWorked with Data Flow Items like OLE DB Source, OLE DB Destination, EXCEL Source and EXCEL Destination, Flat file source & Destination and Implemented package configurations in packages for better reusability, System Variables, User defined Variables and Expressions. Created SQL jobs to automate the packages. Involved in testing the query logic, Packages and Excel Issues.

oConnect, import, shape, and transform data for business intelligence (BI)

oCreated SSRS reports for Reconciliations records from ETL Error Log table.

oExtensively worked on SSRS Dashboards which includes Tabular reports, drill down reports, Sub Reports, Tiles, Table etc. auto selected filters as per the requirement.

oInvolved in complete end to end



Contact this candidate