Post Job Free

Resume

Sign in

Sql Server Business Intelligence

Location:
Overland Park, KS
Salary:
100000
Posted:
November 22, 2023

Contact this candidate

Resume:

Sirisha Pondugula

Email: ad1cyj@r.postjobfree.com Mob- 704-***-****

SUMMARY:

* ***** ** ********** ** SQL Server, ETL / SSIS, and Reporting / SSRS Development, expertise in SQL, and analysis of Online Transactional Processing (OLTP), data warehouse (OLAP), and business Intelligence (BI) applications.

• Development and Implementation of CDC (Change Data Capture) for ETL (SSIS) needs. Collaborating with a team of developers to design, develop, and implement a BI solution.

• Worked to understand the business requirements and developed ETL Packages. Experience in developing data integration and data migration projects using SSIS.

• Experience in creating Store procedures, Functions, CTE’s, Cursors, and Views using SQL Server DB.

• Experience in extracting data from heterogeneous data sources like Excel, SQL, CSV’s, and loading in SQL server using SSIS.

• Experience in SQL Query Optimization / Performance tuning according to the Needs. Experience in SSIS package development, configuration, and deployment.

• Experience in scheduling ETL Jobs using SQL Server Job agent.

• Experience in SSIS/ SSRS Packages Unit Testing and Smoke testing in Production.

• Experience in developing Charts, Drilldown, parameterized, Matrix, and ad-hoc reports using SSRS. Experience in creating an error handling framework according to the SSIS packages Worked with GCS to Help Understand the Business in Various Domains.

• Experience in migrating packages from Development / test to Stage and Stage to Production. Creating/Debugging and maintenance of SQL/ T- SQL, Stored Procedure/Jobs.

• Proficient in all phases of the Software Development Lifecycle (SDLC).

• Expertise in maintaining data quality, data organization, metadata, and data profiling.

• Primarily involved in Data Migration using SQL Azure. Azure storage, and Azure Data Factory, SSIS, PowerShell.

• Experience in User Requirement Gathering, User Requirement Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis, and Reporting Analysis.

• Extensive experience working with Business Intelligence data visualization tools with specialization in Tableau (Tableau 6.x/7.x/8.x Desktop, Server, Mobile, and Public), MicroStrategy, Business Objects, and SSRS.

• Experience with development, testing, debugging, implementation, documentation, and production support.

• Generated DDL Scripts from Physical Data Model by using Forward Engineering technique Experience in Customer Interaction, Collecting and Handling Customer Requirements.

• Experience in Python Development and Scientific Programming and using NumPy and Pandas in Python for Data Manipulation.

• In a data warehouse environment, designed the staging area based on OLTP concepts, cleansed and profiled before loading into data marts.

• Coherent with Normalization/Demoralizations techniques in relational/dimensional environments and have performed normalizations up to 3NF (Third Normal Form).

• Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS) and report development with Power BI.

• Expertise in Creating and developing Power BI Dashboards.

TECHNICAL SKILLS:

ETL Tools: SSIS Reporting Tools: SSRS

Languages: SQL, SAS CLINICAL, SAS BI, VISUAL ANALYTICS, OLAP CUBE, HTML, PERL

Databases: MS SQL Server2005/2008,2014, MS Access, Teradata, DB2, PL/SQL

Operating Systems: Microsoft Windows 9x/NT/2000/XP/Vista/7 and UNIX

PROFESSIONAL EXPERIENCE:

Skillo Dec 2022 - Present

SQL Developer/Data Analyst

Responsibilities:

Created and maintained SSIS ETL packages to extract data from different systems, and load variety of large volume data into data warehouse from various sources like at le, Excel le.

Extracted data from internal data warehouse system to SSRS Tracked trouble ticket request for request made by end users.

Development and Implementation of CDC (Change Data Capture) for ETL (SSIS) needs. Created ad-hoc reports.

Experience in design and development of Tableau visualization solutions.

Working on generating various dashboards in Tableau server using various sources such as SAP HANA/SQL Server.

Modi ed SP's and formulated basic coding and scripting to be utilized for reports within the organization.

Analysed data models, data relationships, and dependencies for greatest optimization.

Designed and developed stored procedures, queries, and views necessary to support SSRS reports Performed functional and performance QA testing.

Designed new reports and wrote technical documentation, gathered requirements, analysed data, developed, and built SSRS reports and dashboard, developed different type of reports including Investor Report, Customer Report, Accounting Statement Report, Delinquent Reports, by using SSRS and SQL Server 2008/2013.

Scheduled and managed daily/weekly/monthly sales and operational reports based on the business requirements.

Involved in gathering Report Development Specifications (RDS) and SOWs for creating reports.

Developed parameterized reports which were used for making current and future business decisions.

Developed Stored Procedures to generate various Drill-through reports, Parameterized reports, Tabular reports, and Matrix reports using SSRS.

Created reports that projected business goals for future.

Used table variables and Common Table Expressions (CTEs).

Worked closely with the o shore team in finding solutions to arising problems. Identify the associated Store Procedures and parameters used for the reports Involved in job scheduling and alerts.

Environment: Win7, SSIS, MS SQL Server 2014, Access, OBIEE.

Accenture, India Dec 2021 – July 2022

Data Analyst / Python Developer

Responsibilities:

Leveraged Python for data analysis, scripting, and automation.

Collaborated with cross-functional teams to design solutions for a global financial institution.

Worked on data integration and processing using Hadoop and Spark.

Experience working on Image and Capture Project, which extracts the information / Data from client and load it in the SQL Tables.

Manage datasets using Panda data frames, database queries from python using Python-SQL connector and SQL dB package to retrieve information.

Used SQL in Backend to create SP which is utilized inside SSRS and SSIS.

Collaborating with team of developers designed, developed, and implement a BI solution for Sales, Product and Customer KPIs.

This ETL process uses many different Control Flow and Data Flow Components (like: For-each Loop, Execute SQL Task, File System Task, and Data Flow Transformations (Like: Data Source, Data Cast, Derived, Conditional, Union all, Import.)

This project used heavy Variables Logics which makes it a dynamic Project; Variables were set on Package and Data level.

Experience in creating Store procedures, Functions, Views, CTE’s and Triggers according to the development needs.

The process of Extraction, transformation, compare and loading of the Data from DB2 into the Data warehouse is done using the Teradata Client Utilities like Fast load and Multi load. Coding of the processes is done using Teradata SQL.

These Processes are scheduled to run daily, weekly, or monthly. Teradata SQL and Client Utilities have played a significant role in migrating the data to Data warehouse and achieving the expected gains.

Worked on defining different patterns and trends, relationships between different data sources. Interacted with the business users on regular basis to consolidate and analyse the requirements.

Identified, formulated, and documented detailed business rules and use cases based on requirements analysis, Involved in multiple projects with different business units.

Identified the Entities and relationships between the Entities to develop a logical model and later translated the model into physical model.

Worked on creating Source to Target mapping document and transformation rules for the designed data models.

Created Design Fact & Dimensions Tables, Conceptual, Physical and Logical Data Models using Erwin tool.

Used Normalization methods up to 3NF and De-normalization techniques for effective performance in OLTP systems.

Integrated CRM systems and implemented advanced security measures.

Environment: SQL Server, Crystal Reports, Windows 7, MS Office, Business Object

Tata Consultancy Services July 2019 – Dec 2021

Data Analyst/Python Developer

Responsibilities:

Managed planning applications and utilized Python (Pandas, Matplotlib, NumPy) for data analysis and forecasting.

Improved statistical forecast demand tools with Python and SQL.

Created and maintained job workflows using scripting languages.

Worked on data migration projects using Oracle and SQL.

Utilized data analysis and visualization tools (Power BI, Tableau).

Leveraged AWS services for data processing and automation.

Involved in requirement gathering and clinical data analysis and Interacted with Business users to understand the reporting requirements, analysing BI needs for user community.

Involved in logical and physical designs and transforming logical models into physical implementations.

Normalized the data up to 3rd Normal form.

Created Entity/Relationship Diagrams, grouped and created the tables, validated the data, identified PKs for lookup tables.

Worked on creating data models for different disease types and disorders, pertaining to Epidemiology, Endocrinology, and Nervous disorders.

Involved in modelling (Star Schema methodologies) in building and designing the logical data model into Dimensional Models.

Write scripts in Python for automation of testing jobs.

Utilized Erwin’s forward/reverse engineering tools and target database schema conversion process. Designed the data marts in dimensional data modelling using star and snowflake schemas.

Redefined attributes and relationships in the reverse engineered model and cleansed unwanted tables/columns as part of data analysis responsibilities.

Developed ETL routines using SSIS packages, to plan an effective package development process, and design the control ow within the packages.

Lead the dead migration task for migrating data from one data centre to another data center as part of London dc and Developed SQL, PL/SQL codes and shell scripts as part of London DC.

Designed data flows that extract, transform, and load data by optimizing SSIS performance.

Worked with slowly changing dimensions (SCDs) in implementing custom SCD transformations. Involved in loading the data from Source Tables to Operational Data Source tables using.

Transformation and Cleansing Logic.

Primarily involved in Data Migration using SQL Azure. Azure storage, and Azure Data Factory, SSIS, PowerShell.

Integrated data from various Data sources like MS SQL Server, DB2, Teradata using Informatica to perform Extraction, Transformation, loading (ETL processes).

Developed mappings, sessions, and batches for relational & at le sources and targets. Loaded data into target database from sources using Work ow Manager.

Worked on improvising the statistical forecast demand tool for providing accurate plan forecasts to

the business using python, SQL and ksh scripts.

Worked on jobs building for the planning workflows using Perl, sql and shell scripting.

Worked on London Data center migration project for moving data bases from phoenix to London

DC

Proposed many Improvements in planning activities which were accepted by Clients as Dollar saving

Published Knowledge Articles on the Supply Chain Processing and procedures of the Applications

that are useful for the Clients.

Upon Understanding the requirements from Client, worked on developing and implementing them

successfully in production individually.

Reviewed and Corrected Testing Reports.

Document concise design, development, and testing Progress for Assigned Projects.

Exceptional Analytical and debugging skills.

Coordinate with DBAs for tasks such as creating /rebuilding indexes, importing & exporting the

dump files and analysing tables.

Demonstrated ability to performance tune queries and reports for maximum performance.

Worked on stored procedures for processing business logic in the database.

Utilized Power BI Gateway to keep the dashboards and reports up to date.

Configuring Power BI Gateway for SSAS live connection.

Performance query tuning to improve the performance along with index maintenance. Worked on the reporting requirements for the data warehouse.

Created support documentation and worked closely with production support and testing team.

Environment: SSRS, SSIS, Business objects, Azure, SQL Server 2008, MSSQL, Python, PL/SQL, SQL, Teradata, Control M, ITIL, Data Analysis, VBA, Macro, Power Bi, Control M.

Tech Mahindra, India August 2018 – June 2019

Data Engineer Intern

Responsibilities:

Assisted in the development of data ingestion pipelines, performing data extraction, transformation, and loading (ETL) tasks.

Collaborated with the data engineering team to clean, validate, and transform data for quality assurance.

Supported database administrators in tasks related to database schema management and query optimization.

Contributed to the maintenance of data warehouses, ensuring data accessibility for analysis.

Developed and executed scripts in Python, SQL, or Shell for automating data-related processes.

Assisted in the design and implementation of data models to support reporting and analytical needs.

Collaborated on performance optimization of data pipelines, databases, and queries.

Maintained documentation for data pipelines, ETL processes, and data dictionaries to facilitate knowledge sharing.



Contact this candidate