Post Job Free

Resume

Sign in

Sql Server Data Analyst

Location:
Glen Allen, VA
Posted:
March 11, 2024

Contact this candidate

Resume:

Swapna Kasireddy

Data Modeler/Data Analyst/ETL Test Analyst

ad39fx@r.postjobfree.com

Ph:804-***-****

www.linkedin.com/in/sriswapna59

Summary:

●Around 6 years of extensive experience in analysis, design, development, testing, interfacing and implementation of applications using Snowflake, ORACLE, SQL SERVER and DB2.

●More than 4 years of experience in the Financial and Banking domain.

●Experience in banking, Credit Card, TeleCommunication, call center management industry business processes.

●Developed Ad Hoc queries and Reports using Snowflake, Oracle, SQL, PL/SQL, SAS and UNIX to fulfill business analysts, Operations Analysts and Financial analyst’s data requests.

●Perform automation of Snowflake scripts in UNIX.

●Proficient in importing/exporting large amounts of data from text files to Snowflake and vice versa.

●Worked on SQL assistant, SQL Server 2005/2008 database tuning, triggers, stored procedures and data migration.

●Experience in writing and implementation of the Packages, Triggers, Stored Procedures, Functions, Views, Indexes at Database level and Form level using PL/SQL in Oracle and T-SQL in SQL Server.

●Highly experienced in Performance Tuning and Optimization for increasing the efficiency of the scripts.

●Ability to understand and write SQL and Stored Procedures in Snowflake and SQL Server.

●Worked in the team to build flowcharts from existing templates and data manipulation process of flowchart in Fractal Campaign.

●Working on Data Verifications and Validations to evaluate the data generated according to the requirements is appropriate and consistent.

●Worked with Campaign Manager during requirements gathering business intent creation and Campaign execution.

●Using knowledge of data, supplier systems and market knowledge, programming and statistical methods to manage and evaluate data proactively to provide 100% quality (accuracy and completeness) with on time delivery.

●Perform analysis and data manipulation to resolve queries, identify opportunities, implement and improve processes to ensure quality.

●Worked on developing and executing campaigns using the Fractal Campaign tool.

●Experience includes development and integration of databases.

●Highly proficient in database design and implementation of OLAP (Data Warehousing) with MS Analysis Services, DTS (ETL) and a good understanding of Star & Snowflake Schema design methods.

●Having knowledge on gathering business requirements and translating them into reporting needs.

●Experience in cloud development architecture on Amazon AWS, Amazon Redshift and Basic on Azure.

●Solid hands on experience of data model repository, documentation in Meta data portals such as Erwin, ER Studio and Power Designer tools.

●Experience with Teradata utilities like Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.

●Have working experience on Data Migration Testing to Azure Cloud.

●Experience in data cleansing, data transformation, data mapping specifications from source to target database.

●Extensive experience working with XML, Schema Designing and XML data.

●Good understanding of Ralph Kimball (Dimensional) & Bill Inmon (Relational) model Methodologies.

●Proficient in data mart design using dimensional data modeling - identifying Facts and Dimensions, Star Schema and Snowflake Schema.

●Involving in the below phases of Analytics using R, Python and Jupyter notebook.

●Excellent knowledge in creating Databases, Tables, Stored Procedures, DDL/DML Triggers, Views, functions and Indexes using T-SQL.

●Extensive experience with T-SQL in constructing Triggers, Tables, implementing stored Procedures, Functions, Views, User Profiles, Data Dictionaries and Data Integrity.

●Proficiency in Normalization to 3NF and Denormalization techniques for optimum performance of database environments like OLTP and OLAP systems.

●Well versed in system analysis, Relational and Dimensional Modeling, Database design and implementing RDBMS specific features.

●Proficiency in extracting data and creating SAS datasets from various sources like Oracle database, Access database, and flat files using Import techniques.

●Experience in dealing with different data sources ranging from flat files, Oracle, Sybase, and SQL server.

●Strong experience in using MS Excel and MS Access to dump the data and analyze based on business needs.

●Conduct data analysis, mapping, transformation, data modeling and data-warehouse concepts.

●Proficient in data governance, data quality, metadata management, master data management.

●Hands on experience in working with Tableau Desktop, Tableau Server and Tableau Reader in various versions.

●Expertise in performing User Acceptance Testing (UAT) and conducting end user training sessions.

Technical Skills:

Database: Databricks, Oracle, SQL server, MS Access,Azure

Database tools: Pyspark, Snowflake sql, Teradata SQL Assistant, SQL Navigator, SAS, Toad, SQL Workbench, Brio

Reporting tools: Crystal Reports, Micro Strategy, Business Objects, Tableau, Qlik view

Languages: Pyspark SQL, PL/SQL, snowflake sql, R, redshift, Python, UNIX shell scripting

Windows Applications: MS Word, Excel, MS PowerPoint,

AWS: S3, Redshift AWS, EC2,

Tools: SAS, Base SAS, SAS/SQL, SAS/MACRO, SAS/ODS, SAS/ACCESS, Erwin, Pivot tables, Python, R, Snowflake schema, Star schema

Professional Experience:

CapitalOne, Richmond

Data Analyst, Nov 2019 – Present

Responsibilities:

●Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica Power Center Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.

●Experience in Data Analysis, Data Modelling, Data Mining, Dashboard development and Testing.

●Experience in Tableau report developing, testing, and deploying reporting solutions using Tableau Server.

●Working extensively with Dimensional Modeling, Data Migration, Data Transformation, Metadata, and Data dictionary, Data Loading, Modeling and Performance Tuning.

●Particularly good experience and knowledge on Amazon Web Services: AWS Redshift, AWS S3 and AWS EMR.

●Experience in working on different Databases/Data warehouses like Teradata, Oracle, Apache, AWS Redshift, SQL Server.

●Database software development using Oracle SQL & PL/SQL, Oracle Forms & Reports, Unix Shell Scripting and Java.

●Experience in Programming Languages like Python, UNIX Shell Programming.

●Work on MS SQL Server, Access Databases and developed T-SQL, SQL, Stored Procedures, Cursors, Triggers, Views, Indexes and Constraints, Query Optimizer

●Developed SQL, Stored Procedures, Views, Indexes and Constraints, Query Optimizer.

●Strong Experience building software in Oracle SQL and PL/SQL utilizing all database objects like store procedure, functions, type objects, cursors, ref cursors, views, materialized views, and PL/SQL collections.

●Experience of importing various types of external data files into SAS dataset/SAS library, creating SAS datasets by using SAS/INFILE, SAS/IMPORT and SAS/SQL.

●Performance tuning, SQL query enhancements, code enhancements to achieve performance targets using Explain Plans

●Developed SQL commands using SQL Assistant (Query man) front-end tool matching the business requirements for the reports like credit path, ACBS.

●Translating the user requirements into technical requirements and mapping the data entities to table columns in the database.

●Involved in Query Optimization and performance tuning,

●Expertise in creating, editing and deploying reports in Qlik view or Tableau. Expertise in creating, editing and deploying SQL Server Integration Services packages.

●Performed numerous data pulling requests using SQL for analysis.

●Wrote hundreds of DDL scripts to create tables and views in the company Data Warehouse.

●Designed and developed weekly, monthly reports related to the marketing and financial departments using Snowflake SQL.

Environment: Snowflake, Databricks environment

UHG, NC

Jr. Data Analyst, Dec 2016 – Apr 2017

Responsibilities:

●Tested Slowly Changing Dimensions (SCD) Type 2.

●Performing Defect Tracking and Performing Root Cause Analysis for issues.

●Responsible for testing and reviewing of ETL mapping and transformation specifications based on the business requirements from various teams and the business teams.

●Testing of ETL jobs that are scheduled for file transfers from Operational Data Stores to designated file systems/directories.

●Tested various Reusable ETL Transformations which facilitate Daily, Weekly & Monthly Loading of Data.

●Tested the performance bottlenecks at sources, targets, mappings, and sessions and employed required measures.

●Managed software defect information interacted with Business users to fix severity and priority of issues.

●Participated in a weekly project status meeting and updated the testing Progress.

●Worked on preparing a test case for Ab Initio graphs.

●Developed test cases for several reports featuring multiple prompt selections, drill through reports, List Report, Cross-tab Report and Chart Report using Cognos10.

●Performed testing for reports and prepared testing documents.

●Verified requirements coverage by conducting walkthrough meetings of Test plan and scenarios with business analysts, project manager and test supervisors.

●Extensive experience on usage of ETL & Reporting tools like Test Management tool, Azure devOps.

●Created Test plan, Test strategy, Critical scenarios and Test Scripts and schedule for testing.

●Worked with technical designers and architects to understand the requirements for a test environment setup.

●Prepared test data by modifying the sample data in the source systems, to cover all the requirements and scenarios.

●Used predefined UNIX scripts for file transfer and file manipulation as part of Test Data preparation.

●Manual Execution and validation of Functional and Migrations Tests for ETL and BI processes.

●Used Toad and SQL Plus for testing execution of ETL Processes for business rules.

●Interacted with Developers and management to identify and resolve technical issues.

●Conducted GUI, Functional, Front end & back end testing and reviewed pages for content problems.

●Also worked on Integration of all the processes in UAT/PROD.

●Optimizing/Tuning several complex SQL queries for better performance and efficiency.

●Ran SQL queries to perform database validation according to the business logic.

●Developed Integration and System test cases using Quality Center.

●Tracked and reported defects into QC and notified management with details.

●Written, executed Test cases, and documented defects in the Quality Center.

●Solved day to day problems of the team arising due to functionality and validation issues.

●Analyzed root cause for defects and documented.

Environment: Quality Center, Informatica Power Center, Erwin, MS Excel, Oracle, SQL, PL/SQL, Teradata utilities, TOAD, Autosys Scheduler, UNIX, Windows.Azure cloud

Capital one, VA.

Automation QA Analyst/ETL Testing, May 2014 – Jan 2016

Roles and Responsibilities:

●Agile methodology involved from the start of Sprint to determine if stories in Version one and the acceptance criteria are well defined and if they satisfy customer requirements. Execute the Test Cases manually as well as automated Functional and Regression testing.

●Working on WWW application framework using Ruby-Cucumber to automate functional and regression test scripts.

●Ensure that all the test cases are updated in the HP Quality Center along with the Master Test Plan.

●Perform Defect Management in HP Quality Center and prepare Test Summary Report and product defect metrics. Worked as Defect Co-coordinator.

●Attend Daily Standup meetings and provide updates on defects. Involved in Bi-weekly Sprint planning meetings and Demos with Business.

●Involved in gathering Functional Requirements, Business Requirements and Design Documents.

●Involved in testing process improvements, including the use of Selenium add-ons and creating reusable tests.

●Performed Functionality, Negative, Security, GUI, Integration, System and Database testing.

●Edited the automatically generated scripts to customize testing.

●Communicated defects using HP ALM 11 with proper Severity and Priority.

●Provided a subset of Functional test cases to UAT test.

●Performed Front End Functionality & GUI testing.

●Performed Back-end SQL testing, and hands-on development skills in SQL (triggers, procedures, functions and packages) and SQL Loader experience.

●Expected to meet with plans and any applicable vendors to understand requirements, conduct testing sessions and troubleshoot issues.

●Used HP ALM 11.0 for bug tracking and reporting, also followed up with the development team to verify bug fixes, and update bug status. Involved in the User Acceptance Testing. Analyzed data through querying via SQL from various database sources (i.e., Oracle, SQL Server, MySQL) Created the DDL scripts using ER Studio and source to target mappings (S2T- for ETL) to bring the data from multiple sources to the warehouse.

Education & Certification:

Master of Computer Science and Engineering (MCA), JNTU, India.

Certified in AWS Certified Developer Associate (CDA).



Contact this candidate