Post Job Free

Resume

Sign in

Data Analyst

Location:
Granada Hills, CA
Posted:
March 30, 2020

Contact this candidate

Resume:

Sandeep Rangineni

615-***-****

adcjsg@r.postjobfree.com

Around 9 Plus years of strong experience in the Information Technology with extensive Software Testing using Manual and Automated methods in Client/Server and of Web based applications, Products, Big Data solutions using HDFS, Hadoop, testing ETL and Data warehousing projects. Played critical roles of Test Lead, Test Architect, Test Manager and shown expertise in managing testing projects over the years. In Depth knowledge of Software Development Life cycle (SDLC), Waterfall V-model and Agile development methodologies. Strong ability to setup a QA teams and proficient in identify, analyze and solve critical problems in projects and provide increased customer satisfaction.

PROFESSIONAL SUMARY:

Extensive experience in creating and monitoring test plans, test scenarios, test cases, test reports, status reports and other documentation for both Manual and Automated testing and execution.

Proficient testing experience in all phases and stages of Software Testing Life Cycle and Software Development Life Cycle (SDLC) using Agile, Waterfall with good working knowledge of testing methodologies, tasks allocation, resources managing and scheduling.

Involved in End-to-End timely execution of assigned data projects, including interacting with business teams, coordinating with developers and testers throughout the SDLC phase for coding, testing and deployment.

Worked and reviewed along with Business Analysts and Functional Team members to translate Business requirements into ETL technical specifications.

Proficient in Test Automation using UFT (Formerly QTP) and Selenium Tools.

Strong Experience in Automating Web Application Testing using Selenium WebDriver.

Extensively tested Salesforce Reports and Dashboards (Salesforce.com).

Extensively tested Salesforce Sales, Marketing and Service Cloud Applications.

Extensively used ETL methodology for supporting data extraction, transformations and loading Processing, in a corporate-wide-ETL Solution using Informatica.

Solid Back End Testing experience by writing SQL queries and executing shell scripts.

Involved in exhaustive Performance Tuning, Monitoring and Optimization of PL/SQL blocks in both Development and Test environment, for Oracle and SQL server databases.

Excellent SQL queries (DDL/DML), PL/SQL programming skills using complex Stored Procedures, effective functions, Views, Indexes, Cursors, Triggers with optimized Performance Tuning to run the data integrity tests.

Experience in Data Analysis, Data Validation, Data Verification, Data Cleansing, Data Completeness and identifying data mismatch.

Strong experience in test data management, defect management and test environment management.

Well-rounded working experience on Databases like Teradata, Hive (Hadoop), Oracle, MYSQL.

Hands on experience in Bigdata Testing using Hive, SQL, UNIX and HDFS Workflow.

Experience on UNIX commands and Shell Scripting.

Performed Smoke, System testing, system Integration Testing, User acceptance testing, Database testing and Regression testing.

Facilitated daily status calls with the offshore development team for tracking project progress. Responsible for creating timeline estimations and presenting weekly status reports to the business stakeholders.

Extensive experience in using Test driven Platform TDP, JIRA, Quality Center (QC), Application Life Management, Remedy for test management and defects tracking and Quick Test Professional (QTP) for regression testing.

Expertise in Data Warehouse, ETL Verification, BI reports.

Excellent analytical, problem solving, and communication skills, with ability to interact with individuals at all levels.

TECHNICAL SKILLS:

Salesforce.com

Sales Cloud, Service Cloud, Custom Cloud, Apex Data Loader

Cloud Computing

Force.com, APEX, Visual Force, Force.com apps

Database

Oracle 12c/11g/10G/9i/8i, MySQL 5.7, MS SQL 2005, 2008, 2012 and 2014

Scripting

Unix shell scripting, Java Script, VB Script

Tools

Toad, PLSQL Developer, Visual Studio 2008/2012, MS SQL Server 2008,

2012, 2014. SSIS, SSRS, PowerShell, Visio, Java Eclipse, JIRA, ALM.

ETL

Informatica Power Center 8.6, Informatica Cloud

Scheduling Tools

Autosys

Version control:

SVN, Bitbucket, Git.

EDUCATION:

Master of Science in Engineering Management, International Technological University, USA, 2015

Master of Science in Information Technology, University of Northern Virginia, USA, 2012

Master of Business Administration, Jawaharlal Nehru Technological University, Hyderabad, 2009

CERTIFICATIONS:

Salesforce Certified Administrator.

WORK EXPERIENCE:

Client: Cydcor LLC Augora Hills, California January 2018 – Present

Role: Salesforce Administrator

PROJECT:

Migrating the campaigns from Merlin to salesforce and production support.

Responsibilities:

Work with Data Integrity and Duplicate Management to help clean and dedupe lead, contact and account data.

Perform data integrity (rules and merging records) functions establishing proper ownership and record type maintenance in accordance with sales territories.

Involved in Regression Testing using Selenium.

Execution of Selenium Test cases and Reporting defects.

Designed Test cases Using Selenium Webdriver.

Extensively validated API integration between Salesforce with STI system using Mulesoft.

Participate in with user requirement sessions and document user requirements to address changing business needs. Review design approach with Product Manager.

Performs system administration functions such as user management (profiles and roles), field and validation rule configuration, record types, picklists, page layout management, mobile setup, data management (uploads), email templates, folder management, and public groups, as well as other configuration items.

Administers overall setup, configuration and maintenance of the Salesforce.com platform for the various divisions.

Work with custom workflow, notifications, approval processes, and Lightning Process Builder.

Using Informatica and SQL worked on integrating the lead ingestion process of different campaigns by identifying the core functionalities and the data required by the sales reps, who were using Salesforce.

Modified the SQL queries and stored procedures to enhance the existing ETL Informatica process that handles lead ingestion process for ATT and Century Link clients to handle increased ingestion volumes into Salesforce.

Interact with developers, end business users and various members of my team to discuss and to resolve defects and their priorities.

Mange User roles, profiles, user licenses and permission sets according to the latest business requirements and user needs.

Generate Dashboards and Reports for business users to keep track on their daily progress based on the roles and profiles defined in the organization.

Validated API integration between Mulesoft and Salesforce.

Worked on visual force permissions and static resources.

Worked on Custom objects and Page Layouts.

Worked on Products commissions, Templates and price book.

Extensively worked on Salesforce classic and Salesforce lighting.

Extensively used JIRA for user stories execution.

Extensively used the data loader and workbench to load, update or delete the records from Salesforce.

Extensively worked on Salesforce classic and Salesforce lighting.

Environment: SQL, PL/SQL, Oracle11g, MS-Word, TOAD, Xactly, Jira, Unix Shell scripting, Informatica, Salesforce, MS-Excel, Edit Plus, UNIX, Eclipse, Java, Tableau.

Client: DIRECTV-Now part of AT&T El Segundo, California August 2016 to December 2017

Role: ETL Tester

PROJECT:

Enterprise management system handles the distribution of commissions & payments to the DIRECTV Distributors and Dealers. The purpose of this system is to Calculate and compensate Retail, Non-Retail (Commercial, MDU, SMATV), Installers, Distributor, and Telco (Rev Share) partners.

As lead for Enterprise Payment system (EPS), TrueComp, ACTS and PARIS applications. Perform the System Integration, End to End and Regression testing on Non-Prod environments. Application profiling to identify the root cause of the bottlenecks or performance issues and provide recommendations to the project teams.

Responsibilities:

Managing and leading QA team of 6 members, delivering projects on average 2-3 projects in a monthly release.

Main tasks include requirement analysis, test strategizing, test cycle planning, estimation, test plan review and test execution review.

Participated in High-Level review of Functional Requirements Specifications (FRS), Business Functional Specification(BFS), High Level Design(HLD), Detail Level Design(DLD) documents to gather requirements used in designing High Level test scenarios and test cases.

Experience with resource planning, scope assessment, testing framework, communication plan, reporting metrics template, defect reports, change management.

Closely went through and worked on all the stages of SDLC for this project and designed and executed Functional, Integration, Regression, and System (End to End), UI Testing, Browser Compatibility, Backend (Database) Testing.

Ensured the content and structure of all Testing documents/artifacts are documented and maintained in HP Quality Center.

Organized the status meetings with the project management and send the status report (Daily, Weekly) to the Client.

Organized Defect Management through Triage Calls involving Business, development and IT Teams using Quality Center.

Extensively performed regression testing in SFDC test sandbox for various releases for:

•Accounts

•Contacts

•Profiles,Roles,Users

•Cases

Ensured timely delivery of all the testing deliverables by coordinating with the Offshore and Nearshore teams.

Developed the detail ETL Test Plan.

Tested Complex ETL Mappings and Sessions based on business user requirements to load data from various source systems like flat files, Mainframe DB2 tables and XML files to target tables.

Performed Validation of ETL processes from various source systems to Target database.

Tested mappings/Reusable Objects/Transformation/Mapplets by using Informatica Power Center

Used Informatica Power Center for extraction, loading and transformation (ETL) of data in the data warehouse.

Represent the QA team to bring cohesion between Project Managers, Business Analysts, Technical Delivery Managers, Tech Leads, Developers and testers.

Identified the areas of Process Improvements which resulted in Cost Savings and increased the testing efficiency. Documented and implemented the best practices.

Executed SQL queries using SQL developer to find useful test data to verify various functionalities of the application and done the back-end verification.

Tested several Informatica ETL mappings and ran on UNIX for loading purpose and checking the log files.

Tested PL/SQL Packages to transform/load the staging data into schema using business logic.

Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources systems including Oracle and Teradata.

Written several complex SQL queries for data verification and data quality checks.

Run all ETL process and verify that the data from source files are properly loaded into the data tables.

Extracted data from various sources like Oracle, flat files and SQL Server.

Analyzed, designed and developed test cases for Big Data analytics platform for processing data using Hadoop and Hive.

Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.

Performed importing and exporting data into HDFS and hive.

Extracted the data from different databases into HDFS using Sqoop.

Developed SQL queries/scripts and macros to validate the completeness, integrity and accuracy of data.

Experienced in managing and reviewing Hadoop Log files.

Validated both Managed and External tables in Hive to verify optimize performance with the understanding of Partitions, Bucketing concepts

Created and worked Sqoop jobs with full/ incremental load to populate Hive External tables and to export data HDFS data to Oracle.

Developed UNIX scripts and AUTOSYS to load extract files in to staging tables using sqlldr.

Coordinating with the team members with various different tasks and monitor the Q/A process to ensure the software quality. Provide weekly progress report, updates and status reports to the QA management and the other management on a periodic basis.

Acted as in-depth technical liaison to developers for communicating and reproducing problems.

Environment: SQL, PL/SQL, Oracle9i/10g/11g, Oracle SQL Developer, Oracle Enterprise Manager 12c, Callidus Sales Performance Management system,HADOOP 2.0, CDH5.4, Beeline, Hue, Hive, Sqoop, Impala, Data Torrent, Data Router, Autosys, Unix shell Scripts, Test Driven Platform(TDP),Scala, Unix Shell scripting, AIX servers, Putty, ALM system, HTML, XML, CA Workload Center system,Actuate Reports, Media Mastero, Power Center 8.1,MS-Excel, Edit Plus, WS-FTP and SVN.

Client: DIRECTV, EL SEGUNDO, CA August 2013 to July 2016

Role: ETL Tester

PROJECT:

Working as a consultant for DIRECTV (now it is a subsidiary of AT&T). It is a direct broadcast satellite service provider to households in the United States, Latin America and the Caribbean. It offers services to the subscribers through Set top boxes and mobile Applications for viewing the content anywhere any time.

As a System Analyst I worked on the different applications and databases from end to end in a pre-production environment and application profiling to identify the root cause of the bottlenecks, functional or performance issues and provide recommendations to the project teams.

Responsibilities:

Analyzed business requirements and module-specific functionalities to identify test requirements and formulate an effective master test plan.

Gone thru the Functional Specifications and implemented their technical aspect in System and Integration testing (SIT).

Developed Test cases, Test scripts from the data mapping documents, functional Specification documents and mapped the test cases with the requirements for generating the Requirement Traceability Matrix (RTM).

Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan.

Design and Development of QA documentation like data matrix and Test scenarios from business and functional requirements.

Closely went through and worked on all the stages of SDLC for this project and designed and executed Functional, Integration, Regression, and System (End to End), UI Testing, Browser Compatibility, Backend (Database) Testing.

Experience in ETL Data Warehousing, database testing using Data Stage for Workflow process.

Leading the team with technical and functional issues as part of KTLO support.

Worked on PL/SQL Stored Procedures, Functions, Packages and Triggers to implement business rules into the application.

Designed detailed test plans, test case and executed test scripts and diagnose problems in Informatica mappings based on Business Requirements.

Extensively Tested for:

•Association between Sales Rep – Care Role

•Association between Sales Rep – Lead and campaign management

•Association between Territory – DMA Addresses and shopping cart orders

Involved in Gathering and Analyzing Business Requirements, High Level Designing and Detail Technical Design of ETL application.

Lead the Offshore team and gave the exclusive training on Application for testing and resolve the issues.

Created different database links and validate the data in databases.

Extensively tested for:

•Accounts

•Contacts

•Leads

•Opportunities

•Cases

Analyze the bug trend and perform root cause analysis on the bugs.

Worked as test data analyst in order to provide test data for projects.

Deliver new and complex high-quality solutions to clients in response to varying business requirements.

Responsible for review of Functional Requirement Specification and System Design Specification documents for testing.

Prepared Test Scenarios by creating Mock data based on the different test cases.

Responsible for effective communication between the project team and the customer.

Provide day to day direction to the project team and regular project status updates to the Management.

Extensively used PL/SQL Procedures and Functions to transform data before the loading process.

Worked on UTL file utility to transfer data from Oracle application to external systems.

Involved in quality assurance of data, automation of processes, programming for data manipulation validation of programming, and presentation of results.

Created SQL reports, data extraction and data loading scripts for different databases and schemas.

Scheduling shell script jobs, creating instances of jobs, verifying logs according to client requirement using Autosys server.

Used Power Center 8 informatica tool in order to load strip files into Oracle databases.

Knowledge on Informatica Data Quality 9.1 tool in order profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.

Involved in maintaining the test environments; with activities like requesting data loads, data base backups, restarting the servers, requesting the deployments, troubleshooting issues.

Exclusively having Knowledge on Callidus Sales Performance Management system which having various rules and condition in order to calculate the amount for dealers.

Worked with the UNIX shell scripts and run in debug mode in order to identify any issues in the scripts.

Build and Deployment of rules in Truecomp 7.x.

Involved in extensive production support activities for KTLO.

Used Application Lifecycle Management tool as test management tool in order to open the defects, map the test cases to requirements and also get the status reports.

Tested Business Objects Reports according to the business requirements.

Developed and executed scripts to test the Performance of the Oracle Database using Jmeter under heavy load.

Worked with Performance team in order to perform tests like Load, Stress, Endurance, Scalability, Spike and Capacity etc. the database performance.

Tested several complex reports generated by business objects including Dashboards, Summary Reports, Master Detailed and Actuate Reports.

Prioritized the test cases based on the projected metrics and project deadlines.

Worked with Developers on Defects until the test case is passed.

Verified test results (viewing the log files using VI editor, querying the database using the SQL documenting the defects.

Involved in testing the UI screens, reports which was used by the internal business team.

Environment: SQL, PL/SQL, Oracle9i/10g/11g, Oracle SQL Developer, PVCS MS-Word, Callidus Sales Performance Management system, Unix Shell scripting, AIX servers, Putty, ALM system, HTML, XML, CA Workload Center system,Media Mastero, Informatica Data Quality9.1,Actuate Reports, Power Center 8.1,MS-Excel, Edit Plus, WS-FTP, UNIX,Jmeter3.0,Oracle Enterprise Manager 11g and Delphix Environment.

Client: Wells Fargo, charlotte, NC March 2013 to July 2013 Role: PL/SQL Data Analyst

Responsibilities:

Analyzed the Requirements from the client and developed Test cases based on functional requirements, general requirements and system specifications.

Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan.

Performed Smoke Testing, GUI Testing, Functional Testing, Backend Testing, System Integration Testing, Sanity Testing, and User Acceptance Testing (UAT).

Prepared Test Cases and Test Plans for the mappings developed through the ETL tool from the requirements.

Gone thru the Functional Specifications and implemented their technical aspect in System and Integration testing (SIT).

Developed Test cases, Test scripts from the data mapping documents, functional Specification documents and mapped the test cases with the requirements for generating the Requirement Traceability Matrix (RTM).

Coordinated software development teams and QA teams on issues and solved the problems.

Provided application support to other programmers and developers in handling multiple projects.

Worked with project teams to investigate complex issues, identify and implement solutions to re-occurring problems.

Established and documented ETL QA standards, procedures and QA methodologies.

Lead the Offshore team and gave the exclusive training on Application functionality and interfaces related issues.

Shared / spread knowledge throughout the Support Team and relevant areas, from development through to support.

Extensively tested several ETL Mappings developed using Data stage.

Provided Application support 24*7 for all Applications resolving critical Issue on time.

Used Putty to connect to Linux from windows.

Worked with ETL Source System and Target System (ETL Mapping Document) for writing test cases and test scripts.

Worked in all areas of incident management and handled critical issues with efficiency.

Maintained Quality Center for defect tracking and hosted meetings with the developers to make sure all the defects are included in the release.

Involved in the meetings related to defect management n daily basis to prioritize the work and to provide solutions.

Provided support to other teams in resolving and troubleshooting problems.

Developed SQL queries (Sub queries and Join conditions), PL/SQL programming to extract data from large database.

Tested the PL/SQL procedures, functions, packages according to the requirements.

Verified test results (viewing the log files using VI editor, querying the database using the SQL) documenting the defects.

Involved in testing the UI screens, reports which was used by the internal business team.

Involved in the creation and testing of database objects like tables, views, stored procedures, functions, packages, DB triggers, Indexes, and db links.

Coordinated with multiple teams, all levels of personnel and subject matter experts.

Lead and performed the maintenance and repair of applications in accordance with technical and functional specifications.

Tested data migration to ensure that integrity of data was not compromised.

Wrote complex SQL queries for extracting data from multiple tables and multiple databases.

Extensively used Oracle database to test the Data Validity and Integrity for Data Updates, Deletes & Inserts.

Provided the management with weekly QA documents like test metrics, reports, and schedules.

Environment: Oracle 10g, SQL, PL/SQL, Flat files, HP Quality Center, Web Services and SQL DEVELOPER, Putty, Informatica Data Quality9.1,Winscp, CISCO System VPN Client, TOAD, WAS, WMB, Windows and UNIX.

Client: Assurant Solutions, Miami, FL July 2012 to February 2013 Role: PL/SQL Data Analyst

Reviewed and analyzed functional requirement specifications, workflow documents, and Use Cases.

Performed System Testing, Integration Testing, Functional and Regression Testing.

Created Data Driven Tests to validate test scenario with different sets of data using parameterization.

Closely worked with the Business Analyst, Developers to resolve the requirement issues, deployment issues, change management etc., during the QA testing and actively participated in Review meetings and walkthroughs.

Prepared test data to verify different types of scenarios.

Created Manual Testing strategy and performed Database Testing using SQL Queries by retrieving data from Oracle Database

Analyzed Manual Test Cases with QA Team for most important & high priority Requirements for conversion to Automated Test Scripts using Quality Center.

Extensively used HP Quality Center as defect tracking system to log, close and generate reports and tracked through to resolution.

Extensively involved in the Backend / Data base testing using SQL queries.

Interacted with developers in resolving the defects found in the application during testing.

Working closely with team members to ensure status and schedules are communicated

Participate in peer reviews of functional specification, application previews, and test plans/test cases.

Create and execute SQL queries to fetch data from Oracle database to validate Data.

Interacted with business analysts and developers to resolve the technical issues so as to meet the client's requirement for a better-quality software product.

Tested the PL/SQL Stored Procedures, Functions, Packages and Triggers to implement business rules into the application.

Responsible for Study, Analysis, Design, Development/Customization and Testing/debugging of the reports.

Used Hints to Achieve the Performance and Tuned SQL Queries and Stored Procedures, improved the performance of the reports.

Involved in Functional, system integration, Regression and smoke testing.

Involved in and provided support for User Acceptance Testing (UAT).

Worked on supply chain management and functionality of the project.

Involved in quality assurance of data, automation of processes, programming for data manipulation validation of programming, and presentation of results.

Created SQL reports, data extraction and data loading scripts for different databases and schemas.

Loading data from flat files into database tables using SQL* Loader

Expertise in UNIX environment and shell scripting. Excellent knowledge in Batch Jobs using UNIX KORN, AWK UNIX Shell Script (bash, ksh).

Optimizing database objects and streamlining applications.

Analysis of customer reported issues, which could be a code issue, data fix issue, performance issue, setup issue, functional issue or an RDBMS issue

Tested the Oracle forms and reports using Forms 6i/10g and Reports 6i/10g.

Environment: Oracle 10g/11g,SQL,PL/SQL, Reports(6i/10g),Forms(6i/10g), Oracle Enterprise Manager 11g,HP Quality Center 10, UNIX, Web Services, SQL DEVELOPER, Putty, Jira, Test Rally, WS-FTP.

Suchi InfoTech Group, INDIA May 2007 to December 2009

Oracle Developer

Participated in the Requirements and functional review meetings of project.

Preparing documentation for requirements, design, install and Unit testing and System Integration.

Tested the database objects like Procedures, Functions, Packages, Triggers, Indexes and Views in Test environment using PL/SQL, Toad and SQL*Plus.

Involved in preparing the Test Plan and Test Cases in HP Quality Center 11.0.

Involved in Database Testing by writing SQL queries, testing triggers and PL/SQL procedures.

Participated in QA team meetings and Bug tracking meetings.

Help the users in performing User Acceptance Test.

Developed and tested the Unix Shell scripts to automate repetitive database processes.

Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and Bulk collects.

Worked on ETL tools to extract and load data from SQL Server, and flat files to Oracle database.

Handled system utilization issues, performance statistics, capacity planning, integrity monitoring, maintenance, reorganization, and recovery of databases.

Experienced in analyzing the issue by checking the log files in the UNIX environment.

Generated detailed test status reports and graphical charts for the applications under test.

Responsible monitoring the Web/App Servers, DB Servers, JAVA Servers etc.

Monitored the Available bytes graphs that would give an idea as to the Memory leak if any present in the system.

Environment: Oracle 9i,SQL Server 2005,SQL, PL/SQL, Toad, MS Access, SharePoint Server, Web Services, Unix, Bugzilla.

TRAININGS ATTENDED:

Year

Title

Location

Organised by

2013

Enterprise Payment System

El Segundo

Directv

2013

Application Lifecycle Management

El Segundo

Directv

2014

True Comp

El Segundo

Directv

2017

Technology Development Platform

El Segundo

Directv

2018

Salesforce Training

Agoura Hills

Cydcor

2018

JIRA

Agoura Hills

Cydcor



Contact this candidate