Post Job Free

Resume

Sign in

Data Sql

Location:
Secunderabad, Telangana, India
Posted:
February 08, 2017

Contact this candidate

Resume:

Ghanta

acyqdb@r.postjobfree.com

732-***-****

PROFESSIONAL SUMMARY:

* ***** ** ******** ******* of experience on source to target mapping, gathering business requirements from business/user, creating source to target mapping document, creating process flows, and DFD.

Worked on various Domains like Banking, Financial, Health Care and Telecommunications.

Well versed with Data Migration, Data Conversions, Data Extraction/ Transformation/Loading (ETL) using DTS, PL\ SQL Scripts

Experience in designing and developing Data Warehouse applications using ETL and Business Intelligence tools like Informatica Power Center, Datastage, Mainframes SAS, SSAS, SSIS, OLTP, OLAP, Business Objects.

Experience with Data Extraction, Transforming and Loading (ETL) using various tools such as Data Transformation Service (DTS), SSIS and Bulk Insert (BCP).

Experience with Relational Databases like Oracle 10g/9i/8i, DB2 ESE, MS SQL Server 2005/2010, DTS, Sybase, Teradata V2R5, MS Access and formats like flat-files, CSV files, COBOL files and XML files

Extensive success in translating business requirements and user expectations into detailed specifications employing Unified Modeling Language (UML) in an SOA environment.

Worked with heterogeneous relational databases such as Teradata Oracle, MS Access and SQL Server.

Experience in working third party tools like WinSQL, TOAD and SQL Developer.

Experience in Other Utilities/Tools/Scripts like Korn Shell Scripting, SQL Plus, SQL Loader, Export and Import utilities, TOAD 9.1 and Visio 10.0.

Experience in complete Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and Bug Life Cycle (BLC).

Good command in modeling using case tools like ERwin, Oracle Designer, Power Designer & E-R/Studio.

Involved in preparation of Traceability Matrix based on High/Detail requirements and Test Cases.

Experienced in handling concurrent projects and providing expected results in the given timeline.

Excellent communication, documentation, and presentation skills with clear understanding of business process flow.

Excellent understanding of Data Warehousing Concepts - Star and Snowflake schema, SCD Type1/Type2, Normalization/De-Normalization, Dimension & Fact tables.

Expertise in broad range of technologies, including business process tools such as Microsoft Project, MS Excel, MS Access, MS Visio, technical assessment tools, Micro Strategy Data Warehouse Data Modeling, and Design.

TECHNOLOGY EXPERTISE:

Database Systems: Oracle 11g, 10i 9i/8i, SQL Server 2008/2005/2000, IBM DB2, TeraData 12, Sybase ASE 12, MS Access 2007.

Data Modeling Tools: Dimensional Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT, and Dimensions Tables, Conceptual, Logical Data Physical Modeling, Erwin r7.2/3/4, Visio, Sybase Power Designer.

Reporting: Crystal Reports and SQL Server 2000/2005 reporting services

ETL Tools: Informatica 9.0/8.1/7.x/6.x/5. x., Data Stage, SSIS, SSAS

Database Development: T-SQL and PL/SQL

Office Applications: MS Office (Word, Excel, Visio, PowerPoint, Project)

Web Development: HTML, XML and FrontPage, Visual Basic 6.0

Operating Systems: Windows 7/Vista/2000/XP/2003, Unix 5.2/4.3, Sun Solaris7

Programming Languages: C#, Visual Basic.NET and C++, VB 6.0

PROFESSIONAL EXPERIENCE:

Sempra Energy Utility, Los Angeles, California

Sr Data Analyst April 2015-Till date

Responsibilities:

Performed the batch processing of data, designed the sql scripts, control files, batch file for data loading.

Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.

Coordinated with the Business Analyst Team for requirement gathering and Allocation process

Methodology, designed the filters for processing the Data.

Designed and developed the Database objects (Tables, Materialized Views, Stored procedures, Indexes),

SQL statements for executing the Allocation Methodology and creating the OP table, CSV, Text files for business.

Processed large Excel source data feeds for Global Function Allocations and loaded the CSV files into Oracle Database with SQL Loader utility.

Performed code Inspection and moved the code into Production Release.

Documented all the Relative activities in Quality Centre and coordinated with QA team.

Developed the best practices and standards for Data Governance Processes.

Performed Data filtering, Dissemination activities, trouble shooting of database activities, diagnosed the

bugs and logged them in version control tool.

Performed source data analysis and captured metadata, reviewed results with business. Corrected data

anomalies as per business recommendation.

Created source to target mappings for multiple source from SQL server to oracle. This was used by ETL

developers.

Performed the physical database design, normalised the tables, worked with Denormalised tables to load the data into fact tables of Data warehouse.

Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.

Used SQL*Loader to load data from external system and developed PL/SQL programs to dump the data

from staging tables into base tables.

Extensively wrote SQL Queries (Sub queries, correlated sub queries and Join conditions) for Data Accuracy, Data Analysis, and Data Extraction needs.

Designed star schema with dimensional modeling, created fact tables and dimensional tables.

Involved in data analysis, data discrepancy reduction in the source and target schemas.

Designed and developed the star schema Data model, Fact Tables to load the Data into Data Warehouse.

Implemented one-many, many-many Entity relationships in the data modelling of Data warehouse.

Developed the E-R Diagrams for the logical Database Model, created the physical Data Model with Erwin data modeler.

Analyzed the tables once the imports were done.

Edited the Database SQL files with Vi Editor, used Unix Commands like Cron, Job scheduling, Executing files, Process, Background, Grep etc

Written PERL Scripts to extract the data from Oracle Database, checking the database connections, regular expressions etc.,

Developed the VBA Integration with Excel feeds and SQL database, SQL Extraction, Transformations of

Excel Data into SQL database.

Involved in writing Oracle PL/SQL procedures, functions, Korn Shell scripts that were used for staging,

transformation and loading of the data into base tables.

Involved in debugging and trouble-shooting of database objects.

Involved in Data loading and data migration – Used SQL * Loader to load data from excel file into staging table, Data Cubes, and developed PL/SQL procedures to load data from staging table into base tables of Data warehouse.

TOAD, SQL Developer tools were used to develop programs for executing the queries.

Worked with SQL Query performance issues. Used index logic to obtain the good performance.

Involved in the monitoring of database performance in Enterprise Manager Console.

Created the XML control files to upload the data into Data warehousing system.

Environment: Struts Framework 1.1, JSP 1.2, Servlet 2.3, XML, Oracle 11g, Data Warehouse, OLAP, SQL Navigator, SQL Developer, Erwin 4.0, XML, OLTP, MS-Excel 2000, MS-office 2000, Microsoft XP Professional.

MT&T Bank, Buffalo NY

Sr Data Analyst Sep 2014 - March 2015

Responsibilities:

Designed & Created Test Cases based on the Business requirements.

Involved in Data mapping specifications to create and execute detailed system test plans. The data

mapping specifies what data will be extracted from an internal data warehouse, transformed and

sent to an external entity.

Involved in extensive DATA validation using SQL queries and back-end testing

Used SQL for Querying the database in UNIX environment

Developed separate test cases for ETL process and reporting

Involved with Design and Development team to implement the requirements.

Developed and Performed execution of Test Scripts manually to verify the expected results

Design and development of ETL processes using Informatica ETL tool for dimension and fact file

Creation Involved in Manual and Automated testing using QTP and Quality Center.

Conducted Black Box – Functional, Regression and Data Driven. White box – Unit and Integration

Testing (positive and negative scenarios).

Defects tracking, review, analyzes and compares results using Quality Center.

Participating in the MR/CR review meetings to resolve the issues.

Defined the Scope for System and Integration Testing

Prepares and submit the summarized audit reports and taking corrective actions

Involved in Uploading Master and Transactional data from flat files and preparation of Test cases, Sub System Testing.

Document and publish test results, troubleshoot, and escalate issues

Preparation of various test documents for ETL process in Quality Center.

Involved in Test Scheduling and milestones with the dependencies

Functionality testing of email notification in ETL job failures, abort, or data issue problems.

Identify, assess and intimate potential risks associated to testing scope, quality of the product and

Schedule

Created and executed test cases for ETL jobs to upload master data to repository.

Responsible to understand and train others on the enhancements or new features developed

Conduct load testing and provide input into capacity planning efforts.

Provide support to client with assessing how many virtual user licenses would be needed for performance testing, specifically load testing using Load Runner

Create and execute test scripts, cases, and scenarios that will determine optimal system performance

According to specifications.

Environment: Windows XP, Informatica Power Center 6.1/7.1, QTP 9.2, Test Director 7.x, Load Runner 7.0, Oracle 10g, UNIX AIX 5.2, PERL, Shell Scripting

Nationwide Insurance, Columbus, OH

Data Analyst Feb 2013 - Aug 2014

Responsibilities:

Involved in interacting with the end-user (client) to gather business requirements.

Participated in analyzing and modeling the requirements for the logical and physical design of the database using star schema and normalization techniques.

Involved in Data Mining, Profiling, and Data Modelling.

Developed complex mappings to load data from Source System (Oracle) and flat files

Built re-usable mapplets using Informatica Designer.

Used Informatica Power Center Workflow manager to create sessions and workflows to run the logic embedded in the mappings.

Developed and modified mappings for Extraction, Staging, Slowly Changing Dimensions of Type1, Type2, Type3, Facts and Summary tables duly incorporating the changes to address the performance issues as stated in the Re-architecture specifications.

Developed mappings for Slowly Changing Dimensions of Type1, Type2, Facts and Summary tables using all kinds of transformations.

Worked on Informatica Power Center 7.1.1 - Source Analyzer, warehouse designer, Mapping Designer,

Workflow Manager, Mapplets, and Reusable Transformations.

Written PL/SQL Stored Procedures and Functions for Stored Procedure Transformation in Informatica.

Define and represent Entities, Attributes and Joins between the entities.

Implemented PL/SQL scripts in accordance with the necessary Business rules and procedures.

Extensively developed PL/SQL Procedures, Functions, Triggers, and Packages.

Written UNIX shell scripts to automate loading files into database using crontab.

Developing batch files to automate or schedule tasks.

Support for the development, pre-production, and the production databases.

Environment: Agile, Oracle10g (SQL, PL/SQL), SQL*Loader, Forms 10g, TOAD 7.4, HTML/DHTML, PRO*C, JavaScript, Windows NT/2000, Crystal Report 10, Informatica Power Center 7.1.3, Erwin 4.0, UNIX.

Cigna Healthcare, Colorado Springs, CO

Role: Data Analyst April 2012 - Jan 2013

Responsibilities:

Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center 9.0

Involved in developing detailed test plan, test cases and test scripts using Quality Center for Functional and Regression Testing.

External interactions with NSCC, DTCC, NFS and various fund companies in which MMLISI is associated.

Involved in Data mapping specifications to create and execute detailed system test plans.

The data mapping specifies what data will be extracted from an internal data warehouse, transformed, and sent to an external entity.

Tested the reports using Business Objects functionalities like Queries, Slice, and Dice, Drill Down, Cross Tab, Master Detail, and Formulae etc

Involved in Teradata SQL Development, Unit Testing, and Performance Tuning

Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)

Tested several stored procedures.

Validated several Business Objects reports. Reviewed and tested business requests for data and data usage.

Tested the ETL process for both before data validation and after data validation process.

Tested the messages published by ETL tool and data loaded into various databases

Responsible for Data mapping testing by writing complex SQL Queries using WINSQL

Experience in creating UNIX scripts for file transfer and file manipulation.

Validating the data passed to downstream systems.

Worked with Data Extraction, Transformation, and Loading (ETL).

Involved in testing data mapping and conversion in a server based data warehouse.

Involved in testing the UI applications

Involved in Security testing for different LDAP roles.

Tested whether the reports developed in Business Objects are as per company standards.

Used Quality Center to track and report system defects

Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Environment: Informatica 8.1/7.1, Informix, DB2, Java, Business Objects, SQL, SQL Server 2000/2005, Teradata V2R6 (MLOAD, FLOAD, FAST EXPORT, BTEQ), TeradataSQL Assistant 7.0, Toad, XML, XSLT, IBM AIX 5.3, UNIX, Shell Scripting, WINSQL, Ultra edit, Rumba UNIX Display, Quality Center 8.2.

Exilant Technologies, INDIA

Data Analyst June 2008-Mar 2012

Responsibilities:

Analyzing the results and Documenting the process and preparing Data Mapping Document and analyze incoming feeds, conduct data profiling, and map the relevant fields to the target repository incorporating business validation rules to integrate the data.

Identified Use cases from the requirements and wrote Use Case Specifications

Created business process workflow diagrams (Activity diagrams)

Link business processes to organizational objectives, perform critical path analysis, and identify opportunities for business process improvement.

Create Mapping documents, ETL technical specifications and various documents related to data migration

Writing complex SQL Queries to test the Informatica ETL process.

Developed PL/SQL programs, stored procedures for data loading and data validations.

Involved in Testing Argus safety system 5.0/5.1.

Developed Oracle and Teradata queries to replace current data ware house reports

Involved in testing Cognos Custom reports and Out of the box reports using Argus Insight 5.0/5.1.

Developed complex SQL Queries to validate the data in the Cognos Custom reports against Safety Database.

Coordinated with the offshore team in the testing process of the SDA (Signal Detection and Alerts) & CBS (Corporate Business Support) Cognos custom reports.

Performed end to end ETL testing of Custom Tables which were developed for Cognos Custom reports.

Coordinated the testing process for Out of the Box reports (Argus Insight reports).

Coordinated the testing process for various periodic reports like (CTPR, NDA, IND, and PSUR).

Developed complex SQL’s to validate the data in periodic reports as a part of Aggregate Testing

Involved in Regression testing of Argus Safety(Periodic) and Insight reports (Cognos Custom and Out of the box) to verify if the Hot fix defects have been fixed in the system (Hot fix1 and Hot fix2 Testing)

Assist in the configuration of products, licenses, and studies in Argus Safety.

Test various Biztalk interfaces developed for the automation of product, license, and study configurations.

Familiar with EDC, eCRF, E2B and other data exchanges

Validate the systems as per GxP and SOX compliance requirements. Provide support for post production issues, fix the issues, test the issues.

Environment: Oracle 11g, PL/SQL, SQL Loader, SQL Reports, Teradata, DB2, Cognos 8 BI, MS Office, MS Visio, MS Word, MS PowerPoint, MS Excel, ERWIN, EDM Teams, EDM Quality, Argus Safety, Argus Insight.



Contact this candidate