Post Job Free
Sign in

Data Developer

Location:
Dublin, OH
Posted:
April 13, 2017

Contact this candidate

Resume:

Sowjanya Talacheru

Sr.Informatica Developer

aczsn1@r.postjobfree.com

614-***-****

Professional Summary:

* ***** ** ************ ********** data integration and data migration solution using ETL tool Informatica.

Worked on all phases of data warehouse development life cycle, from gathering requirements to development, testing, implementation, and support.

Experience on Data warehousing tools: Informatica 9.6/9.5/9.1/8.6 as ETL tool and Cognos Report Net 1.1 MR3 as a reporting tool.

Strong knowledge in data warehouse tools including Informatica Power Center 9.5/9.1/8.6

Designed and implemented global data warehouse programs including data design, ETL, and maintenance. Strong knowledge of data warehousing concepts including star schema, dimensions and fact tables.

Proficient in basics and details of UNIX programming

Proficient in extracting metadata from sources, such as Oracle, using ETL tool and transforming them using business logic and loading the data to the target warehouse.

Experience in writing SQL queries and optimizing the queries in SQL Server.

Experience in writing queries and verifying data and analysis.

Formulated and executed architectural standards for Guidewire solutions.

Designed and developed Guidewire applications and relevant components.

Provided technical advisory and reporting services for Guidewire application versions.

Analyzed functional requirements and evaluated Guidewire technical designs.

Experience in data quality and data profiling.

Solid understanding of databases, data ware house, OLTP, OLAP, Data Modeling, Business Intelligence tools like Cognos.

Efficient Analytical and Bug fixing skills. Got multiple appreciations from different clients.

Productive team member, positive attitude and self-motivated. Quick learner, willing to adapt to new challenges and new technologies.

Technical Proficiency:

ETL Tools : Informatica 9.6/9.5/9.1/8.6

Languages : SQL, PL/SQL

RDBMS : Oracle 9i/10g/11g, SQL Server

Product Name : Guidewire PolicyCenter, ClaimCenter

Education:

Bachelor of Electronics and Instrumentation from JNTU Hyderabad.

Professional Experience :

Company : ERNST & YOUNG LLP, 2012-2016

Client : GE, Louisville, KY

Duration : September 2015 – March 2016

Environment : Informatica 9.1, Teradata, Unix, Cognos

Role : Senior ETL Developer

Description:

The project aimed to implement a Comprehensive Capital Analysis and Review (CCAR) FR solution to support generation of the CCAR FR schedules. The solution included extracting data from multiple schedules, applying FRB-mandated edit checks over transformed CCAR data via validation rules and loading edit check failures into schedule-specific exception tables and final CCAR schedule output data (Fed-ready) into CCAR Data Mart

Results Delivered :

Created ETL Objects for multiple schedules in order to load data to CCAR data mart

Involved in analysis and documentation of mapping specs as per business rules.

Created the mappings using almost all the transformations like Source qualifier, Expression, Filter, Aggregator, Joiner, lookup, Router transformations, etc. to load the data into DWH according to business logic.

Developed mappings with transformations and mapplets confirming to the business rules.

Created the Complex mappings, done the performance tuning for the developed mappings.

Created user defines functions.

Developed sessions using Informatica workflow Manager.

Processed session data using threads and various tasks (session, command etc.) and managed database connections and scheduled workflows.

Used the debugger to debug the valid mappings.

Worked on Exception Handling.

Worked on identifying the Bottlenecks affecting ETL Process and Performance tuning and Optimization of Sources, Targets and Mappings.

Expertise in creating databases, users, tables, triggers, views, stored procedures, functions, Packages, joins and hash indexes in Teradata database.

Extensively worked with Teradata utilities like Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.

Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.

Performing data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like Excel and Teradata.

Project : Nationwide Claim Center

Client : Nationwide, Dublin,OH

Environment : Informatica 9.6, Oracle 11.6, Guidewire ClaimCenter7, Cognos

Duration : September 2014 – October 2015

Role : Senior ETL developer

Description:

Nationwide Insurance is in the process of implementing Guidewire Claim Center as the new claims handling system. This system is being implemented to meet the business needs of Nationwide Insurance for insurance claims processing and related functions. The project aims to build a Guidewire’s Claims Administration solution for business people at their business, by migrating the business data from various legacy systems and downs streams. The solution will use a Conversion process of extracting data from legacy system(s), cleanses, consolidates and loads it into an Intermediate schema & record counts and amounts reconciled against the legacy data sources.

Results Delivered:

Designed ETL objects and data conversion strategy to load data to claim center Analyzed source data and gathered requirements from the business users.

Worked on Informatica tools -Source Analyzer, warehouse designer, Mapping Designer, Mapplet Designer and Transformation Developer.

Most of the transformations were used including the Source Qualifier, Aggregator, Lookup, Router, Filter, Sequence Generator, Expression, Joiner.

Involved in performance tuning at source, mapping and target level.

Involved in unit testing to check the data consistency.

Used Source Analyzer to import metadata from Oracle DB.

Involved in developing financial reconciliation scripts, routines for database consistency checks, and bulk validation for legacy systems in Claim center.

Created shared folders, local and global shortcuts to reuse metadata.

Handled Personal Lines of Business and commercial lines of Business related deliverables.

Involved in System Integration testing(SIT) and User Acceptance testing (UAT).

Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.

Project : Nationwide PolicyCenter

Client : Nationwide, Columbus,OH

Environment : Informatica 9.6, Oracle 11.6, Guidewire PolicyCenter7, Notepad++

Duration : May 2012 – October 2014

Role : Senior ETL Developer.

Description:

The project aimed to build a Guidewire Policy Administration solution for businesses by migrating the business data from various legacy systems. The solution will use a Conversion APIs are written in GOSU, which picks the data from GW XML Staging Schema, processes the XML data and persists in the Policy Center application database and ad-hoc Reconciliation.

Results Delivered:

Designed ETL and data conversion strategy to load data to policy center

Used Informatica Designer tools to design the source destination, target destinations and transformations to build mappings.

Created complex aggregator, lookup, filter, and router transformation rules to generate consolidated data identified by dimension using Informatica ETL (Power Center) tool.

Imported the XSD’s of Policy Period model, Account model and Policy Transaction model into Informatica environment.

Created the Informatica Mappings for Policy Period, Account and Policy Transaction model in Informatica designer and corresponding sessions in workflow manager to generate xml files for each model in GW XML Staging Schema.

Received legacy data from TCS (stored in Intermediate Schema), which we verified and validated in an iterative process.

Generated Submission, Account and Policy Change XML files out of the data loaded in the Intermediate Schema.

Worked on identifying the Bottlenecks affecting ETL process, performance tuning and optimization of sources, targets and mappings.

Worked on Exception Handling and issues with migration from development to testing.

Performed data analysis and data profiling using SQL on various sources systems.

Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform data manipulations.

Used SQL to query Databases for Performing various validations and mapping activities.

Company : IT Minds Technologies, 2009 - 2011

Client : Bank of America, Chicago, IL

Environment : Informatica 8.6, Oracle 10g, Windows XP professional, Unix

Duration : June 2009 – December 2011

Role : ETL developer

Description:

Critical Data Mart is the process implementing data marts to load source system data. The project aims to build a data warehouse solution for business people at their business, by integrating the business data from various source systems. The solution will use a Dimension Model to providing a multidimensional view and allow analytical and ad-hoc reporting. The source of the source data would be daily feeds from the Source Systems.

Results Delivered:

Created data marts for different modules to load legacy converted data.

Established a strong understanding of the existing business model and customer requirements.

Developed various mappings and Transformations using Informatica Designer.

Informatica Designer tools were used to design the source destination, target destinations and transformations to build mappings.

Used source Analyzer and Warehouse Designer to import the Source and Target Database schemas and Mapping Designer to map source to the target.

Designed code, test and document application based on a project’s functional and business requirements.

Created complex aggregator, lookup, filter, Router transformation rules to generate consolidated data identified by dimension using Informatica ETL (Power Center) tool.

Knowledge and work experience in RDBMS concepts, Views, Triggers, Stored Procedures, Indexes, and Constraints.

Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.



Contact this candidate