Post Job Free

Resume

Sign in

Project Data

Location:
Carrollton, TX
Posted:
October 06, 2015

Contact this candidate

Resume:

Name Mr.RAMAIAH CHEJERLA Mail: acrzn0@r.postjobfree.com

+1-469-***-****

Summary

Have around 8 years of IT experience mainly in Database and Data Warehousing technologies like Informatica and Teradata, in both Windows and UNIX platform, worked in projects related to Data Integration and Data Warehouse loads. Worked on projects related to Retail and Insurance. Has extensive experience in contributing to the Design process for a very large Retail Data Warehouse project.

Technical Skills

Platform

Windows 2003,Windows XP, Unix

Languages

C,SQL,PL\SQL

RDBMS

Teradata,Oracle 10g, SQL Server,DB2

Tools & Utilities

Toad, SQL Developer, Teradata Utilities

ETL tools

Informatica 8.x.1/9.5.1, Data stage 8.1, Knowledge on MSBI tools.

BI tool

Cognos 7.0/8.0, OBIEE10g

Modeling Tools

Erwin-7.x

Other skills

Design

Implemented logical data model through Erwin.

Requirement Gathering

Involved in collecting the project specific requirements with the Internal team/customer.

Certifications

Teradata

Teradata 12 Certified Technical Specialist

Education:

Bachelor of Engineering (B.E) in ECE.

Project- 5:

Title – AR Batch posting

Role – Senior Developer

Duration –Feb 2013– till date

Team Size: 3

Technologies –Informatica 9.5.1, B2B, DB2, UNIX, DTE 9.5.1,Java/J2EE.

The batch posting application compares the remittance information contained in the 342 byte file against the billed claims loaded in the contract AR, using programming logic created by HMS developers. Any data that doesn’t match falls into an ‘unmatched’ category, which the analysts use to attempt to post payments manually.

Architecture Details- This project was implemented in n-tier architecture using MVC and Singleton Design patterns with java/j2ee technology and components are used for developing business objects.

Responsibility:

Analyzed different source files like 835,277& 999 formats,

Created parsers to read the source files with different formats,

Through Informatica UDT transformation B2B is integrated with Power Center

Targets, transformed the data, mapped the data and load the

Data into targets using Informatica according to specifications.

Project- 4:

Title – Integrimatch

Role – Senior Developer

Duration –Mar 2011– Jan 2013

Team Size: 10

Technologies –Informatica 9.5.1, DB2, UNIX, Java/J2EE, Spring Framework, JSP, JMS,

Weblogic 9.x.

Client is a leading health care system with cost containment solutions for the federal and state governments, commercial insurers.Focused exclusively on the healthcare industry, client ensures that healthcare claims are paid correctly and by the responsible party, and that those enrolled to receive program benefits meet the qualifying criteria.

There are various projects those work together to attain the client business objective.

Architecture Details- Integrimatch is one such project which primarily deals with asset verification of the concerned party. IntegriMatch is an electronic, non-paper based system through which disclosed and undisclosed assets like liquid assets, real estate and personal property for the Elderly, Blind and Disabled (EBD) population are identified, reviewed and reported. It includes creation of cases of the parties where the workflow would be created and tracked till the closure of the case. It is also helpful in generating reports and sending notification mails to the concerned. Data porting and maintaining integrity through ETL tool.

Responsibility:

Analyzed targets, transformed the data, mapped the data and load the

Data into targets using Informatica according to specifications.

Designed project related documents like Unit test case batch logs for each process.

Design, development and testing of Mappings, workflows using Informatica.

Optimized the performance of Mappings, workflows, and sessions by identifying and eliminating bottlenecks.

Created mapping parameters and used them in mappings.

Extensively used transformations like expression, filter, router, joiner, aggregator, lookup and update strategy transformations to model various standardized business processes.

Involved in unit testing and integration testing.

Main Contribution:

Understanding the business requirements and implement the same in mappings

Analyzed existing mappings for Enhancements

Performing the change request.

Project- 3:

Title – Global BI Data mart

Role – Senior Developer

Duration – June 2010 – Feb 2011

Team Size: 15

Technologies: Teradata 13.0, Sql Server 2008, Data stage 8.1, OBIEE10g, MSBI Tools.

The client is a leading Top Fortune 500 company in USA. Project is to conceptualize and design a centralized Global BI Data Mart to cater the Reporting needs of international market and business intelligence.

Architecture Details-To align the functionality of the disparate existing reporting systems into a single common platform and create a centralized data repository to cater the future business intelligence and reporting needs which will enhance decision making capability and thereby improve their time to market.

Responsibility:

Understanding the Requirements and implementing in Designing process

Implementing the new database designs to provide better results by considering various parameters like Indexes, Skew factors from Database side.

Capturing the queries through DBQL which are firing on database from OBIEEE side and analyzing them for fine tuning.

Working on Terada utilities which will provide the better results on exporting and importing data from various files systems and loading into the target database.

Implementing the various iteration Methods on Database design to provide quick response from OBIEE reporting side.

Configured metadata objects (Subject Area, Table, Column, and PPI’s) and various servers from database side.

Worked on retailing subject area like Sales, Inventory, Allowances, Store returns, Suppliers etc.

Creating various Test sets and allocating the space from Teradata side for Testing Purpose

And provide user privileges for different users from both Database and OBIEE side

Executing the various query explanation plans and compare with existing database designs

Analyze the various MSBI ETL Packages which are involved in the Global data mart design model

Analyzed all Data stage scheduled jobs

Datastge parallel jobs are executed for data loading into Global Data Mart.

Analyzing the various database design models and implementing them Through Erwin tool

Creating Technical docs on tool features

Performed Data Validation of data in the reports and dashboards

Key Challenges:

Understanding the existing Database Design, Business requirements, implementing the PPI’s and designing the global data model and develop the new approaches which will help to provide good performance and accurate Results on report side.

Main Contribution:

Understanding the business requirements, Database and ETL designs, and designing a new business global data model.

Analyzing the various ETL tools for implementing the Global Data Mart.

Identify the Bottlenecks from the Teradata database side and tune it.

Worked with Teradata utilities like BTEQ, Fast load, Mload and TPump.

Documentation across all phases during the implementation.

Project- 2:

Title – Report Performance Tuning

Role –Software Developer

Duration – Jun 2009 – May 2010

Team Size: 11

Technologies: Teradata 12.0, OBIEE10g.

The client is a leading Top Fortune 500 company in USA. Project is to improve the performance of existing BI Reports which includes development, Design and Testing.

Architecture Details- To identify the performance bottleneck either at report level or database level, analyzed database and reports of the existing system. Implemented primary and secondary indexes, PPI’s, join indexes and modified cache, aggregated tables and logging levels, which improved the performance of BI reports.

Responsibility:

Built the Star Schemas based on the functionality.

Implemented JI (Join Indexes) and AJIs (Aggregated Join Index) appropriately resulted in better execution plan, response time and better savings in CPU and IO.

Identified and re-designed PI (Primary Indexes) to reduce HOT AMP operation

Collected the skew factors for the tables which are involved in design to know the PI efficiency.

Created the value ordered indexes on the Range data tables for better access.

Collected the statistics on the new designed table’s data.

Key Challenges:

Understanding the existing Database Design, Business requirements, identifying the bottlenecks from report and database side implementing the PPI’s and develops the new approaches, which will help to provide good performance and accurate Results on report side.

Main Contribution:

Understanding the existing Database design and verify the skew factors for huge volume tables.

Identify the Bottlenecks from the Teradata database side and tune it.

Worked with Teradata utilities like BTEQ, Fast load, Mload and TPump.

Documentation across all phases during the implementation.

Key achievements:

Better transaction response Time - 70% of improvement in Response time.

60 – 70% improvement in CPU and I/O utilization.

Quick turn-around time for Business Intelligence reports

Increase in End User productivity and Efficiency in decision Making

Project -1:

Title- Cognos up gradation

Role: Software Developer

Duration: Dec-2007 to May-2009

Team Size: 6

Technologies: Cognos 7.0, Cognos 8.0

Customer is a Large TPA and Service Provider having business in US, Asia and Australia. client is regarding to General Insurance which also includes other insurances (Medical, Vehicle) that corresponds to claims and payments.

Architecture Details:

Migrate the Cognos Version7 reports to Cognos Version 8.Business Intelligence Cognos query migration process involves Migrating older version complex reports to latest version. This process supports users to access regular usage complex reports in latest Cognos version very easily.

Responsibility:

Recreated reports from Cognos version 7 to Cognos 8 by using the tool Query Analysis.

Key Challenges:

Delivering the Project in time, Designing reports.

Main Contribution:

Analyzed the query in Cognos version7.

Developed/Rewrite the reports in Cognos version8.

Maintained project related documents.

Coordinated with onsite and offshore team to achieve the target in time.



Contact this candidate