Post Job Free
Sign in

Data Developer

Location:
Mumbai, MH, India
Posted:
June 28, 2016

Contact this candidate

Resume:

PROFILE

Having *+ years of total IT experience on Data Warehousing ETL Technologies using Informatica and Teradata in the area of Analysis, Design, Development, Implementation and deployment of business systems.

Extensive experience in ETL and Data Integration for developing ETL mappings and scripts using Informatica Power Center 9.5.1/8.6.1 using Designer, Repository Manager, Workflow Manager & Workflow Monitor.

Extensive experience in Teradata 14/13/12/V2R6/V2R5/V2R3 (Teradata Administrator, Teradata SQL Assistant, BTEQ, FastLoad, MultiLoad, TPump, TPT (Teradata Parallel Transporter), and FastExport.

Expertise in implementing complex Business rules by creating robust Mappings, Mapplets, shortcuts and reusable transformations.

Incorporated various data sources like Oracle, DB2, XML and Flat files into the staging area.

Experience in Data Warehouse development working with Extraction/Transformation/Loading using Informatica PowerCenter/Power Mart with Oracle, Teradata and Heterogeneous Sources.

Knowledge of full life cycle development for building a Data Warehouse.

Extensively used Teradata application utilities like BTEQ, MultiLoad, FastLoad, TPUMP and TPT and Fastexport.

Extensively worked on UNIX and Informatica environments to invoke Teradata utilities and file handlings.

Working experience in an Agile/Scrum environment.

Good exposure to Insurance and Banking Domains.

Good knowledge of Dimensional Data Modeling, Star Schema, Snow-Flake schema, FACT and Dimensions Tables.

Worked in remediation (Performance Tuning) team for improving query performance of user queries and production SQL’s.

Extensively Worked with Maestro and ESP Scheduling tools like creating, composing the job scripts, creating dependencies on other Jobs and also developed calendars for scheduling.

Extensive experience of supporting production flows, resolving the issues in stringent timelines.

Excellent Interpersonal communication, Documentation skills and experience coordinating with project managers, business Analyst, Architects, DBAs and Developers.

Ability to work independently as well as in a team, fun-filled and challenging environment.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 9.5.1

Teradata Utilities: BTEQ, FastLoad, MultiLoad, FastExport, TPT, SQL Assistant

Databases: Teradata, Oracle, DB2, Netezza

Schedulers: Maestro (Job Scheduling Console), ESP

Source systems: Legacy systems, Teradata, XML and Flat files

Programming: C, C++, SQL, UNIX Shell Scripting

Operating systems: Windows 95/98/00/NT/XP/7, UNIX, LINUX

EDUCATION

Master of Science from Osmania University, India.

Bachelor of Science (Computers) from Kakatiya University, India.

Certifications: Teradata 12 Certified Professional

ACHIEVEMENTS

I was awarded with several Bravo awards from nationwide client for my quick learning ability, providing many value add and delivering optimal solution in stringent timelines.

Received Best Performer for producing zero defects, in Dec 2011.

Received IBM BRAVO for my recognition in Feb, 2014.

PROFESSIONAL EXPERIENCE

Nationwide Insurance / IBM, Columbus, OH Dec’ 14 - Present

Lead ETL Developer

Project Description: (DW Claims)

Nationwide Insurance is one of the leading companies in the nation for the Auto, life and Home insurance with more than 10 million policyholders. I am working in DW claims area, bringing the nationwide and allied channel business data from different source systems and loading into Data warehouse on daily/weekly/monthly basis.

Responsibilities:

Working as a Sr. ETL Developer on multiple projects, implemented using Informatica PowerCenter 9.5, Teradata, Oracle and Microstrategy as the core technologies.

Designed and coded the application components in an Agile/Scrum environment.

Informatica mappings and workflows are developed to extract data from multiple databases, flat files, Mainframe files and load into the Teradata warehouse.

Experience in Integration of various data sources like Teradata, Oracle, DB2, XML and Flat Files.

Teradata jobs like MultiLoad, FastLoad, Fastexport and BTEQ programs are developed to cleanse, transform and load data into Teradata warehouse.

Developed BTEQ's for loading the data from staging area to the final Dim/Fact tables.

Created technical specification documents like system design and detail design documents for the development of ETL mappings to load data into various tables in Data Marts.

Extensively Worked with ESP and Maestro Scheduling tool like creating, uploading job script files, creating dependencies on other Jobs and also developed calendars for scheduling.

Involved in the analysis and optimization of long running jobs.

Performed Unit, Integration and system testing and provided UAT support to business partners.

Solved many User incidents/queries.

Participate in code reviews and ensure that all solutions are aligned to pre-defined architectural specifications

Created shell scripts to ftp the files between multiple servers like outbound/inbound.

Responsible for providing the guidelines to production support team in order to resolve complex production abends.

Identify and verify the impact based on the changes in downstream/upstream applications.

Provide development estimates and planning. Created technical design documents to map sources and targets as per business requirements.

Worked on claims reprocessing activity to fix the issues in the claims system. Due to the issue in source data.

Responsible for data fix like rebuilding the dimensional, daily/monthly facts in production due to the issue with source data.

Implemented automation on some of the Manual process in our system like Claims Reprocess.

Co-coordinating with India based team in the onsite-offshore team model.

Environment: Informatica Power Center 9.5.1, Teradata 14, Oracle 11g, UNIX, Microstrategy, ESP.

Nationwide Insurance IBM, Columbus, OH Oct’ 12 – Nov’ 14

Sr. ETL Developer

Project Description: (Fraud detection in Insurance)

The Galaxy Data Warehouse (GDW) supports several Enterprise Validation Technology (EVT) initiatives. The EVT data is used to validate information given by prospective customers against public record’s to identify any incongruence in the information provided by this customer. This project is to send EVT (Enterprise Validation Technology) and SVT (THI-specialty Validation Technology) historical (one time) and daily incremental data from the EVAT database to Detica. So that SIU uses this data for fraud detection solution that simplifies the complexity of insurance fraud monitoring.

Responsibilities:

Worked as a Sr. ETL Developer on multiple projects, implemented using Informatica PowerCenter 9.1.0/ 8.6.1, Oracle 11g/10g and Microstrategy as the core technologies.

Involved in analysis, design, development, integration, performance and user acceptance testing and Production implementation of the projects.

Implemented Informatica best practices as part of the projects and prepared handy documents for the same.

Developed ETL design for error detection and handling by performing necessary validations.

Developed ETL mappings to extract, stage, cleanse, transform, integrate and load data from data mart to Enterprise Data Warehouse.

Extensively worked on file processing, file validations and error handling.

Created mappings using the transformations such as the Normalizer, Sorter, Aggregator, Union, Filter, Lookup, Joiner, Update strategy and Expression, to transform & load the data.

Involved in creating Maestro Jobs and Schedules to execute ETL Workflows.

Good knowledge in creating UNIX Shell Scripts to automate ETL processes and used Maestro to schedule ETL Workflows.

Developed complex BTEQ SQL scripts for loading/unloading data by applying transformations as per business rules.

Extensively used external loaders like FastLoad, MLoad which enhanced the performance of the jobs.

Effectively work with the Collect stats, joins, Index, sub queries etc.

Implemented ETL Best Practices and improved maintainability, performance and data quality

Involved migrating Informatica code across environments as a part of release.

Involved in Performance tuning Informatica Mappings and Sessions by eliminating the bottlenecks.

Troubleshooted complex ETL mappings using Logs, Informatica debugger.

Involved in Design reviews / Code reviews / Test Plan reviews with the project teams as appropriated throughout project lifecycle.

Created Technical designs, Test Case documents and Data Validation Scripts.

Environment: Informatica PowerCenter 9.1/8.6.1, Teradata 13, Oracle 11g, UNIX, Microstrategy, Maestro.

Nationwide Insurance, Columbus, OH May’ 10 – Sep’ 12

Sr. ETL Developer

Project Description: (EProduct)

The Database has lots of historical data. So data warehouse plays a major role in enabling various stores to view the data at a lowest level and help them to make decisions to bring more revenue to company with new policies. The objective of the overall project is to integrate the reporting needs for metrics such as Claim Reserves information, Claim losses, Claim expenses, Salvage recoveries, Claim/Coverage/Claimant count, Average Net Claim Payment (ANCP) trend information etc. at a policy level, package level and to include quote data in the reporting functionality. This project aims at building accurate data structures and delivering the necessary reporting capabilities.

Responsibilities:

Code development as per the requirement and document using Teradata utilities like Fast Load, Multi Load, Fast Export and BTEQ.

Developed Informatica mappings, re-usable transformations and procedures to get the data from the multiple sources like DB2, Oracle, and mainframe legacy systems to the Staging and to Data Warehouse system.

Experience in writing shell scripts in a UNIX environment for cleansing source files and preparing them for the load process and scripts for starting Informatica sessions and Workflows.

Performed the History and Incremental ETL process to populate the Data Warehouse.

Extensively involved in performance tuning of the Informatica ETL mappings like increasing the caching size, overriding the existing SQL.

Developed the mappings for slowly changing dimensions for loading history data.

Performed Unit & regression testing for the Application and documented test cases. Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level.

Upgraded the Informatica maps from version 7.1.3 to 8.6 in different environments and also changed the scripts accordingly.

Involved in QA testing in Unit Testing and Integration Testing of the Application.

Responsible for Production support of eprod flows.

On Daily basis need to send the status report to client regarding the production flows.

In case of abend in production need to resolve the abend within the given SLA time.

Ensure we meet daily//monthly SLA's.

Provided offshore and onshore support when ever required.

Environment: Informatica PowerCenter 8.x/7.x, Teradata, Oracle 10g/9i, DB2, mainframe.

Kaza Medical Group, DE Jun’ 08 – April’ 10

ETL Developer

Project Description: (Physician Sales)

Physician Sales was created specifically with physicians in mind. It supports the administrative, billing and business processes of physician practices, providing a cost effective alternative to in-house systems. With predictable monthly fees, minimal capital expense and proactive customer services, Physician_ EDW significantly improves practice information flow and productivity. Data from various source systems is fed into the EDW (Enterprise Data Warehouse). The process of Extraction, transformation, compare and loading of the Data into the Data warehouse is done using the Informatica powercenter and Teradata Client Utilities like FastLoad, MultiLoad and BTEQ scripts.

Responsibilities:

Design Informatica mappings, sessions and workflows to implement the business rules during the data extraction, transformation and loading

Build complex logics using different transformations viz., Filter, Router, Expression, Lookup (Static and Dynamic), Aggregator, External stored procedure, Update Strategy, etc. in Informatica mappings

Use variables and parameters in mappings, sessions, worklets and workflows.

Extensively used external loaders like FastLoad, MLoad which enhanced the performance of the jobs.

Writing Unix scripts for file watch, to create file list, to move the file after loads and call these scripts in Informatica pre session and post session commands.

Code development, prepare unit and integration test plans, code review, testing, debugging, deployment of Informatica Jobs into production environment

Deployment of Informatica jobs into testing, Validation and production environments

Preparing deployment documents, user guides, coding standards, and other technical documents

Environment: Informatica PowerCenter, Teradata, UNIX, Oracle, SQL, DB2.

Commonwealth Bank of Australia Jan’ 07 – May’ 08

ETL Developer

Project Description: (Corporate Active Data warehouse)

The Commonwealth Bank Group is Australia's largest domestic banking and financial services organization offering a diverse range of products and services to the customers. The project involves building a centralized enterprise wide data warehouse that will be used to analyze the business performance.

Responsibilities:

Responsible for understanding the business requirements.

Responsible for Designing, Development and Unit testing.

Created, updated and maintained ETL technical documentation.

Code development as per the requirement document using Fast Load, Multi Load, Fast Export and BTEQ SQL.

Coordinate with onshore team, in order to get the clarification on requirements.

Designed and developed the ETL Mapping's for the source systems data extractions, data transformations, data staging, movement and aggregation.

Involved in writing FastLoad, MultiLoad scripts to load the data in to Teradata tables.

Developed Informatica Mappings to load the data from source to Teradata.

Used Joins, Sub queries and Set operators frequently to write sql queries.

Environment: Informatica Power Center, Teradata, UNIX, Oracle, SQL, DB2.



Contact this candidate