Pavani Manne Muddu
******.******@*****.***
SUMMARY
Over 7+ years of experience in software development with relevant exposure to Extraction Transformation Loading (ETL) and loading DataMarts and implementing BI concepts using various ETL tools and specialized in DataStage.
Strong design and development experience in DataStage (11.5/9.1/8.7/8.0.1/8.1/7x), Oracle 11g/10g/9i, PL/SQL, DB2, SQL Server 2005/2008, Shell scripts, AutoSys, Crontab Utility, TWS (Tivoli Workload Scheduler), Control-M, FTP.
Good experience in building Fact, Dimension Tables and star/snowflake schema modeling, implementing Slowly Changing Dimension (SCD), Change Data Capture (CDC), ER modeling for OLTP and dimension modeling for OLAP, Reverse Engineering using Erwin.
Comprehensive knowledge in metadata for data dependencies, extracting source data from XML files, Sequential files, Flat files, zip files, transforming data and loading in DWH.
Involved in building customized routine, UNIX shell scripts for implementing business logic and scheduled the jobs using AutoSys, TWS scheduler and Cron tabs, SQL Server DTS Packages to integrate the data coming from heterogeneous sources to Staging Area.
Extensive working experience with Banking, Insurance, Manufacturing domain.
In-depth knowledge in trouble shooting performance bottle-neck, performance tuning, Data migration with large scale application, involved in migrating ETL projects and jobs.
Exposure to full life cycle development projects with System Life Cycle experience with Lean Agile and iterative development model. Expertise in Requirement gathering, Estimation, Modeling, Analysis and Design, development, Testing, Configuration Management, Performance tuning, User Acceptance, Data Conversion, End User Training, Production support and Documentation.
Mentored complete onsite/offshore team for technical front and had ownership for the project delivery in terms of quality, meeting the industry standards, optimal solutions and technology and on time implementation. Active contributor in various project proposals, estimation and technical screening.
Strong decision-making skills, extremely well organized, good interpersonal, communication and presentation skills.
Education Bachelor of Engineering, JNTUCEH, India
Master in Computer Engineering, Sunnyvale, CA
Technical Skills
ETL Tools: DataStage-11.5/9.1/8.x, 7.x(Designer, Director, Manager, Administrator), Parallel Extender, Quality Stage, and Server Jobs, IBM Cognos BI and Abinitio
Data Modeling: Erwin, Star-Schema Modeling, Snowflakes Modeling, FACT & Dimensions
RDBMS: Oracle11g/10g/9i, MS Access, SQL Server 2005/2008, DB2, Teradata.
Languages: C, SQL, PL /SQL
OS: Windows NT/2000/2003/XP, UNIX, LINUX
Scheduling tools: AutoSys, Crontab Utility, TWS (Tivoli Workload Scheduler), Control-M, DataStage Director
CRM: Siebel Tools 8.1.1/7.8.2, Enterprise Integration Manager (EIM)
OTHERS: HTML, SQL*PLUS, MS Visio, UNIX Shell Scripting, TOAD, SQL Developer, Squirrel
Trainings: Informatica Power Center, Business Objects XI
EXPERIENCE
Fifth Third Bank, Cincinnati, OH Nov’15 – Present
Sr. ETL Developer
Fifth Third Bank views our suppliers as contributors to bottom-line growth and as vital partners in our ability to assure high quality products and services to clients and customers. Fifth Third procures a wide variety of goods, services and equipment, and our suppliers have the opportunity to supply these items to our organization.
Worked on multiple assignments CMS, Liquidity, SwapDealer Reporting, LCR, LMR, Fixed Income, Adaptive Upgrade, Efx, ISH
Environment:
IBM InfoSphere 11.5/9.1/8.1 (Designer, Director, Quality Stage), Sequential files, AQT, UNIX, Windows, MS Visio, Oracle 11g, SQL Server 2005, DB2UDB, XML, FTP, Sterling, SiteScope, Remedy, TWS Scheduler, SAP Business Objects XI
Part of core scrum team player, actively involved in sprint ceremonies scoping, PBR, backlog grooming, retrospective etc.,
Involved in Datamart architect, technical design and development with respective projects.
Developing source to target data mappings capturing business rules and logic. Ensuring data standards are being applied correctly and consistently to incoming data, database fields and value translations
Translating business requirements into ETL processes using DataStage & SQL development.
Conducting data analysis of a large number of source systems to generate data quality metrics.
Developed DataStage custom routines to handle data validation & data scrubbing.
Created scripts to create new tables, views, queries, for new enhancements in the application using AQT. Developed PL/SQL procedures, triggers, indexes to implement business logic using AQT.
Experience in Database Application Development, Query Optimization, Performance Tuning and DBA solutions and implementation experience in complete system Development Life Cycle.
Analyzed data and developed jobs to include changing business rules and perform quality checks on data received from external vendors like Blackrock, Sunguard(Kiodex), Findur, Bloomberg
Develop ETL standards & best practices. Resolving ETL and Database performance issue
Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing.
Created Task plans/change controls and provided support during Production Release’s.
Involved in Production Scheduling to setup jobs in order and provided 24x7 production support.
Efficient in incorporation of various data sources such as Oracle, MS SQL Server, and DB2, XML and Flat files into the staging area.
Involved in migrating projects from 8.1 to 9.1 and setting up 9.1 environments. Involved in compiling and testing jobs in 9.1. Tuned the DataStage repository and jobs for optimum performance.
Built new universe as per user requirements by identifying the required tables from Capital Markets ODS environment and by defining the universe connections. Also created and supported WebI reports and scheduled them to generate PDF files for end users.
Define process to create source package and deployment to different server (QA, UAT, and Production)
Lead and mentor other team members on ETL design, development, best practices and performing complex tasks
Setting up Sitescope alerts and TWS scheduling and Creating remedy / incident tickets.
American Modern Insurance Group, Amelia, OH Mar’15 – Oct'15
ETL Consultant
American Modern is a widely recognized, national leader in the specialty insurance business. Experience in the manufactured housing insurance sector, the company delivers specialized products and services for residential property, recreational market and pet health insurance. American Modern joined Munich Re as a key part of the world reinsurance leaders North America insurance operations. Worked on various projects like Lexis-Nexis Carrier Contribution, Prism Reporting and URS
Environment:
IBM InfoSphere 8.7 (Designer, Director, Quality Stage), Sequential files, SQuirrel, UNIX, Windows, MS Visio, Oracle 11g, DB2, XML, Control-M Scheduling, sFTP
Translating business requirements into ETL processes using DataStage & SQL development.
Conducting data analysis of a large number of source systems to asses data quality
Developed DataStage custom routines to handle data validation & data scrubbing.
Extensively used Transformer, Sort, Merge, Aggregator, Funnel, Filter, Copy, Peek, Modify, Compare, Oracle/DB2/ODBC Enterprise, Aggregator, Change Capture, Difference and Remove Duplicates stages.
Analyzed data and developed jobs to include changing business rules and perform quality checks on data received from external sources i.e.,Huon, PC and IC Staging.
Develop ETL standards & best practices. Resolving ETL and Database performance issue
Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing.
Created Task plans/change controls and provided support during Production Release’s.
Involved in Production Scheduling to setup jobs in order and provided 24x7 production support.
Efficient in incorporation of various data sources such as Oracle, MS SQL Server, and DB2, XML and Flat files into the staging area.
Involved in process setup for migrating jobs from development to Test/PROD using manifest files as per Munich Re standards. Tuned the DataStage repository and jobs for optimum performance.
Created Control-M Scheduling jobs, scheduled and e-mailed status of ETL jobs to operations team daily.
Fifth Third Bank, Cincinnati, OH Jan’13 – Feb'15
Sr. ETL Developer
Wells Fargo/Best IT, Phoenix, AZ Dec’11 – Dec'12
ETL Developer/Analyst
Wells Fargo Home Mortgage uses a vast, complicated architecture of systems in various lifecycle states to support their operation as one of the top five mortgage tenders in the world, with billions of dollars flowing through their systems each day. One sub-system, ODS acts as a reference data processing repository for several downstream applications and various types of reporting. Being that portions of their architecture straddle a transition from mainframe to web-based platforms, ODS facilitates a key tie between old and new systems.
Environment:
Oracle 11g, DataStage 7.5/8.7, Abinitio, IBM Cognos BI, Sequential files, TOAD 8, UNIX, Windows, MS Visio, Sybase Powerbuilder 15, Tortoise SVN 1.7.1.
Responsibilities:
Worked with business and end users to understand and gather requirements.
Converted business requirements into functional specification.
Designed and developed ETL solutions using Abinitio Graphs and plans for business problems as solutions for different application modules.
Designed and developed various business reports using IBM Cognos BI tool.
Coordinated and facilitated discussions between BSA’s and Technical Team for gap analysis and remove road-blocks where necessary.
Actively involved in design sessions and meetings with Team leader, Group Members and Technical Manager regarding Technical solution.
Coordinated and performed release testing in SIT and UAT environments by testing various scenarios and report its outcomes.
Debugged, analyzed and fixed the issues found while testing.
Primarily responsible to monitor and running of scheduler chains and also debug and resolve any failures for successful completion.
Reverse engineered DataStage jobs/Portals by extracting the logic and code to develop functional documentation.
Implemented the Add check points, automatically handle activities concepts in sequences
Responsible for integrating data from OLTP to enterprise-wide data warehouse to store reliable data for downstream support system using DataStage.
Created new mappings and updating old mappings according to changes in Business Logic Performed major role in understanding the business requirements and designing and loading the data into data warehouse.
Responsible for Gap analysis and translating business requirements into systems design.
Responsible to track and update the JIRA’s and check in the code into Tortosie SVN.
Identified issues and bottleneck within the code and fixed them for better performing application.
Extensively worked on performance tuning and optimization of SQL statements using explain plan, SQL trace and tkprof.
Analyze, design and develop stored procedures, functions, packages, materialized views and triggers using PL/SQL.
Actively involved in performance tuning brainstorming and resolution
Worked with Data Architects to come up with the database design.
Parker Hannifin Corp, Cleveland, OH Feb’2011–Dec’2011
ETL/DataStage Developer
Parker Hannifin is the world's leading diversified manufacturer of motion and control technologies and systems, providing precision-engineered solutions for a wide variety of commercial, mobile, industrial and aerospace markets. Maintaining the customer Quotes and Agreement data from Order Management and Order to Sales which is on mainframes systems to DB2 for GUI web based application which is on Siebel.
Environment:
IBM InfoSphere 8.1/8.7 (Designer, Director, Quality Stage), DB2 UDB, SQL Server 2008, Sequential files, TOAD 8, UNIX, Siebel 7.8, Windows, XML, Sharepoint
Responsibilities:
Requirement analysis/documentation, developing functional and technical specifications, DW and ETL designing, developing detailed mapping specifications, DFD's and scheduling charts.
Building the SQL queries for BD2 UDB to fetch the data from the Database as per the business requirement and use the SQLqueries in the DataStage for father processing.
Designed and developed all subject modules using DataStage engine. Applied validation rules, constraints using DataStage variables and transformations.
Designed Parallel jobs using various stages like Transformer, join, merge, Lookup, CDC, remove duplicates, filter, modify, aggregator and funnel stages.
Converted complex job designs to different job segments and executed through job sequencer for better performance and easy maintenance
Developed and modified IFB files to load data from staging area into Interface tables and into Siebel base tables using EIM
Responsible for importing LOV, Position, Employee, Contacts, Accounts and Opportunity Data from legacy systems using EIM
Created parameter sets to group DataStage job parameters and store default values in files to make sequence jobs and shared containers faster and easier to build.
Created source to target mapping and job design documents from staging area to Data Warehouse.
Developed job sequence with restart ability, check points and implemented proper failure actions.
Generated completion messages, Status reports using capabilities of job control sequence and UNIX. Responsible for UNIT, System and Integration testing. Participated in UAT (User Acceptance Testing).
Created a WaitforFile Activity jobs, Interface Notification and DataStagesequences to load the data into Target and send the success/failure reports to the end users via email within 3 week window after technical proof of concept initiatives.
Used DataStage Director to run and Monitor the jobs performed, automation of job control using Batch logic to execute and schedule various DataStage jobs.
Worked with DataStage Director for setting up production job cycles for daily, weekly monthly loads with proper dependencies.
Maintaining the support matrix and updating the remedy ticket for analysis purpose.