Post Job Free
Sign in

Power Center Data Modeling

Location:
Lewis Center, OH
Posted:
January 30, 2025

Contact this candidate

Resume:

SRIDHAR VULIGONDA Ph: 614-***-****

Email: *******.*********@*****.***

EXPERIENCE SUMMARY

Over 16 years of IT experience in Analysis, Design, Development, Implementation and Testing of Data Warehouse applications using Data Modeling, Data Extraction, Data Transformation, Data Loading and Data Analysis Techniques.

Has worked in Manufacturing, Telecom, Retail and Banking domains.

Has ETL Informatica Architect experience when working at OSU. Installed and maintained Informatica Servers and clients, Created users and assigned roles through Informatica Administrator.

Documented the Informatica standards and procedures for sustain and development teams.

Worked on Informatica Power Center 9.1,9.5 and the latest version IICS.

Expertise in identifying the root cause and bottle necks of a slowly performing job and providing the techniques for performance tuning and to propose a better job design.

Experience in Designing, Compiling, Testing, Debugging, Scheduling and running Informatica mappings through CLI

Exposure to Big Data concepts and methodologies like Hadoop, PIG, Hive and Sqoop

TECHNICAL EXPERTISE

Data Warehousing

Informatica powermart6.1, Informatica PowerCenter9.1, 9.5, IICS, IBM WebSphere Information Server 8.5, 8.0, WebSphere DataStage 7.5.X 11.7 (Designer, Director, Administrator, Manager), Information Analyzer, MetaStage 6.0, Parallel Extender 6.0(PX),, SQL*Plus

Data Modeling

Data Modeling, Star Schema Modeling, Snow-Flake Modeling, FACT and Dimensions, Physical and Logical Data Modeling, Erwin 4.1/3.5.2/3.X

Languages

SQL, PL/SQL, UNIX Shell scripting, VB.Net and ABAP SQL programming.

Operating Systems

UNIX Solaris2.8, AIX, Windows NT/95/98/2000/xp and MS DOS

Databases

Oracle 10g,9i/8i/8.0/7.3, PL/SQL, UDB DB2, Sybase SQL Server 11.0, MS SQL Server 6.5/7.0, Netezza, Teradata

Reporting Tools / Other Tools

Ticketing Tool - REMEDY, SMART, Oracle ODI(Oracle Data Integrator), Oracle Reports 6i, Crystal Reports, Cognos, Business Objects, Visual Basic 6.0/5.0/4.0/3.0, MS Access 97/2000, Excel, Power Point, UNIX, SQL*Plus, TOAD, Oracle SQL Developer, Oxygen, Jira

Cloud Technologies

Workday and AWS

PROFESSIONAL SUMMARY

1)University of Wisconsin-Madison(Remote) Oct 2022 to Till Date

Sr ETL Developer

The University of Wisconsin –Madison has initiated a project called Administrative Transformation Program(ATP) for transforming their legacy PeopleSoft data into Workday for better performance and to reduce their maintenance cost. My role in the project is to convert and transform the PeopleSoft data by writing SQL queries and loading ODS and Staging tables through IICS.

Roles and Responsibilities

Developing ETL jobs using IICS as an ETL tool, with source as Peoplesoft, Oracle and Target as Workday.

Designing the error logging mechanism and reprocessing the final extract files to Workday.

Discuss and co-ordinate with different teams within the UW to get better understanding of the requirements for the extraction and mapping.

The tools used in this project are IICS, SQL Developer, Jira, and Smart Sheets.

Migrating IICS mappings to AWS. Has written PySpark scripts to fetch the data from on-premise Oracle database and stored the datasets into S3 Bucket in the form of parquet files. These files are then loaded into the EIB excel formats by running a Python script. Parallelly we are migrating Oracle SQL scripts into Athena database.

Designed and developed incremental loads based on the last updated timestamp and dynamically updating the same in the parameter file.

Added looping logic to extract the data incrementally using Start Loop, End Loop, Assignment task, Jump task and Decision task activities in the IICS Task Flow.

Developed Workday Custom reports for getting reference data. Used Indexed reports for faster retrieval for the data from Workday.

Developed IICS mappings for loading the data into Snowflake for the maintain the Datawarehouse for Ancillary systems.

Developed mappings for Data reconciliation and created audit tables for the data validation and accuracy.

Environment: IICS, AWS, Workday Release 42.1.0(ERP on cloud platform), Jira(Task assignments and project tracking)

2)The Ohio State University, Columbus, OH Nov 2016 – Sep 2022

Sr ETL Developer

The Enterprise Project is a business transformation that will improve the Ohio State experience and advance operational excellence through Workday and other modern, effective services and technology.

It includes optimizing finance, HR, payroll, student services and IT business processes; replacing core administrative systems (currently PeopleSoft) with Workday; and improving decision-making through a single source of trusted administrative data.

I am being involved in this project right from the beginning. The Ohio State University gave me a great opportunity for doing the POC for this project and now the project has a team of 9 ETL Developers working in various modules.

As of Jan 2021, the HR, Finance and Supply Chain conversions went live on Workday. I am currently working in the Student conversion.

Roles and Responsibilities

Reviews and analyses the mapping document developed by the design team.

Develops the Informatica PowerCenter 9.5 ETL mappings to generate the most complex hierarchical XML files, which will then be loaded into Workday through iLoads.

Created and loaded the EIBs into Workday.

Created custom reports to validate the data and also to get the reference data which will then be used as lookups for the subsequent jobs.

Developed ETL mappings using Expression Transformation, Router, Normalizer, Un-Connected Lookups, Joins, Source Qualifier, Target transformation.

Generated XSD files from the sample XML for configuring the hierarchical XML transformation in Informatica Power Center 9.5.

Created sessions, mapplets and Workflows.

Helps the team in clearing their technical roadblocks during the development and release phases.

Developed Workday Custom reports for getting reference data. Used Indexed reports for faster retrieval for the data from Workday.

Worked on AWS Glue2.0,3.0 for the ETL Tool evaluation for the University. Written Python/PySpark scripts to connect to on-premise Oracle database and generated comma delimited files as target and loaded them into AWS S3 bucket and Snowflake database.

Got trained in Microsoft Power BI and Tableau reporting tools.

Created custom reports for Suppliers, Customers and Location Hierarchies using Power BI.

Worked on Oracle ODI(Oracle Data Integrator) tool to extract the data from some of the ancillary systems.

Environment: Informatica Power Center 9.5, Workday Release 31.0(ERP on cloud platform), Oxygen(XML Editor), Jira(Task assignments and project tracking)

3)Inxite Health Systems, Dublin, OH Jan 2016 –Nov 2016

Sr ETL Developer

Inxite Health has entered into an agreement with Ohio Governor's office for Health transformation of Coding and Quality measures project to primarily build a framework as part of Patient Centered Medical Home(PCMH) Program Design. Within the scope of quality measures, Data Warehouse developers are required to develop a measurement framework to support up to 40 selected health care quality measures for the provider performance scorecards. A series of SQL queries will be developed that will provide accurate data for each measure in the Inxite Health Systems Healthcare Analytics data warehouse environment.

Roles and Responsibilities

Analyze the measure to identify all sources and confirm data availability.

Reviews functional design documents for completeness and accuracy.

Data profiling which includes understanding of the availability of data sources.

Develops the source-target mapping for measures.

Developed ETL jobs using Informatica PowerCenter9.1. Used Source Qualifier, Expression, Stored Procedure, Aggregation, Sorter, Router, unconnected lookup transformations.

Reviews the Detailed Technical Specification for the proposed solutions and coordinates with business for signoff.

Reviews, understands and prioritizes the story cards in Agile development environment.

Develops and executes validation scripts.

Promotes the code to Production environment using Harvest version control tool.

Environment: Microsoft SQL Server 2012, Informatica Power Center9.1, UNIX

4)OAKS BI Upgrade, Columbus, OH Feb 2015 – Dec 2015

Sr Datastage Consultant

Ohio Administrative and Knowledge System is a BI Project which consists of various subject areas like HCM, Finance, IT Spend. The current HP platform is now being upgraded to Exa platform for performance tuning. There was an upgrade to Datastage7.5 to DataStage8.5 Server Edition, as a result there were many jobs that were modified in the newer version to be compatible with the Exa platform. The source for the BI is PeopleSoft EPM and the target is Oracle, where the final reports are being delivered to the employees of State Of Ohio.

Roles and Responsibilities

Migration of existing DataStage7.5 jobs to DataStage 8.5 Server Edition which also includes modifications to some of the jobs to improve the performance and align with the version compatibility.

Developed DataStage jobs using Transformer, Sequential Files, Hash File, Routines and Shared Containers..

Involved in DataStage Administrator activities like adding users and assigning the roles, DS Engine restart and unlocking the jobs.

Designed and developed jobs for Destructive Loads.

Validated the data in Cognos reports.

Environment: IBM DataStage 8.5 Infosphere Information Server, UNIX, Oracle 11g, SQL and UC4 scheduling.

5)Agent Identity, Erie Insurance Group, Erie, PA Feb 2014 – Jan 2015

Sr ETL Designer

Agent Identity is an application which extracts the Agent information from MDM(Master Data Management) and identifies the Agent’s Group, Account, Agents and sub-Agents uniquely and redefines the hierarchy between these entities. The existing system has a set of complex stored procedures that extracts the data from MDM and populates into the change tables which will in turn be used by the Tivoli’s Active Directory for reporting purpose.

Roles and Responsibilities

Migration of existing stored procedures into Datastage jobs which also includes automation of email alerts and error logging which are the additional value adds.

Developed Datastage jobs using Transformer stage, ODBC stage, Aggregator stage, Sorter stage, Unstructured Stage, Sequential File stages, Remove Duplicate stage.

Developed a Data Model to capture the changes for the incremental load.

Written JCL scripts to schedule the Datastage jobs on Indesca scheduling tool.

Environment: IBM DataStage 8.5 Infosphere Information Server, UNIX AIX, Oracle 11g, PL/SQL

6)Global Warranty Management, General Motors, Detroit Jan 2011 – Feb 2014

DataStage Technical Lead/Architect

Global Warranty Management is a centralized application where the dealers can submit their vehicle warranty claims which will be processed based on certain pre-defined business processes in SAP and DataStage. GWM application maintains Vehicles, Part Prices and Warranty Claims information which will be submitted by various regions around the world. ETL is the phase where it extracts data from SAP and various other source systems, processes it and sends the data to various downstream systems and Cognos reporting.

Roles and Responsibilities

Preparing weekly report for ETL rejections and present it to the GM business.

Presenting the production changes for the upcoming weekend deployments to the GM CAB Manager and getting his approval

Understand the business requirements and involve in the requirements documentation by participating in stakeholder meetings.

As a red team member, used to review the code developed by the innovation team and make sure the code is as per the standards and identify any dependencies and impact when moved to production.

Involved in migration of Datastage servers from Toronto to Warren Tech center.

Installed and configured the DataStage8.5 Sever and clients at the new data center.

Created users and assigned roles to those users through Webconsole and DS Administrator.

Resolved the issue with OR Logging and RT Logging in Datastage8.1

Interact with the data modelers and business analysts to understand the data model and prepare the architecture documents and functional flow diagrams as per the business requirements.

Prepare the functional design and technical design documents based on the business requirements.

Code the Datastage parallel jobs using Datastage Designer and prepare the unit test documents to test the code.

Fully involved in the Performance tuning of the DataStage jobs using different methodologies like configurations of the environment variable file, designing and editing configurations, increasing the read/write speed while fetching or loading data to files or databases.

Design and development of Stored Procedures and called them in Data Stage, as per the need.

Implemented a generic Error Logging technique for each DataStage job, which logs the rejection information in the CSF database along with the run number, rows processed, rows rejected and process start and end time.

Review of code to maintain proper logic implementation/quality standards in the code.

Importing and Exporting of the DataStage jobs.

Experience in Parallel Extender (DS-PX) developing Parallel jobs using various stages including Lookup, Aggregator, Join, Transformer, Sort, Merge, Filter, Compare, Funnel, Remove Duplicates, Change Data Capture, Peek, Copy, Row Generator, Column Generator and ABAP Stage(R/3 plugin).

Wrote ABAP extraction queries and did a query performance tuning in SAP and migrated the data into BAPI stages as a target.

Developed DataStage jobs using DataStage Designer. In depth experience in dealing with different stages including Hash Lookup, Meta stage, Profile stage, Inter Process, Link Collector and Link Partitioning.

Used DataStage Designer to develop jobs for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.

Resolved critical production issues which had high Severity.

Developed cubes and hierarchies in Cognos reporting.

Environment: IBM DataStage 7.5.3 (Designer, Director, Manager and Administrator), Sun Solaris 5.10, Tivoli Scheduler, DB2 / Oracle/ SAP 5.0/ DataStage Version Control/ ABAP SQL and Cognos

7)CIM Redesign, AT & T, Dallas, TX Apr 2010 – Dec 2010

IBM DataStage Developer/Application Designer

Customer Information Management (CIM) is an application that gives the information about the customers of AT&T. AT&T provides various services to the customers like wireless, wireline, digital TV and broadband. Each service has its own systems to maintain billing and customer information. AT&T has taken initiative to integrate these system's data into a centralized enterprise corporate data warehouse called eCDW. This system is named as CIM (Customer Information Management) design which was implemented and being used by AT&T.

Roles and Responsibilities

Used DataStage 8.5 as ETL tool to extract data from source systems such as Flat files (sequential files), File Sets, Data sets to target system.

Responsible for Administration of DataStage and Production issues.

Coding, Unit Testing, Performance Testing, Sub System Testing and UAT.

Preparation of unit test cases, unit test plans

Review of code to maintain proper logic implementation/quality standards in the code.

Experience in Parallel Extender (DS-PX) developing Parallel jobs using various stages including Lookup, Aggregator, Join, Transformer, Sort, Merge, Filter, Funnel, Remove Duplicates, Change Data Capture, Copy, Row Generator and Column Generator.

Developed DataStage jobs using DataStage Designer. In depth experience in dealing with different stages including Hash Lookup, Inter Process, Link Collector and Link Partitioning.

Used DataStage Designer to develop jobs for extracting, cleansing, transforming, integrating, and loading data into data warehouse database

Created Quality Stage jobs using Information Server 8.5 and Performed Parsing, Analyzing, Organizing the patterns of the data integrity.

Used Quality Stage to perform Investigation, Standardization, Matching and Survivorship on the data.

Worked with production support team to resolve ETL Issues and Performance tuning.

Developed Technical Design Documents/Design Specification Documents, Mapping Specification Documents, ETL Component Specification Documents and ETL Derivation specifications.

Environment: IBM DataStage 8.5 Information Server (Designer, Director, Administrator), Quality Stage 8.5, UNIX AIX, DB2 /TeraData, PL/SQL

8)Shared Services(CX-US), Chrysler Group LLC, Detroit Jul 2008 – Apr 2010

ETL Lead

This project consists of conversion of complex stored procedures into DataStage7.5.1. The Over Under stored procedures is a process to identify for each plant whether they are Over producing the vehicles or under producing the vehicles based on the production planning. The existing stored procedures does not have any error handling mechanism and it was very difficult to debug whenever an error occurs. The new DataStage conversion resolves all the issues regarding the error handling and is maintained very easily.

Roles and Responsibilities

Understand requirements and ETL design. Guide the team with technical and functional details. Also involved in business discussions.

Migrated the existing stored procedures into DataStage Jobs and performed gap analysis for all the Jobs.

Conducted unit and system tests for the Jobs and documented the testing process.

Gave on job training on DataStage to the development team and made the team come up to the mark and reach the customer expectations

Conducted WebEx sessions with the client and demonstrated the DataStage jobs and the test results of the project.

Preparation of Estimates for BI/DW opportunities for other projects.

Internal BI Competency development

Created universe in Business Objects for the developing the reports.

Environment: IBM WebSphere DataStage 7.5, DB2, PL/SQL and Unix.

9) Shared Services(CX-US), Chrysler Group LLC, Detroit Jun 2006 – Jul 2008

ETL Developer

TCS has set up an offshore team to help Chrysler to support their Data Warehouse and Business Intelligence initiatives. Chrysler maintains an EDW (Enterprise Data Warehouse) from which we have to extract the data from various source systems and develop dimensions and Facts used for their FLEET project reporting.

Roles and Responsibilities

As a Team Member involved in Project Development, Implementation and maintenance using DataStage 7.5.1.

Loaded data from DB2 extract files to EDW(Enterprise Data Warehouse) and from EDW to the respective data mart tables.

Worked extensively on DataStage Designer, Manager, Director and Administrator Tools.

Used DB2 API stage, Funnel, IPC stage, Merger, Pivot, Transformer, Aggregator and shared container stages in developing the ETLs.

Created sequences and implemented error logging for rejected rows and business rule failure. This includes writing DataStage routines for handling the rejected rows.

Developed unix scripts for bulk loading of DB2 tables using DB2 utilities(Import and Load).

Conducted unit and system tests for the Jobs and documented the testing process.

Environment: IBM WebSphere DataStage 7.5, Oracle, DB2 UDB, PL/SQL and Unix.

10) iShare Performance Reporting Wave III, Whataburger Inc., Texas Mar 2005 – Jun 2006

DataStage Developer

iShare is an initiative launched by Whataburger Inc., to implement a new Enterprise Resource Planning(ERP) application from JD Edwards which will streamline their current processes and enable them to be better equipped to handle their future planned expansion

Roles and Responsibilities

Loaded data from JD Edwards to staging and cleansed the data using appropriate business rules

Worked extensively on DataStage Designer, Manager, Director and Administrator Tools

Used ODBC, DRS, HashFiles, IPC, Merger, Pivot, Transformer, Aggregator and shared container stages in developing the ETLs

Created sequences and implemented error logging for rejected rows and business rule failure.

Environment: DataStage 7.5, SQL Server, Microstrategy.

11) iShare Performance Reporting Wave II, Whataburger Inc., Texas Mar 2004 – Mar 2005

Informatica Power Mart 6.1 Developer

iShare is an initiative launched by Whataburger Inc., to implement a new Enterprise Resource Planning(ERP) application from JD Edwards which will streamline their current processes and enable them to be better equipped to handle their future planned expansion.

Roles and Responsibilities

Loaded data from JD Edwards to staging and cleansed the data using appropriate business rules.

Worked extensively on DataStage Designer, Manager, Director and Administrator Tools.

Used ODBC, DRS, HashFiles, IPC, Merger, Pivot, Transformer, Aggregator and shared container stages in developing DataStage ETLs.

Created sequences and implemented error logging for rejected rows and business rule failure.

Created mappings, sessions and workflows.

Used Source Qualifier, Aggregator, lookups, Normalizer for developing the Informatica ETLs

Environment: Informatica6.1 PowerMart, DataStage 7.5, SQL Server, TOAD, Microstrategy.

CERTIFICATIONS / TRAININGS

Informatica Certified Cloud Lakehouse Data Management(Foundation Level)

IBM Certified Solution Developer - InfoSphere DataStage v8.5

Conducted corporate trainings for Sony and Baxter accounts.

Familiar with AWS concepts and has a hands on experience in creating IAM Users, IAM Roles, ELB, EC2 instances and VPC

EDUCATIONAL QUALIFICATION

Master’s degree in Computer Applications(M.C.A) from Kakatiya University.

Bachelor's degree in Science(B.Sc) Osmania University.



Contact this candidate