Summary:
. Over * years of strong experience in Information Technology with
expertise in Analysis, Design, Development and Maintenance applications
of a Data Warehouse.
. 5+ years of strong Data Warehousing experience using Informatica 6x/7x,
Cognos Impromptu & Powerplay 5x/6x/7x, Oracle 8i/9i /10g, SQL server
2000/7.0, DB2, MS Access 7.0/'97/2000, OLTP, ROLAP, OLAP, Oracle SQL
Loader..
. Proficient in using Informatica PowerCenter and PowerMart for building a
Data Warehouse. Well versed with the end-to-end practices involved in
Data Warehouse Development.
. Knowledge on Data Warehousing, Informatica, Extraction Transformation and
Load (ETL), Data Modeling, Data Integration, Data Merging, Data
Cleansing, Data Scrubbing and Data Aggregation.
. Strong experience on Data Design/Analysis, Business Analysis, User
Requirement Gathering, User Requirement Analysis, Gap Analysis, Data
Cleansing, Data Transformations, Data Relationships, Source Systems
Analysis and Reporting Analysis.
. Worked in the role of data analyst in mapping and scrubbing sensitive
data within the application.
. Created various UNIX Shell Scripts for scheduling various data cleansing
scripts and loading process.
. Experience in dealing with data from heterogeneous sources using
Informatica PowerCenter.
. Involved in Full Life Cycle Development of building a Data Warehouse.
. Extensive expertise in implementing complex business rules by creating re-
usable transformations, and robust mappings/mapplets.
. Implemented performance tuning techniques on Targets, Sources, Mappings
and Sessions.
. Strong expertise in SQL performance tuning.
. Strong experience in Data Analysis, Integration, Cleansing and
Reconciliation.
. 4+ years of experience in Oracle PL/SQL Programming.
. 3+ years of programming experience using Unix Shell scripting, PERL on HP-
Unix, Win XP/2000/98/95, Win NT 4.0, Maestro, IBM AIX 4.3, Sun Solaris
2.7/2.8/5.9.
. Exposed to various functional areas of Data warehousing including Cable,
Retail & Finance.
SKILLS
ETL Tools Informatica 5.x/6.x/7.x/8.x
Databases Oracle 10g/9.x/8.x, DB2, SQL Server 2000/7.0, MS
Access 7.0/'97/2000, Teradata.
Oracle Tools PL/SQL, SQL Loader, TOAD, SQL Impact, SQL
Navigator
Data Modeler Erwin, Microsoft Visio.
Reporting Tools Cognos Impromptu, Cognos PowerPlay,Business
Objects.
Languages C, C++, Java, ShellL & Perl Programming, Visual
Basic, XML, HTML.
Version Control WinCVS, ClearCase,
Tools
Scheduling Tools TIDAL,AUTOSYS,CONTROL M
Project Primavera Team Play 3.0.9.1, Microsoft Project.
Management
Testing Tools Test Director 8x/7x
Operating Systems MVS/ESA, IBM AIX/HP-UNIX/SUN SOLARIS, LINUX,
Windows 2000
Hardware IBM 3090, S390, RS 6000, SP2, HP9000, Compaq
Proliant 3850, PCs
EDUCATION
. Bachelors in Electronics & Communications Engineering at Jawaharlal
Nehru Technological University, Hyderabad, India.
. Professional Training in Informatica PowerCenter & PowerMart, Cognos
PowerPlay, Cognos Impromptu Administration, Cognos Query & Cognos
Upfront.
. Oracle 9i SQL Certified
Experience
Application Developer/Support Analyst, Fannie Mae, Herndon, VA
04/2010 - Present
Fannie Mae is a government-sponsored enterprise (GSE) chartered by Congress
with a mission to provide liquidity and stability to the U.S. housing and
mortgage markets. Fannie Mae operates in the U.S. secondary mortgage
market. Fannie Mae works with mortgage bankers, brokers, and other primary
mortgage market partners to help ensure they have funds to lend to home
buyers at affordable rates. There are many ongoing projects at this company
like LASER, BMR I/II, PFP I/II of which I was part of a project named
Service Investor and Reporting(SIR).
Responsibilities:
. Worked in the role of production support specialist and plsql developer
involving bug fixes
. Involved in business analysis and technical design sessions with business
and technical staff to develop Entity Relationship/data models,
requirements document, and ETL specifications.
. Developed SQL and PLSQL packages for addressing the various Adhoc
requirements.
. Monitored Production, UAT and Test environment jobs which included
complex PLSQL packages, procedures, functions and cursors.
. Worked on triaging of production support incidents which involved
replication of code issues in different environments to identify a
solution.
. Extensively used Remedy for monitoring and recording production support
incidents.
. Extensively used Clear Quest to identify errors and do code fixes.
. Prepared UT PLSQL packages and Unit Test documents for unit testing of
deployment packages.
. Used ClearCase for version control and prepare build packages for
deployments.
. Participated in design and code reviews.
. Prepared High Level Design, Detail Design and Business Requirements
documents.
. Involved in Batch processing using Bulk collect in PLSQL..
Environment: Oracle 10g, PL/SQL, Remedy, ClearCase, ClearQuest, TOAD,
UNIX,DOORS Ucm
BMC Remedy,UT Plsql,Windows Xp professional,Plsql Developer,Sql
plus,Autosys Scheduler.
PL/SQL and ETL Developer, PepsiCo Inc., Plano, TX 03/2007 -
03/2010
Gondola revolution 2 is mainly projected to provide our customers the
better way of arranging Frito-Lay products and other vendor products on the
Gondola with the help of a Planogram. A Planogram is a diagram of fixtures
and products that illustrates how and where retail products should be
displayed, usually on a store shelf in order to increase customer
purchases.
Responsibilities:
. Worked in the role of data analyst in mapping and scrubbing sensitive
data within the application.
. Involved in business analysis and technical design sessions with business
and technical staff to develop Entity Relationship/data models,
requirements document, and ETL specifications.
. Involved in data analysis and also in designing High Level and Low Level
ETL designs.
. Developed various mappings for extracting the data from different source
systems using Informatica PowerCenter 8.1.0 - Server, Workflow Manager,
Repository Manager and Designer.
. Developed mappings between multiple sources such as flat files, db2,
oracle and multiple targets.
. Strong experience in using Expressions, Joiners, Routers, Lookups, Update
strategy, Source qualifiers and Aggregators.
. Created Informatica mappings which included data explosion to incorporate
critical business functionality to load data.
. Involved in Batch processing using Bulk collect in PLSQL.
. Extensively Used Bulk Collect,Bulk Fetch for Batch Processing in PLSQL.
. Created Mapplets for reusable business rules.
. Experienced in creating K-Shell scripting. Created UNIX shell scripts to
load the data from flat files to DB2 tables, scheduling jobs for ETL
loads, running data cleansing routines etc.,
. Involved in data cleansing operations like conversions, migrations, de-
duping, scrubbing, etc.,
. Worked on data reconciliation using Informatica mappings to verify
correctness of loading process.
. Worked on designing Control-M scheduler for the GR2 processes.
. Also was involved in production (baseline work) for the GR1 project.
. Production work involved me in loading the data into DB2 database
provided by our business users and starts the application every week.
Solved many critical issues affecting the daily runs as part of
production work.
. Worked on performance tuning of mappings, sessions, Target and Source.
. Involved in creating and running Sessions & Workflows using Informatica
Workflow Manager and monitoring using Workflow Monitor.
. Wrote test plans and implemented Unit testing.
. Involved with business objects developers for creating list and cross tab
reports.
Environment: Informatica Power Center 7.1.4/8.1.0, Work Flow Manager, Work
Flow Monitor, DETAIL/PowerConnect for Mainframe, Source Analyzer, Warehouse
Designer, Transformation Developer, Mapping Designer, Mapplet Designer,
Oracle 10g, Control-M Scheduler, DB2, TOAD, PL/SQL, Windows XP, HP-UX, IBM
AIX, ERWIN
Software Developer, Time Warner Cable, Charlotte, NC 11/2006
- 02/2007
Time Warner cable is a cable company similar to COMCAST whose main
objective of the business is to create a value for the customer so that the
marketing team can send out offers. There are many projects going on in
this company like CVC, ODSWEB, BMS, AAD, ISA, Digital Phone of which I was
involved in the CVC program.
Responsibilities:
. Automated Shell scripts to pull and load data from operational
resources into the Data Staging Area and Data Warehouse for
business intelligence reporting.
. Produced Scripts to cleanse source data, ETL process (Extract,
Transform & Load) data with business rules, and built re-usable
Mappings.
. Developed views, functions, procedures, triggers, packages using
PL/SQL & SQL to transform data between source staging area to target
staging area.
. Generated server side PL/SQL scripts for data manipulation and
validation and created various snapshots and materialized views for
remote instances.
. Developed SQL, PLSQL, SQL*Plus programs required to retrieve data from
the Data repository using cursors and exception handling.
. Generated DDL scripts for creation of new database objects like
tables, views, sequences, functions, synonyms, indexes, triggers,
packages, stored procedures.
. Created and used Table Partitions to further improve the performance
while using tables containing large data.
. Created SSIS Packages to import and export data from Excel, Xml and
Text files to SQL Server.
. Transformed data from various data sources using OLE DB connection by
creating various SSIS packages.
. Fine Tuned procedures for the maximum efficiency in various schemas
across databases.
. Design SSIS jobs for data delivery for internal and external users.
. Developed SQL Queries to fetch complex data from different tables in
remote databases using joins, database links and kept logs.
. Modified UNIX shell scripts for automating batch jobs.
. Analyzing Business requirements with product management for new
release cycles.
. Wrote Integrity Checks to clean erroneous data using PL/SQL procedure
and Shell
Scripts.
. Documented unit test cases and executed the same.
Environment: Oracle 9i/8i, Windows NT/2000/XP, XML, SQL, MS-SQL Server
2005, SQL*Plus, PL/SQL, SQL*Loader, SSIS, Export/Import, PVCS (Version
Control).
PL/SQL Developer, Global Technologies, India 09/2004- 08/2006
This System was developed for a Bank in India. The Sales and Purchase Order
system has Master Maintenance module, Sales and stock management module,
Purchase Module and Report Generation module. Master Maintenance Module
stores and maintains Product details, Vendor details and Customer details.
Sales and Stock Management Module records the daily transaction details
such as Products sold receipts and acceptance of finished products, and
stock updating.
Responsibilities:
. System study, business analysis and prepared the system design document
and system prototypes.
. Designed and developed user friendly screens using Oracle Forms 4.5 for
all modules.
. Coding of application screens include creation of libraries, program
units, triggers for enhancing the functionality of the screens,
customizing them based on presentation logic, validation of data,
populating non input items and non-base table items.
. Designed User Interface Screens and involved in coding and testing of the
applications.
. Extensive involvement in creating database triggers, functions,
procedures, form designing and writing triggers for form events.
. Generated numerous reports using Oracle Reports 2.5 based on the
specifications of the required output. Reports generated include Sales
Invoices, Sales Performance, Receipts, Stock Position etc.
. Prepared the test plan and test cases document.
Environment: Oracle 8i/9i, Developer 2000 (Forms 4.5 & Reports 2.5), SQL
PLUS, PL/SQL, Windows NT.