Swapna J
************@*****.***
PROFESSIONAL SUMMARY:
. Around 7+ years of IT experience in Designing, Analysis, Development,
Maintenance and Implementation of Data Warehousing Systems as ETL
Developer.
. Experience in Data Warehouse requirements gathering, analysis, design,
conversion of business requirements into high level functional
specifications
. Written Technical Specification, test plans and test data to support
ETL processes.
. Good at ETL mechanism using Informatica complex, high volume Data
Warehousing projects in both Windows and UNIX environment.
. Extensively worked on Informatica power center tools - Designer,
Workflow Manager, Workflow Monitor and Repository Manager.
. Extensively worked on Informatica Designer Components - Source
Analyzer, Warehouse Designer, Transformation Developer, Maplet and
Mapping Designer.
. Experienced in Installation, Configuration of Informatica Power Center
Client & Server.
. Experienced in designing/developing complex mappings, from varied
transformation logic like Unconnected and connected lookups, Router,
Filter, Expression, Aggregator, Joiner, Update Strategy and
Normalizer.
. Strong experience in creating Stored Procedures, Indexes, Functions
using PL/SQL.
. Involved in Informatica Power Center upgrade from 7.1/6.x to 8.5.1.
. Extensively used Informatica Repository Manager and Workflow Monitor.
. Experienced in Performance Tuning including identifying and
eliminating bottlenecks of sources, mappings, sessions and targets.
. Experienced in the integration of various data sources like Oracle,
DB2, XML files, Flat Files, SQL Server.
. Excellent skills in retrieving the data by writing simple/complex SQL
Queries.
. Experience in debugging mappings. Identified bugs in existing mappings
by analyzing the data flow and evaluating transformations.
. Expertise in creating mappings, mapplets and reusable transformations.
. Good knowledge in Scheduling Tools like Tivoli, Tidal and Autosys.
. Experienced in Version Control by using CVS, VSS and Clear Case.
. Strong technical background, and possess excellent analytical &
Communication skills, Creativity, Leadership qualities, Team spirit &
above all a Positive attitude to shoulder successfully, any
responsibility or challenges.
TECHNICAL SKILLS
ETL Tools Informatica Power Center 6.x/7.x/8.1/8.6.1/9.1.
OLAP/DSS Tools Business Objects XI r2/6.5.1
Databases Oracle 8i/9i/10g/11g, DB2, MS SQL Server
2005/2000.
Others TOAD, PL/SQL Developer, SQL-
Navigator, Test Director, Win Runner
Database Skills Stored Procedures, Database Triggers and packages
Data Modeling Tools Physical and Logical Data Modeling using ERWIN
3.5/4.5/7.x
Languages C, C++, Java, UNIX shell scripts, XML, HTML
Operating Systems Windows NT/ 2000/2003, LINUX, UNIX
EDUCATION QUALIFICATIONS
. Master of Computer Applications (MCA), Nagarjuna University, India.
PROFESSIONAL EXPERIENCE
Kohls Departmental Stores, Menomonee Falls, WI
Dec 2012 - Till Date
Role: ETL Developer
Store Dashboard Data Integration
Description: Kohls Objective of the EDW - Store Dashboard Data Integration
Project is to integrate the data for Loss Prevention, Payroll, Inventory
Shortage and Hiring Information to the Dashboard Reports which are built on
Business Objects at a Store level, District level and Region level. Store
Dashboard Data Integration is to ETL the related data from source Flat
files and Transactional database into EDW and thereafter into Data mart.
The ultimate goal of the project is to Auto-populate calculated values into
the report by avoiding manual calculations for Business analysis.
Responsibilities:
. Perform analysis and gathering requirement by coordinating with
Business Users.
. Prepare High Level and Low Level Design Specifications.
. Prepare Source to Target mapping documents for ease of ETL
implementation.
. Prepare specifications to create new Data model by coordinating with
Data Architect and DBA.
. Created and reviewed Informatica mappings, sessions and workflows.
. Created test data, test scripts, perform testing and review test
results.
. Extensively worked on created and reviewing SQL scripts and UNIX shell
Scripts.
. Tuned embedded SQL in Informatica mappings.
. Extensively tuned Informatica mappings, sessions by following best
practices to avoid unnecessary caching of records by the lookups.
. Created and used native SQL for performing updates where Informatica
is taking more time on huge dataset tables.
. Purging old data using Post-SQL in Informatica mappings.
. Involved in development of Informatica interfaces.
. Involved in periodic team Meetings to get regular updates from the
team.
. Involved in pulling data from XML files, flat Files, SQL Server into
Data warehouse and then Data Mart.
. Participated in logical & physical and STAR Schema for designing the
Data Marts.
. Wrote complex SQL scripts to avoid Informatica Joiners, Unions and
Look-ups to improve the performance as the volume of the data was
heavy.
. Provided requirements to the Business Objects development team to
create reports.
. Involved in working with the Business Objects team to create required
Universe measures to auto populate values at the report level.
Environment: Informatica Power Center 9.1, Windows XP/2003, UNIX, XML,
Oracle 10g, IBM Mainframes, SQL server 2005, VISIO, SQL and PL/SQL.
Snap-On Business Solutions, Richfield, OH
Oct 2011 - Nov 2012
Role: Informatica ETL Developer
GM Cornerstone Data Integration
Description: The primary objective is to migrate the current GM EPC4 to the
Cornerstone platform with a phased release. The go-to market strategy will
also include the End of Life plan for the current GM EPC. The migration
plan for Cornerstone will carefully consider the end user to minimize
disruption, avoid impact with retraining and to begin using the new GM EPC
immediately upon the user performing a normal monthly update.
Approach:
Deliver a solution that will enable GM to be the number one parts catalog
provider across all OEMs in all regions of the world. This rating is
ultimately determined by the end users (dealers) of the parts catalog. The
primary goal is to sell more parts starting with improving the end user
experience.
Responsibilities:
. Perform analysis and gathering requirement by coordinating with
Business Users.
. Prepare High Level and Low Level Design Specifications.
. Prepare Source to Target mapping documents for ease of ETL
implementation.
. Prepare specifications to create new Data model by coordinating with
Data Architect and DBA.
. Created and reviewed Informatica mappings, sessions and workflows.
. Created test data, test scripts, perform testing and review test
results.
. Extensively worked on created and reviewing SQL scripts and UNIX shell
Scripts.
. Tuned embedded SQL in Informatica mappings.
. Extensively tuned Informatica mappings, sessions by following best
practices to avoid unnecessary caching of records by the lookups.
. Created and used native SQL for performing updates where Informatica
is taking more time on huge dataset tables.
. Purging old data using Post-SQL in Informatica mappings.
. Involved in development of Informatica interfaces.
. Involved in periodic team Meetings to get regular updates from the
team.
. Involved in pulling data from XML files, flat Files, SQL Server into
Data warehouse and then Data Mart.
. Participated in logical & physical and STAR Schema for designing the
Data Marts.
. Wrote complex SQL scripts to avoid Informatica Joiners, Unions and
Look-ups to improve the performance as the volume of the data was
heavy.
Environment: Informatica Power Center 8.6.0, Windows 7 Pro, UNIX, XML,
Oracle, VISIO, SQL and PL/SQL, JIRA, AccuRev, Oracle SQL Developer.
Wachovia National Bank, NJ
Aug 2010- Sep 2011
Role: ETL/Informatica Developer
Financial Reporting Data warehouse
Wachovia National Bank is one of the leading banks in the world with its
base located in the US and spreading its services across the world.
Wachovia Bank is also one of the nation's leading Mortgage Company. The
FRW (Financial Reporting Warehouse) is a strategic initiative intended to
re-invent the way Wachovia National Bank designs, implements, distribute up-
to-date analysis and information of consumer and commercial loans to the
management for quick and effective business decisions making.
The Data Warehouse was populated on daily basis from underlying OLTP
Applications using Power Center. The data comes through various Data Marts
and from the out sourced vendor, Metavante System. Using Informatica Power
Center, the Bank carries out the extraction, transformation and loading
into the data warehouse.
Responsibilities:
. Proficient in interacting with Business Analysts to clearly understand
business requirements.
. As an active member of warehouse design team, assisted in
creating fact and dimension tables as per requirement.
. Design of the Data Warehousing tables was carried out in STAR
schema methodology.
. Extensively used Informatica Power Center to create data maps for
extracting the data from various Relational systems and loading
into Oracle Staging tables.
. Performed Data cleansing using external tools like Name Parser
and DataFlux.
. Extensively used Informatica client tools Source Analyzer, Warehouse
designer, Mapping designer, Mapplet Designer, Transformation
Developer.
. Implemented various integrity constraints for data integrity
like Referential Integrity, using Primary key and Foreign
keys relationships.
. Developed numerous Complex Informatica Mappings, Mapplets and reusable
Transformations.
. Designed and created complex source to target mapping using various
transformations inclusive of but not limited to Aggregator, Joiner,
Filter, Source Qualifier, Expression and Router Transformations.
. Expertise in using different tasks (Session, Assignment, Command,
Decision, Email, Event-Raise, Event- Wait, Control).
. Optimized Query Performance, Session Performance and Reliability.
. Configured the mappings to handle the updates to preserve the existing
records using Update Strategy Transformation (Slowly Changing
Dimensions SCD Type-2).
. Implemented stored procedures, functions, views, triggers, packages
in PL/SQL.
. Used Source Pre-Load, Source Post-Load, Target Pre-Load, Target Post-
Load functionality of Stored Procedure Transformation.
. Performed Performance tuning of targets, sources, mappings and
session.
. Involved in Partitions of Pipelines to improve performance.
. Created very useful UNIX shell scripts while writing cron jobs for
batch processing. Excellent experience using Tivoli job scheduler.
. Used Test Director to log the defects and coordinated with Test
team for a timely resolution.
. Provided Production Support at the end of every release.
. Generated OLAP reports using Business Objects from Oracle target
database.
. Created Multi-Dimensional analysis using Slice/Dice and Drill
methods to organize the data along a combination of "Dimension" and
"Hierarchies".
. Documented Technical specifications, business requirements and
functional specifications for the Informatica Extraction,
Transformation and Loading (ETL) mappings.
Environment: Informatica Power Center 8.1/7.1, Oracle 10g/9i, UNIX, SQL
SERVER, PL/SQL, SQL*Loader, Business Objects, Windows NT/2000.
Cisco System Inc, San Jose, CA
Oct 2009- July 2010
Role: ETL/Informatica Developer
Capital IT Remarketing
Description: Cisco (CSCO) is the leading supplier of networking equipment
and network management for the Internet. Products include like routers,
hubs, Ethernet and software offerings are used to create the Internet
solutions that make networks possible-providing easy access to information
anywhere, at any time. CISCO Capital IT is a group of Cisco Systems, Inc
loans and leasing which is involved in remarketing of goods which come out
of lease. These goods are in turn sent to the respective refurbishing
centers in order to repair (repair sites) various theaters located in USA,
APJ, EMEA refurbish the products and then remarket it through different
applications.
Responsibilities:
. Coordinated the meetings with Architects and Business analysts for
requirement gathering, business analysis to understand the business
requirement and to prepare Technical Specification documents (TSD) to
code ETL Mappings for new requirement changes.
. Involved with source systems owners, day-to-day ETL progress
monitoring, Data warehouse target schema design (star schema) and
maintenance.
. Worked with Team Lead and Data Modeler to design data model using
Erwin 7.1
. Extensively worked on multiple sources like Oracle 10g, DB2, MS SQL
server, XML and delimited flat files to staging database and from
staging to the target XML and Oracle Data Warehouse database.
. Responsible for developing, support and maintenance for the ETL
(Extract, Transform and Load) processes using Informatica powercenter
8.6.1/7.1.4.
. Created Informatica mappings with PL/SQL procedures/functions to build
business rules to load data. Most of the transformations were used
like the Source qualifier, Router, Aggregators, Connected &
Unconnected Lookups, Unions, Filters & Sequence.
. Worked on requirement gathering for B2B XML's feed(Customer and
Product).
. Developed XML's using XML parser Transformation for large scale data
transmission.
. Developed functional and Technical design documents for implementing
Informatica web services.
. Debugging invalid mappings using break points, testing of stored
procedures and functions, testing of Informatica sessions, batches and
the target Data.
. Used SQL tools like TOAD 9.6.1 to run SQL queries and validate the
data in warehouse.
. Used Unix Shell Scripts for automation of ETL Batch Jobs.
. Involved Bulk data loading used J2EE Web Service to FTP XML's Feed
(Customer & Product) to B2B.
. Performed various operations like scheduling, publishing and
retrieving reports from corporate documents using the business objects
reporting.
. Conducted interviews and organized sessions with Business and SMEs
located at various theaters across the globe to obtain domain level
information and conduct analysis.
. Involved in preparing Test script documentation for Load Testing and
involved in ping tests conducted on servers from different theaters
across the globe
. Participated in weekly end user meetings to discuss data quality,
performance issues. Ways to improve data accuracy and new requirements
Environment: Informatica Power Center 7.1.4/8.6.1, Oracle 10g/9i, Linux,
XML, SQL SERVER, PL/SQL, TOAD 9.6.1, Unix, Business Objects XI, Shell
Scripts, SQL Navigator.
Sedgwick CMS, Memphis, TN Oct
2008- Sept 2009
Role: ETL, SQL Developer
Health care/ Medicare
Description: Sedgwick CMS is a claims management company, providing
innovative claims management solutions for Insurance Product to major
employers. Sedgwick provides services such as claims administration,
managed care, program management and other related services. Health Care is
a suite of web-based tools that provide online access to claims
information, data management and reporting features of the Sedgwick JURIS
claims information system. Using Health Care suite, clients can report new
claims, view/track claims information in IMS and generate customized
reports online.
Responsibilities:
. Involved in requirement analysis, ETL design and development for
extracting data from the source systems like Oracle, flat files, XML
files and loading into datamart.
. Converted functional specifications into technical specifications
(design of mapping documents).
. Involved in Designing logical and physical ER data Models by using
Erwin tool
. Developed complex mappings to load data from multiple source systems
like Oracle 10g, Teradata, DB2, flat files and XML files to datamart
in Oracle database.
. Developed complex mapping logic using various transformations like
Expression, Lookups (Connected and Unconnected), Joiner, Filter,
Sorter, Router, Update strategy and Sequence generator.
. Wrote complex SQL scripts to avoid Informatica Joiners, Unions and
Look-ups to improve the performance as the volume of the data was
heavy.
. Loaded data from flat files to temporary tables in Oracle database
using SQL*Loader.
. Used Global temporary tables for processing DML operations on huge
data from GUI at one time.
. Extensive use of SQL Query, Joining tables, SQL functions in Queries,
PL/SQL with the use of Cursors, REF Cursors and loops.
. Fine Tuned SQL queries and PL/SQL blocks for the maximum efficiency
and fast response using Oracle Hints, Explain plans and Trace sessions
with Cost and Rule based optimization.
. Used KORN Shell scripts to automate Oracle script execution.
. Extensively used Scheduling ETL Jobs tools like Tidal.
. Coordinated with DBA's in resolving the database issues that lead to
production job failures.
. Involved extensively in Unit testing, integration testing, system
testing and UAT.
. Participated in weekly end user meetings to discuss data quality,
performance issues. Ways to improve data accuracy and new requirements
Environment: Oracle 10g/9i, PL/SQL, SQL* Loader, TOAD 9.5, Sybase,
Informatica Power Center 8.1/7.1.4, Reports 9i, Crystal reports, ERWIN 7.x,
MS Visio, UNIX
Pieco Infotech, India
Feb 2007 - Sep 2008
Role: ETL/Database Developer
Birla Sun Life Insurance
Description: Enterprise Data warehouse (EDW) serves a full range of managed
healthcare, life and disability insurance, and retirement savings products
and services. This system gives full information regarding benefits, plans
offered by the company, educate people about diseases, prevention, ICD
codes, HIPPA act and users can enroll and can avail the benefits offered.
This project includes developing Data warehouse from different data feeds
and other operational data sources.
Built a central Database where data comes from different
sources like oracle, SQL server and flat files. Actively involved as an
Analyst for preparing design documents and interacted with the data
modelers to understand the data model and design the ETL logic. Reports
were generated using Business Objects.
Responsibilities:
. Extracted source definitions from various databases like oracle, DB2
and SQL Server into the Informatica Power Center repository.
. Worked on Dimensional modeling to Design and develop STAR Schemas,
used ER-win, identifying Fact and Dimension Tables.
. Developed various transformations like Source Qualifier, Sorter
transformation, Joiner transformation, Update Strategy, Lookup
transformation, Expressions and Sequence Generator for loading the
data into target table.
. Worked with different Operation Data Sources such as Oracle, SQL
Server and Legacy Systems, Excel, Flat files.
. Used Informatica to extract data into Data Warehouse.
. Identified and tracked the slowly changing dimensions, heterogeneous
Sources and determined the hierarchies in dimensions.
. Used Sever manager for session management, database connection
management and scheduling of jobs to be run in the batch process
. Created dimensions model star schemas using Kimball methodology.
. Developed number of Complex Informatica Mappings, Mapplets and
Reusable Transformations for different Health Plan Systems to
facilitate Daily, Monthly and Yearly Loading of Data.
. Involved in fixing invalid Mappings, testing of Stored Procedures and
Functions, Unit and Integration Testing of Informatica Sessions,
Batches and the Target Data
. Written documentation to describe program development, logic, coding,
testing, changes and corrections
. Arrived at the dimension model of the OLAP data marts in Erwin.
. Recover the failed Sessions.
. Optimized the mappings by changing the logic and reduced running time.
. Finished the tasks within the allocated time for every release, always
we are on Time and on Target.
Environment: Informatica Power Center 7.1.4, Oracle 8i, SQL, PL/SQL
Procedures, SUN OS, Business Objects 5.1, UNIX Shell Scripting.