Kalpana Chamarti
**********@*****.***
Summary: Informatica Developer
. 7 years of experience in data warehouse projects involving UNIX/
Informatica/ Cognos and Oracle PL/SQL programming as design and
implementation strategies.
. 9 years of experience in Pro*C/C/C++/Unix/Nmake/Sablime at progressive
levels of responsibility in software design, development, testing,
software configuration management.
. Strong experience in design and implementation of ETL using Informatica
PowerCenter.
. Very good understanding & design knowledge of Data warehouse modeling,
Star/Snowflake schemas.
. Excellent interpersonal skills and a good team player with a drive to
take up new challenges in a fast paced environment with a commitment to
excel.
. Experienced in all phases of Software Development Life Cycle (SDLC).
. Extensive experience in Oracle PL/SQL programming, SQL*Plus, Database
Triggers, Packages, Cursors, Stored Procedures, Functions and management
of Schema objects. Proficiency in Data Definition and Data Manipulation
languages.
. Experience in using UNIX platform with shell scripting.
. Experience at various levels and functions in data warehousing in design,
development, testing and production, maintenance. Experience in complete
data warehousing application and client-server based business
applications.
. Strong knowledge of Data Warehousing, Data modeling, end-to-end business
intelligence solution.
. Created Informatica mappings to load data using transformations like
source qualifier, sorter, aggregator, expression, joiner, connected and
unconnected lookup, filters, sequence generator, router and Update
strategy.
. Extensive knowledge of Lookup caches and transformations.
. Expertise in building complex business rules by creating complex
mappings/ Mapplets, shortcuts, reusable transformations.
. Involved in creating folders and code migration.
. Extensive experience in supporting Informatica applications.
. Extensive experience in error handling and problem fixing in Informatica.
. Proficient in using Informatica workflow manager, workflow monitor.
. Worked with Performance tuning and identifying the bottlenecks.
. Worked with different sources like SQL Server, xml, and Flat files.
. Clear and thorough understanding of business process and workflow.
. Excellent quantitative and analytical skills.
. Good Team Player committed to customer satisfaction and quality.
Key Skills:
. Involved in all aspects of the BI process - gathering requirements from
the end users, designing standard contracts between interfaces, designing
the technical blue prints for various data mappings, coming up with data
load strategies while incorporating data cleaning solutions whenever
possible, developing and implementing actual solutions, performing the
unit testing, integration testing, regression testing (as needed) and UAT
in development and QA environments
. As a liaison between the business and the IT Team, was involved
identifying core requirements and used the business domain knowledge on
design, test and implementation of ETL process and final reports
. Experienced in working with the DBA team in designing data marts and
enhancing existing Data repositories
. Experienced in using all the PowerCenter transformations in the designer,
and used the debugger extensively for troubleshooting mappings
. Designed sessions, event wait/raise, assignment, e-mail, and command
tasks in the Workflow Manager
. Involvement in performance tuning of Informatica sessions, by identifying
the bottlenecks at various places, applying proper techniques (reviewing
the data/ index cache sizes, buffer sizes, optimizing source/ lookup
queries) and comparing results against production through put.
. Extensively used SQL*Loader, Stored Procedures, Functions, Packages,
Triggers, Joins, Cursors, aliases, Sequences, etc while working with
Oracle
. Experienced in coding complex database queries to produce adhoc reports
for business users.
. Strong understanding in writing UNIX/ LINUX shell scripts to handle data/
scheduling anomalies.
. Debugging various production issues and providing production support.
. Experienced in making appropriate documentation and providing knowledge
base support documents
Technical Skills:
ETL Tools: Informatica 9.1 and prior
OLAP tools: Cognos (Impromptu/ Power play/ Report Net), OBIEE
RDBMS: Oracle 11g & prior, DB2 UDB, Sqlserver 2000/ 2005
Languages: SQL, PL/SQL, Shell Programming, C, Pro*C, C++, Java
OS: HP-UNIX, LINUX, Sun Solaris, Windows servers
Tools: Erwin, MS-Word, MS-Excel, MS-PowerPoint, Nmake,
Sablime
Education:
M S Computer Science, NJ
B. Tech Electronics, India
Visa Status: U.S.Citizen
Work Experience:
NYC Dept. of Finance, New York City March 14 -
Present
ETL Developer
New York City's Department of Finance (DOF transformative initiative,
known as the Citywide Payments & Receivables Repository (CPRR) Program
makes the entire payment process easier for any individual or entity
making payment to the City and optimize the City's management of the full
life cycle of accounts receivable, from origination through to
settlement. This project requires the integration of the agency system
with the CPRR, in order for the agency to be able to send receivables to
the CPRR and for the agency to receive payments through the CPRR against
those receivables, as well as for retail transactions. Finally, CPRR
provides an interface to the City's Financial Management System (FMS) in
order to report the revenue collected by CPRR on behalf of the agency.
My Role
. Member of an ETL Team to build new Data Mart and load the same to
Reports.
. Built complex Informatica processes to generate CSV output files and
designed UNIX Scripts, and FTP the same to the destination servers .
Designed many other Data marts.
Responsibilities:
. Extracted data from Oracle, XML and flat files.
. Understood the Organization's Business Requirements.
. Worked with business analyst to gather the business requirements and
involved in weekly team meetings.
. Worked with Logical data model /Physical Data model (Star Schema)
. Developed complex mappings using Informatica power center designer to
transform and load data from source systems like Oracle, XML, flat files
to loading Oracle target to the database.
. Worked on Slowly Changing dimensions.
. Have a Good Experience on Performance Tuning.
. Extensively used various types of transformations such as Expressions,
Aggregator, Joiner, Filter, Update strategy, lookup (connected and
unconnected) to load the data.
Environment: Informatica Power Center 9.1, Oracle 11g, PLSQL, Erwin 3.5,
UNIX, LINUX, Cognos 10, Project Place.
AT&T, NJ July 13 - Dec 13
ETL Developer
System XR is a business intelligence solution designed as reporting
solution for System X clients. It takes data from various sources like
orders, call volume, switch data and loads data marts. System XR also
provides reporting for wireline, U-verse and DirectTV orders.
My Role
. Involved in developing the conceptual, logical and physical data model
and used star schema in designing the data mart
. Translated the Business rules into ETL Processes that load the repository
for Metadata management.
. Performed thorough analysis of data and worked on enhancements and
troubleshooting processes.
. Provided production support of the nightly batch.
. Designed processes to load data for System XR.
Responsibilities:
. Installed and configured Informatica Power Center and Informatica Client
tools.
. Worked on performance tuning on both Informatica and Oracle.
. Monitored the workflows, reported errors, analyzed the session logs to
determine the slowing factors for the processes.
. Worked on migration of Informatica 8.1.1 to 9.5 version.
. Worked on shell scripting to aid with the Informatica workflows.
. Extracted the data from Oracle, Flat files into data warehouse.
. Worked with different transformations like source analyzer, lookup
filters.
. Created session, batches and scheduled it.
. Created User, folders and User groups.
. Created various mappings using designer that include source qualifier,
Expression, Aggregator, and Lookup Transformations.
Environment: Informatica Power Center 9.5, MicroStrategy, Oracle 11g, Toad,
UNIX, PL/SQL, SQL*Plus, SQL*Loader, Erwin 3.5
BMW North America, NJ Oct 11 - July 13
BI Resource
Project: KPI - Key Performance Indicators
The BMW Dealerships are rated for their performance on various measures.
The idea of this project is to show their growth in the last 12 - 13 months
YTD, and project their best approaches for better business. There is a
constant need for upgrading the calculations for these reports with the
ever changing market. These monthly/yearly reports were generated using
Informatica.
My Role
. Part of an ETL Team to build new Leads Data Mart and load the same to
extract KPI Reports
. Built complex Informatica processes to generate CSV output files and
designed JCL's/ UNIX Scripts to handle the scheduling for the jobs, and
FTP the same to the destination servers for 4 KPI's.
. Designed many other Data marts and KPI's.
Responsibilities:
. Extracted data from Db2, Sqlserver, xml, flat files.
. Understood the Organization Business Requirements.
. Worked with business analyst to gather the business requirements and
involved in weekly team meetings.
. Worked with Logical data model /Physical Data model (Star Schema)
. Developed complex mappings using Informatica power center designer to
transform and load data from various source systems like flat files, Db2,
Sqlserver and loading Oracle target to the database.
. Worked on Slowly Changing dimensions.
. Have a Good Experience on Performance Tuning.
. Extensively used various types of transformations such as Expressions,
Aggregator, Joiner, Filter, Update strategy, lookup (connected and
unconnected) to load the data.
Environment: Informatica Power Center 8.6, Oracle 10g, DB2, SQL SERVER '00/
'05, SQL, PLSQL, Erwin 3.5, UNIX, LINUX, Cognos 10, Project Place, MS VISIO
Dow Jones, NJ Jan 10 - Aug
11
ETL developer
Project: CIDW - Customer Intelligence Data Warehouse
The CIDW warehouse is the solution for analysis and integration of the
Print Edition and Online Wall Street Journal (WSJ.COM) customer data. It
contains subscribers, non-subscribers and prospects related information.
CIDW extracts data from various source databases. The project is concerned
with extraction of such data from various sources and loading them in a
consistent and integrated database using Informatica as the ETL tool.
Marketing and Advertising departments use the data in CIDW for campaigns,
behavioral targeting and other marketing programs such as consumer
retention and market events. CIDW also serves as the source from which
several systems extract valuable data in order to perform various analysis
functions and reporting related to demographics and help target marketing
and promotion campaigns (e.g., direct mailings, e-mail).
My Role
. Involved in developing the conceptual, logical and physical data model
and used star schema in designing the data mart
. Translated the Business rules into ETL Processes that load the repository
for Metadata management.
. Performed thorough analysis of data and worked on enhancements and
troubleshooting processes
. Provided production support of the nightly batch.
. Designed processes to load data for MYWSJ.COM; personalized consumer
application for WSJ.COM.
Responsibilities:
. Installed and configured Informatica Power Center and Informatica Client
tools.
. Extracted the data from Sql server, Flat files into data warehouse.
. Worked with different transformations like source analyzer, lookup
filters.
. Created session, batches and scheduled it.
. Created User, folders and User groups.
. Created various mappings using designer that include source qualifier,
Expression, Aggregator, and Lookup Transformations.
Environment: Informatica Power Center 8.6, Cognos (Impromptu/ PowerPlay/
ReportNet), Oracle 8i/9i, UNIX, PL/SQL, SQL*Plus, SQL*Loader, Erwin 3.5
Alticor, MI Jun 08 - Dec
09
Informatica Developer
Alticor is a global company offering products and manufacturing and
logistics services in more than 80 countries and territories world-wide.
Through its subsidies and affiliates, Alticor owns or manages manufacturing
and distribution facilities throughout the world. The Application for
which we have worked was AMD; AMD is the second largest chip producer
company in the world. We develop the data mart to process and store all the
employees' information in a single database. Finally, doing the potential
transformations of the data to the Oracle data mart.
Responsibilities:
. Collaborated with business analysts and DBA for requirement gathering,
business analysis and designing of data marts.
. Worked on dimensional modeling to design and develop star schema,
identifying fact and dimension tables for providing a unified view to
ensure consistent decision making.
. Worked with heterogeneous sources from various channels like Flat files,
Sql Server.
. Worked on Workflow and Session Level Grid.
. Configured and Installed Informatica Power Center
. Worked on Informatica tool Source Analyzer, Warehouse Designer, Mapping
Designer, Workflow Manager, Workflow Monitor and Repository Manager.
. Extensively used transformations like Aggregator, Expression, Router,
Filter, Lookup, Sequence Generator, and Update Strategy.
. Analyzed session log files in case of session failed to resolve errors in
mapping or session configuration.
. Performed Unit testing and verified the data.
. Modified some of the existing reports fixing some of the issues related
to performance and data quality etc., during production support.
. Worked on Performance Testing, Unit Testing.
Environment: Informatica 8.6, OBIEE 10g, Flat files, XML, Oracle 9i/10g,
Windows NT.
Best Buy, Minneapolis May 06 - May
08
Informatica Developer
Best Buy co., Inc is North America's number one specialty retailer of
consumer electronics, personal computers, entertainment software and
appliances. Successfully developed and maintained ETL maps to extract,
transform and load data from various data sources to the enterprise data
warehouse called PMS (Product Management System). The data warehouse
contained information regarding sales data, purchase data, valued customer
information, employee information. This helps to make decisions for new
product improvements, analysis of existing product and improve customer
service.
Responsibilities:
. Installed and configured Informatica Power Center and Informatica Client
tools.
. Extracted the data from Sql server, DB2, Flat files into data warehouse.
. Working with different transformations like source analyzer, lookup
filters.
. Created session, batches and scheduled it.
. Created various mappings using designer that include source qualifier,
Expression, Aggregator, and Lookup Transformations.
Environment: Informatica Power center, Sql Server, Oracle, Windows NT, Flat
files.
LMS System, VA Apr 05 - Apr 06
PL/SQL Application Developer
Project: Learning Management System
LMS is a nothing but Learning Management System, which provides a base for
students to get connected to various courses from the course providers. The
system is being developed for National Education Foundation, which is one
of the leading providers of distance learning in the US. The current LMS
system, which is based on C++/Windows, is replaced with .NET framework and
Oracle Database and multi-tier architecture. Students login to the system,
can take over 2000 courses online. The system is currently scaled to handle
5000 simultaneous users. The administrators can form courseware, add new
courses, remove courses, add a student, allocate courses to the students
etc. etc. The system also provides grade generating tools and reports. Used
SQL to define and manipulate database tables.
. Wrote database PL/SQL triggers and procedures to provide backend security
and data consistency.
. Gathered requirements on the project.
. Performed technical and functional reviews.
. Designed and developed database tables, procedures, functions, packages,
triggers to meet business requirements.
. Developed some PL/SQL scripts that will change the server data across all
components in the system.
. Supported installation and configuration of Oracle 10g/9i.
. Performed code reviews on the procedures and triggers.
. Developed stored procedures and complex packages.
Environment: Oracle 9i, UNIX, PL/SQL, SQL*Plus, SQL*Loader, Erwin 3.5
AT&T Bell Labs, AT&T, Somerset, NJ Jun 93
- Aug 02
Member of Technical Staff - 1
Involved in the development and maintenance of the Global Transaction
Network Support System (GSS). Used Pro*C /C++/Shell for the application
development. GSS involves supporting AT&T's worldwide 800 classic and prime
services. It is a client/server system that provides provisioning and
maintenance capabilities for the advanced 800 service work centers,
downloads customers' routing plans into the network and gives external
customers update capabilities to make real time changes to their 800
services and routing plans. Performed Configuration Management and did load
building using Nmake.
Responsibilities:
. Followed application development life cycle procedure in Pro*C to
complete various feature requirements including designing, documenting,
developing, reviewing, testing and installing the changes.
. Created common library functions using C and Embedded SQL to be used
across different applications.
. Developed and maintained Scheduler tool that generates reports for the
GSS periodically. The report generator tool generated various reports
every night through Cron jobs that produced a specific formatted report
for a particular account based on database validations and feature
requirements of the system.
. Participated in the design and implementation of customer user interfaces
for AT&T Work center to monitor the transaction data of the systems on
SUN workstation and PC.
. Analyzed and worked on physical design, code implementation and test
cases to provide the management team with the performance improvement of
the GSS system.
. Wrote design documents and library functions for the application.
. Used SQL*Plus and PL/SQL of Oracle to retrieve and manipulate the system
data on the database.
. Scheduled and conducted design and code reviews.
. Provided technical support to integration and system test teams.
. Implemented functions for GSS to construct, maintain, and load all
processing logic on the AT&T Network for real-time routing, retrieval and
billing.
. Was incharge for the whole project to keep track of the version changes
using Sablime and building loads for Integration Testing, System Testing
and Production.
Environment: HP-UX 10.20, Oracle 7.3.4, Pro*C, C, C++, Shell Programming,
PL/SQL