SATISH KAMATH, Lead ETL Developer
Phone: 404-***-**** (C)
770-***-**** (H)
Email: *************@*****.***
Summary:
Over 19+ years of professional experience in Software Development Life
Cycle (SDLC), Data warehousing and the implementation of various Business
applications. Over 9+ years of Informatica ETL Development. Have a wide
variety of experiences in various capacities across different industries
such as Sales, Hire Purchase, Manufacturing, Finance, Ship Board
Management, Airline, Telecom, Credit Cards and sensitive data Industry.
1. Extensive working knowledge in Data warehouse ETL activities using
Informatica 8.1/7.1/6.2.
2. Working knowledge of Informatica Administration like installation,
troubleshooting, maintaining fuser/group privileges and migrating
objects from Dev to QA and QA to Production using Repository Manager
export/import.
3. Proficient in Oracle 10g / 9i / 8i / 7.x / 6.x, OLAP/OLTP, Data Mart,
Data warehousing and Oracle Data Base Administration.
4. Proficient in Data warehouse ETL activities using SQL, PL/SQL, PRO*C,
SQL*LOADER, C, Data structures using C, Unix scripting, Python
scripting and Perl scripting.
5. Proficient with Star schema and Snowflake schema Data warehouse models
using Ralph Kimball/ W. H. Inmon methodologies.
6. Working knowledge in Oracle Backup and recovery, imports and exports,
performance-tuning, creation of Oracle objects, analyzing objects and
monitoring and estimating the table space for oracle tables and
indexes.
7. My work authorization is US citizen.
Technical Environment
Hardware:
8. SUN Sparc Ultra-enterprise Server, HP9000 Series UNIX server,
SGI3200 UNIX server and IBM RISC 6000 Iris server.
Database:
9. Oracle 10g/9i/8i/7.x/6.0/5.0
10. MS-Access
11. Teradata
12. SQL Server
Data warehouse and WEB Development Tools:
13. Informatica PowerCenter 8.1/7.1/6.2
14. Python, JAVA, HTML, Servlets and JSP.
Client Server Development Tool, Programming Languages and
Designing tools:
15. UNIX, Sun Solaris-OS, AIX, Windows vista/ Windows XP/ Windows
98/ Windows95, MS-Windows/MS-DOS
16. Developer 2000 (ORACLE FORMS 4.5 & ORACLE REPORTS 2.5),
SQL*Plus, PL/SQL, SQL*FORMS 3.0, Pro*C, SQL Loader and Oracle
database Triggers and packages.
17. C, Pascal, Data structures and algorithms using C.
18. Unix Shell scripts
19. Python scripts
20. Linux scripts
21. Perl scripts
22. SQL Navigator, Microsoft Query, TOAD & Oracle Procedure
Builder.
23. Visual Source Safe (VSS), PVCS (Polytron Version Control
System) and DTS (Defect Tracking System)
24. ERwin
25. Microsoft Visio
Education
26. Post Graduate Diploma in Software Technology (PGDST) from
National Center for Software Technology (NCST), Bombay,
India.
27. Bachelor of Science (BSC), Kirti College, Bombay University,
Bombay, India.
28. Certified in Informatica PowerCenter 6.0 course from
Informatica.
Extracurricular activities
. Member of ToastMasters Alpharetta club for improving my
leadership skills.
. Teaching Hindi language to kids at Balvihar.
Work Experience
Client: CDC, Atlanta, GA
(Feb '09 till Present)
Environment: Informatica PowerCenter 8.5/ Unix, Oracle 10g/ Erwin
This BioSense Data warehousing project is designed to build a CDC Data
Warehouse that keep track of epidemics across the nation. The project
implemented both star and OLTP schema and size of databases across projects
varied from 7-25 terabytes.
The following tasks are accomplished in this project: -
29. Working as an ETL Lead/Informatica Admin implementing ETL processes to
design/develop/test/install/migrate/troubleshoot various BioSense
processes to collect epidemic details across the nation.
30. Managing two other resources on this project.
31. Helping other Informatica resources trouble shoot any problems they
might have.
32. Assigning work to other resources and making sure the project is
delivered on time.
33. Installation of Informatica 8.5 on staging/ LAB box.
34. Troubleshooting of various errors while installation/ production run/
Data issues.
35. Design modifications of the BioSense Data model using Erwin.
36. Loading XML messages received from various hospitals across the nation
into Database using JMS JBOSS queues/Informatica.
37. Added new maps like Message_Observation to the BioSense process.
38. Fixing bugs on different mappings to have a more accurate data for the
next BioSense release.
39. Used Informatica's built in version control system for source code
management.
40. Used source side, target side and full pushdown optimization to
improve session performance.
41. Involved in Informatica Admin work like installation, migrating
workflows and mappings from Dev to staging/ Lab Box/Production using
Repository Manager and managing user and group privileges on
Informatica Folders.
42. Performance tuning various maps and SQL statements to have a better
throughput of data received from various hospitals across the nation.
43. Developed Korn Shell scripts to move, archive source/ target files.
44. Developed some PL/SQL procedures for mass deletes/updates.
45. Have some exposure to OBIEE.
Client: Cox Communications, Alpharetta, GA
(Nov '08 till Dec '08)
Environment: Informatica PowerCenter 8.1/ Unix, Oracle 10g/ Erwin
This Data warehousing project is designed to build a Cox Data Warehouse
that keep track of their promotions info and channel lead report info. The
project implemented both star and OLTP schema and size of databases across
projects varied from 5-10 terabytes.
The following tasks are accomplished in this project: -
46. Worked as an ETL Lead implementing ETL process to fix the C2O site
error fix.
47. Designed, developed, unit tested Sales Promotions dimension maps.
48. Designed complex star schema architecture for Channel Lead Report from
Channel Lead Report OLTP schema.
Client: ChoicePoint, Alpharetta, GA
(Jun '05 till Oct '08)
Environment: Informatica PowerCenter 8.1/7.1/ Oracle 10g/9i, Unix,
Teradata, ERwin
This Data warehousing project was designed to build a ChoicePoint Data
Warehouse to load the source data received from various courts in the
nation. The data received from courts include Criminal Data (CROSS),
Bankruptcy Data (National Data Retrieval) and Foreclosure Data (Real
Property). All three projects implemented star schema data models and size
of databases across projects varied from 500 GB to couple of terabytes.
The following tasks are accomplished in this project: -
49. Worked as an ETL Lead implementing ETL process for CROSS mappings for
source data received from courts in Texas Brazoria, Colorado Denver,
Texas Elpaso, Kentucky and Nebraska using Informatica PowerCenter 8.1.
50. Managed an offshore team of four developers on this project.
51. Delegated tasks to offshore team and made sure they were completed on
time.
52. Performed source data analysis before development and interaction with
Business Analysts.
53. Created Technical specification and Source to Target mapping
documents.
54. Used various maps/session performance tuning techniques including
parallel pipeline session partitioning.
55. Designed and developed ETL architecture for Real Property database in
3NF.
56. Performed historical/incremental loads on Real Property database.
57. Developed reusable transformations, mapplets and worklets to reduce
development efforts whenever possible.
58. Used First Logic Name and Address parsers in lot of maps.
59. Developed and unit tested reusable transformations for name and
address parsing.
60. Developed CROSS error handling logic/ reprocess generic map to load
manually rectified error data for any court.
61. Led a team of four other developers and helped them with technical
issues.
62. Developed slowly changing dimension mappings.
63. Developed complex LANDFLIP report for FBI to give them a list of
property owners who flipped their properties to minimize taxes.
64. Developed Death Master report to let the Government know about the
Dead people in the country to avoid SSN frauds.
65. Developed Unix scripts to automate the load process, restart from
failure points, Unix workflow scripts to be called from ControlM
scheduler and move, archive source/ target files.
66. Developed various transformations in the mappings like Source
Qualifier, Expression, Union, Filter, Sorter, Aggregator, Joiner,
Lookup, Router, Sequence generator, Transaction Control, Update
Strategy, Stored procedure, External Procedure and Normalizer
transformations.
67. Unit tested and peer tested the mappings and workflows and fixed the
bugs.
68. Designed and developed Bankruptcy customer outputs (PADR/Express
financial) and was responsible in providing them monthly output files
promptly in required output formats.
69. Encrypted and secured FTP the sensitive public data to both internal
and external customers.
70. Used the $$PushdownConfig Mapping Parameter to do source-side, target-
side, or full pushdown optimization at different times, depending upon
the database workload.
71. Debugged maps using Debugger and Transformation's verbose data.
72. Developed command tasks, email tasks, decision tasks and timer tasks
in Workflow manager.
73. Created Unit testing, QA migration and Production Turnover documents.
74. Involved in Informatica Admin work like migrating workflows and
mappings from Dev to QA and QA to Production using Repository Manager
and managing user and group privileges on Informatica Folders.
75. Followed development standards like ChoicePoint's object naming
conventions, short cuts to sources, targets, source anchors, target
anchors and a brief documentation on each transformation that I
developed.
76. Involved in production support for nightly loads.
Client: CompuCredit, Atlanta, GA
(Oct '03 till Jun '05)
Environment: Informatica PowerCenter/PowerMart 7.1 Oracle 8i/9i, ERwin 3.5
This Data warehousing project is designed to build a CompuCredit Data Store
(CCDS) to load the raw data files from different portfolios of credit card
industry. This data is further used to provide Decision Support System to
the bankers to take appropriate action on their customers. This was a Star
schema model with the main fact table Accounts surrounded by about 20
history tables. The size of this database was around 200 GB.
The following tasks are accomplished in this project: -
77. Worked as a Lead ETL Developer implementing ETL process for CCDS
migration (Also known as TrueNorth) using Informatica
PowerCenter/PowerMart 7.1.
78. Designed, developed and unit tested type 1 and type 2 mappings to load
the data in CCDS star schema.
79. Developed and unit tested the main Accounts fact table of this data
warehouse.
80. Developed an accounts persistent cache mapping to improve the
performance across all other type 2 mappings.
Client: Bellsouth, Atlanta, GA
(Nov '01 till Sep '03)
Environment: Informatica PowerCenter 6.0/ Oracle 8i
This Data warehousing project is designed to collect the revenue details
across different dimensions such as time, product, Customer service
Representative and nine southeastern states. The current size of this
database is around 100 Gigabytes in around 25 tables.
The following tasks are accomplished in this project: -
81. Worked as an ETL Developer implementing ETL process for a new Multi
channel data mart (MCDM) using Informatica PowerCenter 6.0.
82. Migrated the existing pl/sql code and pro*C programs to new
informatica mappings.
83. Performance tuned the mappings to finish the job in 2.5 hours from
earlier 6 hours, well ahead of the deadline time, 7:00 AM.
84. Developed PL/SQL stored procedures to integrate the 'CHARGEBACK' logic
in daily loads. Charge back procedure is executed to offset the
outgoing revenues from the Rep's who took the outgoing orders and
charging those Rep's who initially took the incoming orders.
85. Developed a PL/SQL stored package ranking, which gives the credit of
the order to the correct vendor.
Client: GE Aircraft Engines, Cincinnati, OH.
(March '99 to Oct '01)
Cognos / OLAP / Data Warehousing
Environment: Oracle 8i, Cognos Impromptu, Cognos Power play, Cognos
Transformer
The OLAP Project is to provide various reports to the top-level management
as a "Decision Support System" & to the individual Manager regarding the
transaction volume and revenue details by Shops, Engine Models, Customers
and Contracts. Led the offshore development team of seven people to meet
the deliverables on time. Worked with various plants to get the required
data for MTW from respective source systems
The following are the tasks accomplished during this project:
86. Enhancements to Data models using Erwin.
87. ETL was carried out thru SQL, PL/SQL Procedures, PRO*C, Unix
scripting and SQL*LOADER for various OSB (Order Shopping Billing)
shops.
88. Developed programs to populate data in Data Marts using PL/SQL.
89. Built Impromptu queries and Power Play Transformer cubes based on the
Data Mart data.
90. Developed SQL, PL/SQL programs to keep the Master and lookup tables up
to date on a daily basis. Automated this process using Appworx
scheduler.
91. Developed a PL/SQL stored package to log the status of each program in
daily nightly loads giving full details on records processed, updated,
inserted, deleted, successful completion and error messages with start
and end timings.
92. Mapped the data between Source system tables and Staging Area tables
and between Staging Area tables and Central Data Warehouse tables.
93. Developed SQL*LOADER control files to load the data from data files
into Staging Area (SA).
94. Performing Data Base Administrative activities like creating table
spaces, users, tables, indexes, views, synonyms, managing and
monitoring the table space, taking exports and imports, granting and
revoking the user permissions and troubleshooting.
95. Developed PL/SQL programs to massage and scrub the data into Staging
Area with the help of Lookup tables and Master tables and finally
transferred the data from Staging Area to Central Data Warehouse.
96. Creation of new catalogs, IQD's, Impromptu Reports, IWR, Power play
Transformer cubes and Power play Reports.
97. Creation of drill down reports to provide the details till the Part
level data.
98. Creating Users, Roles, Profiles and Granting and revoking the
privileges to the Users.
99. Involved in Application performance and Tuning using Explain Plan and
Tkprof, creation of Indexes and putting the where clause in the right
order.
100. Creating daily backup and recovery scripts for hot backups and Cold
backups and Export and Import and Schedule the backup scripts.
101. Designed and developed Web reports using Java, Java servlets, Java
Server Pages (JSP) and HTML.
102. Involved in documenting my jobs at a summary level and detail level.
Client: Carnival Cruise Lines, Miami, Florida
(April '98 to February '99)
Implemented "Shipboard Management Systems", "IN-CABIN CALLING"
(Billing for calls made from ship cabins) and "SEA-VISION" (Billing for
movies rented and food and drinks ordered by guests from their cabins).
The following are the tasks accomplished during this project:
103. Developed a PRO*C program "IN-CABIN CALLING" for billing the calls
made from ship cabins.
104. Developed a PRO*C program "SEA-VISION" for billing the movies rented
and food and drinks ordered by guests from their cabins.
Client : Technology Management & Analysis Corporation
(December '97 to March '98)
Implemented design development and maintenance of NLC (NATIONAL
LEAGUE OF CITIES) Data conversion.
The following are the tasks accomplished during this project:
105. Loading data for NLC using SQL*LOADER and PRO*C programs.
106. Creation of views, Indexes, procedures, functions, packages, database
triggers and UNIX Shell scripting.
Client: Tata Engineering and Locomotive Company (TELCO)
(August '96 to November '97)
Implemented design and development of "Sales And Hire Purchase
Accounting System. All the development was done using software development
methodology ISO9001.
The following are the tasks accomplished during this project:
107. Design and development of Oracle forms 4.5 and Reports 2.5 (Developer
2000).
108. Developed C++ program for the Object oriented designing and
programming module.
Client: Tata Engineering and Locomotive Company (TELCO)
(October '90 to July '96)
All the development was done using software development methodology
ISO9001.
109. Implemented design, development and maintenance of "Shares And
Debentures Accounting System ". Was responsible for the development of
warrant processing, buyback of debentures, splits of shares and
debentures and sub division processing modules.
110. Design and development of Oracle Forms 3.0/2.3.