Resume
Susmita Banerjee
E-mail:nupako@r.postjobfree.com Contact No:412-***-****
Summary
1. About 6 yrs of experience in the design and implementation of Data Warehousing.
2. Expertise, knowledge and hand on experience with IBM Data stage and BODI
3. Involved in Extraction, Transformation, Loading (ETL) data from various sources into Data warehouses and Data Marts
4. Create and Design of Data model by applying the Data modeling techniques as per business requirement using the Erwin Software.
5. Knowledge and hand on experience with Data Warehousing internal architecture
6. Exposure in Business Object Reporting tool
7. Coordination and management of projects to successful completion
8. Good strengths in Analysis, Design and Documentation
9. Manage 4 to 5 member team acting as module lead
10. Good team player with very good inter-personal skills, decision making, well organized, resourceful and committed to quality & organization
Work Experience
1. Working as Senior Software Engineer in Target Technologies Pvt Ltd from April 2008 to April 2011
2. Working as Onsite coordinator for 3 months in Target Technologies(Minneapolis).
3. Worked as Developer in Aditi Technologies Pvt Ltd Bangalore from July 2007 to April 2008
4. Worked as ETL developer in Inest Infotech Bangalore from June 2005 to July 2007.
Education
1 Bachelors in Engineering (B.E), Computer Science and eng, MSRIT, Bangalore, Affiliated to Visveswaraiah Technological University, Belgaum, Karnataka, India in the Year 2005.
Technical Proficiency
Data warehousing Tools & Packages
ETL : IBM Data stage 7.5.1, Data Integrator Designer
OLAP : Business Objects
Methodologies : Star Schema, Snowflake Schema.
Data Modeling Tool : Erwin 7.1
Basic Skills:
Operating System : IBM AIX, Windows NT/XP/2000
Programming Languages : SQL/PLSQL
RDBMS : DB2, Oracle 10g
Career Profile:
For Target Corporation pvt ltd. (April 2008 – April 2011 )
Target Corporation is an American retailing company, founded in Minneapolis. Target is known as world second largest retail. Target has identified the need for an Enterprise DW strategy with business intelligence & reporting architecture and future roadmap as part of their enterprise architecture implementation to support the information needs appropriate for the goals and anticipated growth of the company.
Project 3:
Title : Presentation Minimum
Project Type : Development and Implementation
Technologies used : Unix, DB2, IBM Data Stage 7.5, Erwin 7.1
Role : Senior Software Developer acting as Module lead
Description
In Target Enterprise data warehousing the Presentation Minimum data does not reside in present. The entire scope of this project is directed at bringing Presentation Minimum data from the Presentation Minimum system into EDW. Presentation Minimum data is used in calculating the Weekly Instocks % along with On-Hand Inventory and Tracked / Not Tracked Item data. The focus of this project shall be Presentation Minimum data need to calculate the Weekly Instock % as well as additional data determined by the business to provide advanced analytics and drilldown capabilities. While it’s noted below, the primary goal of this project is not simply to provide the merchandising business with the capability to calculate the Weekly Instocks % out of EBI, but to empower Merchandising with the analytical faculties needed to view Presentation Minimum data and the Instocks % with a new lens.
Role & Responsibilities:
1. Involve with the business team to gather and analyze the requirements
2. Providing the design approach for the requirements
3. Involved in the development of all logical and physical data models and the definition of all metadata
4. Modification and change in data model according to requirement in Erwin
5. Coordinating with Onsite offshore team in transiting the design.
6. Reviewing the High-level Design and low level design with architect and DBA teams.
7. Coordinating all systems integration, quality assurance, and user acceptance testing to verify that the solution meets all business and technical requirements
8. Direct Interactions with business user, and Handling ad hoc Issues on the implemented Modules
9. Maintaining the Design Standards and checklists of the project
Project 2:
Title : Sales and Inventory Relief Foundation and Mart
Project Type : Development and Implementation
Technologies used : Unix, DB2, IBM Data Stage 7.5, Erwin 7.1
Role : Senior Software Developer acting as Module lead
Description
The Sales and Inventory Relief Foundation and Mart project is a part of the Merchant analytics program which provides the capability to capture Targets stores sales data and inventory details related to transaction and register as part of Enterprise Requirements to be built within EBI framework.
The Sales and Inventory Relief Foundation and Mart project will deliver a foundational Data Warehouse as part of EBI Data warehouse can be used not only by Merchant Analytics program but also by other initiatives and groups who are interested in Sales and Inventory Relief Data built in the EBI warehouse. The scope of this project is to capture all valid transaction which occurred at store registers .As a part of the Merchant Analytics program, the first two initiatives are building the Food Merchant Analytics Mart and Unsaleables Mart. Food Merchant Analytics (FMA) is a project to build out foundation layers and the mart for key subject areas that will enable the merchants with analytics capabilities as well as help understand the movement of grocery products throughout the supply chain. FMA is aimed at removing dependencies on grocery wholesalers and increase profitability by allowing Target to work directly with vendors. Unsaleables Mart data will provide reporting for marked out of stock information at stores and distribution centers as well as providing valuable visibility to guest returns, defectives, rewraps, salvages and vendor charge backs which all contribute negatively to profitability. A significant number of metrics in the FMA/UNS mart will come from the Sales and Inventory Relief Foundation.
Role & Responsibilities:
1. Gathering and analyzing the requirements
2. Providing the design approach
3. Development of all logical and physical data models and the definition of all metadata
4. Modification and change in data model according to requirement in Erwin
5. Creating the Jobs, scripts and Schedule information
6. Testing the extract, transfer, and load of data from the source systems into the staging area
7. Write unit test cases according to requirement and perform Unit Testing
8. Implementing the changes in the Stage and production environment
9. Coordinating all systems integration, quality assurance, and user acceptance testing to verify that the solution meets all business and technical requirements
10. Direct Interactions with business user, and Handling ad hoc Issues on the implemented Modules
11. Maintaining the Design Standards and checklists of the project
Project 1:
Title : ADW DQ/SCR
Project Type : Enhancement and Implementation
Technologies used : UNIX, DB2, IBM Data Stage 7.5, Erwin 5
Role : Software Developer
Description:
ADW: Analytical Data Warehouse storing GRM related information.
GRM: Guest (Customer) Relationship Management
ADW is an extensively large Data warehouse which is of 65 Terabyte that includes subject areas like Store sales, Web sales, Guest Data Migration (GDM), Account Data Migration (ADM), Operational Data Store (ODS), Scoring process, Account Guest Item (AGI), Campaigning and Reference. The data is used for Market Analysis, One-to-one marketing, Determine trip to stores, large baskets, Cross Selling, Fraud detection.
DQ-SCR is Data Quality – System Change Request. DQ-SCR Projects are group of minor fixes or enhancements to the existing ADW.SCRs are created by the users after identifying gaps between existing system and business needs. Each SCR go through a full SDLC life-cycle. Development effort for the SCR is between 160 to 2400 hours. DQ or SCR takes min of 2 months to release.
Role & Responsibilities:
1. Gathering and analyzing the requirements
2. Providing the design approach to resolve the issue
3. Creating and modifying the Jobs, scripts and dB related changes and Schedule information.
4. Write unit test cases according to requirement and perform Unit Testing
5. Implementing the changes in the Stage and production environment
6. Transition to the support team.
7. Maintaining the Design Standards and checklists of the project
For Aditi technologies pvt ltd. (July 2007 – April 2008)
Aditi is a leading edge solutions development partner for software Product businesses and financial institutions. We provide technology consulting and outsourced product development on emerging technologies
Project 2:
Title : Drug Fair Data Mart
Client : Drug Fair
Technologies used : Data Integrator Designer, DB2, BO
Role : Developer
Description:
This Drug Fair Data Mart contains number of jobs and huge amount of data. The daily and weekly activities have already been automated. The important part of project is the performance tuning and achieving the result quickly. The main parts required in this project is monitoring of jobs, regular Sanity Test of data or any output as mutually decided with Drug Fair, Implementation of change requests, Trouble Shooting in case of any job failure, Designing ad-hoc reports, fixing bugs, Maintenance of the Intranet Application so that we can increase the performance of the Data Mart.
Role & Responsibilities:
1. Copying of feed files from staging server to production server in working directory/compress.
2. Extraction of feed file from source to staging area.
3. Applying transformation in the dimension and validation happens. Dimension loading happens.
4. Applying transformation in the Fact and validation happens. Fact loading happens.
5. Run the Store Data Warehouse Load (Scheduled through Data Integrator Job server)
6. Scheduling the job and backup process.
7. Testing the Jobs.
Project 1:
Title : Pharmacy Data Mart
Client : Duane Reade
Technologies used : Data Integrator Designer, DB2,BO
Role : Developer
Description:
This Duane Reade Data Mart contains number of jobs and huge amount of data. The important part of project is the performance tuning and achieving the result quickly. The main parts required in this project is Monitoring of jobs, Trouble Shooting in case of any job failure, Regular Sanity Test of data or any output as mutually decided with Duane Reade, Implementation of change requests, Designing ad-hoc reports, Fixing bugs, Maintenance of the Intranet Application. That applies to increase the performance of the Data Mart.
Role & Responsibilities:
1. Copying of feed files from staging server to production server in working directory/compress.
2. Extraction of feed file from source to staging area.
3. Applying transformation in the dimension and validation happens. Dimension loading happens.
4. Applying transformation in the Fact and validation happens. Fact loading happens.
5. Run the Store Data Warehouse Load (Scheduled through Data Integrator Job server)
6. Scheduling the job and backup process.
7. Testing the Jobs.
For i-Nest Infotech (June 2005 – July 2007)
Pathmark - Enterprise Data Warehouse
Pathmark’s has identified the need for an Enterprise DW strategy with business intelligence & reporting architecture and future roadmap as part of their enterprise architecture implementation to support the information needs appropriate for the goals and anticipated growth of the company. The DW architecture is intended to identify a robust, comprehensive, and enterprise-reporting environment (high level infrastructure need) that will effectively leverage the investment made in operational systems
Project 2:
Title : General Sales Data mart
Client : Pathmark Stores, Inc
Technologies used : Informatica Power Center 7.x, Oracle 9.x, Win XP
Role : ETL Developer
Description:
This data mart empowers Pathmark’s to get an insight into sales such as Dollar Sales, Margin, and Budget etc for different product categories over time. This gives the overall view of the sales, product, employee and geographical location of the stores. In this data mart the customer transactional details comes daily basis, where as the employee details comes weekly basis and the location information comes monthly basis. The main purpose of this data mart is to keep a watch on the sales of the different stores in different countries. That applies appropriate measures to increase the production of the stores.
Role & Responsibilities:
1. Involved in ETL programming
2. Used Oracle 9i as the target database, Source database is a combination of Flat files
3. Created data mappings to extract data from different source files, transform the data using filters, expressions and Lookups then load to Oracle data warehouse
4. Validating the mappings, generating & loading the Data
5. Extracting the data from Operational Data Source, Transform the data using various transformation and then load in to Oracle database
6. Created and Running / scheduling Work Flows using the Work Flow Manager and documented the existing mappings
7. Performed Unit Testing For the Mappings
Project 1:
Title : Stores Operations Data mart
Client : Pathmark Stores, Inc.
Technologies Used : Informatica Power Center 6.x, BO, Oracle9i, Win XP
Role : ETL Developer
Description:
Store operations data mart allows Pathmark’s to reduce the SG&A costs they incur in operating their retail stores. Minimizing the number of manual processes and increasing the productivity of store employees allows retailers to scale effectively and decreases SG&A as a percentage of sales. Store management efficiencies can indirectly improve revenue, as store employees are able to spend more time assisting consumers.
This data mart consolidates, standardizes and provides enterprise-wide view of whole store information so that from a single schema multi-dimensional, consolidated as well as drilled down information can be generated to support decision making at store level.
Role & Responsibilities:
1. Involved in ETL programming
2. Used Oracle 9i as the target database, Source database is a combination of Flat files
3. Created data mappings to extract data from different source files, transform the data using filters, expressions and Lookups then load to Oracle data warehouse
4. Validating the mappings, generating & loading the Data
5. Extracting the data from Operational Data Source, Transform the data using various transformation and then load in to Oracle database
6. Created and Running / scheduling Work Flows using the Work Flow Manager and documented the existing mappings
7. Performed Unit Testing For the Mappings
Personal Details
Visa Type : L2-EAD
Sex : Female
Current Address : 35 Highland Road
Bethel Park PA 15102