Mounika Bandam
**** ********** ***, ***** ******, NJ 07047, +1-732-***-****, ***************@*****.***
PROFESSIONAL SUMMARY
•Extensively strong work experience with large scale Data Warehouse implementations using Informatica 9.x PowerCenter, Oracle on Windows platforms.
•Proficient knowledge and hands-on experience in building Data Warehouses, Data Marts, Data Integration and ETL processes.
•Extensive experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Warehouse and Data Marts using Informatica PowerCenter tools (Repository Manager, Designer, Workflow Manager and Workflow Monitor).
•Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions, Worklets and Workflows using Informatica PowerCenter.
•Experience in integration of various data sources like Oracle EBS, PRO2, Flat Files and XML files into Data Warehouse.
•Experience in Dimensional Data Modeling, Slowly Changing Dimensions Type I, II & III, Star/Snowflake Schema, Fact & Dimension tables, OLTP & OLAP Systems.
•Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages, Cursors, Triggers, Views and
Indexes in distributed environment.
•Excellent expertise with different types of data load strategies and scenarios like Historical Dimensions, Surrogate keys, Summary facts, Incremental Logic etc.
•Experience in performance tuning of Informatica mappings and sessions to improve performance of the large volume projects.
•Expert in business process engineering and SDLC, AGILE methodology including analysis, design, development, testing and implementation of software applications.
•Expert in scheduling the tasks (workflows) through Oracle DAC (Data Warehouse Administration Console)
•Experience in resolving on-going maintenance issues and bug fixes.
•Excellent analytical, problem solving skills with strong technical background and interpersonal skills.
EDUCATION
Stevens Institute of Technology, Hoboken, NJ
May 2018
Master of Science in Computer Science, GPA: 3.93/4
Courses: Data Mining & Knowledge Discovery, Database Management systems, Web Programming, Natural Language
Processing
Stevens Institute of Technology, Hoboken, NJ
December 2017
Graduate Certificate in Business Intelligence & Analytics, GPA: 4/4
Courses: Multivariate Data Analytics, Social Network Analysis, Data Warehouse/Business Intelligence
SKILLS
Programming Languages
Oracle PL/SQL, Python, R, C, C++, Java, JavaScript, HTML, XML
AWS Services
EC2, S3, RDS, Lambda, SQS
Data Science Packages
Pandas, NumPy, Scikit-learn, Matplotlib
Database Systems
Oracle 11.x, Oracle 12c, MySQL, MongoDB
Tools
Informatica Power Center 9.x, Splunk, R Studio, Git, Eclipse, OBIEE, DAC
Office Tools
MS Word, MS Excel, MS Power Point, MS Outlook
Operating Systems
Windows, MAC, Basic Linux
EXPERIENCE
Stevens Institute of Technology, Hoboken, NJ
January 2018 – May 2018
Teaching Assistant, Data Mining & Knowledge Discovery
•Teach and clarify the doubts to students and guiding them on the projects
•Conduct Exams and grading papers, quizzes and projects
•Interact with 60+ students each week demonstrating procedures and fielding questions
TEKsystems Global Services, Hyderabad, India June 2014 – May 2016
Software Engineer (ETL Developer), Client: Schlumberger, Houston, TX
•Achieved 90% efficiency by automating the monthly KPIs manual process
•Designed data warehouse using star and snowflakes schema methodologies for the quality and procurement module
•Worked on Informatica object migration using Repository Manager
Available July 2018 1
Mounika Bandam
8915 Bergenwood Ave, North Bergen, NJ 07047, +1-732-***-****, ***************@*****.***
•Performed metadata validation, reconciliation and appropriate error handling in ETL process
•Proficient with various transformations like Filter, Aggregator, Connected/Unconnected Lookups, Expression, Joiner, Router, Update Strategy and stored procedure etc.
•Improved the load performance by tuning the SQL Queries and ETL mappings using parameter files, variable etc.
•Created extensive documentation on the implementation design and daily loads
•Handled Informatica & DAC Administration tasks
•Monitored and maintained production loads with ease and documented problems and solutions for running the workflows
•Proficient with the Informatica tools – Repository Manager, Designer, Workflow Manager, Workflow Monitor
•Worked on various Oracle EBS modules like Procurement, Sales, Finance, MRP, Manufacturing
•Conducted Training sessions to Interns on Informatica 9.5.1 and DAC
•Strong experience in writing the views, materialized views and stored procedures
•Hands on experience with flat files and different source systems like EBS, PRO2
•Created Data Validation scripts for validating the data after the load
Environment: Informatica PowerCenter 9.x, Oracle 11g and 12c, Flat files, PRO2, DAC
TEKsystems Global Services, Hyderabad, India February 2014 – March 2014
Intern, ETL Developer
•Analyzed the Source System and involved in designing the ETL
•Developed the ETL mappings to store the data in a historical manner (Type II) using Data modelling
•Created a detailed report on maintaining the naming standards in Informatica
•Created a Business Area in OBIEE RPD to develop the reports
Environment: Informatica PowerCenter 9.x, Oracle 10g, OBIEE
ACADEMIC PROJECTS
Stevens Institute of Technology, Hoboken, NJ
Watch My Nutrition Spring 2018
•Developed a web based app for users to track their nutrition intake values daily
•Provided the stats on intake over the period like Weekly, Monthly and Quarterly
Develop a Risk Predictive Model for Prudential Life Insurance Fall 2017
•Obtained Prudential Life Insurance data from Kaggle.com
•Handled large datasets for clustering and multi class prediction
•Trained Random Forest, Neural Networks, Logistic Regression and KNN Classifiers to accurately classify insurance risk in R
GEDCOM file Parsing Project Summer 2017
•Developed a Python program to discover errors and anomalies in GEDCOM genealogy file
•Practiced some of the extreme programming (XP) and scrum practices while working on the project
Predicting the Future of a Bank Spring 2017
•Used algorithms like KNN, Naive Bayes, Random forest for regression analysis of the data to understand the churning rate of the customers of the bank
•Implemented Logistic regression and other techniques like binning and backward elimination of a variable to improve
the accuracy of the model using Python
Death in the US Spring 2017
•Collected the Death in the US data set from Kaggle and learnt many techniques by cleaning the data
•Developed the network between the states of the USA based on common manner of death
CERTIFICATES
•EDX Python for Data Science, EDX Programming with R for Data Science, SPLUNK Certified User 6.x
ACTIVITIES
•Volunteer in Women’s LeadHERship Conference at Stevens Institute of Technology in October 2017
•Volunteer for a NGO activity with TEKsystems Global Services in February 2015
•Organized Music and Fun activities at TEK Fest 2015
•Organized Technical and Cultural activities in Bachelor’s Degree
•Worked as CR (Class Representative) for three years in Bachelor’s Degree
Available July 2018 2