Ramya Balasubramanian
E-mail: *******.*@*****.*** Phone: 302-***-****
Professional Summary
• Extensive 6 years of experience in software development and design of ETL methodology for supporting data transformations & processing in a corporate wide ETL Solution using Abinitio.
• Professional Experience in data warehouse design and implementation using Abinitio and Database like Teradata and Oracle.
• Working Experience in dimensional model and developed dimensional data. Unix development using Ab initio tools in Data warehouse environment.
• Skilled in designing high performance ETL and Datawarehousing solutions.
• Expertise in designing and developing Graphs from varied transformations using various components in abinitio.
• Strong experience in developing Shell scripts, SQL and exposure to Teradata.
• Competency in understanding business application, business dataflow, and data relation.
• Expertise in full project life cycle development for Analysis, Design, coding and Testing.
• Proficient in analyzing and translating business requirements to a well-defined system design specifications.
• Polished leadership skills, with ability to motivate teams to increase productivity.
• Excellent skills in understanding business needs and converting them into technical solutions.
• Experience serving as client main point of contact regarding needs definition, project status and issue resolution.
• Self-motivated, quick learner, team player.
• Good Ability to plan, prioritize and track down the work effectively.
• Proficient in creating Unit, System/Integration Test Plans & Test cases.
• Adhering to the quality standards of both client and employer.
• Training/Mentoring the junior team members and transferring the knowledge effectively
• Ability to work in a team and as an individual with minimal supervision
• Good experience in Banking and life science domain.
Work Experience Summary
• Developed, tested, implemented and supported applications. Created, maintained and supported data warehousing applications and processes.
• Used Abinitio as ETL tool to pull data from source systems, cleanse, transform, and load data into databases like Teradata and oracle.
• Check in/Check Out existing applications Using EME in order to perform the necessary modifications.
• Build various Sandboxes in order to create Adhoc & Regular applications in Abinitio.
• creating several Complex Graphs in Abinitio to Automate the HealthCare process.
• Created common graphs in Ab initio that can be executed by passing the parameters. It reduces the redundancy of the code.
• Created graphs that capture the change data between current and previous files. This data is used to create the history in the full file. Developed a common graph that will capture the change data for 130 tables.
• Created Teradata External loader connections such as MLoad, Fastload while loading data into the target tables in Teradata Database.
• Migrated Abinitio code from one environment to the other by developing scripts with the air commands.
• Extensive experience in using air commands and also in EME setup activities.
• Write and modify several application specific Configuration scripts in UNIX in order to schedule the application in the Environment.
Write and modify several application specific Configuration scripts in UNIX in order to schedule the application in the Environment.
Skill Summary
• ETL Tool: Abinitio
• Databases: Oracle, Teradata
• General software & Programming: UNIX Shell scripts, SQL, Excel, Word, PowerPoint
• Operating Systems Windows 2000/98/XP/NT/Visio, UNIX.
Experience
Period Client Employee Designation Location
Jan 2012 – Till Date Chase Cognizant Technology Solutions Associate - Projects Wilmington, DE
Dec 2009 – Aug 2011 IMS Cognizant Technology Solutions Associate - Projects Chennai, India
Feb 2007 – Dec 2007 Citicards Tata Consultancy Services Assistant Systems Engineer Chennai, India
Aug 2006 – Feb 2007 CapitalOne Wipro Technologies Project Engineer Chennai, India
May 2006 – July 2006 CapitalOne Wipro Technologies System Analyst Richmond, VA
Dec 2004 – Apr 2006 CapitalOne Wipro Technologies Project Engineer Chennai, India
Nov 2003 – Apr 2004 DTDC STG Technologies Software engineer Trainee Chennai, India
Education
• Master of Computer Application, Madurai Kamaraj University, India.
• Bachelor of Science (Physics), M.S University, India.
Project Experience
Project : Reprice And Deprice Analytic Repository(RADAR)
Role : Application Developer
Client : Chase Bank
Duration : Jan’12 – Till Date
Location : Wilmington, DE
The source system Core accepts and loads an action sequence, a case is created/updated for particular account. A new case generated will result in the transmission of the entire case to the RADAR. Additionally, any changes, status or otherwise, to any component of the Case, will result in the transmission of the entire case to the RADAR. The interface of these additions or changes is known as the Action Sequence Delta (ASD) interface. The ASD file contains detail record types 0-4. ASD Record Type 0 contains Account information. This process will split record type 0 to load to the ACCT table in RADAR. ASD Record Type 1 contains case information. This process will split record type 1 to load to the CASE table in RADAR.ASD Record Type 2 contains case action information. This process will split record type 2 to load to the CASE ACTION table in RADAR. ASD Record Type 3 (AM2K fields) contains case action variable information. This process will split record type 3 to load to the CASE ACTION VARIABLES table in RADAR. Variable information will not change, once created on C3.ASD Record Type 4 contains account review details for a Case. This process will split record type 4 to load to the ACCT_RVW table in RADAR. The delta file will be created daily for all ASD record types and update in the master file in turn into the table.
• Extensively interacted with the clients to design the graph according to the technical specification.
• Developed Abinitio graphs to validate the asd file and create the delta file and master file for all the tables.
• Developed Graphs with the header, trailer validation and transformations according to the mapping document.
• Developed graphs to load the data in all asd tables.
• Developed wrapper scripts for validate process, source watcher and for all load jobs.
• Developed DDL and query to create tables, views and grants for all the tables in Oracle database.
• Performed Unit testing on the Graphs and wrote queries in SQL to check if the data was loading to the target tables.
• Prepared test plan and test results for header trailer validation, sanity check of data, to ensure parent child relationship between the tables.
• Supported other modules in MAC like ACME, RGW, and ACHGT and gave recovery steps for each failure.
• Worked with prod services team and resolved tickets promptly
Project : Global Reference Repository
Role : Associate - Projects
Client : IMS
Duration : Dec’09 – Aug’11
Location : Cognizant Technology Solutions, Chennai
The Global Reference Repository (GRR) is a pivotal system whose breadth and depth of data available for business applications fulfils the role of a master data hub for all reference data requirements. Application of standardized business rules and transformations ensures the elimination of inconsistencies in the data. History of changes to reference data kept to support reporting of “as was” (time of transaction), “current”, or any point in time views. GRR will sit in the DW tier and will be the master data repository for reference data. Reference data is data that IMS acquires and that which is associated with transactions and aggregates. Reference data types/items/domains will become the dimensions in the DW tier.GRR will store the data in both normalized (3NF) and de-normalized (dimensional) format to cater various transactional warehouse requirements and reporting.
Resposibilities:
• Prepared mapping document for the 3NF model and validated with the clients by conduction data validation sessions.
• Prepared mapping document for the 3NF model and validated with the clients by conduction data validation sessions.
• Extensively interacted with the clients regarding the mapping, gathered the requirements and created the sample data which helped in finalizing the mapping.
• Created the design document for the 3NF model
• Developed Abinitio graphs to load the data from the source LPIN,GPIN and DDMS to GRR data warehouse.
• Developed Graphs with the transformation according to the mapping document.
• Performed extensive testing on the Graphs and wrote queries in SQL to check if the data was loading to the dimension tables.
• Created generic graph for CDC (change Data Capture) Process which process data to capture the history data for 130 tables.
• Created Party dimension with the 3NF data, which gives details about the party data.
• Provided warranty support for the 3NF module and supported the project whenever there were issues.
• Resolved tickets promptly and created appropriate fix notice for the issues raised.
Project : Call Management System(CMS)
Role : Assistant Systems Engineer
Client : CitiCards
Duration : Feb’07 – Dec’07
Location : Tata Consultancy Services, Chennai
CMS provides the information and management tools Citigroup needs to monitor and analyze the performance of their contact center operations. CMS is a database, administration and reporting application designed for enterprises that receive a large volume of telephone calls and have complex contact center operations. CMS provides an administrative interface to the automatic call distribution (ACD) feature of the Avaya Communications Servers, enabling contact center managers to generate reports, administer ACD parameters, and monitor call activities to determine the most efficient service possible for their customers. New data mart needs to be build with source as NMIS Sqlserver database and target as Oracle Database. Data in NMIS server is loaded at half-hour interval by pulling the data from all CMS source databases that are located in different cities.
Daily data from NMIS is pulled to Bimis server by Ab Initio and loaded to Data Warehouse tables after applying the transformations.
Responsibilities:
• Preparing the Design documents based on the Functional/requirements specifications document
• Extensively interacted with coordinator and involved in preparing various documents like Detailed Designs for CMS, Dev to PRD Migration plan, Integration test plan.
• Coding and unit testing of Abinitio Graphs for various stages of ETL.
• Developed Abinitio graphs to load the data from the source to various staging levels and from staging tables to the respective Dimensions and Fact table of EDW.
• Developed Graphs with the transformation according to the mapping document.
• Performed extensive testing on the Graphs and wrote queries in SQL to check if the data was loading to the dimension tables and fact tables properly.
• Development of UNIX shell scripts as the wrapper script to execute graphs.
Project : CSA – DDE Migration
Role : Project Engineer
Client : CapitalOne
Duration : Aug’06 – Feb’07
Location : Wipro Technologies, Chennai
CSA (Central Staging Area) is the ETL architecture that was implemented previously in Capitalone applications. DDE (Data Distribution Environment) is the new architecture that Capitalone is implementing in all applications. The staging area in DDE architecture is divided into init and clean area. The clean area is accessible by all the other applications. Because of this, reusability is highly implemented. The distribution area is divided into load ready area and load or sending the data to third party. Capitalone is implementing DDE architecture in all of their previously developed applications.
Responsibilities:
• Analyzed existing CSA application and Designed application according to DDE standards.
• Creation of high and detail level design, development estimation, Project dashboard and defect documents.
• Developed Abinitio graphs to implement the same functionality in DDE environment as of the old environment.
• Developed Abinitio graphs using various components of database, datasets, partition, departition, sort and transform
• Extensively worked in data Extraction, Transformation and loading from source to target system using Bteq,FastLoad,MultiLoad.
• Involved in Unit Testing the Abinitio graphs and quality assurance and bug fixing.
• Reviewed the design and coding of other applications and reviewed the developed code to ensure the quality of deliverables.
• Taken ownership of four applications and Allocated task to team members.
• Preparing implementation, rollout and rollback plan and Setting Up the switch variables and parameter set
• Lead development team of four members and assigned work to them.
Project : System Analysis on CSA – DDE Migration
Role : System Analyst
Client : CapitalOne
Duration : May’06 - Jul’06
Location : CapitalOne, Richmond, USA
Analysis of all the existing ETL applications that were developed in CSA (Central Staging Area) environment and so that Capitalone wants to move the applications to new architecture DDE (Data Distribution environment). Analyzed the dependency between the applications and the applications details that can be moved without any impact.
Responsibilities:
• Communicated effectively with client to collect the requirements.
• Conducted meetings with the SME of the application and collected dependency details of the application.
• Prepared data flow diagram and application details using VISIO.
• Prioritized the application according to the dependency and created different phases for the project.
• Coordinated with the offshore team and assigned task to analyze the application.
• Participated in the estimation of the project and prepared the report stating the feasibility of the project.
• These systems were Analysed for the feasibility to use Ab initio and the Warehouse in Teradata. Fast-Load, MultiLoad, BTEQ & Fast-Export were the Teradata utilities analysed for this project.
Project : Charger Finance
Role : Project Engineer
Client : CapitalOne
Duration : Dec’05 – Apr’06
Location : Wipro Technologies, Chennai
Charger is to implement an infrastructure solution for the Installment Loan business by outsourcing significant components of the loan origination and servicing value chain systems and business processes to one or more vendors, thus enabling the Installment Loan business to reposition itself strategically. Charger Finance is to design and implement a solution that receives the data from analytical environment and maintains the financial data related to the installment loans.
Responsibilities:
• Produced design documents on a high level for Abinitio involvement and integration
• Developed Ab Initio graphs using various components according to the mapping document.
• Developed and used sub graphs to prevent redundancy of transformation usage and maintainability
• Created Teradata External loader connections such as MLoad, Fastload while loading data into the target tables in Teradata Database.
• Prepared SQL scripts for extraction of data from table based on business conditions
• Created the Sandboxes in Development Environment by check in and checkout process.
• Coordinated with the Onsite Technical Leader.
• Creation of development estimation, Project dashboard and defect documents.
Project : Charger Origination
Role : Project Engineer
Client : CapitalOne
Duration : Aug’05 – Nov’05
Location : Wipro Technologies, Chennai
Charger origination is to extract analytical data from PNC, a third party vendor for Capitalone. The origination of the Installment Loan is sent to the Capitalone from the vendor PNC. The installment loans are approved and validated in the phase origination. The analytical data extracted from third party vendor are approved in the charger origination. This Project is being developed in agile methodology.
Responsibilities:
• Coordinated with onsite technical lead and collected the business requirements.
• Developed Ab Initio graphs to load the data into the Teradata database
• Developed Ab Initio graphs to implement some of the actions (business rules) required for the processing of data before being loaded into Teradata.
• Created Teradata External loader connections such as MLoad, Fastload while loading data into the target tables in Teradata Database.
• Involved in Unit Testing the Ab Initio graphs and quality assurance and bug fixing.
• Worked in Agile Methodology.
• Applications setup monitoring and project status reports preparation.
• Participating in peer review process of developed components for quality of deliverables.
Project : PCAN Migration
Role : Team Member
Client : CapitalOne
Duration : Jan’05 – Aug’05
Location : Wipro Technologies, Chennai
PCAN MIGRATION PROJECT focuses on migrating the PCAN data warehouse and all associated 16 analytic applications from the current Oracle environment, to the Teradata Platform by using AbInitio as ETL tool. PCAN maintains the Credit Card details of Canada customer and CapitalOne maintains other country customers under the stream GFS. Access Cheque and Access Cheque offer are the two applications that deals with the cheque offered under the credit card account. This application maintains the account details and cheque offered from the account and transaction details.
Responsibilities:
• Taken Sole responsibility towards the two applications Access Cheque and Access Cheque Offer.
• Analyzed the existing application and prepared business requirement specification.
• Produced high and detail level design document for the process involved to get the desired results.
• Produced Mapping Document with the transformation from the existing application.
• Developed re-usable graphs for data extraction and scrubbing for various transformation processes.
• Developed Ab Initio graphs to manipulate the logs and load the data into the Oracle databases.
• Involved in loading of data into Teradata from legacy systems using Ab initio graphs.
• Developed UNIX shell scripts as wrapper scripts and executed the application as per the schedule.
• Developed UNIX shell scripts as wrapper scripts and executed the application as per the schedule.
• Involved in Unit Testing the Ab Initio graphs and quality assurance and bug fixing.
Project : Track mail
Role : Developer
Client : DTDC
Duration : Nov’03 – Apr’04
Location : STG Technologies, Chennai
Track MAIL project automates the courier service. This project is used to track the status of the each mail or parcel that transports through this courier service.Each mail or parcel will be having a unique number which will be generated by the system and the same will be used to track the mail or parcel. .NET is the framework used in this project by connecting database in oracle.
Responsibilities:
• Design and Develop the web page for the tracking system
• Connect the database from the frontend and do the data validation
• Test the code and write unit test cases