Prasanna Lakshmi R.
************@*****.*** +1-408-***-****
PROFESSIONAL SUMMARY:
Over 13 Years of overall IT experience in Analyzing, Designing, Developing, Testing, Implementing and Maintaining client/server business systems.
12 years of experience with Extraction Transformation and Loading Datstage (ETL) tool –Knowledge in Data warehousing involving data warehouse analysis and design using IBM INFORMATION SERVER 8.5/8.1/8.0 (Designer, Director and Administrator) and Ascential Websphere DataStage 7.5/7.1/6.0.
Extensively worked on Datastage Parallel Extender and Server Edition.
Expertise in performing Data Migration from various legacy systems to target database.
Developed Parallel jobs using various stages like Join, Merge, Lookup, Surrogate key, Scd, Funnel, Sort, Transformer, Copy, Remove Duplicate, Filter, Pivot and Aggregator stages for grouping and summarizing on key performance indicators used in decision support systems.
Experience in UNIX environment, shell scripts,, Knowledge in understanding the PL/SQL scripts including stored procedures, functions at Database level.
Expertise in Software Development Life Cycle (SDLC) of Projects - System study, Analysis, Physical and Logical design, Resource Planning, Coding and implementing business applications
Good experience in writing UNIX shell scripts for Data validations, Data cleansing etc.
Expertise in working with various heterogeneous and operational sources like Oracle, Teradata, DB2, SQL Server, Mainframes and MS-Access into ODS & DW in the staging area
Extensive experience in development, debugging, troubleshooting, monitoring and performance tuning using Data Stage Designer, Data Stage Director, Data stage Manager.
Strong understanding of the principles of Data Warehousing using Fact Tables, Dimension Tables, star schema and snowflake schema modeling
Worked on different Scheduling tools like CA7, AutoSys and good experience in writing JIL scripts and designing Autosys jobs using Web GUI and proven track record in troubleshooting of DataStage jobs and addressing production issues.
Skilled in writing technical specification documents, translating user requirements to technical specifications and creating and reviewing mapping documents.
Experienced in troubleshooting DataStage jobs and addressing production issues such as performance tuning, enhancements and data issues.
TECHNICAL SKILLS:
Development/Productivity Tools : IBM Information Server 8.0.1., Data stage 7.5x (Server and
PX), Informatica (Beginner), Apache Hive (Beginner)
Data Warehousing Scheduling Tools : CA-7 Scheduler, Control-M, Autosys.
RDBMS : Oracle 8.1, SQL Server 6.5, DB2, basics of Teradata
Programming Languages : Unix shell scripting, SQL, Basics of PL/SQL
Domains : Retail Industry, Sales, Finance, Securities and Electronics
Tools : Putty, TOAD, Teradata SQL Assistant, HPQC, HPALM, Team
forge, POWER, Basics of mainframe CA7 commands to check
JCL status.
TRAININGS & CERTIFICATIONS:
Certified in Infosphere Datastage 8.0.
DataStage Server and PX
Quality Stage (Basics)
Hive Basics
EDUCATION:
Bachelor of Science (Computer Science)
PROFESSIONAL EXPERIENCE:
Kohls Milpitas CA, USA ETL Tech Lead - Feb 2017 – Till Date
Macys San Francisco, CA, USA ETL Tech Lead - Jul 2016 – Feb 2017
Lam Research, CA, USA ETL Tech Lead - Mar 2016 – Jul 2016
Macy San Francisco, CA, USA ETL Developer/ Tech Lead - Nov 2013 – Mar 2016
IBM Bentonville, AR, USA Senior Software Engineer - Sep 2009 – Nov 2013
Wipro Limited B’lore, India Analyst Programmer - Oct 2006 – Sep 2009
Valtech Pvt. Ltd B’lore, India Associate Software Engineer - July 2004 – Sep 2006
PROJECTS:
Kohls, USA Feb 2017 –Till Date ETL Tech Lead
1.
Project Name
Account Management
Client
Kohls (Milpitas, US)
Duration
May 2016 – Till Date
Role
ETL QA Lead
Account Management – This is a Migration project in which the current Account management system is maintained in the ATG platform which uses the Oracle 11g database. The creation and modification to the user accounts happen synchronously to the database with no messaging system involved. But, there are synchronous and asynchronous web service calls to MASH system to sync up the user profile and Loyalty profile data to CMDM system and Loyalty system respectively.
With the migration to KOS platform, there is a mechanism needed to migrate the user profile related data to the MySQL database. Since this migration will not be achieved with a single attempt, we have come up with the process of initial full data migration and eventually with the incremental data migration of delta changes. After the migration of the data, the users will not be 100% routed to the KOS platform unless the ATG is decommissioned, this creates the necessity of maintaining the data integrity between Oracle and MySQL data sources. To achieve this, we are using the Apache Kafka publish-subscribe message system integrated with the zoo keeper and the custom feed extractor.
Responsibilities:
Work closely with the Business, Solution Architect, Data Modeler, Data Mapper, Business Analysts to understand business requirements, providing expert knowledge and solutions on Data Warehousing, ensuring delivery of business needs in a timely cost-effective manner
Evaluate and migrated data between ATG & MYSQL Databases and test the data collection, data staging, data movement, analytics delivery, data quality and archiving strategies with the base of Stories/requirements provided in JIRA
Prepared the testcases which included different functional and technical scenarios.
Prepared the testplan and test strategy.
Executed the testcases and updated the test results and raised the defects in JIRA for tracking.
Environment: Oracle, JIRA, Google cloud, MYSQL
1.
Project Name
Search & Browse(SnB)
Client
Kohls (Milpitas, US)
Duration
Feb 2016 – Till Date
Role
ETL QA Lead
Search & Browse – This is a migration project. The purpose of this application is to move the existing Kohl's platform catalog data(products/SKU’s & UPC’s data) running on Oracle ATG Stack at ATT Physical Data Center to the Open Source platform running in Cloud.
The goal of this project is to migrate from Oracle ATG Stack(Oracle) to Open Source Stack NoSQL DB (MONGO DB).
The re-platforming project will be done in parallel without halting the development of new functionality on the existing platform.
The goal of re-platforming project is to make the seamless transition from OLD to the New Platform.
Responsibilities:
Work closely with the Business, Solution Architect, Data Modeler, Data Mapper, Business Analysts to understand business requirements, providing expert knowledge and solutions on Data Warehousing, ensuring delivery of business needs in a timely cost-effective manner
Evaluate and migrated data between ATG & MONGO Databases and test the data collection, data staging, data movement, analytics delivery, data quality and archiving strategies with the base of Stories/requirements provided in JIRA
Prepared the testcases which included different functional and technical scenarios.
Prepared the testplan and test strategy.
Executed the testcases and updated the test results and raised the defects in Version-one for tracking.
We will extract all the production data or Golden data or Only Active products from ATG into CSV files using the SQL scripts. These files will be uploaded or copied to the google storage.
Executing One Time Load Batch job options:
Full Catalog/Products (Full data set from production both Active/inactive/Orphan/etc
Execute the batch job only for Active Products using CSVs where we edit the SQL scripts by passing the parameters.
Environment: Oracle, JIRA, Google cloud, MONGODB
Macys, USA Jul 2016 –Till Date ETL Tech Lead
1.
Project Name
Big Ticket
Client
Macys (San Francisco, US)
Duration
Jul 2016 – Feb 2017
Role
ETL QA Lead
Big Ticket - Currently customer is able to purchase Big Ticket item by visiting Macys Store and Macys.com. For online purchase, customer must contact the call center and the order is placed through the existing RDS system. For in store purchase, team members will enter order directly into RDS system.
This supports the Big Ticket program and will contain the functional requirement to acquire Online Big Ticket (Furniture Items) Customer Data into the Integrated Data Management (IDM) platform and will be retained in the Customer Analytic Program System (CAPS).
Acquire new online Big Ticket Customer Data into IDM.
Assign Appropriate affiliate Source and priority for new online Big Ticket customer with email.
Process online Big Ticket Customer through Match Merge.
Export new online Big Ticket customer to Epiphany for marketing Welcome and Segmentation.
Export online Big Ticket transaction to Epiphany for Marketing Analytic.
Responsibilities:
Work closely with the Business, Solution Architect, Data Modeler, Data Mapper, Business Analysts to understand business requirements, providing expert knowledge and solutions on Data Warehousing, ensuring delivery of business needs in a timely cost-effective manner
Evaluate and test the data collection, data staging, data movement, analytics delivery, data quality and archiving strategies with the base of Stories/requirements provided in Version-one
Effectively tested the data developed and loaded for Big ticket using datastage tool, which loads data into DB2 and Vertica Databases. These jobs load data into Dimensions and Facts which has SCD type-2 dimensional loads. The source data is extracted from the MST/DB2 and then sent to the vertica Database for future reporting purposes.
Identify opportunities to optimize the ETL environment, implement monitoring, quality and validation processes to ensure data accuracy and integrity.
Extensive experience in testing the data loaded through ETL tool IBM Infosphere/Websphere DataStage, Ascential DataStage to files like CSV’s and TAB files.
Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and star/snowflake schema modeling.
Prepared the testcases which included different functional and technical scenarios.
Prepared the testplan and test strategy.
Executed the testcases and updated the test results and raised the defects in Version-one for tracking.
Environment: IBM Information Server 11.0.1, vertica, DB2, Version-one
Lam Research, USA Mar 2016 – Jul 2016 ETL Tech Lead
1.
Project Name
OPT
Client
Lam Research (Fremont, US)
Duration
Mar 2016 – Jul 2016
Role
ETL Tech Lead
OPT – Online Planning Tool (OPT): The project is to create OPT dashboard to replace exiting OPT application and allow the business users and the IT to query their near real time, ad-hoc, standard reports, operational and analytical reports from SAP Hana.
Responsibilities:
Serve as technical lead and work with business partners to define product roadmap and requirements for SAP data migration program.
Interact closely with key business stakeholders to understand user demands; product architects, development team to understand technical application dependencies and help build finer and detailed product specifications packaging features/functionality into phased product releases.
Communicate functional and non-functional requirements via user stories to dev teams.
Specify data mapping & data provisioning methodologies along with application interface detailing.
Handled migration of legacy applications from SQL server to HANA models, enabling faster performance, better memory utilization and easier access to simpler interface modeling for future.
Evaluate and design data collection, data staging, data movement, analytics delivery, data quality and archiving strategies
Review the jobs for OPT ETL applications created using datastage/BODS tool, which loads data into CSV’s, and TAB’s. These files are created by extracting the data from Oracle DB using PL/SQL and Datastage Technologies and the files are sent to down Stream applications.
Identify opportunities to optimize the ETL environment, implement monitoring, quality and validation processes to ensure data accuracy and integrity.
Extensive ETL tool experience using IBM Infosphere /Websphere DataStage, Ascential DataStage .
Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.
Environment: IBM Information Server 8.0.1, BODS, SAP HANA, SQL Server, JIRA
Macys, USA Nov 2013 – Till Date ETL QE
1.
Project Name
3PO (Third Party Orders)
Client
Macys (Sanfrancisco, US)
Duration
Dec 2015 – Mar 2016
Role
ETL QE
3PO - The overall goal or strategy of this project is to maintain and streamline the MCOM marketing channel expense actualization and allocation at the JOB level across channels by tactic. It's achieved after integrating MCOM Marketing campaign and vendor expense details, along with associated merchant allocation into MAS.
Responsibilities:
Efficiently tested Marketing & Finance related ETL stories which reads data from various source tables and loads to files and tables.
Involved in unit testing of ETL jobs which involves calculating the facts below:
1.Demand Commission
2.Actual Commission
3.Space Allocation
4.Spend by Merchant
Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and star/snowflake schema modeling.
Identify opportunities to optimize the ETL environment, implement monitoring, quality and validation processes to ensure data accuracy and integrity.
Worked extensively in testing the Marketing & Finance related ETL jobs that were developed using ETL tool IBM infosphere Datastage.
Identify opportunities to optimize the ETL environment, implement monitoring, quality and validation processes to ensure data accuracy and integrity.
Extensive experience in testing the data loaded through ETL tool IBM Infosphere/Websphere DataStage, Ascential DataStage to files like CSV’s and TAB files.
Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and star/snowflake schema modeling.
Prepared the testcases which included different functional and technical scenarios.
Prepared the testplan and test strategy.
Executed the testcases and updated the test results and raised the defects in Version-one for tracking.
Environment: IBM Infosphere datastage 8.5, DB2, SQL Server, UNIX, Version one
Macys, USA Nov 2013 – Mar 2016 ETL Developer / Tech Lead
2.
Project Name
STELLA
Client
Macys (San Francisco, US)
Duration
Nov 2013 – Dec 2015
Role
ETL Datastage Developer / Tech Lead
STELLA - Stella is a Product Information Management System (PIM). All Product information and product images on the Macys.com website and Bloomingdales.com are managed in Stella. Stella manages Product assets including module like PRODUCT, ATTRIBUTES, PRICING, PROMOTIONS, UPC’s etc.
Responsibilities:
Work closely with the Business, Solution Architect, Data Modeler, Data Mapper, Business Analysts to understand business requirements, providing expert knowledge and solutions on Data Warehousing, ensuring delivery of business needs in a timely cost-effective manner
Evaluate and design data collection, data staging, data movement, analytics delivery, data quality and archiving strategies
Developed the jobs for Stella ETL applications using datastage tool, which loads data into CSV’s, and TAB’s. Extracting the data from Oracle DB using PL/SQL creates these files and Datastage Technologies and the files are sent to down Stream applications.
Involved in unit testing of ETL Datastage jobs.
Identify opportunities to optimize the ETL environment, implement monitoring, quality and validation processes to ensure data accuracy and integrity.
Extensive ETL tool experience using IBM Infosphere/Websphere DataStage, Ascential DataStage.
Worked on DataStage tools like DataStage Designer, DataStage Director.
Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and star/snowflake schema modeling.
Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.
Developed parallel jobs using different processing stages like Transformer, Aggregator, Lookup, Join, Sort, Copy, Merge, Funnel, CDC, Change Apply and Filter.
Environment: IBM Information Server 8.0.1, Oracle, DB2.
IBM, USA Sep 2009 – Nov 2013 Tech Lead/Datastage Developer
2.
Project Name
1. GLOBAL REPLINISHMENT SYSTEM (GRS)
Client
Wal-Mart (Arkansas, US)
Duration
Jul 2012 – Nov 2013
Role
Tech Lead/Datastage Developer
Global Replenishment System (GRS) - The Global Replenishment System (GRS) - a common global platform for replenishment planning and purchase order execution across all countries and formats. GRS is based on JDA forecast algorithms, a supply chain planning software, to perform the replenishment planning of Wal-Mart, considering item, supplier, sales history, events and other data from various source systems through ETL based batch interfaces. IBM is to deliver this data integration to ensure the functions as required by Wal-Mart business are met and implement the changes needed in the source systems, interfaces and the target system peripheral adapters.
Responsibilities:
Actively worked in all phases of the SDLC starting from requirements, design (LLD), development and unit testing, integration testing of all Datastage applications.
Interaction with client for analysis of Organization data, ETL requirement gathering & data Consolidation.
Work extensively in designing and developing datastage jobs and sequencers in Datastage V8.5 parallel edition.
Worked in onsite-offshore environment, assigned technical tasks, monitored the process flow, conducted status meetings and making sure to meet the business needs.
Used different Partitioning methods in parallel job (Auto, Round-Robin, Hash, and Entire) to improve the performance and to get the accurate results according to the business requirements
Take higher responsibilities to lead the team and deliver the good quality of code.
Worked in Data stage 8.0 PX job enhancements based on the new Requirements from Business.
Extensively worked in Performance tuning of Datastage jobs that runs long while monitoring the flow.
Used Datastage Designer for developing various jobs to extract, cleansing, transforming, integrating and loading data into Data Warehouse
Scheduled the server jobs using Datastage Director, which controlled by Data Stage engine and also for monitoring and performance statistics of each stage
Creation of jobs sequences and job schedules to automate the ETL process by extracting the data from flat files, Oracle and Teradata into Data Warehouse
Preparation of UTP, UTC & also responsible for testing the jobs.
Reviewing peer objects and submit the project status report to manger on daily basis
Testing and Debugging Data stage job related issues
Support various phases of Testing like SIT, UAT and PROD
Provide RCA for specific requirements and provide estimation of SLA of UI availability based on the status of the batch run.
Created and Managed the Defects that occurred in GRS project in HPQC tool.
Analysis of functional and Technical defects and provided the relevant data stage code fix or appropriate solution within the given ETA.
Currently took role in production monitoring, support and transition of interface for production support team
Involve in understanding change requests and work on enhancements by providing right estimation.
Maintained the different versions of code in POWER while deploying the code fixes to different phases like SIT, UAT.
Raised CRQ in Remedy7 tool for deploying the code in Production environment.
Environment: IBM Information Server 8.0.1, Oracle, DB2, Teradata, SQL server and CA7.
3.
Project Name
SAMS AMS
Client
Wal-Mart (Arkansas, US)
Duration
Oct 2011 - Jul 2012
Role
Datastage Developer
Project Description
Wal-Mart Stores, Inc. (formerly branded as Wal-Mart, branded as Wal-Mart since 2008) (NYSE: WMT) is an American public multinational corporation that runs a chain of large discount department stores and a chain of warehouse stores. Sam’s Club is one of the divisions of Wal-Mart where the retailers and shop owners buy in huge quantities with lots of discounts.
SAMS AMS Project will implement JDA BAM Application (BAM) to provide Assortment functionality to SAMS business users. Data will be extracted from a number of existing Wal-Mart databases and transformed and loaded into the BAM database
Responsibilities:
Extensively worked in all phases of the SDLC starting from requirements, design (LLD), development and unit testing, integration testing of all Datastage applications.
Worked in onsite-offshore environment, assigned technical tasks, monitored the process flow, conducted status meetings and making sure to meet the business needs.
Active role in Designing, Testing and supporting different types of complex DataStage jobs.
Interact with client for analysis of Organization data, ETL requirement gathering & data Consolidation.
Take higher responsibilities to lead the team and deliver the good quality of code.
Worked on creating mapping documents from the business requirements and designed the ETL
Created multiple jobs and sequences in Data stage 8.0 PX Job design and responsible for creating Design documents.
Design and development of Data stage jobs for different subject areas & preparation of UTP, UTC & also responsible testing the jobs.
Reviewing peer objects.
Support various phases of Testing like SIT, UAT and PROD
Ensure deliverables of high quality that meet requirement within budget, schedule and time.
Preparation of Low-level Design and High-level Design docs) with reference of source to target mapping.
Development of datastage design concepts, execution, testing and deployment on client server.
Modifying the existing Job if required using Datastage Designer for developing various jobs to extract, cleansing, transforming, integrating and loading data into Data Warehouse.
Used Datastage Director to schedule running the jobs, monitoring scheduling and validating its components.
Responsible for understanding change requests and work on enhancements by providing right estimation.
Environnent: IBM Information Server 8.0.1, Oracle, DB2, SQL Server, AIX Server(UNIX)
4.
Project Name
Install Base
Client
Netapp (Sunnyvale, US)
Duration
Sep 2009 - Oct 2011
Role
Senior ETL Specialist
Project Description
NetApp’s IB project is one of the leading/large DW&BI project in IBM - in terms of complexity, challenge, number of resources and technology. NetApp started support center to support for their customers globally. About Customer NetApp is a product-based company in US. NetApp is a world leader in unified storage solutions for now day’s data-intensive enterprise. It also got "Best Place To Work For" award by Fortune magazine 2009.
Responsibilities:
Analysis and Understanding the process of the Requirement.
Providing technical suggestion to implement the complex logic in ETL /SQL/Procedure.
Troubleshooting and rectifying for issue if any team is facing and providing right contact if team not able to do.
Designed various kind of Jobs in Datastage 8.0.
Co-ordination between offshore team and onsite team meeting with client whenever needed.
Sending the status and task allocation for ETL offshore team and prioritizing activities.
Clarify the business rule if needed from BA and source system technical /business user.
Prepare the high-level design document (For ETL) with reference of SOW.
Worked effectively in Performance tuning of Datastage jobs.
Created sequences, which include the workflow of jobs in the Datastage project.
Created UTC (Unit Test Cases).
Environment: IBM Information Server 8.0.1, Oracle
Wipro Limited, Bangalore, India Oct 2006 – Sep 2009 Analyst Programmer
5.
Project Name
ETL Regulatory Feeds
Client
State Street (Boston, US)
Duration
Mar 2008 - Sep 2009
Role
Senior ETL Datastage Developer
Project Description
State Street is the world's leading provider of financial services to institutional investors. Our broad and integrated range of services spans the entire investment spectrum, including research, investment management, trading services and investment servicing.
Responsibilities:
Analysis and Understanding the process of the Requirement.
Involved in Creation of Design documents like TAD (Technical Analysis document) etc.
Designed the Jobs in Data stage 7.5.2 PX.
Developed mappings using the stages like Sequential file, Lookup Stage, Aggregator Stage, Transformer stage and etc.
Created sequences which include the jobs in data stage project.
Running and monitoring of Jobs using Datastage Director and checking logs.
Involved in creating UNIX Shell scripts for Header and Trailer validation and other functional validation of source files/Flat files.
Created UTC (Unit Test Cases) and DIT(Development Integration Test Castes)
Involved in creation of Job Information Language (JIL) files in Autosys for scheduling ETL jobs.
Environment: Data stage 7.5, Oracle
6.
Project Name
National Commercial Bank (NCB)
Client
NCB (Jeddah, Saudi Arabia)
Duration
Dec 2007 - Feb 2008
Role
Senior ETL Datastage Developer
Project Description
The project is NCB belongs to National Commercial Bank Jeddah, Saudi Arabia. In this Project the source systems being running on different DBMS platforms, transport data through several independent data streams to update the data on these systems .To ensure consistency of data across the target systems, NCB has proposed to implement an enterprise-wide Data Integration and Management (DIM) solution.
Responsibilities:
Analysis and Understanding the process of the Requirement.
Designed the Jobs in Data stage 7.5.2 PX.
Developed mappings using the stages like Sequential file, Lookup Stage, Aggregator Stage, Transformer stage and prepared the UTC (Unit Test Cases)
Involved in migration of Server jobs to Parallel jobs using ETL-Data stage tool.
Environment: Datastage PX, Oracle
7.
Project Name
WI BI DWH
Client
Wipro Infotech (Bangalore, India)
Duration
Oct 2006 - Nov 2007
Role
Senior ETL Datastage Developer
Project Description
Wipro Info Tech proposes to build a Data Warehouse for Wipro Infotech using powerful ETL tool - Ascential Data Stage. The processed data would be loaded to the DB2 database. These would be reported to the Users via Hyperion Reports and Hyperion Planning.
Project Responsibilities
Analysis and Understanding the process of the WIBI.
Interaction with the Source data owner for the data (SAP Team).
Writing Specification & Writing Unit Test conditions.
Design the Job In Datastage 7.5.2 PX. Developed mappings using the Datastage built-in stages like Sequential file, Lookup Stage, Aggregator Stage, Transformer stage and etc. jobs using DataStage tool to load data warehouse and Data Mart.
Created sequences which include the jobs in data stage project.
Design and Develop ETL Performance tuning of Datastage jobs.
Environment: Datastage 7.5.2, DB2 and AIX
Valtech India Pvt. Ltd, Bangalore, India July 2004 – Sep 2006, Associate Software Engineer
8.
Project Name
BI-PYTHON (Sales Analysis System)
Client
Louis Vuitton (Paris, France)
Duration
August 2004 – Sep 2006
Role
ETL Developer
Project Description
“Louis Vuitton” is the world’s leader in manufacturing and selling the Designer Wear. The items they manufacture include Leather Goods, shoes, Jewelry, Watches, Belts & bracelets, Textiles and etc. As they are gigantic in the world of selling fashion and designer wear, they are having the stores round the globe.
Project Responsibilities
Analysis and design of ETL - processes.
Created Data Stage server jobs to load data from oracle and ODBC, sequential files, flat files etc.
Used the Data Stage Designer to design and develop jobs for extracting, cleansing, transforming, integrating, and loading data into the target
Created job sequences and schedules for automation.
Used DataStage tool to transform the data to multiple stages, and prepared documentation.
Used Data Stage Manager for importing metadata from repository, new job categories.
Customized Datastage applications, Managed Data Stage Jobs and Job Sequences
Analysis and Creation of Control-M specifications from Test Director.
Creation and Execution of Drafts in Control - M.
Environment: Data Stage7.5, Oracle 9i, Control – M
PERSONAL DETAILS:
Visa : H1B valid till 2017 Dec.
Marital Status : Married
Contact # : +1-408-***-****
Email : ********.*********@******.***