Sign in

Data Computer Science

Houston, Texas, United States
February 01, 2018

Contact this candidate

Nikita Kora

***** ***** ****** ****, ****: +* 260-***-****

Houston, TX 77062 Email:

Work Authorization: F-1 VISA


Seeking internship positions during Summer and/or Fall of 2018 in the field of Computer Science, through which I can hone my technical skills and apply the knowledge acquired both in university and prior work experience.


University of Houston Clear Lake at Houston Aug 17 - Present

Master of Science (Computer Science)

Basaveshwar Engineering College Aug 09 – May 13

Bachelor of Engineering (Computer Science)


Development Tools: Informatica Power Center, Toad, Tecita, Data Modelling, VBA in Excel.

Programming Languages: C, SQL, UNIX, PLSQL

Database: Oracle, SQL Server

Reporting Tools: SAP BO, MS Excel 2007

Operating Systems: windows XP, Windows 7

Development Methodology: Agile(Scrum) & Waterfall


●ETL Informatica level I domain level certificate from TCS

●Oracle level I and II domain level certificate from TCS

●SAP ABAP certificate


Data warehouse Training

●Extensively worked in data extraction, transformation and loading data from source to target using FASTLOAD, MULTILOAD.

●Given access to users by Teradata grant and revoke options and inserted records using FASTLOAD utility.

●Created Unix script.


Company: Tata Consultancy Services, India (4 YEARS)

Project 1: Data Staging Team has 10 kinds of environments which are based on Client classification and Non-client classification of data. Tables are created, and mappings are developed in our respective folders based on requirements. Views will be created once mappings are done. Job net is developed which includes job ids for each mapping. Workflows are created then testing is done. Data staging team servers as source for many other teams in the project.

Designation: ETL Informatica developer Sept 2013 - Aug 2015

Technology: Informatica Power Center 8.6, Jira methodology, Oracle

Responsibilities: Expertise in Waterfall, agile methodologies (SCRUM).

●Analyzed Client Business Requirements and Documented Functional Requirements and Use Cases as part of the Functional Specifications Document.

●Extensively worked in data extraction, transformation and loading data from source to target.

●Created SQL DDL, DML statements and stored procedures

●Worked on Informatica Source Analyzer, Mapping Designer & Mapplet, and Transformations.

●Created a mapping for the movement of data from oracle and used SCD1 and SCD2 mappings.

●Developed complex mappings in Informatica with transformations like lookup, union, router, aggregator, expression, update strategy, Sequence generator and joiner conforming to the business rules and documented.

●Knowledge in Dimensional Data Modeling (Star Schema, Snowflake Schema, FACT and Dimension tables), data modelling and data warehousing concepts.

●Worked with different sources like Flat Files and Relational Databases.

●Responsible for weekly status reports development about the project progress, problems the team came up with, and potential unidentified risks.

Project 2: Customer Data Networks - The Project is to build an impact Data Mart from the existing Data Warehouse data with supporting Buyer and Catalogue information. To create a sales and service data mart required for Business Intelligence and reporting purposes. The Existing system which allows the customer to enter, modify new sales/service data is used as one of major source systems. They needed a Data Warehouse so that they could maintain historical data at central location for integration, analyze business in different locations, according profit areas. Creating validation and loading the rejected records into the relevant tables and send to the users for re-submitting.

Designation: Data analyst Aug 2015 – Aug 2017

Technology: Informatica Power Centre 9, Toad, Jira, Teradata.

Responsibilities: Used Informatica platform to extract, transform, and load data.

●Experience with data analysis, data mapping and dimensional modeling experience in decision support systems (data marts) using Star Schema

●Analyze existing SQL / ETL Scripts and improving them to achieve better performance.

●Develop, test, integrate and deploy ETL routines.

●Able to work in a team-oriented environment using qualitative and quantitative skills to develop, implement, and integrate strategies.

●Created design documents like data flow diagrams, process flow charts, high level document and low-level document.

●Analysis of data issues and developing the ETL logic based on the requirement.

●Maintaining the data dictionary and objects like mappings.

●Importing and exporting scenarios from development test.

●Strong Knowledge in Data Warehouse architecture and Data Warehousing

●concepts like Star Schema, Snowflake Schema, Data Marts, Dimension and Fact tables.

● Knowledge about the architecture of Teradata.

●Good understanding on Teradata utilities like FASTLOAD, MULTILOAD, BTEQ scripts.

●Knowledge on spool space issues, long running queries.


●Outstanding Performance Award.

●Achieved 98% to 100% on time performance reports.

●Got appreciation mails from client.

Contact this candidate