Post Job Free

Resume

Sign in

Data Professional Experience

Location:
Frisco, TX
Salary:
80000
Posted:
March 25, 2018

Contact this candidate

Resume:

JYOTSANA UTSAV

***** ***********, ******,*****, ***** Sponsorship : Not required

ac4xht@r.postjobfree.com https://www.linkedin.com/in/jyotsanautsav Mob: 469-***-****

TECHNICAL SKILLS

ETL Tool: Teradata– BTEQ,MLOAD,FLOAD, FastExport, Unix-awk, sed, grep, Informatica Power Center 9.1.0/9.1.5

Programming Languages: SQL(in depth knowledge), UNIX shell scripting, R, Python

Reporting Tool : SAP - Business Objects(R3), Desktop Intelligence, Web Intelligence

Databases : Teradata 14.10/13.00, Oracle, Db2

RDBMS Tools : TERADATA

Testing Tools : Quality Center

Scheduling Tool : Control-M, Autosys

Operating Systems : Windows 7/XP/2000/Linux

EDUCATION

Bachelor of Engineering, Electronics & Telecommunications, Mody Institute of Tech & Sciences, India June 2011

PROFESSIONAL EXPERIENCE

Black Project, Self Project Jan 2018 – Present

Working towards a solution using Machine learning regression algorithm to determine the new potential buyers in Dallas area, using Python and Tensor Flow.

Initial Startup phase of data gathering.

Programmer Analyst, Bank of America, Gurgaon, India Jan 2016 – Feb 2017

Driven a team of highly-trained and motivated BofA associates supporting bank card information production Teradata, Oracle, DB2 system containing multiple batch cycles, acting as Level 1 support.

Ensured Unix Teradata job flows perform seamlessly and promptly mitigate any offending processes using TERADATA/SHELL.

Managed UNIX shell scripts using Autosys as scheduling tool.

Employed Autosys to manage Linux, Datastage, Informatica, SAS, Cassandra and Teradata job processes and perform root-cause analyses for failed jobs and resolve any abnormalities quickly.

Strong hands on experience using Teradata utilities (SQL, B-TEQ, Fast Load, MultiLoad, FastExport, Tpump, Query man).

Experienced in creating complex mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism.

Conducted and improved database performance tuning by SQL statements, index tuning, and data

warehousing, using Oracle and db2.

Migrated legacy databases using ETL tools.

Created shell scripts to automate data pull from remote FTP servers and imported the data through

SQL Loader.

Developed new process for the team to check production failure of multiple Unix shell scripts, at

any given instance of time.

Senior Application Developer, IBM Gbs, Noida, India July 2011 – Dec 2015

Experience of 5 years in Dataware Housing, worked on different Databases such as Teradata and developing Strategies for loading using Teradata utilities such as FASTLOAD, MULTILOAD, FASTEXPORT, BTEQ, TPT etc.

Proven track record in planning, building, and managing successful large-scale Data Warehouse and decision support systems. Comfortable with both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.

Performance Tuning of sources, Targets, mappings and SQL queries in transformations in Informatica.

Created the ETL source to Target specification documents by understanding the business requirements.

Developed mappings that perform Extraction, Transformation and load of source data into Derived Masters schema using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings.

Vast experience in end to end life cycle Implementation of a data warehouse.

Automated Workflows and BTEQ scripts (Teradata) using Autosys as a scheduling tool.

Well versed in writing UNIX Korn shell scripting.

Researched Sources and identified necessary Business Components for Analysis.

Interacted with different system groups for analysis of systems.

Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS in Teradata.

Analysis of Business and system requirements including impact analysis of existing systems and create detail requirements with the consultation of Business users and technical architects.

Creation of design documents for the requirements.

Familiar with Creating Secondary indexes, and join indexes in Teradata.

Provided performance tuning along with physical and logical database design support in projects for Teradata systems.

Implemented slowly changing dimensions methodology to keep track of historical data.

Successfully implemented projects using TD- Utilities- FLOAD, FASTEXPORT, MLOAD, and TPT.

Worked on projects to integrate various data sources Sybase, Teradata, and Flat Files.

Analyzed existing warehouse and made necessary changes/modifications.

Experience in Data modeling in building data model.

Loaded the data into the Teradata database using Load utilities like (Fast Export, Fast Load, MultiLoad, and Tpump).

Solved Various Defects in Set of wrapper Scripts which executed the Teradata BTEQ, MLOAD, FLOAD utilities and Unix Shell scripts.

Played a major role in setting up different regions - Testing, Integration, Performance, Break-fix, and Production environment with the Wrapper scripts and setup and folder structure for Teradata scripts to run in UNIX environment.

Understanding BRD, Data required document and mapping documents.

Developed MLOAD, BTEQ, FASTLOAD, MULTILOAD scripts to load/extract data from Load ready Files to Teradata Warehouse or vice versa.

Worked on Teradata Query man to validate the data in a warehouse for the sanity check.

Developed unit test plans and involved in system testing.

Worked in one of the biggest Data warehousing projects, Vodafone spain, of IBM and developed Codes as per the design to load the interface data into the Data warehouse tables.

Prepared CRQ implementation documents to migrate the code from SIT to PROD.

Prepared Unit Test Cases and reviewed documents of the project.

Analyzed the Source and Target databases to understand the business, thereby suggesting new areas of improvement.

Developed new Unix wrapper script for file processing which reduced the loading time from hours to minutes resulting in saving up to 50% resources.

ACHIEVMENTS & CERTIFICATIONS

Awarded with annual 'Eminence and Excellence Award-ORION' by the Director -Communication Service Sector, IBM for delivering great service to the customer, in Nov 2012.

Received Deep Skill Adder Award, IBM, four times in a row for possessing the skills required to provide strategic solutions to the clients and delivering outstandingly with the changing requirement as per client's need.

Received Bronze Award, Bank Of America for delivering the solution to customer on time (card information failed to load into the corresponding tables,reflecting incorrect data in customer’s account) in Aug 2016

Certified Lean change Agent, Aug 2014

Certified Teradata 13 professional, Oct 2013

Certification in progress “Python Programming Essentials”, 01/2018- present

Certification in progress “Machine Learning A-Z: Hands-on R in Data Science”, 02/2018- present



Contact this candidate