Helen Treasa
Sebastian
Professional Summary:
An overall experience of 5 years with DXC Technology formerly known as Hewlett Packard (HP) as an ETL Developer in Telecom IT Domain. Directly involved with User for analyzing and providing Business solutions. Strong believer in teamwork, building cordial relation with people inside and outside the organization, innovate solutions for difficult programming problems, quick -learner and problem analyst.
Previous organization details: DXC Technology, Bangalore (India) DXC Technology is a multinational technology corporation headquartered in Palo Alto California, united states. It provides hardware, software and services to consumers, small- and medium-sized businesses and large enterprises, including customers in the government, health and education sectors. Software Skills:
Area Products
Platforms Windows, Unix
Relational
Databases
Teradata, Oracle SQL,PL/SQL
ETL IBM InfoSphere DataStage, Powercenter Informatica Scheduling
Tools
Control-M, DataStage Scheduler
Reporting
Technologies
Tableau, Power BI
Scripting Macros, VB Scripting
Incident
Management
BMC Remedy
Hadoop
Ecosystem
Sqoop, Hive
Programming
Language(s)
Python,C#,.Net
Testing HP Alm
Contact details:
adctl6@r.postjobfree.com
Strengths:
• Work experience
in Production
environment
• Involved in developing
reports for clients
• Strong Analytical
Skills Data
Management skills
• Strong organizational skills
• Excellent communication
skills Team management
Capability Time
Management Capability
Ability to perform in
pressure.
Work Experience
Organization : DXC.Technologies(Previously HPE) August 2013 – November 2018 Client : Telstra, Australia
Project : Enterprise Data warehouse RCRM Express January 2017 – November 2018 Role : Data Engineer
Responsibilities:
Created UNIX shell scripts to run the DataStage workflows and controlling the ETL flow.
Experience working on Teradata BTEQ scripts, Fast load and TPT Utilities.
Solid experience in writing SQL, stored procedures, proficient in SQL query performance tuning
Load data daily into daily history maintained Express tables
Develop ETL jobs to load the data from flat files to target tables.
Create mapping document for source to target load (ETL).
Ownership of data accuracy.
Tune up jobs in production, by analyzing the query performance via Teradata viewpoint as well as explain plan.
Provide intra-day transactional data based on need Automatically archive history data > retention period Minimize impact on existing User, Batch, DSF and ETL-Access Layer components.
Create FTP jobs in Unix to pull/push files between systems.
Setting up file watcher jobs to trigger the batch process on the arrival of the feed.
Schedule jobs based on criticality and consumption of resources.
Model and design the database objects in SQL Server.
Build scripts to move/purge file(s) from the system based on the business criteria.
Performance testing prior to production release.
Create automation tools (on excel based vba macros to generate DDLs from Production environment and add column(s), indexes and stats based on project requirements.
Support and monitor jobs in production to ensure the batch stream meets the SLA.
Create UNIX shell scripts to run the DataStage workflows and controlling the ETL flow.
Worked extensively in developing ETL program for supporting Data Extraction, transformations and loading using DataStage and Informatica.
Experience in transferring data between a Hadoop ecosystem and structured data storage in a RDBMS such as Exadata, Oracle and Teradata using Sqoop according to client’s requirement.
Creating tables in Hive QL to meet the analytical and reporting requirement. Project : Enterprise Data warehouse deliver feeds to Downstream December 2015 – December 2017 Role : ETL Test Engineer
Responsibilities:
Developed Test Plans, Test Cases, and Test Scripts for SIT and support for UAT tests Used DATASTAGE as an ETL Tool for Developing the Data Warehouse.
Managed and conducted System testing, Integration testing and Functional testing.
Used the Designer to develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.
Extensively worked with DS (Designer, Director and Manager).
Used the DATA STAGE Director and its run-time engine to schedule the jobs, validating and debugging its components, and monitoring the resulting executable versions.
Tracked the defects using ALM tool and generated defect summary reports. Project : BIDW Non-Discretionary Program August 2013 – November 2015 Role : Product Support Engineer
Responsibilities:
EDW is a strategic reporting and analytical platform for systems managing customers, products and billing.
Debug a failure Fix Data quality issues Change ETL jobs for new requirements Optimize inefficient ETL jobs/queries Develop SQL queries for adhoc tasks/user requests Change management: Impact Analysis and review Develop file transfer scripts to send files to downstream on a scheduled basis Build a daily CTRL-M job to perform daily delta load at EOD Running the Jobs/Workflows for ETL process and prepared sql queries to verify the data.
Verifying the ETL data in target database.
Support and monitor jobs in production to ensure the batch stream meets the SLA.
Create incidents for job failure(s) based on criticality Education:
Master of Computer Science and Application
Christ University, Bangalore, India
Bachelor of Computer Science
Jyoti Nivas College, Bangalore, India
Volunteer Work:
04/01/2019 - current
Hold volunteer position with a non-profit Organization called "Salette Malayali Community" for communities data management / communication.
Help the community with all the communication details that must be sent across, which ensures everyone are well in advance aware of all events that takes place. Managed online website to announce regular events. REFERENCE:
Will be provided based on request.