Email: *****************@*******.***
Contact: +1-757-***-****
Sandeep Sathyan
Sr ETL Consultant (INFORMATICA/TERADATA)
Having 9 years of IT experience mainly in Health care and Retail domains. In depth knowledge about various phases of IT projects which include analysis and planning, building user stories/ gathering business requirements, development, testing, data modeling, data governance and implementation encompassing both AGILE and Waterfall methodologies.
PROFESSIONAL SUMMARY
Certified SAFe 4(Scaled Agile Framework) Practitioner.
Expertise in Data Warehousing, Data Migration, Data Integration, Data Analysis and Data Conversions using ETL tool- INFORMATICA PowerCenter
Expertise in Report generation on business requirements using BTEQ scripts and SAP BO tools.
Expert in development using INFORMATICA Client Tools – Source Analyzer, Designer, Mapplet Designer, Repository Manager and workflow Manager.
Experience in design and development of complex mappings, Reusable Objects (Mapplets, Lookups, Transformations, Tasks and Sessions), transformations, extraction and loads.
Experience in loading data, troubleshooting, debugging mappings, performance tuning on components (Sources, Targets, Mappings and Sessions) and fine-tuned transformations to make them more efficient in terms of performance.
Worked with heterogeneous source systems ranging from Databases like Teradata 16, Oracle, SQL server, XML, DB2 and flat files.
Expert level development skills on file transfer utilities like FTP & SFTP.
Strong Knowledge of Data Warehouse Architecture and Designing Star Schema, Snowflake Schema, FACT and Dimensional Tables, Physical and Logical Data Modeling.
Proficient in Unix scripting & Shell programs.
Hands on experience on handling huge volume of data and performance tuning.
In depth understanding of universe design, development, alteration maintenance using information design tool and universe design tool (IDT & UDT)
Worked extensively with Web intelligence functions, formulas, and variables.
Used function, such as filters, conditions, breaks, sorting trill down/ slice and dice, input controls, master / details, cross tab, charts and alerts
Widely used JIRA Dashboard to track the issues and communication with the teams.
Successfully worked in global delivery models - with offshore & onsite teams
Understanding on the Apache Kafka messaging service
Worked on Airflow to create ELT flows.
Knowledge on container service using Dockers.
Technical Skills:
ETL Tools
INFORMATICA POWERCENTER 9.x/10.2, Airflow
Database
Teradata 16, ORACLE 9i/10g/11g, MS SQL server 11, DB2
Scripting Language
SQL, Unix Scripting, BTEQ, Fast load & Multiload
Data Modelling
CA Erwin
Reporting tool
Business Objects XL R3
Scheduling Tools
Control-M(CTM), WLM
Messaging Tool
Kafka
Others
JIRA, SNOW, HPSM, MS Office suite including MS Visio, APM
Client: Anthem Period:2015 Nov - Till date
Position: Senior ETL Developer Location: - Virginia (Norfolk)
Responsibilities:
Worked closely with Business analysts and Data architects to understand and analyze the user requirements.
Involved in building the ETL architecture for Data Migration.
Worked on Data integration between Mainframe systems and Teradata using INFORMATICA PowerCenter.
Developed complex mappings and implemented SCD Type-II and Type III transformation logics to load the data from various sources.
Implemented data manipulations using various Transformations like Aggregate, Filter, Normalizer, Sequence Generator, Expression etc.
Wrote BTEQ scripts to transform data.
Used Teradata utilities fastload, multiload and BTEQ to load data.
Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
Developed Mappings, Sessions, Workflows and Shell Scripts to extract, validate, and transform data according to the business rules.
Extensively used Change data capture concept in INFORMATICA to capture the instream data flow changes.
Working on Dimension & Fact tables, developed mappings and loaded data on to the relational database.
Developed logical and physical data models that capture current/future state data elements.
Generated UNIX scripts for automatic daily load processes.
Generated completion messages and status reports using workflow manager.
Involved in preparing ETL specifications and unit test plans for the mappings.
Extensively worked in the performance tuning of the ETL process.
Successfully migrated the mappings to QA and production environment.
Migrate existing system to load and send data through Kafka messaging service
Worked in Container based application development using Docker
Created POC to load the data using airflow
Tools: INFORMATICA Power center 10.2, Teradata v16, MS SQL server, DB2/IMS, UNIX, JIRA,
Business Object, Control-M(CTM)
Client: Med Pro Period: 2015 May - 2015 Oct
Position: ETL Developer Location:- Offshore(India)
Responsibilities:
Analyzed functional specifications provided by the architect and created technical specifications document.
Ensured architectural adherence through blueprint and design reviews.
Designed and implemented the complex project with high quality.
Developed logical and physical data models, DB2 look alike physical model structures that capture incremental and history data.
Performed data manipulations using various Informatica Transformations like Joiner, Sorter, Router, Filter, Expression, Aggregator, Union, Transaction Control, Update Strategy & Look Up.
Extensively used Change data capture concept in Informatica to capture the changes in the data warehouse.
Working on Dimension as well as Fact tables, developed mappings and loaded data on to the relational database.
Support of bug fixes and data issues reported.
Performed the Unit Testing which validate the data is mapped correctly which provides a qualitative check of overall data flow up and deposited correctly in Target Tables.
Applied different optimization techniques to enhance performance at various levels which includes SQL tweaking, Sourer, Target and transformation level optimization.
Extended support for SIT & UAT
As part of release management, participated in migration of the components till PROD from development environment.
Logical and Physical data model build using ERWIN.
Created WLM/CTM jobs to automate run and designed required recovery/ad-hoc components.
Tools: INFORMATICA Power center 9.6, Oracle 10g, UNIX, SQL developer, ERWIN, WLM/CTM
Client: H.E.B Period: 2012 Jan - 2015 Apr
Position: ETL Developer Location:- Offshore(India)
Responsibilities:
Worked closely with business team to finalize on the validation rules that need to be applied outside Informatica ETLs.
Participated in meetings with business analysts to understand the user requirements and assisted in the design of BRS.
Created technical specification document for each workflow in exhaustive detail listing all technical and business validation rules, look-up tables and error messages.
Reviewed the design with project tech lead and test team.
Using Informatica Power Center Designer, designed Mappings.
Created Mappings with Transformations, Sessions and Workflows.
Created various transformations such as Expression, Lookup, Joiner, Router, Filter, Aggregator and Sequence Generators.
Created and used reusable transformations when required.
Extensively used mapping parameters, mapping variables and parameter files.
Configured and ran the Debugger from within the Mapping Designer to troubleshoot predefined mapping
Logged the errors during execution in Error tables.
Created backup tables to store the history of all the transactions performed on the given data.
Worked with Informatica administration and integration teams to migrate code to higher environments without any glitches
Created training guides and run books for production support team
Tools: INFORMATICA power center 8.6, ORACLE 8i, DB2, ERWIN, Unix shell scripting.
Education:
Master of Computer Applications, IGNOU, India