SUSIL KUMAR SAHU
*****.****@*****.***
Professional Summary
Over 15 years of IT experience in BIDW, Teradata, Big Data, Talend, Automic, Informatica, Ab-Initio, Oracle, Sql Server, BO and Tableau.
Experience in Analyzing, Designing and Implementation of Data warehouse systems (OLAP) using Teradata, Talend, Automic, Informatica, and other Data warehouse tools.
Worked as Lead/Teradata Developer/ DW Architect/ Business Analyst/ ETL
Developer/ETL Tester/Data analyst in DWHs and Data Marts in Retail, eComm,
Banking, Finance, Risk, Credit Card, Healthcare and Energy & Utility domain.
Strong working experience with customer in terms of requirement gathering, Analysis
and Design of Table/View using Teradata, Business user communication.
Designed data Mapping documents, ETL architecture documents and specifications.
Excellent Experience in Designing, Developing, Prod support, Documenting, Testing of ETL jobs and mappings using Teradata, Automic, Informatica and Talend to populate tables in Data Warehouse and Data marts.
Experience with working on multiple environments like Production, Development and Testing.
Expertise in writing yaml scripts, stored procedure and Macro in Teradata.
Extensively worked with Teradata utilities like BTEQ, FastExport, FastLoad, MultiLoad, TPump and yaml scripts to export and load data to/from different source systems including flat files.
Expertise in writing large/complex queries using SQL.
Experienced in Database programming for Data Warehouses (Schemas), proficient in dimensional modeling (Star Schema modeling, and Snowflake modeling).
Expertise in all the phases of System development life Cycle (SDLC) using methodologies like Agile.
Exposure to Hadoop, Hive, Spark, Hbase, Sqoop, MongoDB.
Educational Qualification
M.C.A - Sambalpur University, India - 2005
P.G.D.C.A - Berhampur University, India – 1998
Technical Certification
Scrum Master Certified.
Technical Summary
ETL Tools: Automic, Informatica, Talend, Ab-Initio, IRIS
Big data: Hadoop, Hive, Spark, Hbase, Sqoop, MongoDB, Data Lake.
Languages: PL/SQL, VB, Paython, R
Operating systems: Windows 7, Windows 2000/2003/NT/XP/Vista, MS-DOS, Unix
Databases: Teradata, Oracle (SQL), SQL Server
Tools: SQL Assistant, Toad, BOXI, Tableau, GitHub, IRIS, HP ALM, JIRA, Autosys
Microsoft Tools: MS office Suite (Word, Excel, PowerPoint), Adobe Acrobat Reader.
Methodologies: Software Development Life Cycle (SDLC), Agile Methodology.
Professional Experience
Client: Walmart, CA May 2018 to till date
Role: Senior Data Engineer
Project Name: Sams Club, Global Dynamic Analytical Platform
Environment: Teradata16, Bigdata, Hive, Spark, DataLake, Tableau, Automic, GitHub, Unix.
Walmart is a global leader in Retail industry with formidable presence in online business. Sams Club eCommerce business is handling both Online and Store data. It stores the Order, Return, Refund, Cancellation and Ship tracking related information. It helps business users with various reporting and analytical reporting through Tableau and other analytical tool.
Roles and Responsibilities:
• Extensively worked on Teradata utilities BTEQ, Fastload, Multiload to load data.
• ETL loads and Analytical scripts are executed using yaml scripts and hive queries.
• Perform Coding, performance improvement, identifying & fixing bugs.
• As part of the Global Analytics team provided various solution and fixes to the business.
• Having good exposure and working experience in Tableau.
• Good understanding of data for Order, Return, Refund and Ship-tracking related information.
Client: Microsoft, WA Jun 2017 to Apr 2018
Role: Tech Lead / Developer
Project Name: Microsoft Dynamic 365 for Customer Insights
Environment: Azure, Hbase, SQL server 2016, Visual Studio 2017, GIT, Azure, PowerShell
Customer Insights infuses big data with customer analytics to understand customer’s needs and identify opportunities that better target marketing resources.
Roles and Responsibilities:
• Extensively worked on stored procedure.
• Build logic for KPI in Dynamic 365 for Customer Insights.
• Perform Coding, performance improvement, identifying & fixing bugs, configuring
continuous Integration.
• Implement continuous integration using Jenkins.
Client: Oklahoma Gas & Energy, OK Sep 2015 to Apr 2017
Role: Lead Teradata / ETL Architect
Project Name: TLM
Environment: Teradata14.10/15.0, Informatica, Talend, BIBO, Unix
The scope of this project is to develop and provide insight regarding the risk associated with distribution transformer load management (“TLM Project”). The software developed in this project will utilize interval meter data to generate an on-demand report that provides a list of distribution transformers ranked based on the loading profile of each transformer as a percentage of transformer rating, as a proxy for risk factor. Using this data, TLM could better understand transformer risk and replace any transformers as risk, as well as keep in service transformers performing optimally. This would help TLM achieve a higher degree of system reliability, minimize outages, and potentially cut OPEX and CAPEX expenditures.
Roles and Responsibilities:
•Working as a Team lead, architect for the Project.
•Wrote Requirement design document, Prepare the database design and mapping document, Review the ETL design documents.
•Preparation of Unit Test cases, Unit Test plans, Performance Testing, Sub System Testing and UAT.
•Extensively worked on stored procedure, macro and view creation in Teradata to data load and view report.
•Used Teradata utilities BTEQ, Fastload, Multiload to load data.
•Worked on query and performance improvement.
• Created ETL jobs using Informatica, Talend to load the data into stage
and target tables.
• Played the key role in defining the coincident peak logic for the transformers in TLM
project which will return the more accurate value for each transformer.
•Migration of ETL code and Teradata objects from Dev, QA to Prod environments.
•Responsible for end to end delivery of project and change requests.
•Coordinate with internal and external customers as necessary on the deliverables and issues.
•Responsible for guiding the offshore development team.
•Responsible for estimation, planning and execution with specific focus on requirement analysis.
Client: GE, WI May 2015 to Aug 2015
Role: Technical Lead/Teradata Architect
Project Name: GE Healthcare
Environment: Teradata13/14.10, BOXI, Tableau
GEHC provides transformational medical technologies and services that are shaping a new age of patient care. GEHC expertise is in medical imaging and information technologies, medical diagnostics, patient monitoring systems, performance improvement, drug discovery, and biopharmaceutical manufacturing technologies to help clinicians around the world re-imagine new ways to predict, diagnose, inform and treat disease, so that the patients can live their lives to the fullest.
Roles and Responsibilities:
•Extensively worked on stored procedure, macro and view creation in Teradata to data load and view report.
•Unit Testing, Integration Testing and Performance Testing of Stored procedures.
•Interacting with Business Analysts for the Table-To-Table mapping of the business flows.
•Helping team on their report query build and performance tuning.
•Analyzing the mappings and optimizing them.
•Define and maintain naming standards, attributes and data types.
•Creating Technical Design Documentation.
•Prepare technical specification documents for ETL and database.
•Creating Unit Test Plan Scenarios.
•Managing Team, Planning and scheduling activities.
•Business Development by identifying opportunities and proposing solutions.
Client: GE, CT Aug 2010 to Mar 2014
Role: Technical Lead/Teradata Architect
Project Name: GE Commercial Finance EDW & Decoupling
Environment: Teradata 13/ 14.10, Informatica, Talend, IRIS Tool, SQL Assistant, BO, Unix
GE COMFIN has designed the product to provide loan to various business or Party. It keeps track of various transactions done with various Party or Business. There are three EDW subject areas like Party, Agreement and Asset. Key discussion topics will revolve around the definitions and relationships of these subject areas for the business. Data from all the business are collected to help the GE Capital Risk Head Quarter in analyzing the trend and provide Risk strategy for new business.
Roles and Responsibilities:
•Worked as a Tech Lead/ Team Member, architect for the above given Projects.
•Wrote Requirement design document, Prepare the database design and mapping document, Review the ETL design documents.
•Preparation of Unit Test cases, Unit Test plans, Performance Testing, Sub System Testing and UAT.
•Extensively worked on stored procedure, macro and view creation in Teradata for data load and view report.
•Used Teradata utilities BTEQ, Fastload, Multiload to load data.
•Unit Testing, Integration Testing and Performance Testing of Informatica, Talend, IRIS tool jobs and stored procedures.
•Created ETL jobs using Informatica, Talend, IRIS tool to load the data into stage and target tables.
•Worked on the many performance improvement both in the query and ETL job level.
•Migration of ETL code and Teradata objects from Dev, QA to Prod environments.
•Responsible for end to end delivery of project and change requests.
•Coordinate with internal and external customers as necessary on the deliverables and issues.
•Responsible for guiding the offshore development team.
•Responsible for estimation, planning and execution with specific focus on requirement analysis.
•Serve as a focal point to communicate and resolve interface and integration issues with other teams.
•Provide gap analysis on existing Data-warehouses in supporting new business areas.
•Managing Team, Planning and scheduling activities.
Client: GE Money, INDIA/USA Dec 2005 to Jul 2010
Role: Technical Lead/ Module Lead/ETL Developer
Project Name: GE Money, GECF - Americas (Production Support & Maintenance)
Environment : Oracle 10G, 8i/9i, Ab-Initio (ETL), BO, Toad, Unix
GE Money works closely with the Credit Bureaus to develop Skip and Early Delinquency strategies and provide Risk Strategy support for new products. The Customer modeling analytical platform (CMAP) and Retail Sales Finance (RSF) Data Warehouse collects data from various sources to facilitate various marketing campaigns, credit score modeling, risk analysis and other decision sciences. While the marketing team uses the data to define campaigns and promotion strategies for the existing and new products to the end-customers, the Risk team uses the data to understand which demographic areas are indicating increased fraudulent transactions and thereby come up with Risk Models to mitigate the same.
Roles and Responsibilities:
•Worked on all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation, and support.
•Extracted data from Flat files and transformed it in accordance with the Business logic mentioned by the client and loaded data into the staging tables.
•Used the Ab-Initio tool to develop jobs for extracting, cleansing, transforming, integrating, and loading data into data warehouse database
•Involved in performance tuning for several ETL jobs to reduce runtime in daily batch cycle.
•Worked on the query performance tuning for different jobs.
•Used several partitioning techniques in various stages to gain best performance on parallel jobs.
•Involved in migration and implementation of ETL code from development to Test and Production environment
•Prepare documentation for project estimates for all new developments to be done.
•Created Low Level Design Documentation of mapping/ETL Specs for business functionality
•Involved in documentation for Design Review, Code Review and Production Implementation.
Clients: Bond Technology, Home Electronix, DFID, ASD, Orisys, Pyoonidhi
Duration: Aug 2001 to Oct 2005
Role: Team Lead and Developer
Project Name: CEHR, Home Electronix, WORLP, Appliancestore, Orinet Café, IBSS
Environment : ASP.Net, VB.6, SQL Server 2000, Crystal Report 10, HTML
Roles and Responsibilities:
•Design and development of the web page.
• Writing Technical Specifications.
•Creating user controls and writing reusable codes.
•Preparing HLDD for new modules.
•Unit testing and debugging.
• Created Timebar component.
• Created DLLs.