Senthil Kumar Alwarsamy
Professional Summary:
** ***** ** ** ********** with extensive Enterprise Data Architecture experience in all different phases of SDLC, Agile Scrum methodology (Certified Scrum Master), Agile lean Kanban methodology (Certified), AWS (AWS Certified solution architect – associate).
Strong expertise in the area of Data Modeling (Certified), Data Profiling, Data Analysis, Data Migration using ETL tools and Data Architecture solutions. Strong hands-on experience in data governance process and data regulations such as GDPR.
Designed and managed Conceptual / logical / physical data models, data dictionary, Generate DDLs for deploying oracle databases. Proficient in successful implementation of Data Models for large scale projects.
Strong expertise in various data profiling techniques i.e., as Column profiling, address profiling, dependency profiling, redundancy profiling, uniqueness profiling etc.
Experience in migrating oracle databases to AWS RDS, designing Data Integration architecture and specifically hands on experience in ERwin r9.6, DB2, Oracle 12c, SAP Business Object Data Services, Informatica Power Center, SAP Information Steward, IDMS, Global Ids.
Strong fundamentals in RDBMS and experience in writing complex SQL queries for high volume report generation in investment banking applications using ORACLE 11g, PL/SQL, data regulations and compliance process.
Developed Enterprise Data lineage Meta Data Management from the scratch, defined the business process to keep this data in sync with project releases (using Graph database model in Neo4j).
Hands on experience in NoSQL database migrations from RDBMS. Oracle 10g (XML documents stored as BLOB) to Mongo DB migration, Oracle 9i databases to Neo4j.
Got trained in BIG DATA technologies such as Hadoop 2.0, Map Reduce, No SQL database technologies such as Mongo DB, Neo4j, RDF, SPARQL.
Knowledge in transforming the legacy source files, raw input extracts, various database storage to centralized AWS Data Lake.
Strong expertise in analyzing business requirements, interpreting UML representations such as Use Case diagrams, Sequence diagrams and component diagrams, translates them to data requirements, and develops conceptual, logical, and physical data models.
Experience in defining complex validation and data cleansing logic in Information Steward Tool.
Experience in creating and managing metadata about source files, data processing.
Experience in Query Performance Tuning, PL/SQL procedure creation, Source System Analysis (System and Data), Data Mapping.
Strong hands on experience in defining the Data Integration strategy and fluency in AWS cloud features.
Education
Degree and Date
Institute
Major and Specialization
Master of Science(Software Engineering, 5 years integrated) – 2005
Coimbatore Institute of Technology, Coimbatore, Tamil Nadu, India
Software Engineering
Employment Summary with Duration and Designation:
Name of the Employer
From
To
Duration
Designation
Cognizant Technology Solutions
07/01/2015
Till date
2 Years 1 months
Architect - Projects
Cognizant Technology Solutions
07/03/2011
06/30/2015
5 years
Senior Associate - Projects
Cognizant Technology Solutions
08/08/2005
07/02/2011
2 years 11months
Associate
Intel Technologies, Bangalore, India
10/15/2004
04/30/2005
6 months
Technology Intern
Malar Electronics, Chennai, India
05/12/2003
11/30/2003
6 months
Technology Intern
OVERALL TECHNICAL SUMMARY:
Software Methodologies
Waterfall model, Agile SCRUM, Lean KANBAN
Data Profiler Tools
SAP Information Steward, Global Ids
Languages/Frameworks Known
C, SAP Business Object Data Services (SAP BODS), PL/SQL, XML, SQL, Eztrieve, Big Data, Core Java, Python, Mainframe, Unix
Databases
Oracle, DB2, Integrated Database Management System (IDMS), AWS RDS, AWS Dynamo DB
Database Tools
ERwin 8.2.08, ERwin 9.6, SQL Developer 3.2, Stored Procedure Builder
Utilities
MSOffice, DFSORT,VSAM
Application Servers
Apache, Tomcat
Version Control and Config tools
Visual source safe, Endeavor, SharePoint
Domain Knowledge
Information Management
CERTIFICATIONS:
AWS certified solution Architect – Associate
Scrum Alliance certified Scrum Master
Team KANBAN practitioner
Cognizant certified Data Modeling – L1
Cognizant certified IDMS Database Developer
IBM DB2 UDB 700
IBM DB2 UDB 703
Cognizant certified DB2 UDB V8.1 developer
IBM System p Administrator
Cognizant certified SAS developer
A Snapshot of Significant Projects
Project Name
Client Name
Role
GDR-DDR-GLR Re-platforming
Dun and Bradstreet, USA
Data Architect
Intelligence Engine – Rules Automation, Globalization
Dun and Bradstreet, USA
Data Architect, Data Modeler
Linkage Discovery Engine
Dun and Bradstreet, USA
Data Modeler
Data Supply Chain Inventory Management (DSC)
Dun and Bradstreet, USA
Data Modeler
Maximize Customer Value (MAXCV) by Data Supply Chain – R1, R2
Dun and Bradstreet, USA
Data Analyst, Data Modeler
Global Input Handler
Dun and Bradstreet, USA
Data Analyst, Data Modeler
Source Profile Data Management
Dun and Bradstreet, UK
Data Modeler, Development Lead
Process control and Operational Efficiency Data Management
Dun and Bradstreet, UK
Data Modeler
Small Business Risk Insight Data Management
Dun and Bradstreet, USA
Mainframe Developer /
Data Analyst
Migration of Auto-Route to Common Data Transfer Services
Depository Trust and Clearing Corporation, USA
Technical Lead
Meta Information Storage Strategic Program
BNY Mellon, USA
Technical Lead
European Database Platform Re-engineering
Dun and Bradstreet, USA
Data Analyst, Data Modeler
Global Monitoring Solution Re-Plat forming
Dun and Bradstreet, UK
Data Analyst, Data Modeler
Portugal Migration for Europe Office System
Dun and Bradstreet, UK
Mainframe Developer /
Data Analyst
E-PORTFOLIO creation for Norway, Sweden, Denmark, Finland (NORDICS)
Dun and Bradstreet, UK
Mainframe Developer /
Data Analyst
Project Details
Data Architect - Cognizant Technology Solutions.
Project
GDR-DDR-GLR Re-platforming,
Data Supply Chain Re-Engineering,
DACoE GSRL Transformation Service
Client
Dun and Bradstreet, USA
Period
Jan 2016 – Till now
Description
This project is to migrate the GDR-DDR-GLR systems from HP Unix to Linux platform with upgraded Java (from 1.4 to 1.7), MQ (from 5 to 8), Oracle versions (9i to 12 C). The data flow architecture for GDR has been migrated from MQ – Transactional processing to Batch File processing. The Oracle databases are migrated from single node to two nodes architecture. As part of this project, a Proof of concept (PoC) has been implemented successfully to align that GDR system work with upgraded software / Hardware architecture.
Role
Data Architect
Contribution
Worked with Solution Architects / business / DBA team to consolidate the database hardware configuration requirements.
Data Integration Strategy documentation
Feasibility study on AWS migration for databases
Analysis on Data lake implementation for GSRLs as S3 objects
Worked to have Hybrid database environment (On-premise, AWS). Migrated database using AWS database migration manager.
Reverse Engineered and created data models
Analyzed AWR reports, OEM metrics for the database performance and Created Indexes for performance improvement.
Unit tested the Java modules, unix shell scripts, PL/SQL procedures for the upgraded platform.
Proposed the design for MQ Loader application and worked with Solution Architect team for review / signoff.
The performance of GDR system got improved more than 300% after this re-platforming.
The GLR system performance got improved more than 200%.
Team Size
6
Data Analyst, Data Modeler - Cognizant Technology Solutions.
Project
Intelligence Engine – Globalization, Rules Automation
Client
Dun and Bradstreet, USA
Period
June 2015 – Dec 2015
Description
This project is to extend the usage of Intelligence Engine (IE) system to new markets such as Mexico, India etc. The IE rule processing are manually coded in PL/SQL programs. So, minor rule changes ended up code changes. From Data Architecture team, new rules database is proposed to store rule configurations and automated the rules. Rule Configuration changes can be done any time in the database without any code change the system rules can be modified.
Role
Data Architect, Data Modeler
Contribution
Worked with business team to understand system’s functional requirements, interpreted the data requirements.
Delivered mapping specifications with accurate details for development team.
Delivered Data Model for Rule automation.
Review and signoff the changes in existing Logical / Physical Data Models for deployment.
Reviewed as well as created multiple Data Mapping Specifications for the ETL team to handle oracle database.
Worked on Erwin to create logical and physical data models for Integration and Data Mart layers.
Reviewed data dictionary that are created for logical and physical data model.
Data mapping with business requirements and Source systems data.
Identify the gaps in model by mapping existing systems model to model attributes.
Analyzed Legacy Source Data volumetric, performance.
Team Size
12
Data Architect, Data Modeler - Cognizant Technology Solutions.
Project
Linkage Discovery Engine
Client
Dun and Bradstreet, USA
Period
Dec 2013 – June-2015
Description
This project is to build a system which automates the processes such as Discovery, Curation and Synthesis that are involved in identifying the Business to Business relationships accurately. This system interacts with multiple systems to access business data and orchestrate linkage creation process. Linkage data is truly global and is represented as family trees. The Linkage data discovery is very complex as related business need to be linked, verified and aggregated.
Role
Data Architect, Data Modeler
Contribution
Worked with business team to understand system’s functional requirements, interpreted the UML diagrams shared by System Architects and identified the data requirements.
Created data management strategy based on data requirements, system architecture.
Delivered mapping specifications with accurate details for development team.
Review and signoff the changes in existing Logical / Physical Data Models for deployment.
Reviewed as well as created multiple Data Mapping Specifications for the ETL team to handle oracle database.
Worked on Erwin to create logical and physical data models for Integration and Data Mart layers.
Reviewed data dictionary that are created for logical and physical data model.
Define the XML control messages for Message Queue communication across systems.
Data mapping with business requirements and Source systems data.
Identify the gaps in model by mapping existing systems model to model attributes.
Analyzed Legacy Source Data volumetric, performance.
Team Size
18
Sr. Data Analyst, Data Modeler - Cognizant Technology Solutions.
Project
Data Supply Chain (DSC) Inventory Management
Client
Dun and Bradstreet, USA
Period
Oct 2012 – Dec 2013
Description
Dun & Bradstreet (D&B) is an information service provider and its customers can get all types of information about a company. Data Supply Chain is multiyear project initiated by D&B to modernize its processes such as collecting data from raw data providers, processing the data within D&B data repositories, delivering the data in D&B products. This project is to identify new raw data sources based on data quality, implement business rules to cleanse, transform data as per D&B standards. This project involved analysis of legacy data stores in IDMS, DB2 and the objective is to replace these legacy data stores with oracle database. This processed data will be exposed to services layer and deliver data in D&B business report products.
Role
Sr. Data Analyst, Data Modeler
Contribution
Managed 5 members offshore Data Analyst (DA) team and ensured all the DA deliverables are meeting the client’s data standards.
In-depth Data Analysis using the data profiling tool SAP Information Steward to calculate worthiness of raw data source.
Performed different types data profile techniques and understood data volumetric.
Understood UML Structural / Behavioral diagrams from system architects and defined the scope of data management.
Worked with solution Architecture team to corroborate data requirements and non-functional system requirements based on data volumetric.
Created data management strategy based on data requirements, system architecture.
Reviewed DDLs created by data analysts to ensure Data Analyst, Data Architecture standards in place.
Worked on Erwin to create logical and physical data models for Integration and Data Mart layers.
Defined the File Layouts, XML control messages for Message Queue communication across systems.
Team Size
12
Sr. Data Analyst, Data Modeler - Cognizant Technology Solutions.
Project
Maximize Customer Value by Data Supply Chain Seeding (MaxCV DSC Seeding)
Client
Dun and Bradstreet, USA
Period
May 2012 – Sep 2012
Description
The project MaxCV DSC Seeding is to transform the Data Supply Chain from the legacy data store to a new and enhanced data stores. Seeding is intended to populate the new topical repositories for data with existing legacy data thus providing the new data supply chain with the initial data needed for further training, testing, and migration of the data supply chain. All legacy data will have information source codes to denote that the data is legacy data. The legacy information source codes will be the last source available to provide data for the best view / Product Ready Data mart.
Role
Sr. Data Analyst, Data Modeler
Contribution
Worked with Legacy system CoE team to collect data requirements and non-functional system requirements based on data volumetric.
Created data management strategy based on data profile analysis, data requirements, system architecture.
Worked on Erwin to create conceptual, logical and physical data models for staging tables.
Performed different types data profile techniques and understood data volumetric and types.
Managed 3 member Data Analyst, Data Architecture (DA) team and ensured all the DA deliverables are meeting the client’s data standards.
Worked in Agile Methodology and defined a strategy to deliver DA artifacts for each iteration release.
Team Size
20+
Data Analyst, Data Modeler - Cognizant Technology Solutions.
Project
Global Input Handler
Client
Dun and Bradstreet, USA
Period
Jul 2011 – May 2012
Description
This project is to build a system called Global Input Handler (GIH) which will enable D&B to process all data sources (online or batch) through a common, flexible technology infrastructure, providing a single source of truth to our customers. It will process all inbound data collected by D&B and load Inventory Data Models. The primary output of the GIH will be pre-processed, standardized data, which will be available for consumption by the subsequent Topical Repositories.
The approach that will be used for this solution is to construct an environment where the Business owner can define a Source Profile, select the applicable Business Rules and the steps to follow (workflow) to process a source record group. Input Handling consists of four logical components:
a.Source On-boarding
b.Pre-processing
c.Identify Record
d.Routing
Role
Data Analyst, Data Modeler
Contribution
Managed 5 member Data Analyst, Data Architecture team and implemented inventory data models.
Designed the flexible configuration parameter driven tables for accommodating changes in business rules and thresholds.
Delivered File layout Specifications, Mapping specifications, XML schemas.
Created Metadata for the source files and performed the Data profile for all sources.
Defined Work Flows and Data Flows for some of the critical Loading components coded in SAP BODS.
Defined Data validation rules in SAP information steward.
Identified and documented the data mapping and complex transformation Logic for the ETL team.
Integrated rules repository where rules can be easily created, tested and debugged.
Interfaces to other components and applications (Match, Cleanse and Standardization, Staging databases)
Team Size
10+
Data Modeler - Cognizant Technology Solutions.
Project
Source Profile Data Management
Client
Dun and Bradstreet, USA
Period
Nov 2009 – Jun 2010
Description
This project is to create a centralized database and manage the meta data about source file details to Input Handler. D&B get hundreds of different source files across markets, topics. The input handler correspond to each market will process these source files and load the topic data mart. By having centralized database, the source files can be classified and processed more efficiently.
Role
Data Modeler
Contribution
Analyzed the source file’s metadata, existing data models used by Input Handlers, File Layouts, and System Interfaces.
Created conceptual / logical / physical data model for the source profile data management.
Created Mapping specifications for the User Interface fields to newly created table columns.
Delivered Logical data flow for the data entered in UI.
Delivered data dictionary for logical and physical data model.
Team Size
4
Data Modeler - Cognizant Technology Solutions.
Project
Process control and Operational Efficiency Data Management
Client
Dun and Bradstreet, USA
Period
Jan 2009 – Oct 2009
Description
This project is to enhance the performance of file input handlers by designing a centralized database for managing the input handler process. This database will control record processing based on the relative priority of the records, batches or the profile instances. It also records critical audit information (e.g., Start Process Timestamp, End Process Timestamp, Transaction Status, Transaction Result) for each loading process from the staging repository to the respective topical repository through Oracle Data Integrator (ODI).
Role
Data Modeler
Contribution
Analyzed the process details in each input handler, existing data models used by Input Handlers, File Layouts, and System Interfaces.
Created conceptual / logical / physical data model. .
Created Mapping specifications for the created table columns that need to be logged by input handlers while processing a source file.
Delivered data dictionary for logical and physical data model.
Team Size
2
Developer / Data Analyst, Cognizant Technology Solutions
Project
Small Business Risk Insight Data Management (SBRI)
Client
Dun and Bradstreet, USA
Period
Jun 2008 – Dec 2008
Description
This project is to get data feed from mainframe system and maintain in Oracle Data Mart. Small Business Risk Insight (SBRI) is a system in Mainframe environment to calculate the risk score for business. The scope of the project is to identify the elements that need to be maintained in ODS and create database for data management.
Role
Data Modeler
Contribution
Analyzed the data usage of elements in Mainframe DB2 database, existing data models used by SBRI, File Layouts, and System Interfaces.
Monitored the production system and implemented mainframe solutions for the production issues.
Team Size
5
Technical Lead, Cognizant Technology Solutions.
Project
Migration of Auto-Route to Common Data Transfer Services
Client
Depository Trust and Clearing Corporation, USA
Period
Jan 2008 – May 2008
Description
The goal of this project is to migrate the AutoRoute legacy system by incorporating all its functionality into the CDTS (Common Data Transfer Services) platform (as stated in the 2008 Stakeholder/Financial Corporate Goals). The AutoRoute system was in sourced from SIAC and was designed to provide a standardized way of distributing files and reports from NSCC, NYSE, AMEX, MBSCC, GSCC & SIAC applications. The existing file processing, monitoring, setup and maintenance functionalities are being modified by CDTS functionalities. All AutoRoute meta data is to be converted to Subscriptions, Profile, Destination & Node data. The source for all reports generated for AutoRoute is being modified.
Role
Technical Lead
Contribution
Requirements Analysis
Design document creation
Created Data Migration Strategy
Weekly Status Report Creation
Coordinate with QA Testing teams
2 member Team Management
Team Size
3
Technical Lead, Cognizant Technology Solutions.
Project
Meta Information Storage Strategic Program
Client
Depository Trust and Clearing Corporation, USA
Period
Jul 2007 – Jan 2008
Description
This project involves maintenance of CICS online transaction, which is used to maintain the Tax DB2 tables. The project follows the onsite-offshore model, which means the enhancement requested by clients will be worked on from offshore as well as onsite. The project includes analysis and solution proposition of service requests raised by business users. Service Request is basically a change or enhancement that is driven by business or another application.
Role
Technical Lead
Contribution
Service Request Analysis
Design document creation
Weekly Status Report Creation
Coordinate with QA Testing teams
3 member Team Management
Team Size
4
Data Modeler - Cognizant Technology Solutions.
Project
European Database Platform Re-engineering
Client
Dun and Bradstreet, UK
Period
Feb 2007 – Jun 2007
Description
European Database Platform Re-engineering (EDPR) Program is a system migration program from Mainframe environment to ODS. The scope of the project is to migrate the IDMS and DB2 databases in mainframe to Oracle Technology in Unix Platform. The database such as Public Notices (PNDB) and European Statements (ESDB) are using IDMS technology and contains events and financial information. The Granular Database (GRDB) in DB2 technology contains all the in-depth financial information about the European companies. The objective of the project is to move the daily input bulk load file processing system in mainframe to ETL in Data warehousing without impacting the other systems in mainframe. The data in mainframe databases are to be unloaded, transformed as per the proposed database model and loaded in the ORACLE ODS.
Role
Data Modeler
Contribution
Created conceptual / logical / physical data model.
Analyzed the data usage of elements in Mainframe DB2 database, existing data models used, File Layouts, and System Interfaces.
Created a Rules document to consolidate the validation and data transformation rules.
Created Mapping specifications for the legacy data source IDMS, DB2 elements to target ODS columns.
Delivered data dictionary for logical and physical data model.
Team Size
6
Data Modeler - Cognizant Technology Solutions.
Project
Global Monitoring Solution Re-Plat forming
Client
Dun and Bradstreet, UK
Period
Dec 2006 – Feb 2007
Description
Dun & Bradstreet’s (D&B’s) customers can monitor a business by registering in DBAI (online portal for customers). Global Monitoring System (GMS) is database system (DB2) in which the registrations to monitor the business are being maintained. The database contains details about the customers, business and the registrations. This system has COBOL – DB2 stored procedures and mainframe batch applications. The stored procedures were created to add /update registration, customer and business details. To improve the performance in this system, the DB2 database has been changed with the ORACLE with different data model. The mainframe batch applications were modified to act in accordance with the database change.
Role
Data Modeler
Contribution
Created a Rules document after analyzing the mainframe DB2-COBOL stored procedures. to consolidate the validation and data transformation rules.
Created Mapping specifications for the legacy data source DB2 elements to target ODS columns.
Created conceptual / logical / physical data model.
Analyzed the data usage of elements in Mainframe DB2 database, File Layouts, and System Interfaces.
Delivered data dictionary for logical and physical data model.
Team Size
8
Developer / Data Analyst, Cognizant Technology Solutions
Project
Portugal Transition
Client
Dun and Bradstreet, UK
Period
Feb 2006 – Oct 2006
Description
This project is to migrate all the Portugal business information from EOS databases (IDMS) to Oracle Data Store. The impacts of the data removal had identified on Batch applications as well as online transactions. The Batch applications which were dedicated to Portugal data processing had put into Pending status and the applications which process Portugal data along with other EOS countries data has to be modified to suit with the Business Requirement. The impacted Online Transactions were identified and informed to their respective Informa teams (D&B’s Partner) about its usage.
Position
Developer / Data Analyst
Contribution
Created Mapping Specification to map the legacy source data elements to newly defined Oracle Data Store columns.
Legacy Mainframe Data analysis.
Written C program to parse unstructured large data chunks and convert the data into delimited structured elements.
Uploaded the data into a warehouse through automated process
Automated the FTP flows using mainframe tools.
Team Size
3
Developer / Data Analyst, Cognizant Technology Solutions
Project
E-PORTFOLIO creation for Norway, Sweden, Denmark, Finland (NORDICS)
Client
Dun and Bradstreet, UK
Period
Sep 2005 – Jan 2006
Description
This project was to get the credit recommendation data from the E-PORTFOLIO (EPR) database and send it with the existing European Office System (EOS) extract data feed to other systems. This helped D&B to eliminate the manual work because of the data inconsistency in Credit control.
Position
Developer / Data Analyst
Contribution
Data cleaning, extraction, data exploration and analysis
Uploaded the data into a warehouse through automated process
Created PL / SQL procedures to automate some of the data analysis such as name match, address match etc.
Team Size
4