MITHILA JOOTTU THIAGARAJAN
**** ********** **, **********, **
Data Engineer Lead
****.*****@*****.***
EXPERIENCE SUMMARY
Seasoned Data Engineer, Data Integration professional with demonstrated success in developing and implementing solutions for Fortune Companies like Prime Therapeutics, Tricentis, Workday, Franklin Templeton, Wells Fargo, Deutsche Bank, Dean Foods, America Honda motors, Charles Schwab, USB through leading IT companies. Possesses diversified technical background with 14 years of experience in DWH, Data Lake, ETL architectures.
Has extensive financial domain experience and worked on Snaplogic (4 Years), Informatica PC & (11 years), OCI (1.5 Years), Python (2 Year), Erwin (3 years), Tableau (2 Years), AWS (2 years), PL/SQL and SQL (10 Years), Teradata (1 Years), Pyspark (2 Years), Snowflake (1.6 Years) and Tableau (2 years).
OBJECTIVE
Seeking a challenging career as Senior Data Engineer / Integrator
ACADEMIC CREDENTIALS
MCA: Master of Computer Applications from Madurai Kamaraj University - 2001 with Distinction
B.Sc. Bachelor of Sciences in Computers from Madurai Kamaraj University - 1998 with Distinction
CERTIFICATION
●SnapLogic Architect Certification from SnapLogic
●Hands on Informatica Intelligent Cloud Services (IICS)
●Informatica PowerCenter developer certification
●Informatica Data Quality specialist certification
●Snowflake Cloud Dataware House: Hands-on SQL training
●Python for Data Analysis and Visualization
●Cloud Computing Solutions Professional
●Fundamentals of the Databricks Lakehouse Accreditation - certified
●AWS Cloud Practitioner Essentials, AWS Managed services console walkthrough – AWS training and certification
●Dbt Fundamentals certified
SKILLS
Snaplogic Informatica 10 Cognos 8 Erwin 9+ Data Modeling SQL PL/SQL
Autosys SQL Server Agent Teradata TPT T-SQL MDM Concepts Python Informatica TDM AWS Redshift S3 API gateway TIBCO CIS Snowflake Postman Unix Scripting DWH Architecture ETL Programming DB2 Java Oracle XML Programming Teradata BTEQ Scripting Tableau SCD Informatica IDQ Talend Star schema SoapUI SaaS Agile Integration OCI
STRENGTHS AND ACHIEVEMENTS
●Certified SnapLogic Architect, from SnapLogic
●Expertise in building self healing pipelines
●Proficient in Web Services, API building, XML, and JSON data handling using SOAP and REST
●Experienced in working with structured and semi-structured data
●Expertise in developing/maintaining ETL end-to-end lifecycle with ETL Informatica Power Center Components Mappings, Mapplets, simple and complex Transformations, Sessions, Worklets, and Workflows for data loads
●Expertise in Data Integration projects using Informatica PC & Cloud, Snaplogic, TIBCO DV, and Mulesoft
●Result-oriented Senior Integration Data Lead who has worked through the full product life cycle to deliver many complex and high-performance enterprises
●Comfortable in working with teams of all levels and in individual self-starter roles
●Experience with advanced ETL techniques including Staging, Reusability, Data validation, CDC, Batch processing, Error handling, incremental loading, and incremental aggregation
●Experienced in testing methodologies, Unit testing, Integration testing, System testing, and proven abilities in ETL testing
●Expertise in DWH Architecture and Data migration
●Experienced with Airflow, Autosys and Control-M scheduling tools
●Experienced in migrating applications from ETL's previous versions to Informatica 8.6/9.5 version
●Good experience in mentoring/leading onsite-offshore model project execution
●Level 3 production support includes analyzing the production issues, and resolving it within permissible SLAs
●Lead Development activities include providing estimations, participating in database design meetings, data modeling, analyzing the project from an ETL perspective, designing, developing, test support, and documentation
●In-depth analysis of the performance of the application overall in production and then analysis and optimize it further
●Have written fine T-SQL Queries, Dynamic-queries, sub-queries, and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views, and Cursors
●Skilled in error and event handling: precedence Constraints, Break Points, Checkpoints, and Logging
●Ensured all the business quality goals are achieved as stated in the company quality guidelines
●Supported and have strong knowledge of Data modeling and supported Data modelers in defining tables and relations per the business requirement
●Implemented successfully ETL and ELT strategies
●Experience in using Teradata components BTEQ, Multi load, Fast Load, and TPump
●My expertise in Troubleshooting and fixing ETL defects by playing a Data Analyst role for the entire project ecosystem has helped me to achieve awards and client applauses
●Have created Business Object Universes and fine-tuned them according to the business requirements
●Executed GAP analysis between ETL system and BI systems and provide RCA for the issue and rectified them in a timely manner
●Engineered and reviewed ETL production design for performance optimization
●Resolves issues through troubleshooting and quick diagnoses
●Ensures data quality by designing and implementing adequate internal controls and monitoring processes
●Expertise in S3, RedShift, Google cloud platform, and AWS environments
●Includes control to identify individual ETL flows that are performance issues
●Working knowledge of the Snowflake data platform and Azure Databrick
●Hands on experience on DBT - Implemented hooks on repetitive tasks, updated dbt_project.yml for model configuration section
●Experience in Supply chain, Sales, Customer support technical domains
●Expertise on iPaas tool Workato
●Hands on OCI tool
PROJECTS EXECUTED
Project #1 : S&I Integration Support - Oct 23 - Till Date
Client : Prime Therapeutics
Company : Lance Soft, USA
Role : Data Architect IV
Skills : Python, Snaplogic, OCI, JIRA, Azure, AWS, JSON, XML, Salesforce, Snowflake, Postgres, Web services, RESTful API, Supply chain domain
●Supporting SureScripts applications in the pharmacy organization.
●SME and Managing ePrescribing application of S&I
●Working on Batch and RT systems
●Working on Oracle on, MySQL and SQL Server db on AWS
●Responsible for analyzing system integration requirements and designing an integrated solution to fulfill the business requirements, in collaboration with the solution architects
●Evaluated, developed, tested, and deployed high-quality system integrations using the SnapLogic platform
●Identify problem areas with business teams and enhance existing system integration business logic
●Monitor the performance of the integration platform to ensure efficiency and reliability
●Recommended and documented integration design decisions that meet the organization's needs
●Active participant in agile meetings such as daily stand-up meetings, weekly sprint planning, and tech review meetings and contribute to technical review meetings and integration updates
●Worked on multiple migration projects from another integration tool to Sanplogic with many design improvements
●Integrated HTTP Listener/Requestor, Database, JMS, JIRA, Confluence, and Salesforce, Salesforce managed subscriptions like Kimble and Bizible with Data Warehouse.
●Experience in developing secure integrations addressing all relevant layers within the integration
●Experience in developing integrations within a hybrid infrastructure environment is desired
●Experienced within Azure or AWS (e.g. Storage Accounts, SQS, Data Factory, Synapse, etc.)
●The data integration with Snowflake includes Snaps for bulk load, upsert, and unload in addition to standard CRUD (create, read, update, and delete) functionality
●Pyspark scripting for encoding and data processing.
●Experienced with Supply chain DevOps practices and working within an Agile/Scrum development team
●Excellent technical, analytical, and problem-solving skills
Project #2 : Data Engineer Lead - Tricentis - Apr 22 – Dec 22
Client : Tricentis
Company : iMidia LLC, USA
Role : Data Integration Lead
Skills : Python, Snaplogic, OCI, JIRA, Azure, AWS, JSON, XML, Salesforce, Snowflake, Postgres, Web services, RESTful API, Supply chain domain
●Responsible for analyzing system integration requirements and designing an integrated solution to fulfill the business requirements, in collaboration with the solution architects
●Evaluated, developed, tested, and deployed high-quality system integrations using the SnapLogic platform
●Identify problem areas with business teams and enhance existing system integration business logic
●Monitor the performance of the integration platform to ensure efficiency and reliability
●Recommended and documented integration design decisions that meet the organization's needs
●Active participant in agile meetings such as daily stand-up meetings, weekly sprint planning, and tech review meetings and contribute to technical review meetings and integration updates
●Worked on multiple migration projects from another integration tool to Sanplogic with many design improvements
●Integrated HTTP Listener/Requestor, Database, JMS, JIRA, Confluence, and Salesforce, Salesforce managed subscriptions like Kimble and Bizible with Data Warehouse.
●Experience in developing secure integrations addressing all relevant layers within the integration
●Experience in developing integrations within a hybrid infrastructure environment is desired
●Experienced within Azure or AWS (e.g. Storage Accounts, SQS, Data Factory, Synapse, etc.)
●The data integration with Snowflake includes Snaps for bulk load, upsert, and unload in addition to standard CRUD (create, read, update, and delete) functionality
●Pyspark scripting for encoding and data processing.
●Experienced with Supply chain DevOps practices and working within an Agile/Scrum development team
●Excellent technical, analytical, and problem-solving skills
Project #3 : Salesforce Workday Integration - May 20 – Apr 22
Client : Workday
Company : Virtusa Corporation, USA
Role : Data Engineer
Skills : dbt, Snaplogic, RedShift, JIRA, AWS S3, Python, PySpark, Airflow, JSON, XML, Salesforce, Workday web services, RESTful API
●Employing good design methodologies to design, enhance, improve or otherwise develop systems throughout the project life cycle in accordance with corporate and/or client functional needs, requirements
●Worked in different subject areas like Sales, Pre-Sales, Customer Support, Services, Pricing, Finance, Marketo Inbound and Outbound, Data quality, and MDM projects
●Extensively worked as an Integration developer to integrate data from different sources (Salesforce, Pricing transformation, Workday dashboard) into target DWH (Redshift)
●Have worked on Workday to Reltio Integration. Created an end-to-end framework to extract the data from the Workday system and loaded it to the Reltio MDM environment using RESTful API
●Developed ETL pipeline to extract data from the different sources and load it into AWS S3 landing zone
●Trained functional teams on the integration and supported them to Integrate multiple resources
●Developed ETL pipeline to extract the data from AWS S3 landing zone and load it into the target AWS Redshift data warehouse
●Development and Tuning of ETL processes in Snaplogic and Redshift SQLs
●Create business views with complex logic for the dashboards
●Build ETL process for Data quality and governance dashboards
●Create stored procedures and write SQLs as per the requirement from the business user
●Debugging of errors/issues encountered during the SDLC life cycle
●Involved in the Production Support and Maintenance phase
●Provide replication solution defects resolution ideas/fix to the Production support team
●Experienced in developing data pipelines for Data Lake
Project #4 : Salesforce Workday Integration - Nov 19 – Mar 20
Client : Workday
Company : Intelliswift services, USA
Role : Data Integrator
Skills : Snap Logic, Salesforce, Workday web services, JIRA, SOAPUI, AWS API gateway, Python, JSON, XML, GCP, S3, and RedShift
●Implemented Integration between Salesforce with Workday Webservices API for Extended Learner details
●Worked as liaison between Salesforce and BT technology Integration team for this project to bring out functional requirements
●Expertise in using JSON, XML, and RESTful API data pipeline transformations
●Prototyped and demonstrated to the Workday Integration team(s) using Snaplogic, Postman, Swagger and Soap UI, JSON, and XML
●Created detailed integration documentation
●Trained functional teams on the integration and supported them to Integrate
●Experienced in developing data pipelines for Data Lake
Project #5 : Product Data Services - Aug 17 – Mar 18
Client : Franklin Templeton Investments
Company : Encore Software services, USA
Role : Senior Consultant
Skills : Informatica Power Center 10.1, TDV 7.6 (Composite Integration Studio), Snap Logic, Oracle 11g, SQL Toad, Windows, PL/SQL, SQL, DWH, Control-M, UNIX, JIRA, Data Virtualization, DWH, Erwin 9.6, Tableau, AWS, Python
●Developed ETL end-to-end lifecycle with ETL Informatica Power Center Components Mappings, Mapplets, simple and complex Transformations, Sessions, Worklets, and Workflows for data loads
●Performed Database Integration, New project integration, designing, developing, SIT support, and documentation using Informatica and TIBCO CIS Studio
●Created CIS views and tweaked the views for optimizing the performance of Data Integration
●Designed ETL design requirements for multiple projects in Franklin
●Production support for the PDS addressing data processing issues and production escalations across multiple platforms for complex and time-sensitive requirements
●Successfully provided PDS Purging, Resilience, and Recovery strategy
●Compared and presented various DWH cloud architectures on IaaS, SaaS, PaaS
●Provided verification and validation of the development during testing phases
●Involved in Incident Management for Production issue fixes and change request
●Developed and enhanced Data Quality metrics and Data Transformation from previous releases
●Provided enhancement plan and carried out code changes in various releases
●Experienced in setting up LDAP ports domain configuration
●Used Python for Data Analysis and Reporting
●Tested the new releases on different environments and provided valuable answers to QA questions
●Have appreciated and used Tableau reporting for the one-time specific requirement for an agile POC
●I have covered multi-level drill-down options with report bursting to various users using Tableau
●Have helped ETL leads/Managers in deciding on a strategy to go for the DWH approach or Tableau reporting for Self-service reporting
●Includes control to identify individual ETL flows that are performance issues
●Designed, developed, managed, and monitored pipelines to process FTO and e-commerce data using Snaplogic Elastic Integration iPaaS
●Collected data from heterogeneous sources like flat files, Excel sheets, archives, Oracle and SQL Server databases, SOAP, and REST services and loaded to Amazon Redshift and another SaaS
●Utilized workday REST and SOAP APIs, CSV files extracted to - premises FTP server, and Snaplogic workday snap to collect reports
●Managed Snaplogic servers, pipelines, and scheduled/triggered tasks
●Supported and developed new requirements for AWS S3/Redshift integration with ETL tool Informatica power center 10
●Developed mapping for Uploading data to S3 and Downloading data from S3
●Facilitated the Informatica 9.6 to Informatica 10.1 version migration
Project #6 : Profit View - Nov 16 – Aug 17
Client : Wells Fargo Bank
Company : Infosys Technologies LTD, Fremont, USA
Role : Technology Lead - DWH
Skills : Informatica Power Center 9.6, Informatica IDQ, Informatica TDM, Erwin 9.6, Oracle 11g, SQL Toad, PL/SQL, SQL, Windows, Autosys, Unix, HP ALM, JIRA
●Developing/maintaining ETL end-to-end lifecycle with ETL Informatica Power Center Components Mappings, Mapplets, simple and complex Transformations, Sessions, Worklets, and Workflows for data loads
●Developed Informatica PC mapping for multiple version releases in wholesale technologies
●Performed Database migration, Incremental Data Model uploads, and coordinated team on it
●Database design meetings, data modeling, analyzing the project from ETL perspective (Informatica), designing, developing, test support, and documentation
●Verified Data model requests and implemented the same using Erwin and XML
●Create data subsets to reduce costs and accelerate the development
●Created ETL test data using Informatica TDM and defined business rules for the test of the functionality of batch processing
●Involved in Incident Management for Production issue fixes and change request
●Enhanced Validation queries for specific operations and added to the workflow
●Developed and enhanced Data Quality metrics and Data Transformation from previous releases
●Given reverse KT to the team and mentored new team members
●Provided extensive cleansing and profiling of legacy data and provided input to the current system
●Provided enhancement plan and carried away changed in a subsequent release
●Performed Production support proactively by never missing SLAs
●Tested the new releases on different environments and provided valuable answers to QA questions
Project #7 : Deutsche Asset Management - Dec 13 – Apr 14
Client : Deutsche Bank
Company : Cognizant Technologies, India
Role : Associate
Skills : Informatica Power Center 9.6, Informatica IDQ, HP ALM, Oracle 11g, SQL Toad, PL/SQL, Teradata, Windows XP, Erwin, XML, Autosys, Tableau
●Developed and enhanced Informatica mappings on various stages and promoted them to Live
●Developed Informatica Power center batch architecture to extract, transform and load data from various sources like Oracle, Siebel, DB2, flat files, and XML files sent by third parties
●Design ETL Informatica Transformations including – XML parser, Normalizer, SQL transformation, Store procedure, update strategy, and transaction control transformation and created complex mappings
●Developed mapping and workflows using Informatica best practices and considering the optimum performance of the jobs
●Developed and maintained ETL Code for loading transactional data incremental loading into the DeAM system
●Raised Change Requests and Incident Management, analyzed and coordinated resolution of code flaws for the development environment, and hot-fixed them in QA and production environments during the runs
●Expertise in using Informatica Data Quality (IDQ) for client data standardization
●Performance tuning of Informatica batches by optimizing Sources, Targets, Mappings, and Sessions and experience in using partitioning and parallel processing
●Implemented CDC logic which incrementally loads Portfolio
●Created ETL test data using Informatica TDM and defined business rules for the test of the functionality of batch processing
●Experienced in Informatica Information life management tool used to optimize TDM and purged old data safely
●Extensively used HP ALM to track defects, issues, and bugs reported by Businesses
Project #8 : SIMPL MSA - Apr 13 – Nov 13
Client : American Honda Motors, CA, USA
Company : Accenture Services, India
Role : Software Engineering Sr. Analyst
Skills : Informatica Power Center 9.1, ASG Zena, IBM Mainframes, DB2, PL/SQL, IBM Query Tuning, FTP, Java, Javascript
●Analyzed 3 applications and Datawarehouse and gave valuable inputs to the Client
●Provided Impact analysis and estimated time needed to implement and test the changes done in the source system
●Created efficient SQL queries for reporting needs
●Provided support for SIMPL MSA promptly and catalytic to quicker closure and implementation of the issue
●Analyzed and enhanced complex Informatica mappings using various transformations
●Examined the session logs, bad files, and error tables for troubleshooting mappings and sessions
●Carried out Performance tuning of the SQL queries and mapping to identify bottlenecks at target, source, mapping, and session levels
●Implementing coding in Java, and Javascripting for web-based UI interfaces
●Carried out data validation with source files in EBCDIC and ASCII formats, XML files, and flat files
●Supported Unix Shell Script for various preload and post-load validation and activities
●Analyzed, Designed, and Implemented Purge logic for the SIMPL MSA application
●Update the mapping sheets and Data Dictionary
●Mentoring the new team members
●Well-versed in using Kanban and Agile processes in projects
●Effectively supported the entire SIMPL ETL integration data marts working as Offshore lead coordinating On-site team and Offshore BI teams
Project #9 : Equinox MDM - Jul 07 – Feb 08
Client : Equinox, New York, USA
Company : Accenture Services, India
Role : Senior Systems Engineer
Skills : SSIS, Windows, Oracle, PL/SQL, SQL
●Analysis of the requirements provided by various business users
●Executed sessions, sequential and concurrent batches for the proper execution of mappings and then set up email delivery after execution
●Developed Unix Shell Script as well for various preload and post-load validation and activities
●Have written fine T-SQL Queries, Dynamic-queries, sub-queries, and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views, and Cursors
●Skilled in error and event handling: precedence Constraints, Break Points, Checkpoints, and Logging
●Supported team in resolving SQL Reporting services and T-SQL related issues and Proficiency in creating different types of reports such as Crosstab, Conditional, Drill-down, Top N, Summary, Form, OLAP, and Sub reports and formatting them
●Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS Package Designer
●Experience in importing/exporting data between different sources like Oracle/Access/Excel etc. using SSIS/DTS utility
●Experience in ETL processes involving migrations and in-sync processes between two databases
●Experience in Microsoft Visual C# in the script component of SSIS
●Transformed data from one server to other servers using tools like Bulk Copy Program (BCP), and SSIS
●Experience in creating configuration files to deploy the SSIS packages across all environments
●Expert in generating and writing parameterized queries, drill-through reports, and formatted SQL server Reports in SSRS 2005
Project #10 : Composite Report Net - Feb 07 – Jul 07
Client : Composite, San Mateo, USA
Company : Wipro Technologies, India
Role : Senior Systems Engineer
Skills : COGNOS ReportNet, Framework Manager, Data modeling, PL/SQL, SQL, Consulting, HTTPS, Java
●Optimizing integrated schema sourcing data from Siebel, Oracle, and SAP
●Experience in Cognos BI tools using Cognos Framework Manager, Report Studio, Query Studio, Analysis studio, Power Play Transformer, and Impromptu Web Reports (IWR)
●Experienced in Cognos 10, 8, and ReportNet with Report Studio, Framework Manager, Query Studio, Analysis Studio, Metric Studio, and Cognos Connection
●Strong experience in creating and publishing packages in Framework Manager
●Strong understanding of dimensional modeling of Star Schemas, and Snow-Flake Schemas Methodologies for building enterprise data warehouses
●Developing and peer-reviewing Cognos reports
●Unit test plan preparation, probing adequacy of Unit testing & coverage
●Effectively supported onsite Acceptance Testing
Project #11 : Logistics KPI - Jan 06 – Feb 07
Client : Dean Foods, CA, USA
Company : Wipro Technologies, India
Role : Associate
Skills : Informatica Power Center 8.6, Teradata BTEQ scripting, TPT, Cognos, System integration testing, PL/SQL, SQL, DWH
●Performed Unit Testing and Integration Testing. Supported Systems Testing Team
●Analyzed 3 applications and Datawarehouse and gave valuable inputs to the Client
●Provided Impact analysis and estimated time needed to implement and test the changes done in the source system
●Provided support and gathered knowledge with other Dean Foods teams to create reusable scripts
●Analyzed and enhanced BTEQ script and TPT utilities
●Examined the session logs, bad files, and error tables for troubleshooting mappings and sessions
●Carried out Performance tuning of the SQL queries and mapping to identify bottlenecks at target, source, mapping, and session levels
●Effectively coordinating the On-site team and offshore team in the entire QA Testing and Implementation cycle
●Involved in ETL development and performance tuning
●Supported BI teams in the development of the report using Cognos Software
●Successfully coordinated offshore–onshore model project with working as the offshore lead
Project #12 : iSchwab – ETL Migration - Jul 03 – Jan 06
Client : Charles Schwab, USA
Company : Wipro Technologies, India
Role : Project Engineer
Skills : Informatica Power Center, Oracle 8i, IBM Mainframes, COBOL, SQL Loader, SQL Plus, Unix Shell scripts, vi, Banking & Finance, DWH
● Re-engineering and Analysis of Unix/shell scripts and PL/SQL code
●Design of Informatica mappings and Workflows
●Created complex and scalable Informatica mappings and workflows
●Value-added future suggestions are provided to the client
●Performed Data validation with different sources in EBCDIC and ASCII forms. Used Normalizer and XML transformation
●Created Test Strategies and test scenarios and defined document standards to follow CMM-level coding
●Creation of End-to-End Unit Test Cases, Parallel test cases, and System test cases
● Created automated reusable test scripts
●Gain team awards for reducing the execution time significantly
●Received the ‘Feather in the Cap’ award from my project manager twice for this project
Project #13 : Dashboard - Jan 03 – Jul 03
Client : COE, Wipro Technologies, India
Company : Wipro Technologies, India
Role : Project Engineer
Skills : Informatica Power Center, Business Objects, Oracle 8. x, VB Script, PL/SQL, Erwin, Informatica 7, HTML, Erwin 6.0, HTTP
●Created reports for higher-level managers to see periodical performances of various domains like Financial, Customer, Internal Operations, and Learning & Growth perspectives
●Actively participated in each aspect of the project, primarily involved in Data capturing through MS Excel using VB macros and PL\SQL.
●Used Erwin tool to create data modeling
●Created dynamic reports using Business Object Mapped Flat files to Source Tables
●Loaded the Dimension and Fact tables using Informatica
●Designed High-level design and Low-level design documents adhering to the business standards
Project #14 : Transcoding for Residential Gateway May 01 – Jan 02
Client : LG India
Company : Wipro Technologies, India
Role : Curriculum Project Trainee & Project Trainee
Skills : Java, XML, XSL, WML, cHTML, OSGI, Web services, SOAP, HTTPS
Date : 04/12/2000 - 05/15/2001
●Presented different displays of Residential Gateway data with common content management techniques for devices like PC, Mobile Phones, iMode phones and Tablet
●Analysis, Design, and Implementation of various devices with common content
●Analysis and finding the feasibility of the system
●Extensively used XML and Java for transcoding for the display part of the common content
●Used XSL, WML, cHTML, SOAP, and web services, tested on a device, and proved the research
●Gained expertise in Event Handling and Java syntax for web applications
●Accoladed and received Best Innovative project award by the research depart of Wipro Technologies
●Developed POC and demonstrated in the Tech Forum – 2001