Resume

Sign in

Data Engineer

Location:
Pleasanton, California, United States
Posted:
June 17, 2019

Contact this candidate

Resume:

Mithila Joottu Thiagarajan

ac9mry@r.postjobfree.com

818-***-****

EXPERIENCE SUMMARY

Dedicated and competent ETL and DV professional with demonstrated success in developing and implementing solutions for Fortune Companies like Franklin Templeton, Wells Fargo, Deutsche Bank, Dean Foods, America Honda motors, Charles Schwab, USB through leading IT companies. Possess diversified technical background with 10+ years of experience in ETL and Data Integration.

Have extensive financial domain experience and worked on Informatica (10 years), Tibco DV (Composite) (1.5 years), Erwin (3 years), Tableau (1 Year) and BI tools Cognos (1 year).

OBJECTIVE

Seeking a challenging career as Data Integration Analyst/ Sr. Developer in ETL and advance my Career as Data Integration Consultant.

ACADEMIC CREDENTIALS

MCA: Master of Computer Applications from Madurai Kamaraj University - 2001 Passout - 75%

B.Sc. Bachelor of Sciences in Computers from Madurai Kamaraj University - 1998 Passout - 76% with Distinction

CERTIFICATION

BIG DATA 101 – Big data University

Cloud Computing Solutions Professional – Intelisys Cloud Services University

NoSQL and DBaaS 101 – Big data University

AWS Cloud Practitioner Essentials, AWS Managed services console walkthrough – aws training and cetification.

SKILLS

Informatica 10

Cognos 8

Erwin 9.6

Data Modeling

PL/SQL

SQL

Autosys, ASG Zena, SQL Server Agent

Teradata TPT

T-SQL

MDM Concepts

Pivot and Non-Pivot tables

Informatica TDM

Java script

AWS

TIBCO CIS 7.6

Unix Scripting

DWH Architecture

ETL Programming

DB2

Java

Oracle

XML Programming

Teradata BTEQ Scripting

SSIS 7/8

Tableau

COBOL

Informatica IDQ

Talend

Star schema

Slowly changing dimension

STRENGTHS AND ACHIEVEMENTS

Expertise in developing/maintaining ETL end-to-end lifecycle with ETL Informatica Power Center Components Mappings, Mapplets, simple and complex Transformations, Sessions, Worklets and Workflows for data loads.

Expertise in Data Integration projects using TIBCO Composite 7.6.

Have exposure to other DV tools like Informatica DIH and Snap Logic, have successfully created few POCs on each.

Experience developing projects with Informatica IDQ (Informatica Data Quality, Profile, Scorecard, Address validator, Match, Merge, Parse, Exception handling).

Experience in working for OFSAA data modeling, DQ metrics and DT of OFSAA applications using Erwin data modeling.

Experience with advanced ETL techniques including Staging, Re-usability, Data validation, CDC, Batch processing, Error handling, incremental loading and incremental aggregation.

Good database skills in Oracle, DB2, SQL Server and Teradata.

Experienced in testing methodologies, Unit testing, Integration testing, System testing and proven abilities in ETL testing.

Created ETL testing scenario, test plan and test scheduling.

Experienced with Autosys and Control-M scheduling tools for Informatica.

Experience in migrating applications from ETL previous versions to Informatica 8.6/9.5 version.

Good experience in mentoring/leading onsite-offshore model project execution.

Well experienced in working Agile and Waterfall model project development approaches.

Level 3 production support includes analyzing the production issues, resolving it within permissible SLAs.

Lead Development activities include providing estimations, participating in database design meetings, data modeling, analyzing the project from ETL perspective, designing, developing, test support and documentation.

Proactively collaborate with the QA teams on the testing scenarios and provide inputs on the test cases.

In depth analysis on the performance of the application overall in production and then analyze and optimize it further.

Implement the TYPE 1 and TYPE 2 slowly changing dimensions using Informatica and SSIS.

Have written fine T-SQL Queries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.

Skilled in error and event handling: precedence Constraints, Break Points, Checkpoints and Logging.

Interacting with the client SME and client DBA on various applications and understand the functional and technical aspects of the application.

Ensure all the business quality goals are achieved as stated in the company quality guidelines.

Supported and have strong knowledge on Data modeling and support Data modelers in defining table and relations per the business requirement.

Implemented successfully ETL and ELT strategies.

Have hands on experience on Talend transformations, components and pipelines.

Experience in using Teradata components BTEQ, Multi load, Fast Load and TPump.

My expertise in Trouble shooting and fixing ETL defects by playing a Data Analyst role for the entire project Ecosystem has helped me to achieve awards and client applauses.

Have created Business Object Universes and fine-tuned them according to the business requirements.

Executed GAP analysis between ETL system and BI systems and provide RCA for the issue and rectified them on a timely manner.

Gaining expertise in Integration tool Snap logic, to easily connect any combination of Cloud, SaaS or On-premise applications and data sources.

Executed POC’s with Snap logic for scenarios using aws s3 database.

oPipeline: Replicate a Database Schema in Redshift.

oScript: Pivot Data.

oScript: Pivot Data.

oPipeline: REST Get for Customer Items.

PROJECTS EXECUTED

Project #1 : Product Data Services - Aug' 17 – Sep’ 18

Client : Franklin Templeton Investments

Company : Encore Software services, USA

Role : Senior Consultant

Skills : Informatica 9.6, TDV 7.6 (Composite Integration Studio), Oracle 11g, SQL Toad, Windows, Control-M, UNIX, JIRA, Data Virtualization, DWH, Erwin 9.6

Description :

Product data services is a division of Franklin Templeton. PDS comprehends Fund Related Data services that delivers consolidated, consistent and authoritative Product data with high quality that can be consumed and distributed across the enterprise and outside the enterprise through API services.

Responsibilities:

Production support for the PDS addressing data processing issues and production escalations across multiple platforms for complex and time sensitive requirements.

Developed ETL end-to-end lifecycle with ETL Informatica Power Center Components Mappings, Mapplets, simple and complex Transformations, Sessions, Worklets and Workflows for data loads

Performed Database Integration, New project integration, designing, developing, SIT support and documentation using Informatica and TIBCO CIS Studio.

Created CIS views and tweaked the views for optimizing the performance of Data Integration.

Designed ETL design requirements for multiple projects in Franklin.

Successfully provided PDS Purging strategy.

Provided verification and validation of the development during testing phases.

Involved in Incident Management for Production issue fix and change request

Developed and enhanced Data Quality metric and Data Transformation from previous releases.

Provided enhancement plan and carried away changed in subsequent release.

Carried away proactive Production support, by never missing SLA’s.

Developed documents for PDS purging strategy with using Big data technologies.

Experienced in setting up LDAP ports domain configuration.

Tested the new releases on different environments and provided valuable answers to QA questions.

Mentored clients and team members on new application development and self-service best practices.

Co-ordinated Offshore – Onshore meetings around the clock while doing Production support and excelled as a good team member.

Project #2 : Profit View - Nov ‘16 – Aug ‘17

Client : Wells Fargo Bank

Company : Infosys Technologies LTD, Fremont, USA

Role : ETL Developer/Production support

Skills : Informatica IDQ, Informatica TDM, Erwin 9.6, Oracle 11g, SQL Toad, Windows, Autosys, Unix, HP ALM, JIRA

Description :

Profit View is a profitability management tool for Wells Fargo bank under wholesale technologies. Profit View replaces its previous version customer profitability tool for relationship groups by Leveraging existing sources/infrastructure via new sources will enable common profitability and cross sell calculations. Add functionalities like hierarchies, adding more integration with other existing tools. This project is a production support with several releases with incremental deployment project delivered in Agile methodology.

Responsibilities:

Production support for the tool Profit view.

Developing/maintaining ETL end-to-end lifecycle with ETL Informatica Power Center Components Mappings, Mapplets, simple and complex Transformations, Sessions, Worklets and Workflows for data loads

Performed Database migration, Incremental Data Model uploads and coordinating team on it.

Database design meetings, data modeling, analyzing the project from ETL perspective (Informatica), designing, developing, test support and documentation

Verified Data model requests and implemented the same using Erwin and XML.

Create data subsets to reduce costs and to accelerate the development.

Created ETL test data using Informatica TDM and defined business rules for the test the functionality of batch processing

Involved in Incident Management for Production issue fix and change request

Enhanced Validation queries for specific operation and added in the workflow.

Developed and enhanced Data Quality metric and Data Transformation from previous releases.

Given reverse KT to team and mentored new team members.

Provided extensive cleansing and profiling of legacy data and provided as input to current system.

Provided enhancement plan and carried away changed in subsequent release.

Carried away proactive Production support, by never missing SLA’s.

Tested the new releases on different environments and provided valuable answers to QA questions.

Project #3 : Deutsche Asset Management - Dec’ 13 – Apr’ 16

Client : Deutsche Bank

Company : Cognizant Technologies, India

Role : Associate

Skills : Informatica IDQ, HP ALM, Oracle 11g, SQL Toad, Windows XP, Erwin, XML, Autosys, Tableau

Description : The asset management division of Deutsche Bank, Deutsche Asset Management, has an excellent team dedicated to serving its clients around the clock and across the globe. To provide professional services of portfolio management to the investors. Portfolio management services include: Making the research reports available to the clients, providing buy/sell tips to the clients, Total financial planning for the clients etc.

The ETL solution provides an independent, efficient, reliable and controlled static data collection, processing and maintenance service for all systems contractually maintained by DeAM. Provides an independent, efficient, reliable and controlled pricing function across all instrument types used by DeAM.

Responsibilities:

Developed and enhanced Informatica mappings on various stages and promoted to Live.

Developed Informatica Power center batch architecture to extract, transform and load data from various sources like Oracle, flat files and XML files sent by third parties.

Design ETL Informatica Transformations including – XML parser, Normalizer, SQL transformation, Store procedure, update strategy and transaction control transformation and created complex mappings.

Developed mapping and workflows using Informatica best practices and considering optimum performance of the jobs.

Developed and maintained of ETL Code for loading transactional data incremental loading into the DeAM system.

Raised Change Requests and Incident Management, analyzed and coordinated resolution of code flaws for the development environment and hot fixed them in QA and production environments during the runs

Expertise in using Informatica Data Quality (IDQ) for client data standardization.

Performance tuning of Informatica batches by optimizing Sources, Targets, Mappings and Sessions and experience of using partitioning and parallel processing.

Implemented CDC logic which incrementally loads Portfolio.

Created ETL test data using Informatica TDM and defined business rules for the test the functionality of batch processing.

Experienced in Informatica Information life management tool used to optimize TDM and purged old data safely.

Extensively used HP ALM to track defects, issues and bugs reported by Business.

Project #4 : SIMPL MSA - Apr ‘13 – Nov ‘13

Client : American Honda Motors, CA, USA

Company : Accenture Services, India

Role : Software Engineering Sr. Analyst

Skills : Informatica 9, ASG Zena, IBM Mainframes, DB2, IBM Query Tuning, FTP, Java, Java script

Description : SIMPL Market Sensing Analytics is a project for Sales Analytics addresses and analysis the profitability involved in giving Incentives to the Dealer. SIMPL MSA, my role was to support and enhance mappings in Auto DWH and OLTP and DM application involved to create reports using Oracle Endeca, Crystal Reports and Business Objects.

Responsibilities:

Analyzed 3 applications and Data Ware house and gave valuable inputs to Client.

Provided Impact analysis and estimation time needed to implement and test the changes done is source system.

Created efficient SQL queries for the reporting needs.

Provided support for SIMPL MSA promptly and catalytic to the closure and implementation of the issue quicker.

Analyzed and enhanced of complex Informatica mappings using various transformations

Examined the session logs, bad files and error tables for troubleshooting mappings and sessions

Carried out Performance tuning of the SQL queries and mapping to identify bottleneck at target, source, mapping and session level.

Implementing coding in Java, Java scripting for web based UI interfaces.

Carried out data validation with source files in EBCDIC and ASCII formats, XML files and flat files.

Supported Unix Shell Script for various preload and post load validation and activities.

Analyzed, Designed and Implemented Purge logic for the SIMPL MSA application.

Update the mapping sheets and Data Dictionary

Mentoring the new team members

Well versed in using Kanban and Agile processes in projects

Effectively supported the entire SIMPL ETL integration data marts working as Offshore lead coordinating On-site team and Offshore BI teams.

Project #5 : Equinox MDM - Jul ‘07 – Feb ‘08

Client : Equinox, New York, USA

Company : Accenture Services, India

Role : Senior Systems Engineer

Skills : SSIS, Windows, Oracle

Description : Equinox is a fitness club that engages members in fitness and wellbeing, delivering transformative results. Equinox has a growing portfolio of properties across the United States, as well as a location in London.

The primary objective of Equinox MDM project was to migrate the existing Data Warehouse load from Mainframe Source System to SQL Server.

Responsibilities:

Analysis the requirements provided by various business users.

Executed sessions, sequential and concurrent batches for proper execution of mappings and then setup email delivery after execution.

Developed Unix Shell Script as well for various preload and post load validation and activities.

Have written fine T-SQL Queries, Dynamic-queries, sub-queries and complex joins for generating Complex Stored Procedures, Triggers, User-defined Functions, Views and Cursors.

Skilled in error and event handling: precedence Constraints, Break Points, Check points and Logging.

Supported team in resolving SQL Reporting services and T-SQL related issues and Proficiency in creating different types of reports such as Cross-Tab, Conditional, Drill-down, Top N, Summary, Form, OLAP and Sub reports, and formatting them.

Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS Package Designer.

Experience in importing/exporting data between different sources like Oracle/Access/Excel etc. using SSIS/DTS utility.

Experience in ETL processes involving migrations and in sync processes between two databases.

Experience in Microsoft Visual C# in script component of SSIS.

Transformed data from one server to other servers using tools like Bulk Copy Program (BCP), and SSIS

Experience in creating configuration files to deploy the SSIS packages across all environments.

Expert in generating writing parameterized queries, drill through reports and formatted SQL server Reports in SSRS 2005.

Project #6 : Composite ReportNet - Feb ‘07 – July ‘07

Client : Composite, San Mateo, USA

Company : Wipro Technologies, India

Role : Senior Systems Engineer

Skills : COGNOS ReportNet, Framework Manager, Data modeling, Consulting, HTTPS, Java

Description : Composite ReportNet is a Business Intelligence project. Enterprise Information Integration (EII) refers to the technology and business best-practices for the real-time aggregation of corporate data across multiple, potentially widely disparate, data sources. Composite ReportNet project focuses on creating reports on various comprehensive, reusable “views” of accounting models of EII in various Data Systems.

Responsibilities:

Optimizing integrated schema sourcing data from Siebel, Oracle and SAP

Experience in Cognos BI tools using Cognos Framework Manager, Report studio, Query studio, Analysis studio, Power Play Transformer, Impromptu Web Reports (IWR).

Experienced in Cognos 10, 8, ReportNet with Report Studio, Framework Manager, Query Studio, Analysis Studio, Metric Studio and Cognos Connection.

Strong experience in creating and publishing packages in Framework Manager.

Strong understanding of dimensional modeling of Star Schemas, Snow-Flake Schemas Methodologies for building enterprise data warehouses.

Developing and peer reviewing Cognos reports.

Unit test plan preparation, probing adequacy of Unit testing & coverage.

Effectively supported onsite Acceptance Testing.

Project #7 : Logistics KPI - Jan' 06 – Feb' 07

Client : Dean Foods, CA, USA

Company : Cognizant Technology Solutions, India

Role : Associate

Skills : Informatica 9.6, Teradata BTEQ scripting, TPT, Tableau, System integration testing, DWH

Description : Logistics KPI is a project for analysis on the optimum logistics involved in transferring the dairy products of Dean Foods. My role involved in delivering BTEQ scripts using TPT running through Informatica 9 for various criteria and carried out QA testing by reviewing SRS and created Test Scenarios, Test planning and Test case design. Worked in tandem with development team and testing team.

Responsibilities:

Performed Unit Testing and Integration Testing. Supported Systems Testing Team.

Analyzed 3 applications and Data Ware house and gave valuable inputs to Client.

Provided Impact analysis and estimation time needed to implement and test the changes done is source system.

Provided support and gathered knowledge with other Dean Foods team to create reusable scripts.

Analyzed and enhanced of BTEQ script and TPT utilities.

Examined the session logs, bad files and error tables for troubleshooting mappings and sessions

Carried out Performance tuning of the SQL queries and mapping to identify bottleneck at target, source, mapping and session level.

Effectively coordinating On-site team and offshore team in entire QA Testing and Implementation cycle.

Involved in ETL test - performance tuning.

Supported BI teams for the reports development using Tableau Software.

Successfully coordinated offshore – onshore model project with working as offshore lead.

Project #8 : Integrated Centralized Environment - Jan ‘05 – Jan ‘06

Client : UBS, USA

Company : Wipro Technologies, India

Role : Project Engineer

Skills : Informatica, Oracle 7.x, SQL Finance, Financial Consultancy, DWH

Description : UBS: A world-leading wealth management company global investment bank and premier global asset management business at every level. Integrated Centralized Environment (ICE) is an integrating environment of GPP, BEE, and CAL server in the earlier system in UBS for Financial Services.

Responsibilities:

Analysis of Old Design(Mapping)

Suggesting feasible solution and creating reusable components

Created Mapplets as reusable components

Developing Mapping

Peer Review

24/7 Support after the release of the product

Project #9 : iSchwab – ETL Migration - Jul ‘03 – Jan ‘05

Client : Charles Schwab, USA

Company : Wipro Technologies, India

Role : Project Engineer

Skills : Informatica, Oracle 8i, IBM Mainframes, COBOL, SQL Loader, SQL Plus, Unix Shell scripts, vi, Banking & Finance, DWH

Description : To migrate iSchwab UNIX, Oracle suite of programs read flat files generated by the CS04 Mainframe sub-system of iSchwab to Informatica layered iSchwab. Transformations and pre-processing that are previously carried out on Pro*C, PL/SQL, Oracle stored procedures and Shell scripts are migrated to Informatica. The code was fine-tuned and re-used in the Informatica environment. The conversion of the previously existing batch programs with effective re-engineering wherever required.

Responsibilities:

Re-engineering and Analysis of Unix/shell scripts and PL/SQL code.

Design of Informatica mappings and Workflows.

Created complex and scalable Informatica mappings and workflows.

Value added future suggestions provided to client.

Performed Data validation with different sources in EBCDIC and ASCII forms. Used Normalizer and XML transformation.

Created Test Strategy and test scenarios and defined document standards to follow CMM level coding.

Creation of End-to-End Unit Test Cases, Parallel test cases and System test cases.

Created automated reusable test scripts.

Gain team awards for reducing the execution time significantly.

Received ‘Feather in the Cap’ award from my project manager twice for this project.

Project #10 : Dash Board - Jan ‘03 – Jul ‘03

Client : COE, Wipro Technologies, India

Company : Wipro Technologies, India

Role : Project Engineer

Skills : Business Objects, Oracle 8.x, VB Script, PL/SQL, Erwin, Informatica 7, HTML, Erwin 6.0, HTTP

Description : Balanced Scorecard is a weekly, monthly and quarterly dashboard report showing the status of e-enabling DWH group with respect to Financial, Customer, Internal Operations and Learning & Growth perspectives. These reports are primarily for the higher-level managers to see the status/performance of DWH practice.

Responsibilities:

Actively participated in each aspect of the project, primarily involved in Data capturing through MS Excel using VB macros and PL\SQL.

Used Erwin tool to create data modeling.

Created dynamic reports using Business Object Mapped Flat files to Source Tables.

Loaded the Dimension and Fact tables using Informatica.

Designed High level design and Low-level design document adhering the business standards.

Project #11 : Transcoding for Residential Gateway - May ‘2001 – Jan’ 2002

Client : LG India

Company : Wipro Technologies, India

Role : Curriculum Project Trainee & Project Trainee

Skills : Java, XML, XSL, WML, cHTML, OSGI, Web services, SOAP, HTTPS

Date : CPT 04/12/2000 - 05/15/2001

Description : Different display for interactive devices like PC, Mobile Phone, iMode phone and Tablet devices for Residential Gateway interact through common content management.

Responsibilities:

Analysis, Design and Implementation of various devices with common content.

Analysis and find the feasibility of the system.

Extensively used XML and Java for transcoding for the display part of the common content.

Used XSL, WML, cHTML, SOAP and web services and tested on a device and proved the research.

Gained expertise in Event Handling and Java syntax for web applications.

Accoladed and received Best Innovative project award by the research depart of Wipro Technologies.

Developed POC and demonstrated in the Tech Forum – 2001



Contact this candidate