Post Job Free

Resume

Sign in

Etl Developer Informatica

Location:
Norristown, PA
Posted:
August 01, 2022

Contact this candidate

Resume:

Meenakshi Sundaram Chellappa

Lead Data Engineer

Email: adryxf@r.postjobfree.com

Cell: 484-***-**** / Home: 484-***-****

PROFESSIONAL SUMMARY

Around 18+ years of IT experience with skills in Design, Development, Administration and implementation of various projects using Object Oriented and Multidimensional Analysis. Involved in full life cycle development including Design, ETL strategy and Troubleshooting. Worked extensively with Investment Banking and Finance Domain.

13+ years of Data warehouses/Data marts Specialized in Informatica ETL developments experience using Informatica Power Center and Oracle Warehouse Builder (OWB) tool, working in Master Data Management (MDM) and Informatica Data Quality(IDQ), Big Data and Data warehouse.

10+ Years experience in Data Architect and Data Modular using ER Studio. If any tables alter, migration scripts.

1+ years experience on Cloud platform Informatica Intelligent Cloud Services (IICS) and AWS cloud working to create files and consume files in AWS S3 Buckets, maintaining AWS Availability Zone, EC2 Instance and doing POC in Glue ETL.

Working as ETL Developer and Oracle PL/SQL on variety of Window and UNIX platforms with Oracle 19c/12c/11g/10g/9i/8i.

Technical lead, project management and expertise as IT SME in Data warehouses design, development and production support.

Done AWS Certifions, Oracle Certified Associate (OCA) and Oracle Certified Professional (OCP)

Worked with clients in different domains like Banking, Investments, Finance and Telecom.

Experienced on complete Software Development Life Cycle (SDLC) using various methodologies like Agile, Scrum and Kanban dashboard, Waterfall Model

Strong experience in Data Migration, Data Conversion and Oracle PL/SQL scripts, working in Apache Hive Connectivity and Apache Spark-sql for data validations, Hadoop and Big Data processing.

Expertise in stored procedures, Functions, Triggers and necessary test plans to ensure the successful execution of the data loading processes and Legacy Loader.

Perform all levels of work when required in addition to high level, customer facing work through the project lifecycle.

Designed and Implementing Kafka producer Application to produce near real time data using Apache Kafka Connect Framework

Using Change Data Capture (CDC) software, Slowly Changing Dimension (SCD), Oracle GoldenGate (OGG) real time data replication tool as the source for Apache Kafka producer

GoldenGate Kafka Adapters are used to write data to Kafka Clusters.

Worked with database objects such as Views (including Materialized Views), Stored procedures, Functions, Triggers, Sequence Generators and Partitions on Oracle.

In Depth Knowledge of Performance Tuning of Oracle procedures. Experience in fine tuning the database using utilities like SQL Tuning, SQL trace, Explain plan.

Solid experience in loading data using tools like Export / Import/expdp/impdp, good knowledge on utilities like Tkprof.

Analyzed tables and indexes for Cost Based Optimization (CBO) for improving performance and work efficiently.

Worked with Various Oracle Developer tools such as Toad, PL/SQL Developer, SQL Tools, SQL Developer and PRO*C.

Proficient in using Informatica Workflow manager, Workflow monitor, Server manager, pmcmd (Informatica command line utility) to create, schedule and control workflows, tasks, and sessions.

Involved in analysis, design, testing and deployment and loading of data from heterogeneous source systems to the warehouse according to the end users requirements.

Extensively used transformation such as Source Qualifier, Expression, Filter, Router, Aggregator, Rank, Joiner, Sequence Generator, Normalizer, Look Up and Update Strategy to implement complex business logic.

Strong understanding of the principles of DW using Fact Tables, Dimension Tables, Star and Snowflake schema, data vault modeling and Process Control and CDC Table that would maintain all the load status workflow and error tank details.

Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools like ER Studio and ERWIN.

Knowledge in JAVA/J2EE/AngularJS, Kafka, Kubernetes, Docker container and creating web services development experience and testing in SOAP UI.

Strong exposure in Crystal Reports XI/8.5/8.0. Business Objects development oriented reports for Banking, Investments, Telecom, financing, retail, insurance, costing, payroll and billing statements formatted in systems.

Self learning Snowflake Data warehouse, using personal account and trial version.

TECHNICAL COMPETENCIES

RDBMS:

Oracle 19c /12c/11g/10g/9i/8i,Teradata, MS SQL Server 2005, MS Access, MySQL

Database Tools:

SQL* Loader, SQL* Plus, Toad, PL/SQL Developer, Oracle APEX 4.0,SQL Navigator, Oracle ADF Faces, Putty, ER Studio 9.5/11, ERWin 3.5.2

Reporting Tools:

Crystal Reports 8.5, BI Publisher, BOXI

Environment

Windows XP/NT/2003, UNIX and Linux

Languages:

SQL, PL/SQL, Java, Python, EJP, PHP, VB.NET, VS 2005, Python, C, C++, PRO*C, XML

ETL Tools:

Informatics Power Center 10x/ 9.x/8.6/8.1, Oracle Warehouse Builder (OWB).ODI

Other Tools

Star Team, BMC (Magic Ticket), CONTROL-M, SOAP UI and hp Quality center, Rally, Jira, Confluence, Fitness, Docker, Kubernetes Container Services, Jenkins, SourceTree and GIT Status, Fitnesse, WinSP, VPN, Service Now.

Cloud

IICS, AWS and GCP

Educational Qualification:

●Master of Philosophy (M.Phil) in Computer Science, Periyar University (2009)

●Master of Computer Applications (M.C.A) in Bharathidasan University, Trichy (2000 - 2003)

Certifications:

AWS Certified Cloud Practitioner exam done and below the Certificate link for reference.

https://www.credly.com/badges/d42495b5-11f4-4f0b-84f6-99218f0c64ec/public_url

ORACLE Certified Professional – Oracle Certified PL/SQL, Forms and Reports, Build Internet Applications Developer (OCP)

ORACLE Certified Associate – Oracle Certified SQL Developer (OCA)

Informatica 10X Developer Best Award Certified in NIIT technologies Ltd.

Training attended on IICS and GCP

PROFESSIONAL EXPERIENCE

Client: Charles Schwab & Co., Inc., TX (Oct. 2021 - Present)

Industry Practice: Investment Banking (Wealth Management)

Role: Lead Data Engineer (IICS Developer, AWS and GCP Cloud)

The Charles Schwab Corp. is a savings and loan holding company, which engages in the provision of wealth management, securities brokerage, banking, asset management, custody, Fund Account, Cashiering, Trade Buy/Sell, Equity, Mutual Fund and financial advisory services. It operates through the Investor Services and Advisor Services segments.

Responsibilities:

Getting requirements to estimate story points, project plan, workload and capacity, Create Design, Development QA, UAT, Follow up to deploy in productions and Maintain Team members.

Working on ETL IICS and ETL Informatica Power-Center/Power Exchange to create workflows Creates files / Consume files in NAS location and move to files in AWS S3 Buckets.

Working on Teradata IDW (Integrated Data Warehouse) to store large volume datas in Data Warehouse and developing Data and Rep Technology (DaRT). working in Apache Hive Connectivity and Apache Spark-sql for data validations, Hadoop and Big Data processing.

Creating new Table/ Modifying table structure in IDW and SQL Server database for project development.

Creating new Control M jobs scheduled to trigger ETL workflow and Batch process.

Creating PL/SQL Packages, Stored Procedures, Functions, Cursor and Triggers to implement the Business Logic and developing various Transaction handlers for the web service application.

Working SQL server database to manage User Maintain Tables (UMT) structure to get the details.

Identify production issues to analysis, mitigate data to find and identify the details, provide Root cause analysis (RCA) for the issue and backlog (Create User Story) the details to implementing Code, Testing signoff done then deploy into production environment ASAP.

Implement Jenkins, GIT and GitHub for version control, Code Build, UAT Testing and CI/CD pipeline to deploy code in production.

Working on GCP (Google Cloud Platform) using GCP Cloud storage, DataProc, Data Flow, Big Query, Cloud Composer, Cloud Pub/Sub. GCS Buckets and gsutil commands.

Expert in working with Cloud PUB/SUB to replicate data real-time from source system to GCP Big Query.

Support Cloud Data Warehouse (CDW) GCP data migrations and Enterprise Data Warehouse (EDW) SDLC Data Warehouse development.

Environment: IISC, AWS, GCP, Informatica 10x, Oracle 12c and 19c, Teradata, IDW, SQL Server, Jira, Putty, UNIX and Linux, NAS, WinSP, Bamboo, Bitbucket, GIT and GitHub, Source Tree, GIT Stash, Confluence, Remedy, Control M, Tableau

Client: SEI Investments, Oaks PA (Dec. 2010 – Oct. 2021)

Industry Practice: Investment Banking (Wealth Management)

Role: Tech Lead (ETL Developer & Oracle PL/SQL)

SEI is a leading global provider of comprehensive wealth management solutions to the Manage of Banks, Trust & Private Wealth Institutions, Investment Advisors, Investment Management, SEI Wealth Network for families and Individuals.

SWP is Developing Agile and Scrum methodology. Mutual Fund Process for Custody developing Charles Schwab, NSCC, DTCC, DST Orders and Confirm Mutual Fund Processing, Custodian Accounts, Fund Accounts, Funding Nostro Accounts, Income and Dividends, Long Term and Short Term Capital Gain. CSTOA, Account Maintains, Firm Branch Report, Global (Multi Currency) Net Settlements, Custody Net Settlements, Fill Pricing, Order Initiation, Trades, Transactions, Activity Services, Currency Movement, Cash Receipt, Free Movements, Fixed Income, Receiver Authorized Delivery (DTCC RAD), Tax, Fees Services, Contribution, Distribution, Withholding, Contribution, Currency Sweep Models, Accounting, Portfolios and Firms Details, Account Settlement and Information services, Equities, Corporate and Municipal Bonds, Government Bonds, Money Market Instruments, Money Market Reform, NSCC UFA, Currency Sweep Characteristics and Loaders, Asset Services and Securities Processing, Net Asset Value (NAV) every day price file upload.

Vanguard (NSCC) - Maintain Client Account and Vanguard Asset details for Mutual Fund Process, Money Market, Money Market Reform, Equity, Bonds, Corporate Actions (CA), Account and Portfolios Maintains

Private Banking – Account and Portfolio, Trade and Settlement Transaction, Correction and Reversal, Reconciliations. Statement Reports maintenance.

Wells Fargo Bank – Mutual Fund Process, Money Market, Money Market Reform, Equity, Bonds, Corporate Actions (CA), Account and Portfolios Maintains.

US Bank – Maintain Client Account, Custody and Fund account details to create Mutual Fund Order Processing, Fixed Income and post the Transactions every day.

BNY Mellon NRS and Clear Sky Capital Inc. for Blue Sky report details.

ONESOURCE - Tax and Accounting (PAN) maintains Global Tax Services.

RR Donnelley (RRD) - Client Account Address details maintains.

Envestnet - Client Account and Firm broker details maintains.

Responsibilities:

Creating PL/SQL Packages, Stored Procedures, Functions, Cursor and Triggers to implement the Business Logic and developing various Transaction handlers for the web service application.

Designed and Developed ETL Interfaces using PL/SQL Stored Procedures, UTL_FILE, External Tables, SQL*Loader Oracle Utilities for the Staging Area and Target Database.

Working in ETL Informatica Power-Center/Power Exchange to create workflows, working in Master Data Management (MDM), Big Data and Data warehouse.

Technical lead, project management and expertise as IT SME in Data warehouses design, development and production support.

Pull data from one schema (EDB) to another schema (Data Warehouses) using ETL Informatica transformation tools, Process Control and CDC Table that would maintain all the load status workflow and error tank details, Tuning ETL Coding and Mapping design for Source SQL query, Lookup SQL and ETL Transformations.

Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts and Decision Support Systems (DSS) using Multidimensional and Dimensional modeling (Star and Snowflake schema) Concepts.

Designed and Implementing Kafka producer Application to produce near real time data using Apache Kafka Connect Framework

Using Change Data Capture (CDC) software, Oracle GoldenGate (OGG) real time data replication tool as the source for Apache Kafka producer

GoldenGate Kafka Adapters are used to write data to Kafka Clusters.

Full lifecycle exposure to Data Warehouse projects and Data marts with Star and Snowflake Schemas.

Expertise in OLAP/OLTP System Study, developing database Schemas like Star Schema, Snowflake Schema, Conforming Dimensions and Slowly Changing Dimensions used in relational, dimensional and multi-dimensional modeling.

Data Architect and Data Modular using ER Studio. If any tables alter, migration scripts.

Created Test environment for data staging using SQL & PL/SQL scripts and optimized critical queries to eliminate Full Table scans & reduce Disk I/O.

Optimize SQL queries for better performance. Tuning of Oracle stored procedures & developed business logic in database layer using PL/SQL. Performance tuned multiple SQL queries and PL/SQL blocks using Explain Plan, Hints and SQL Trace.

Performs technical tasks including estimating, analysis, technical requirements, design, constructions and Unit testing, Integration testing and UAT testing following Production release code maintenances and support.

Involved in exhaustive Performance Tuning using DBMS_Profiler, Explain Plan, Monitoring and Optimization of PL/SQL blocks and Batch framework process in both Development and Production environment for Oracle.

Created and manipulated PL/SQL tables (i.e. Index-by Tables and Nested tables) and Varrays. Involved in creating Procedures, Packages and appropriate REF Cursors to handle BULK collects and BULK binds.

Schedule the ETL workflow job to the Control M development team to provide ETL Workflow information in the Job Run-book to trigger the job on time.

Proficiently developed Design documents (HLD & LLD), Unit Test Cases (UTS) and project documentation (user guides, run book, standard report instructions, demos).

Analyzed Oracle Objects and Creating Partitions and Compress Tables for very large tables in Production to reduce disk contention & improve performance.

Generated DDL scripts for new database objects like tables, views, sequences, synonyms, indexes, CDC tables and granting privileges.

Wrote the Migration scripts satisfying the new Business Requirement converting the data from its present form into a new form. Migration done performing all the Business Validations as part of Pre & Post Migration.

Using Service now ticket requests for any Production issues and Incident tracking and resolve the problems and Identify issues and provide immediate production solutions and provide Root Cause Analysis (RCA). Like Incident (INC), Problem (PRB) and Request tickets creating or Resolve or Analysis details assigned to different teams.

Creating Fitnesse fixture automation testing for developed Store Procedure.

Handling UNIX/Linux server Oracle database via connecting Putty. Pulling file NAS location to local using WinSp.

Works independently to develop, configure, code and test programs from specifications.

Develop ETL data workflows and code for new data integrations and maintain/enhance existing workflows, creating technical and coding mapping documents.

Creating new BDA adapter, Kafka configuration, Partitioning and coding to stage data in various tables and using restful API service and triggering POJO app end point to generate flat file reports.

Creating new Angular UI UX design page configuration setup and creating wireframe to design UI page.

Managing Onshore and Offshore model for development to create tasks for my team members.

Environment: Informatica 10x and 9x, Oracle 12c/11g, ER Studio 11/ 9.5, JAVA/J2EE, Angular, Kafka, TOAD, SQL * Plus, PL/SQL developer, SQL * Loader, Rally, Windows 10, DUDE, Fitnesse, Putty, WinSP, Bitbucket, Source Tree, GIT Stash, Confluence, Kubernetes, Docker, JSON, Avro, SWIFT messages, Jenkins, Dynatrace, VPN, Service Now, UNIX and Linux, SOAP UI, Star Team, HP Quality Center,

Client: British Telecom Global Services – UK (Nov. 2009 – Dec. 2010)

Tech Mahindra, India

Project: BTGS – Seven Sisters Platform (ANDES – Agora)

Role: Technical Associate (Oracle PL/SQL Developer)

Description:

●The British Telecom Global Services (BTGS) Master Data Management (MDM) programs are a cross-functional initiative aimed at implementing a harmonized product portfolio across country/region supported by a standard set of systems and processes.

●This will improve the customer service of BT Global and reduce cost through standardization and rationalization of systems and processes. So, the program involves a number of system replacements.

●The BTGS – Seven Sister Platform (ANDES – Agora) Central Repository of data migration project is an End to End (E2E) data processing. The part of Big Friendly Giant (BFG) Transient Reconciliation Database (BTRDB) components solution provides a GUI screen.

●Developing MPLS, iVPN networks and Reuters Operations.

Roles & Responsibilities:

Worked on PL/SQL programming Packages, Stored Procedures, Functions, Cursors and Triggers.

Developing PL/SQL codes for various Work Packages as well as Change Requests.

Responsible for preparing Low Level Design (LLD), UTP (Unit Test Planning) documents, Unit testing of the codes and Preparing Release notes, Deployment and Support for product builds and Maintenance of database.

Worked in Data Warehousing for OWB (ETL) tool extracting the data like .csv format creating various kinds of reports data analyzing, reconciliations, data validation, quality checking, Matching report data to Database data (Count the Data), Consolidation and DWH testing. Handling Customers wise and country wise reports. Pull the data from one schema to another schema. Developing Pro*C, SQL*Loader scripts like Control file, Log file and BAT file, work on it External tables (flat file) exports (Transform) and imports (Load).

Generate the report from Oracle Warehouse Builder (OWB) consolidated report, Discrepancy report like .xls format, creating the Hyperlinks in .xls spreadsheet reports sent to Client.

Creating new Objects, Joins table, structures and altering existing entities and attributes, Materialized Views, Sequences and Indexes changing in design descriptions as per Client requirements.

Expanded Physical Data Model (PDM) for the OLTP application using ERwin.

Provide Oracle Application Server and Oracle APEX implementation and support.

Interacting with business regarding solving various code issues and preparing daily, weekly reports and process performance monitors / Reports consolidated entity lists.

Responsible for Analysis, Design, Development, Deployment and Support for product builds and Maintenance of database. It consists of BT Customers, Suppliers, Contracts, Sites, and Products and Inventory information.

Environment: Oracle 10g, Oracle Warehouse Builder (OWB) Tool, SQL Developer, Toad, hp Quality Centre, Oracle J Developer 10g, Oracle ADF Faces, BMC Remedy, PHP, Oracle APEX 4.0, ERWIN 3.5.2, Java, J2EE, PRO*C, Windows and Unix.

Sun Chemical Sales Analytics - India (Dec. 2004 – Sep. 2009)

Role: Software Engineer

Description:

Sun chemical is a leading chemical company. Portfolio ranges from chemicals, plastics, performance products. A web-based portal was designed to cater the needs of the sales and marketing team.

Roles & Responsibilities:

Worked on PL/SQL programming Packages, Stored Procedures, Functions, Cursors and Triggers.

Worked Python Programming and scripting language to write Python Coding and testing code, debugging programs and integrating applications with third-party web services.

Using UNIX (Shell Scripting) environment.

Worked for data warehousing tool in Informatica (ETL) Extracting reports from process flow identified and data validation the business requirements for data Analysis, Quality, Matching, Consolidation various kind of reports for data warehousing testing.

Gathered knowledge from the client side and reverse transitioned to the team.

Assisting the Test Team during the functional testing and Unit testing.

Worked on designing the logical model and Oracle ERP application.

Performed impact analysis on the change requests from clients and proposed solutions.

Environment: Oracle 9i/10g, Python, Toad, Crystal Reports 8.5, Informatica (ETL), Pro*C, Oracle ADF Faces, Windows and UNIX. SQL Developer, SQL Server, T-SQL, MS Access, VB.NET, ADO.NET, ASP.NET (VS 2005), PRO C, Forms 6i & Reports 6i, Window XP.



Contact this candidate