Post Job Free
Sign in

Sr ETL / Informatica Developer

Location:
Charlotte, NC, 28203
Salary:
70$/Hr
Posted:
September 07, 2022

Contact this candidate

Resume:

ARCHANA M

Sr ETL / Informatica Developer

Email: adsg5p@r.postjobfree.com Contact: 469-***-****

Professional Experience:

Around 8 years of experience in Information Technology with a strong background in Database development and Data warehousing (OLAP) and ETL process using Informatica Power Center 10.x/9.x, Designer (Source Analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.

Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server and Worked on integrating data from XML files, flat files like fixed width and delimited.

Experience in writing stored procedures and functions.

Experience in SQL tuning using Hints, Materialized Views.

Tuned Complex SQL queries by looking Explain plan of the SQL.

Experience in UNIX shell programming.

Expertise in Teradata RDBMS using Fast load, Multiload, Tpump, Fast export, Multiload, Teradata SQL Assistant and BTEQ utilities.

Expertise in Dimensional Data Modeling using Star and Snowflake Schema. Designed data models using Erwin.

Experience in Creation and managing user accounts, security, rights, disk space and process monitoring in Solaris and Red hat Linux.

Extensive experience with T-SQL in constructing Triggers, Tables, implementing stored Procedures, Functions, Views, User Profiles, Data Dictionaries and Data Integrity.

Excellent T-SQL development skills to write complex queries involving multiple tables, great ability to develop and maintain stored procedures, triggers, user defined functions.

Extensive knowledge in handling slowly changing dimensions (SCD) type 1/2/3.

Experience in using Informatica to populate the data into Teradata DWH.

Experience in understanding Fact Tables, Dimension Tables, Summary Tables

Establish and administer a process for receiving, documenting, tracking, investigating, and taking action on all complaints concerning the organization’s privacy policies and procedures in coordination and collaboration with other similar functions and, when necessary, legal counseling.

Technical Skills:

ETL

Informatica 10.x/9.x

Data Governance

Informatica EIC / EDC 10.2 with AXON, AnlaytixDS

CLOUD SERVICES

amazon web services

Databases

Oracle 12c/11g/10g/9i/8i/7.x, DB2, Teradata v 13/v12/v2r5, SQL Server

Reporting Tools

Business Objects XI 3.1/r2, Web Intelligence, Crystal Reports 8.X/10.X

Database Modelling

Erwin 4.0, Rational rose

Languages

SQL, PL/SQL

Web Tools

HTML, XML

OS

Windows NT/2000/2003/7, UNIX, Linux, AIX

Education: Bachelor of Technology in Computer Science and Engineering, India

Client: Bank of America, Charlotte, NC Apr 2021- Till date

Role: Sr ETL/ Informatica Developer

Description: The Bank of America Corporation is an American multinational investment bank and financial services holding company headquartered in Charlotte, North Carolina. The bank was founded in San Francisco and took its present form when NationsBank of Charlotte acquired. Bank of America is one of the world's largest financial institutions, serving individuals, small- and middle-market businesses and large corporations with a full range of banking, investing, asset management and other financial and risk management products and services. The company serves approximately 56 million U.S. consumer and small business relationships. It is among the world's leading wealth management companies and is a global leader in corporate and investment banking and trading.

Responsibilities:

Extensively worked in performance tuning of terra data SQL, ETL and other process to optimize performance.

Designed and developed complex mappings by using lookup, expression, update, sequence generator, aggregator, router, stored procedure, etc., transformations to implement complex logics while coding a mapping.

Worked in requirement analysis, ETL design and development for extracting data from the mainframes

Worked with Informatica Power Center 10 .0.2 designer, workflow manager, workflow monitor and repository manager.

Have configured Power center to load the data to Hive directly without Informatica BDM for less resource intensive jobs

Developed and maintained ETL (extract, transformation, and loading) mappings to extract the data from multiple source systems like oracle, SQL server and flat files and loaded into oracle.

Used advanced features of T-SQL in order to design and tune T-SQL to interface with the Database and other applications in the most efficient manner and created stored Procedures for the business logic using T-SQL.

Designed, developed, and tested stored procedures, views and complex queries for Reports distribution.

Developed database triggers and stored procedures using T-SQL cursors and tables.

Developed Informatica workflows and sessions associated with the mappings using workflow manager.

Used power exchange to read source data from Mainframes Systems, power center for etl and Db2 as targets

worked and created files for business objects

Continuously measure data privacy compliance with multifactor risk scoring and monitoring of data access and movement using data privacy management

Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate data from varied sources like Oracle, DB2, flat files and SQL databases XQuery

Involved in creating new table structures and modifying existing tables and fit into the existing data model.

Responsible for normalizing COBOL files using normalize transformation

Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings.

Debug through session logs and fix issues utilizing database for efficient transformation of data.

Worked with pre-session and post-session UNIX scripts for automation of ETL jobs. Also involved in migration/conversion of ETL processes from development to production.

Extracted data from different databases like oracle and external source systems like flat files using ETL tool.

Extracted data from sources like SQL server, AWS and fixed width and delimited flat files. transformed the data according to the business requirement and then loaded

Involved in implementing the Land Process of loading the customer Data Set into Informatica MDM from various source systems

Used DTS/SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML.

Involved in debugging informatica mappings, testing of stored procedures and functions,

Developed mapplets, reusable transformations, source and target definitions, mappings using Informatica 10.0.2.

Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the business requirements.

Involved in performance tuning of mappings in informatica.

Migrated the ETL - Workflows, Mappings and sessions to QA and Production Environment

Good understanding of source to target data mapping and business rules associated with the ETL process.

Environment: informatica 10.0.2, Unix, flat files (delimited), Data privacy management, Cobol files, Teradata 12.0/13.0/14.0

Client: Oracle, India (Indus Infotech) Apr 2019 – Feb 2021

Role: Informatica Developer

Description: Oracle provides organizations around the world with computing infrastructure and software to help them innovate, unlock efficiencies and become more effective. The work we do is not only transforming the world of business--it's helping defend governments, and advance scientific and medical research. From nonprofits to companies of all sizes, millions of people use our tools to streamline supply chains, make HR more human, quickly pivot to a new financial plan, and connect data and people around the world. At work, we embrace diversity, encourage personal and professional growth, and celebrate a global team of passionate people developing innovative technologies that help people and companies tackle real-world problems head-on.

Responsibilities:

Provided strategic direction for the planning, development, and implementation of various projects in support of enterprise data governance office and data quality team.

Familiar with split domain functionality for BDM and EDC and use the same Blaze engine on the Cluster.

Ran checks for duplicate or redundant records and provided the information on how to proceed backend ETL process.

Loaded data into the Teradata tables using Teradata Utilities BTEQ. Fast Load, Multi Load, and Fast Export, TPT.

Partnered with data stewards to provide summary results of data quality analysis, which will be used to make decisions regarding how to measure business rules and quality of the data.

Implemented data quality processes including translation, parsing, analysis, standardization, and enrichment at point of entry and batch modes. Deploy mappings that will run in a scheduled, batch or real-time environment.

Worked on the designing and development of custom objects and rules, reference data tables and create/import/export mappings.

Developed 'matching' plans and helped determine best matching algorithm, configure identity matching and analyze duplicates.

Worked on building human task workflows to implement data stewardship and exception processing

Developed KPIs & KRIs for the data quality function.

Drove improvements to maximize value of data quality (e.g., drive changes to have access to required metadata to maximize impact of data quality & quantity cost of poor data quality)

Performed data quality activities i.e., data quality rule creation, edit checks, identification of issues, root cause analysis, value case analysis and remediation

Augmented and executed the data quality to ensure achievable and measurable millstones are mapped out for delivery and are effectively communicated.

Prepared the documents on all mappings, mapplets and rules in detail and handover documentation to the customer.

Environment: Informatica Developer 10.1.1 HotFix1, Teradata 14.0, Oracle 11G, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), UNIX/LINUX, Shell Scripting, SharePoint

Client: Wipro/Voya Financials – (Indus Infotech), Chennai, India Jan 2017- Mar 2019

Role: Informatica Developer

Description: Voya's Vision is to be America's Retirement Company. Voya helps Americans plan, invest and protect their savings to get ready to retire better. Voya is in the following businesses to achieve the vision: Retirement Services, Annuities, Investment Management, Employee Benefits, and Life benefits as part of insurance benefits. Voya Financial is an American financial, retirement, investment and insurance company based in New York City. Voya began as ING U.S., the United States operating subsidiary of ING Group, which was spun off in 2013 and established independent financial backing through an initial public offering.

Responsibilities:

Collaborated with Business Analysts for requirements gathering, business analysis and designing of the Enterprise Data warehouse.

worked in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like DB2, MS SQL Server, Oracle, flat files, Mainframes and loading into Staging and Star Schema.

Created/Modified Business requirement documentation.

Created ETL Specifications using GAP Analysis.

Did Production Support on Monthly rotation basis.

Worked on Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions

Worked towards data modelling, baseline the data model to accommodate all business specific requirements and configured Informatica MDM Hub Server, Lookup Adapter, Cleanse Server, Cleanse Adaptor and Resource Kit.

Experience in Match and Merge Analysis, Trust Validations, Meta data Management.

Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages, and query groups.

Created Mappings to get the data loaded into the Staging tables during the Stage Process.

Coordinated with Business Leads to understand Match & Merge and incorporated their requirements and ideas.

Created INFORMATICA mappings, Sessions and Workflows.

Worked on CISM tickets for Production issues for Trucks and Cars Data Marts.

Extracted Mainframe data for Local Landscapes like Credit Check, Application etc.

Created Unit Test cases, supported SIT and UAT.

Worked with Informatica Power exchange and Informatica Cloud to integrate and load data to Oracle db.

Worked with Informatica Cloud for creating source and target object, developed source to target mappings.

Expertise in Dimensional Data Modeling using Star and Snowflake Schema. Designed data models using Erwin

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions, and measured facts

Worked with Teradata Database in writing BTEQ queries and Loading Utilities using Multiload, Fast load, Fast Export.

Created CISM ticket with HP to fix DB2 issues.

Modified MORIS (Trucks) data mart load with enhancements.

Created Defects in ALM software and assigned to developers.

Fine Tuned the Complex SQL by checking Explain Plan of the queries.

Environment: Informatica Power Center 9.6.1, DB2 9.8, Erwin, PLSQL, Red hat Linux, Power Exchange 9.5.1, Copy Book, Tivoli, ALM, Shell Scripting.

Client: Wipro /JPMC, - (Indus Infotech), Chennai, India. Nov 2014 – Dec 2016

Role: ETL Developer

Description: JPMC is building an integrated customer data warehouse with a high-performance, scalable, extensible, and providing a 360 view of the customer, which is critical to support future growth. Currently JPMC is in a process of Sun setting EDW and Migrating the Data Marts to ICDW Integration Layer which is built on a third normal form on IBM's Reference Data Model BDW.

Responsibilities:

Involved in System Analysis and Designing, participated in user requirements gathering. Worked on Data mining to analyzing data from different perspective and summarizing into useful information.

Interacting with Users, analyzing client business processes, documenting business requirements, Performing Design Analysis and Developing Design Specifications.

Carrying out development and deployment activities for both History and Incremental code respectively in Dev instances.

Understanding Client Requirements and summarizing this release scope based on functionality of the project.

Writing Unit test cases on related scenarios to be tested such as Count, attribute checks, performance, functional, business logic checks and so on.

Carrying sanity checks for code migrated during deployment and then loading the History and Incremental data.

Mapped data between source systems and warehouses.

Developed and modified programs to meet customer requirements.

Prepared functional and technical documentation data for warehouses.

Quickly learned new skills and applied them to daily tasks, improving efficiency and productivity.

Involved in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like MS SQL Server, Oracle, flat files and loading into Staging and Star Schema.

Used SQL Assistant to querying Teradata tables.

Experience in various stages of System Development Life Cycle (SDLC) and its approaches like Waterfall, Spiral, Agile.

Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.

involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.

Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).

Worked on Change Data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.

Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.

Worked with data Extraction, Transformation and Loading data using BTEQ, Fast load, Multiload.

Used the Teradata fast load/Multiload utilities to load data into tables.

Environment: Informatica Power Center 9.5 Repository Manager, Mapping Designer, Workflow Manager and Workflow



Contact this candidate