Post Job Free

Resume

Sign in

Informatica Developer

Location:
Stamford, CT
Salary:
75$/Hr
Posted:
October 19, 2020

Contact this candidate

Resume:

Harshini

Cell:609-***-**** / 585-***-**** E-Mail: adg4ab@r.postjobfree.com

CAREER OBJECTIVE:

Aspiring for challenging assignments in the field of Business Analysis, Data Analytics, Program Management and Software Development. Desire to achieve career growth through a continuous learning process and prove to be an asset for the organization.

EDUCATION:

Undergraduate Degree - Bachelor of Engineering (B.E.) SRM University, India

Major: Computer Science and Engineering

TECHNICAL SKILLS:

Operating Systems

Unix, Linux, and Windows.

Programming and Scripting

C, C++, Java, .Net, Perl Scripting, Shell Scripting, VBA, PL/SQL, T-SQL.

ETL Tools

Informatica PowerCenter 10.2.0/10.1.1/9.5/9.1/8.6/8.1/7.1, Informatica Power Exchange, Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), SSIS, Salesforce, DataStage, etc.

Database Tools

SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, AQT v10 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN

Scheduling Tools

Informatica Scheduler, CA Scheduler (Autosys), ESP, Maestro, Control-M.

Conversation/Transformation Tools

Informatica Data Transformation, Oxygen XML Editor (ver.9, ver.10)

Software Development Methodology

Agile, Waterfall

Domain Expertise

Insurance/Finance

RDBMS

SQL, SQL*Plus, SQL*Loader, Oracle11g/10g/9i/8i, Teradata, IBM DB2, UDB 8.1/7.0, Sybase 12.5, Netezza v9, MS SQL Server 2000/2005/2008, MS Access 7.0/2000.

EXPERIENCE

Summary:

•Over 8 years Extensive experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and developing Master Data using Informatica PowerCenter, Teradata.

•Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un-connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure and more.

•Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modelling, Ralph Kimball Approach, Star/Snowflake Modelling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modelling.

•Involved in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, Telecom and Manufacturing Sectors.

•Strong understanding of OLAP and OLTP Concepts.

•Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.

•Experience in SQL, PL/SQL and UNIX shell scripting.

•Hands on experience working in LINUX, UNIX and Windows environments.

•Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.

•Good knowledge on data quality measurement using IDE.

•Extensive ETL experience using Informatica PowerCenter (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.

•Strong experience in Dimensional Modelling using Star and Snowflake Schema, Identifying Facts and Dimensions, Physical and logical data modelling using Erwin and ER-Studio.

•Experience in designing and Developing complex Mappings using Informatica PowerCenter with Transformations such as Lookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, XML generator, XML parser, Stored Procedure, Sorter and Sequence Generator.

•Working experience using Informatica Workflow Manager to create Sessions, Batches, and schedule workflows and Worklets, Re-usable Tasks, Monitoring Sessions.

•Experienced in Performance tuning of Informatica and tuning the SQL queries.

•Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration, and user acceptance testing.

•Experience in handling initial/full and incremental loads.

•Expertise in scheduling workflows Windows scheduler, Unix, and scheduling tools like CRTL-M &Autosys

•Designed, Installed, Configured core Informatica components such as Informatica Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD & Data Modelling.

•Experience in support and knowledge transfer to the production team.

•Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.

•Experienced in Quality Assurance, Manual and Automated Testing Procedures with active involvement in Database/ Session/ Mapping Level Performance Tuning and Debugging.

Professional Experience:

1)Client: Nestle Water Location: Stamford, CT

Sr. Informatica PowerCenter December’ 2019 – Till Date

Responsibilities:

•Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica PowerCenter.

•Experienced in Parsing high-level design specs to simple ETL coding and mapping standards.

•Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.

•Coded Teradata BTEQSQL scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.

•Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, informatica Server monitoring, UNIX file system maintenance/clean-up and scripts using Informatica Command line utilities.

•Extensively worked on CDC to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.

•Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.

•Proficient in System Study, Data Migration, Data integration, Data profiling, Data Cleansing / Data Scrubbing and Data quality

•Coded Teradata BTEQ sql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.

•Experience in creation of ETL Mappings and Transformations using Informatica PowerCenter to move data from multiple sources into target area using complex transformations like Expressions, Routers, Lookups, Source Qualifiers, XML generator, XML Parser, Aggregators, Filters, Joiners

•Responsible in preparing Logical as well as Physical data models and document the same

•Performed ETL code reviews and Migration of ETL Objects across repositories.

•Developed ETL's for masking the data when made available for the Offshore Dev. team

•Developed UNIX scripts for dynamic generation of Parameter Files& for FTP/SFTP transmission

•Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production to meet the SLA's

•Migrated codes from Dev to Test to Pre-Prod. Created effective Unit, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure successful execution of accurate data loading.

•Scheduled Informatica workflows using OBIEE.

•Involved in implementing change data capture (CDC) and Type I, II, III slowly changing Dimensions

•Developed functions and stored procedures to aid complex mappings.

Environment: Environment: Informatica PowerCenter 10.x/9.6, Oracle 11g, Teradata, PL SQL, SQL developer,

TOAD, Putty, Unix

2)Client: Flagstar Bank Location: Jackson, MI

Sr. Informatica PowerCenter Developer March’2018 - September’2019

Responsibilities:

•Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.

•Worked closely with ETL Architect and QC team for finalizing ETL Specification document and test scenarios.

•Extracted data from oracle database and spreadsheets, CSV files and staged into a single place and applied business logic to load them in the central oracle database.

•Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

•Migration of code between the Environments and maintaining the code backups.

•Integration of various data sources like Oracle, SQL Server, Fixed Width & Delimited Flat Files, DB2.

•Involved in the Unit Testing and Integration testing of the workflows developed.

•Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.

•Imported Source/Target Tables from the respective databases and created reusable transformations like Joiner, Routers, Lookups, Filter, Expression and Aggregator and created new mappings using Designer module of Informatica.

•Designed table structure in Netezza.

•Involved in Database migrations from legacy systems, SQL server to Oracle and Netezza.

•Used the Address Doctor to validate the address and performed exception handling, reporting and monitoring the system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed the workflows as an application to run them. Tuned the mappings for better performance.

•Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.

•Profiled data on Hadoop to understand the data and identify data quality issues

•Imported and exported data from relational databases to Hadoop Distributed file system using Sqoop.

•Developed shell scripts for running batch jobs and scheduling them.

•Written UNIX shell scripts to load data from flat files to Netezza database.

•Handling User Acceptance Test & System Integration Test apart from Unit Testing with the help of Quality Centre as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.

•Involved in Production Support

Environment: Informatica PowerCenter 9.6, Oracle 11g, SQL Server, PL/SQL, Unix and WINSCP, Netezza,

Bigdata Edition 9.6.1, Hadoop, HDFS, HIVE, Sqoop.

3)Client: AvMed Location: Miami, FL

Sr. Informatica ETL Developer January’2017 - February’2018

Responsibilities:

•Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapp let Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL, and Web Service transformations.

•Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.

•Assisted in building the ETL source to Target specification documents

•Effectively communicate with Business Users and Stakeholders.

•Work on SQL coding for overriding for generated SQL query in Informatica.

•Involve in Unit testing for the validity of the data from different data sources.

•Design and develop PL/SQL packages, stored procedure, tables, views, indexes, and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.

•Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.

•Perform Data Conversion/Data migration using Informatica PowerCenter.

•Involve in performance tuning for better data migration process.

•Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.

•Create UNIX shell scripts for Informatica pre/post session operations.

•Automated the jobs using CA7 Scheduler.

•Document and present the production/support documents for the components developed when handing-over the application to the production support team.

•Created Data Model for the DataMart’s.

•Used materialized views to create snapshots of history of main tables and for reporting purpose

•Coordinating with users for migrating code from Informatica 8.6 to Informatica 9.5

•Contact with Informatica tech support group regarding the unknown problem

•On-Call support during the weekend

•Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production to meet the SLA's

•Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.

•Prepared SQL Queries to validate the data in both source and target databases.

Environment: Informatica 9.5/8.6, Oracle 11g, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL Loader, OBIEE,

Unix, Flat files, Teradata

4)Client: Nike Location: Portland, OR

Informatica ETL Developer November’2015- December’2016

Responsibilities:

• Interacted with Business users to understand the Business requirements and prepared technical documents for implementing the solutions as per business needs.

•Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes.

•Involved in logical and physical data modelling and analysed and designed the ETL processes.

•Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.

•Extensively used Informatica client tools to extract data from different sources like flat files and oracle.

•Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate one time, Daily, Monthly and Yearly Loading of Data.

•Created IDQ mappings with key generator, labeler, standardizer, match, consolidator, sorter, address validator transformations etc.

•Used Teradata utilities Fastload, multiload, tpump to load data.

•Configured the mappings to handle the updates to preserve the existing records using Update Strategy Transformation.

•Used Informatica Workflow Manager for Creating, running the Batches and Sessions, and scheduling them to run at specified time.

•Created users, tables, triggers, stored procedures, joins and hash indexes in Teradata database.

•Performed performance tuning on Teradata Tables.

•Used push down optimization techniques with Teradata databases.

•Involved in Debugging, Troubleshooting, Testing, and Documentation of Data Warehouse.

•Creating solution documents for developed mappings.

•Involved in solving day-to-day problems, giving support to the users.

•Communicated issues and progress to project manager.

Environment: Informatica PowerCenter 9.5.1, IDQ 9.5.1, Oracle 11g/10g, Teradata, SQL, PL/SQL, TOAD 9.6,

Windows XP

5)Client: Verizon Location: Irving, TX

Informatica ETL Developer September’2012- October’2015

Responsibilities:

•Logical and Physical data modelling was done using Erwin for data warehouse database in STAR SCHEMA.

•Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.

•Worked with health payer related data such as customers, policy, policy transactions, claims.

•Generated weekly and monthly report status for the number of incidents handled by the support team.

•Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex ETL logics

•Worked with Informatica PowerCenter Designer, Workflow Manager, Workflow Monitor and Repository Manager.

•Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Informatica Designer to create complex mappings from Business requirements.

•Created various transformations like filter, router, lookups, stored procedure, joiner, update strategy, expressions and aggregator to pipeline data to Data Warehouse/Data Marts and monitored the Daily and Weekly Loads.

•Designed and developed various complex SCD Type1/Type2 mappings in different layers, migrated the codes from Dev to Test to Prod environment. Wrote down the techno-functional documentations along with different test cases to smooth transfer of project and to maintain SDLC.

•Experience in using Stored Procedures, TOAD, Explain Plan, Ref Cursors, Constraints, Triggers, Indexes-B-tree Index, Bitmap Index, Views, Inline Views, Materialized Views, Database Links, Export/Import Utilities.

•Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.

•Used different algorithms like Bio gram, Edit, Jiro, Reverse and Hamming Distance to determine the threshold values to identify and eliminate the duplicate datasets and to validate, profile and cleanse the data. Created/modified reference tables for valid data using Analyst tools.

•Developed Informatica Workflows and sessions for mappings using Workflow Manager.

•Deployed the Informatica code and worked on code merge between two different development teams.

•Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.

•Created Pre-& Post-Sessions UNIX Scripts to merge the flat files and to create, delete temporary files, change the file name to reflect the file generated date etc.

Environment: Informatica PowerCenter Designer 9.5/8.6, Informatica Repository Manager, Oracle10g/9i, DB2 6.1,

Erwin, TOAD, Unix- SunOS, PL/SQL, SQL Developer, Teradata



Contact this candidate