Post Job Free
Sign in

Informatica Developer

Location:
Fieldville, NJ, 08854
Salary:
75$/Hr
Posted:
September 22, 2020

Contact this candidate

Resume:

Yogesh Ravi

Informatica Developer

Email: adgbsz@r.postjobfree.com Cell No:609-***-****

PROFESSIONAL SUMMARY

Around 8 years of experience in Information Technology with a strong background in Database development and Data warehousing (OLAP) and ETL process using Informatica Power Center 9.x/8.x/7.x/6.x, Power Exchange, Designer (Source Analyzer, Warehouse designer, Mapping designer, Metadata Manager, Mapplet Designer, Transformation Developer), Repository Manager, Repository Server, Workflow Manager & Workflow Monitor.

Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server and Worked on integrating data from XML files, flat files like fixed width and delimited.

Experience in writing stored procedures and functions.

Experience in SQL tuning using Hints, Materialized Views.

Done integration of the AXON with Informatica EIC/EDC. Achieved Data Governance with Informatica AXON. Achieved Lineage for DQ and impact assessment via AXON.

Lead the Team in driving the development of IFRS Business Rule Development in EDQ

Worked on Power Exchange bulk data movement process by using Power Exchange Change Data Capture (CDC) method

Tuned Complex SQL queries by looking Explain plan of the SQL.

Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions

Experience in UNIX shell programming.

Expertise in Teradata RDBMS using Fastload, Multiload, Tpump, Fastexport, Multiload, Teradata SQL Assistant and BTEQ utilities.

Worked with Informatica Cloud for creating source and target object, developed source to target mappings.

Expertise in Dimensional Data Modeling using Star and Snow Flake Schema. Designed data models using Erwin.

Understanding of Relational (ROLAP), Multidimensional (MOLAP) modeling, data warehousing concepts, star and snowflake schema, database design methodologies and Meta data management.

Experience with preparing functional Specifications, Low Level and High-Level Design specification and Source to target mapping documents for the ETL programs/business processes

Experience in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Roles, Privileges, Tables, Constraints, Views, Indexes, Sequence, and Synonyms Dynamic SQL and SQL*Loader in distributed environment.

Experience in Creation and managing user accounts, security, rights, disk space and process monitoring in Solaris and Red hat Linux.

Extensive knowledge in handling slowly changing dimensions (SCD) type 1/2/3.

Experience in using Informatica to populate the data into Teradata DWH.

Experience in understanding Fact Tables, Dimension Tables, Summary Tables

Experience on AWS cloud services like EC2, S3, RDS, ELB, EBS, VPC, Route53, Auto scaling groups, Cloud watch, Cloud Front, IAM for installing configuring and troubleshooting on various Amazon images for server migration from physical into cloud.

Designed Architectural Diagrams for different applications before migrating into Amazon cloud for flexible, cost- effective, reliable, scalable, high-performance and secured.

Good knowledge in Azure Data Factory, SQL Server 2014 Management Studio

Migrated existing development from servers onto into Microsoft Azure, a cloud based service.

Build servers using AWS: Importing volumes, launching EC2, creating security groups, auto-scaling, load balancers, Route 53, SES and SNS in the defined virtual private connection.

Creating alarms in Cloud Watch service for monitoring the server performance, CPU Utilization, disk usage etc.

Wrote adhoc data normalization jobs for new data ingested into Redshift.

Familiar and with advanced Amazon Redshift and MPP database concepts.

Moved the company from a SQL Server database structure to AWS Redshift data warehouse and responsible for ETL and data validation using SQL Server Integration Services.

Designing and building multi-terabyte, full end-to-end Data Warehouse infrastructure from the ground up on Amazon Redshift for large scale data handling Millions of records every day.

Used Amazon Kinesis as a platform for streaming data on AWS

Worked with networking teams in configuring AWS Direct Connect to establish dedicated connection to datacenters and AWS Cloud

Designing and configuring the AWS Secure Notification Service (SNS) and Secure Email Service (SES) architecture of the solution and working with a client.

Exposure to development, testing, debugging, implementation, user training & production support.

Experience with 24x7 production support.

Excellent Analytical, Written and Communication skills.

Technical Skills

ETL Informatica 9.6.1/9.x/8.x/7.x, Data Stage 7.5/6.0, SSIS

Data Governance Informatica EIC / EDC 10.2 with AXON, AnlaytixDS

CLOUD SERVICES amazon web services

Databases Oracle 12c/11g/10g/9i/8i/7.x, DB2, Teradata v 13/v12/v2r5, SQL Server

Reporting Tools Business Objects XI 3.1/r2, Web Intelligence, Crystal Reports 8.X/10.X

Database Modeling Erwin 4.0, Rational rose

Languages SQL, COBOL, PL/SQL, JAVA, C

Web Tools HTML, XML, JavaScript, Servlets, EJB

OS Windows NT/2000/2003/7, UNIX, Linux, AIX

Education:

Bachelor of Technology in Computer Science and Engineering.

Role: Informatica Developer

Verizon, Piscataway, NJ Nov 2019-Till Date

Description: People Tech Group is a leader in Enterprise Solutions, Digital Transformation, Data Intelligence and Modern Operations. We work with and have helped medium to large enterprise class customers along with fortune 500 clients by utilizing our expertise in technology integration development and implementation to accelerate our client’s most important initiatives. Over the years, People Tech Group has broadened its services by acquiring – the Ramp Group, our UX Design Studio, CodeSmart, our Government Division, and Fyrsoft, our Modern Data Center solution partner.

Responsibilities:

Involved in gathering and analyzing the requirements and preparing business rules.

Gathered information from source system, Business Documents and prepared the Data conversion and migration technical design document.

Extensively worked in performance tuning of terra data SQL,ETL and other process to optimize performance..

Designed and developed complex mappings by using lookup, expression, update, sequence generator, aggregator, router, stored procedure, etc., transformations to implement complex logics while coding a mapping.

Worked in requirement analysis, ETL design and development for extracting data from the mainframes

Worked with Informatica Power Center 10 .0.2 designer, workflow manager, workflow monitor and repository manager.

Developed and maintained ETL (extract, transformation and loading) mappings to extract the data from multiple source systems like oracle, SQL server and flat files and loaded into oracle.

Developed Informatica workflows and sessions associated with the mappings using workflow manager.

Used power exchange to read source data from Mainframes Systems, power center for etl and Db2 as targets

worked and created files for business objects

excellent working knowledge with Informatica Informatica EIC/EDC, IDQ, IDE, Informatica Analyst Tool, Informatica Metada manager, Informatica Administration/Server Administration.

Involved in creating new table structures and modifying existing tables and fit into the existing data model.

Responsible for normalizing COBOL files using normalize transformation

Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings.

performed unit and integration testing and wrote test case

Debug through session logs and fix issues utilizing database for efficient transformation of data.

Worked with pre-session and post-session unix scripts for automation of etl jobs. Also involved in migration/conversion of etl processes from development to production.

Extracted data from different databases like oracle and external source systems like flat files using etl tool.

Extracted data from sources like sql server, AWS and fixed width and delimited flat files. transformed the data according to the business requirement and then loaded

Involved in debugging informatica mappings, testing of stored procedures and functions,

Perform unit testing on deliverables and documenting.

Developed mapplets, reusable transformations, source and target definitions, mappings using Informatica 10.0.2.

Generated queries using sql to check for consistency of the data in the tables and to update the tables as per the business requirements.

Involved in performance tuning of mappings in informatica.

Migrated the ETL - Worklflows, Mappings and sessions to QA and Production Environment

Good understanding of source to target data mapping and business rules associated with the ETL process.

Environment: informatica 10.0.2, unix, flat files (delimited), cobol files, Teradata 12.0/13.0/14.0

Role: Informatica Developer

Zoetis Parsippany, NJ. June 2019-Oct 2019

Description: Zoetis is a global animal health company dedicated to supporting customers and their businesses in ever better ways. Building on more than 65 years of experience, we deliver quality medicines, vaccines and diagnostic products, complemented by genetic tests, biodevices and a range of services. We are working every day to better understand and address the real-world challenges faced by those who raise and care for animals in ways they find truly relevant.

Responsibilities:

Provided strategic direction for the planning, development and implementation of various projects in support of enterprise data governance office and data quality team.

Designed & Developed Scripts Jobs in EDQ to carry out the profiling of data for different data fields on source schema tables for different business scenarios and report the statistics to the Business Team.

Designed & Developed Scripts Jobs in EDQ to carry out profiling of data on different Data Fields from source schema tables along with the concerned reference tables meant for migration purpose from source to target systems.

Develop reusable Data Quality mappings and maplets, DQ Rules, Data Profiling Using Informatica Developer/ Informatica Analyst tool. Have expert level experience with Informatca EIC, AXON Data Governance. Work with Informatica Metada manager for getting Impact and Lineage.

Ownership of the development of the 'Integrated Control Framework' (ICF) for data quality

Extensively used Informatica Data Quality (IDQ) to profile the project source data, define or confirm the definition of the metadata, cleanse and accurately check the data.

Ran checks for duplicate or redundant records and provided the information on how to proceed backend ETL process.

Loaded data in to the Teradata tables using Teradata Utilities BTEQ.Fast Load, Multi Load, and Fast Export, TPT.

Partnered with data stewards to provide summary results of data quality analysis, which will be used to make decisions regarding how to measure business rules and quality of the data.

Implemented data quality processes including translation, parsing, analysis, standardization and enrichment at point of entry and batch modes. Deploy mappings that will run in a scheduled, batch or real-time environment

Collaborated with various business and technical teams to gather requirements around data quality rules and propose the optimization of these rules if applicable, then design and develop these rules with IDQ

Worked on the designing and development of custom objects and rules, reference data tables and create/import/export mappings

As per business requirements, performed thorough data profiling with multiple usage patterns, root cause analysis and data cleaning and develop scorecards utilizing informatica data quality (IDQ)

Developed 'matching' plans and helped determine best matching algorithm, configure identity matching and analyze duplicates

Worked on building human task workflows to implement data stewardship and exception processing

Developed KPIs & KRIs for the data quality function

Drove improvements to maximize value of data quality (e.g. drive changes to have access to required metadata to maximize impact of data quality & quantity cost of poor data quality)

Performed data quality activities i.e. data quality rule creation, edit checks, identification of issues, root cause analysis, value case analysis and remediation

Augmented and executed the data quality to ensure achievable and measurable millstones are mapped out for delivery and are effectively communicated

Prepared the documents on all mappings, mapplets and rules in detail and handover documentation to the customer

Environment: Informatica Developer 10.1.1 HotFix1, Informatica EDC 10.2, Informatica Axon 6.1/6.2, Teradata 14.0,Oracle 11G, PL/SQL, Flat Files (XML/XSD, CSV, EXCEL), UNIX/LINUX, Shell Scripting, Rational Rose/Jazz, SharePoint

Roles: Informatica Developer

Blue Cross Blue Shield - Richardson, TX. Aug 18-May 2019

Description: Blue Cross Blue Shield Association is a federation of 36 separate United States health insurance organizations and companies, providing health insurance in the United States to more than 106 million people

Responsibilities:

Collaborated with Business Analysts for requirements gathering, business analysis and designing of the Enterprise Data warehouse.

worked in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like DB2, MS SQL Server, Oracle, flat files, Mainframes and loading into Staging and Star Schema.

Created/Modified Business requirement documentation.

Created ETL Specifications using GAP Analysis.

Did Production Support on Monthly rotation basis.

Worked on Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions

Used Informatica Data Quality (IDQ) to profile the data and apply rules to Membership and Provider subject areas to get Master Data Management (MDM)

Created INFORMATICA mappings, Sessions and Workflows. Worked END- END project with Informatica EIC/EDC & AXON.

Worked on CISM tickets for Production issues for Trucks and Cars Data Marts.

Extracted Mainframe data for Local Landscapes like Credit Check, Application etc.

Created Unit Test cases, supported SIT and UAT.

Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.

Used IDQ's standardized plans for addresses and names clean ups.

Applied the rules and profiled the source and target table's data using IDQ.

Worked with Informatica Power exchange and Informatica Cloud to integrate and load data to Oracle db.

Worked with Informatica Cloud for creating source and target object, developed source to target mappings.

Expertise in Dimensional Data Modeling using Star and Snow Flake Schema. Designed data models using Erwin

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts

Extracted data from sources like SQL server, and Fixed width and Delimited Flat files. Transformed the data according to the business requirement and then Loaded into Oracle and Teradata databases.

Experience in writing SQL queries and optimizing the queries in SQL Server.

Performed data analysis and data profiling using SQL on various sources systems including SQL Server

Designed and developed complex mappings to load the Historical and Weekly data from the Legacy Systems to Oracle database.

Worked with Teradata Database in writing BTEQ queries and Loading Utilities using Multiload, Fastload, FastExport.

Created CISM ticket with HP to fix DB2 issues.

Modified MORIS (Trucks) data mart load with enhancements.

Created Defects in ALM software and assigned to developers.

Fine Tuned the Complex SQL by checking Explain Plan of the queries.

Improved ETL job duration/run time from hours to minutes where no push up and partition process unavailable.

Created mappings to load data from ODS to DWH.

Used DB2 Loader utility to load large tables efficiently.

Created session partitions to load data faster.

Ftp'd the File using Session property FTP Connection to Informatica Server.

Used Informatica Power Exchange to read Mainframe files.

Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.

Environment: Informatica Power Center 9.6.1, Axon, Informatica Metadata Manager, DB2 9.8, Rapid SQL 8.2, IBM Data Studio, Erwin, PLSQL, Red hat Linux, Power Exchange 9.5.1, Copy Book, Tivoli, ALM, Shell Scripting.

Role: Informatica Cloud Developer

TCS,Bangalore,India Jun 2015-Mar 2017

Description: Citi works tirelessly to provide consumers, corporations, governments, and institutions with a broad range of financial services and products. We strive to create the best outcomes for our clients and customers with financial ingenuity that leads to solutions that are simple, creative, and responsible

Responsibilities:

Provide recommendations and expertise of data integration and ETL methodologies.

Responsible to analyze UltiPro Interfaces and Created design specification and technical specifications based on functional requirements/Analysis.

Design, Development, Testing and Implementation of ETL processes using Informatica Cloud

Developing ETL / Master data management (MDM) processes using Informatica Power Center and Power Exchange 9.1

Convert extraction logic from data base technologies like Oracle, SQL server, DB2, AWS.

Complete technical documentation to ensure system is fully documented

Data cleansing prior to loading it into the target system

Convert specifications to programs and data mapping in an ETL Informatica Cloud environment

Provide High Level and Low Level Architecture Solution/Design as needed

Dig deep into complex T-SQL Query and Stored Procedure to identify items that could be converted to Informatica Cloud AWS

Develop reusable integration components, transformation, logging processes.

Support system and user acceptance testing activities, including issue resolution.

Environment: Informatica Cloud, T SQL, Stored Procedure, AWS, FileZilla, Windows, bat scripting, MDM.

Role: ETL Developer

Atom Technologies - Mumbai, India. Nov 2014-May 2015

Description: Axis Mobile application enables the users to enjoy convenient and secured mobile banking. It provides all basic web banking functionalities in mobile like Account summary, mini-statement, and transaction details for all kind of accounts. It provides the user with complete control over credit card accounts. This app also supports some advanced banking features like Auto-Pay, Schedule-Pay, Mobile check deposit, Card control, Instant money transfer facilities (NEFT, IMPS).

Responsibilities:

Involved in System Analysis and Designing, participated in user requirements gathering.Worked on Data mining to analyzing data from different perspective and summarizing into useful information.

Involved in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like MS SQL Server, Oracle, flat files and loading into Staging and Star Schema.

Used SQL Assistant to querying Teradata tables.

Experience in various stages of System Development Life Cycle(SDLC) and its approaches like Waterfall,Spiral,Agile.

Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.

involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.

Worked on Teradata Multi-Load, Teradata Fast-Load utility to load data from Oracle and SQL Server to Teradata.

Involved in massive data cleansing prior to data staging.

Configured workflows with Email Task, which would send mail with session, log for Failure of a sessions and for Target Failed Rows.

Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).

Worked on Change Data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.

Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.

Cleanse, standardize and enrich customer information using Informatica MDM.

Worked with data Extraction, Transformation and Loading data using BTEQ, Fast load, Multiload.

Used the Teradata fast load/Multiload utilities to load data into tables.

Worked with UNIX shell scripts extensively for job execution and automation.

Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.

Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.

Environment: Informatica Power Center 9.5 Repository Manager, Mapping Designer, Workflow Manager and Workflow Monitor, Oracle 11g, Teradata V 13.0, Fast load, Multiload, Teradata SQL Assistant, MS SQL Server 2012, SQL, PL/SQL, T-SQL, Unix, Cognos, Tidal.

Role: Programmer Analyst Location: Hyderabad, India

Client: Bank of America Mar 2012-Oct 2014

Description: Wisdom Technologies Pvt Ltd. is a privately held firm based in Hyderabad, India. Our solutions and engineering services resulted in successful launch of several products and applications by our customers. In the year 2005, our journey began. Since inception, our solutions have been deployed in numerous fields of Automotive, Consumer and Security segments.

Responsibilities:

Involved with requirement gathering and analysis for the data marts focusing on data analysis, data quality, data mapping between staging tables and data warehouses/data marts.

Designed and developed processes to support data quality issues and detection and resolutions of error conditions.

Wrote SQL scripts to validate and correct inconsistent data in the staging database before loading data into databases.

Analyzed the session logs, bad files and error tables for troubleshooting mappings and sessions.

Wrote shell script utilities to detect error conditions in production loads and take corrective actions, wrote scripts to backup/restore repositories, backup/restore log files.

Scheduled workflows using Autosys job plan.

Provided production support and involved with root cause analysis, bug fixing and promptly updating the business users on day-to-day production issues.

Environment: Informatica Power Center 8.6.1/7.x, Oracle 10g, Autosys, Erwin 4.5, CMS, TOAD 9.0, SQL, PL/SQL, UNIX, SQL Loader, MS SQL Server 2008.



Contact this candidate