Post Job Free
Sign in

Data Manager

Location:
San Francisco, CA
Posted:
January 22, 2018

Contact this candidate

Resume:

Gopala Krishna

Informatica Developer

*******.****@*****.***

+1-603-***-****

PROFESSIONAL SUMMARY:

●5+ years of IT experience in analysis, design and development for various Software applications in client-server environment and providing Business Intelligence solutions in Data Warehousing for Decision Support Systems and OLAP application development

●5+ years of as an ETL Analyst and ETL Developer in Data Warehouse / Data Marts using Informatica Power Center

● Involved in creating hierarchies, properties, validation/verification, import & exports, implementing security & automation of processes

●5+ years of experience in Oracle, SQL, PL/SQL and UNIX shell scripting

●Worked on multiple client specific environments related to Financial, Telecommunications, Banking and Insurance

●Extensively used ETL methodologies for supporting data Extraction, Transformation and Loading (ETL) process in a corporate-wide-ETL solution using Informatica Power Center

●Experience in using Data sources/targets such as Oracle 11g, SQL Server 2008/2005, Teradata, XML and Flat files

●Worked extensively on various Informatica Data Integration components - Repository Manager, Designer and Workflow Manager/Monitor

●Vast experience in Designing and developing complex mappings from varied transformation logic like Unconnected and Connected Lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.

●Good understanding of Data warehouse concepts and principles, Kimball & Inman approaches, Star & Snowflake Schema, Fact/Dimension tables, Normalization/De normalization

●Expertise in Data Analysis, Data Mapping, Data Modeling, Data Profiling and development of Databases for business applications and Data warehouse environments

●Worked with Business Managers, Analysts, and end users to correlate Business Logic and Specifications for ETL Development

●Extensively involved in creating Oracle PL/SQL Stored Procedures, Functions, Packages, Triggers, Cursors, and Indexes with Query optimizations as part of ETL Development process

●Proficiency in Data Warehousing techniques for Data Cleaning, Slowly Changing Dimension (SCD) phenomenon, Surrogate Key assignment and Change Data Capture (CDC)

●Strong skills in data analysis, data requirement analysis and data mapping for ETL processes

●Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and session

●Experienced in using UNIX commands and in creating new/modifying UNIX Shell scripts for file

Processing, data validations and Job

●Good knowledge on Agile Methodology and the scrum process.

.

TECHNICAL SKILLS:

ETL

Informatica Power Center (9.x/10.1)

Databases

Oracle, Teradata, SQL Server.

Reporting Tools

SAP Business Objects, Tableau, Wave Analytics

Scheduling Tools

Autosys, Dollar Universe, Tidal

Programming

PL/SQL.

PROFESSIONAL EXPERIENCE:

Salesforce, San Francisco, CA July 2017 – Present

Role- Informatica Developer

Responsibilities:

●Involved in various design discussions and requirement discussions with the end users for designing Ad-hoc reports for 360-degree view of an Object.

●Worked closely with all the application owners to know about the functionality of these applications in a very detail level to understand the data models being used by them.

●Worked with upstream teams to get data in the form of Flat-Files.

●Created Technical Specifications document based on functional specifications.

●Developed complex ETL mappings that involve parallel processing of multiple instances basing on certain parameters in control table.

●Created complex Mappings using different Transformations like Filter, Router, Joiner, Connected & Unconnected Lookups, Sorter, Aggregator and Sequence Generator to pipeline data to Data Warehouse

●Developed workflows with multiple sessions and instances for same subject area.

●Created post-sessions and pre-sessions for all the sessions to update the process table which is used to understand the status of ETLs.

●Developed Dimension mappings that load the dimension from landing zone table involving Type 2 and Type 1 transformations.

●Involved in tuning Informatica ETL mappings analyzing them thoroughly.

●Involved in identifying various bottle necks at various levels (database, mapping, session, and workflow) and came up with solution to improve the performance.

●Created table partitions for each of the country, operating company and period combination which enables faster retrieval of data.

●Used Persistent- Cache to cache the files and reuse the files in multiple mappings to improve the performance.

●Used Push down Optimization to run the mappings partly in the database to decrease the run time of the workflows.

●Used Web Service Consumer Transformation to lookup the address.

●Created a Stored Procedure to do a partition swap from staging to FDW which is called in ETL to load the Fact data for various tables which are passed as parameters to the stored procedure.

●Create Oracle Stored Procedures to implement the ETL Process control logic.

●Created Oracle Stored Procedures for Segment Dimension which holds all the dimension keys.

●Created Functions and called in Informatica to enhance the performance of the mapping.

●Scheduled workflows using Cisco Tidal Enterprise Scheduler.

●Worked on JSON to upload the dataset to Wave Analytics.

●Made enhancements to the UNIX scripts to compare the XML of the mappings with Template mappings to check errors as a part of QE Framework.

●Used Flat-File Validation Frame work to check the Structure of the Flat-Files

●Performed end to end testing to ensure the quality of code.

●Deployed the code into production and worked on creating the support manual for the application

Environment: Informatica Power Center 10.1, Oracle 11g, Linux, Tidal, JSON, Wave Analytics.

Idea Crew Washington D.C March 2016 – July 2017

Role: Informatica Developer

Responsibilities:

●Prepared design specification documents as per the inputs received from the Architect and the Business Analyst.

●Extracted data from Heterogeneous source systems like Oracle, SQL Server and Flat files with fixed width and delimited.

●Involved in Cleansing and Extraction of data and defined quality process for the warehouse

●Developed Informatica ETL mappings, sessions and workflows based on the technical specification document.

●Created Mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator

●Designed and developed the logic for handling Slowly Changing Dimension tables load by flagging the record using update strategy for populating the desired

●Developed reusable mapplets and transformations for reusable business calculations

●Used exception handling logic in all mappings to handle the null values or rejected rows

●Tuned the ETL components to gain the performance and to avoid business continuity.

●Worked with Persistent Caches for Conformed Dimensions for the better performance and faster data load to the data warehouse

●Involved in performance tuning and optimization of Informatica Mappings and Sessions using partitions and data/index cache to manage very large volume of data

●Performed query overrides in Lookup Transformation as and when required to improve the performance of the Mappings

●Developed Oracle PL/SQL components for row level processing.

●Dropped & recreated Indexes before & after loading through pre-SQL& post-SQL

●Developed UNIX scripts for processing Flat files.

●Scheduled the jobs in the Autosys.

●Prepared Test Data and loaded it for Testing, Error handling and Analysis

●Prepared the test cases and tested the ETL components for end to end process.

●Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for Unit Testing, Systems Testing, expected results

●Created an Issue Log to identify the errors and used it for preventing any such errors in future development works.

●Worked on the production code fixes and data fixes

●Responsible to troubleshoot the problems by monitoring all the Sessions that are scheduled, completed, running and used Debugger for complex problem troubleshooting.

●Worked with Application support team in the deployment of the code to UAT and Production environments

●Involved in production support working with various mitigation tickets created while the users working to retrieve the database.

●Worked as a part of a production support team and provided 24 x 7 supports

●Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Mappings

●Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.

Environment: Informatica Power Center 9.6, MS SQL Server 2008, Tableau, Linux, Autosys

Cisco Systems (Accenture, India) September 2012 – January 2016

Role- Informatica Developer

Responsibilities:

●Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.

●Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin.

●Involved in converting the Data Mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.

●Involved in designing and customizing of Data Models for Data Mart supporting data from multiple sources on real time.

●Designed the Data Mart defining Entities, Attributes and Relationships between them.

●Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.

●Defined various Facts and Dimensions in the Data Mart including Fact Less Facts, Aggregate and Summary Facts.

●Reviewed Source Systems and proposed data acquisition strategy.

●Designed and developed Informatica Mappings to load data from Source Systems to ODS and then to Data Mart.

●Designed the ETL processes using Informatica to load data from Oracle, Flat Files (fixed width), and Excel files to staging database and from staging to the target Teradata Warehouse database

●Extensively used Power Center to design multiple mappings with embedded business logic.

●Created complex mappings using Transformations like Connected / Unconnected Lookup, Joiner, Router, Rank, Sorter, Aggregator and Source Qualifier Transformations

●Created Mapplet and used them in different Mappings.

●Implemented CDC (Change Data Capture) using Informatica

●Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.

●Improved the performance of the ETL by indexing and caching.

●Creation of metrics, attributes, filters, reports, and dashboards created advanced chart types, visualizations and complex calculations to manipulate the data.

● Transformed TABLEAU into a managed service offering for consumption across Corporate Treasury and Corporate Investments.

● Act as a Point of Contact/Administrator in Data Interoperability, Analytics and Business Intelligence and Production Support issue resolution

●Designed and developed Oracle PL/SQL scripts for Data Import/Export.

●Extracted and Loaded data using different tools in Teradata like BTEQ, Fast load, Multiload, Fast Export

●Worked with various Teradata utilities as external loaders in Informatica.

●Developed Unix Shell Scripts for automating the execution of workflows.

●Created various UNIX Shell Scripts for scheduling various Data Cleansing scripts and loading process.

●Maintained the batch processes using Unix Shell Scripts.

●Designed and deployed UNIX Shell Scripts.

●Managed Change control implementation and coordinating daily, monthly releases and reruns.

●Responsible for loading data into warehouse from various sources using SQL Loader to load millions of records.

●Involved in migration of Mappings and Sessions from development repository to production repository

●Provided Production Support by executing the Sessions, diagnose problems and fix the Mappings for changes in business logic.

●Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 8.x, Oracle, Teradata, $ Universe, SAP BO, Linux

EDUCATION:

●Bachelors in Computer Information Systems.



Contact this candidate