Profile:
●Around * years of experience in software design, development, maintenance, testing and troubleshooting on ETL/DWH applications with expertise in handling multiple disparate systems of sources and targets (Flat files and RDBMS).
●Experience with Type 1, Type2, Type3 Dimensions
●Used various Transformations such as Expressions, Filters, aggregators, Lookups, Routers, Normalizer, and Sequence Generator etc. to load consistent data in to Oracle, and Teradata databases.
●Well versed in implementing ETL solutions using ETL Tools Informatica PowerCenter 9.x/8.x, Informatica Power Exchange, Test Data Management, PL/SQL and UNIX Shell Scripts
●Adept at Data Modeling & Data Warehousing concepts like Star, Snowflake Schemas, facts, dimensions.
●Strong experience in using Python, SQL, PL/SQL, Stored procedures/functions, triggers and packages, complex joins, correlated sub-queries, aggregate functions, analytical functions, materialized views, indexing, partitioning and performance tuning the same.
●Involved in Supporting and monitoring of ETL loads, bug fixes, business requirement analysis and prepared functional requirement document, performance tuning of SQL/PL SQL scripts based on explain plan, code Reviews as per ETL/Informatica standards and best practices
Technical Skills:
ETL Tools
Informatica 8.x/9.x
Data Modeling
Dimensional Data Modeling, Star Join Schema Modeling, Snowflake Modeling, Fact and Dimensions Tables, Physical and Logical Data Modeling
Technologies
ETL, Data Warehousing, Dimensional Data, Modeling, Machine learning basics, R basics, Business Intelligence
RDBMS
Oracle 11g, MS Access
Tools
TOAD, SQL*PLUS, SQL Developer, Tectia, Teradata SQL Assistant, RStudio, Weka, Tensor flow
Languages
SQL, Java, UNIX, PLSQL, Teradata, R
Operating Systems
MS-DOS, Windows7, UNIX
Professional Experience:
Credit Suisse, Tata Consultancy Services Jan 2017 – Aug 2017
ETL Informatica developer
Credit Suisse Group is a Swiss multinational investment bank and financial services holding company based in Zurich and is a leading financial services company, advising clients in all aspects of finance, across the globe. The project is to support the data mart application, which enable the business users to access and make the right decision.
Responsibilities:
●Conducted data analysis on real-time banking client and non - client data.
●Built data model for a potential client that resulted in generating close to $0.3 Million in company revenue.
●Produced weekly status report development about the project progress, the team came up with integration solutions
●Developed mappings to help the underwriting process to select deals that are more profitable
●Submitted recommendations to Credit Swiss Re to aid early identification of potential deals without adding more cost Recommender System
●Identify the sources and analyse the source data.
●Created and stored metadata in the repository using Informatica Repository Manager.
●Cleanse the source data, Extract and Transform data with business rules, and built reusable mappings, using Informatica PowerCenter Designer.
●Involved in creating, editing, scheduling and deleting of sessions using workflow manager in Informatica
●Implemented various Data Transformations using slowly changing dimensions.
●Extensively worked with SQL queries, packages, Triggers, Views
●Optimized the performance of the Informatica mappings. Configured session properties and target options for better performance.
●Used debugger to test the mapping and fixed the bugs. Created and used Debugger sessions to debug sessions and created breakpoints for better analysis of mappings.
●Created mapping variables and parameters and used them appropriately in mappings.
●Extensively used all the features including Designer, Workflow manager and Repository M
Environment: Informatica 9.1/9.6.1, Teradata, DB2, Oracle 11g, Agile, JIRA, MS SQL Server, UML, Tectia, Teradata SQL Assistant, Toad 12.1.0, flat files, XML files.
Credit Suisse June 2015 – Dec 2016
ETL Informatica developer
Data Staging Team is based on Client classification and Non-client classification of data and served as source for many other teams in the project.
Responsibilities:
●Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.
●Effectively used Informatica parameter files for defining mapping variables, workflow variable
●Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
●Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
●Used debugger in identifying bugs in existing mappings by analysing data flow, evaluating transformations.
●Effectively worked on Onsite and Offshore work model.
●Pre and post session assignment variables were used to pass the variable values from one session to other.
●Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
●Reviewed and analysed functional requirements, mapping documents, problem solving and troubleshooting.
●Performed unit testing at various levels of the ETL and actively involved in team code reviews.
●Responsible for developing complex Informatica mappings using different transformations.
●Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.
●Responsible for Defining Mapping parameters, variables, and Session parameters according to the requirements and usage of workflow variables for triggering emails.
●Responsible for tuning the Informatica mappings to increase the performance.
●Implemented complex ETL logic using SQL overrides in the source Qualifier.
●Performed Unit tests development work and validates results with Business Analyst.
●Developed Unix Scripts for updating the control table parameters based on the environments.
●Responsible for providing written status reports to management regarding project status, task, and issues/risks, testing.
●Analysing requirements to create test cases and getting approval from client for execution.
●Created UNIX Shell Scripts for scheduling various data cleansing scripts and loading process.
Environment: Informatica 8.6.1, Oracle 10g, SQL, PL/SQL, MySQL, Teradata, TOAD, Shell Scripts, UNIX, Tectia
Morgan Stanley, India Aug 2014 – May 2015
ETL Informatica developer
Morgan Stanley is a worldwide leader in investment banking and including an overview, stats, history and other Commercial Banking and Investment Banking competitors.
Responsibilities: _
●Understand business and data requirements by interacting with multiple teams across the organization.
●Improved customer engagement by 12% by using mappings and reports including demographic data
●Formulated key performance indicators (KPI’s) to measure and track the sales and revenue.
●Incorporated AGILE methodology and worked on project tracking system (JIRA) to manage business data requirements.
●Developed the data marts and tested the mappings, which resulted in increase of $0.23 Million in company revenue.
●Created a mapping for the movement of data from oracle and used SCD1 and SCD2 mappings.
●Provided technical guidance for re-engineering functions of Oracle warehouse operations.
●Extensively used transformations like router, lookup (connected and unconnected), update strategy, source qualifier, joiner, expression, stored procedures, aggregator and sequence generator transformation.
Environment: Informatica 9.1/9.6.1, Teradata, DB2, Oracle 11g, Agile, JIRA, MS SQL Server, UML, Tectia, Teradata SQL Assistant, Toad 12.1.0, flat files, XML files.
Bank of America, India Jan 2014 – Aug 2014
ETL Informatica developer
Bank of America Corporation is a multinational banking and financial services corporation headquartered in Charlotte, North Carolina, United States. Assets rank it second on the list of largest banks in the United States.
Responsibilities:
●Importing and exporting scenarios from development test.
●Analysed existing SQL / ETL Scripts and improved them to achieve better performance which reduced Financial Report generation time by 60%
●Develop, test, integrate and deploy ETL routines. .
●Created design documents like data flow diagrams, process flow charts, high-level document and low-level document.
●Developed ETL data pipelines using SQL view, joins and PLSQL procedures.
●This improved overall performance of the data warehouse and saved ~$200K in operational costs.
●Trained junior team members in data analysis, collecting business requirements and data warehouse implementation.
●Extensively used Transformations like Source Qualifier, Expression, Lookup, Update Strategy, Aggregator, Stored Procedure, Filter, Router, Joiner etc. Implemented Lookup transformation to update already existing target tables.
●Used Teradata utilities fast load, multiload to load the data.
●Good knowledge on Teradata Manager SQL assistant and BTEQ.
●Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, address standardization, exception handling, and reporting and monitoring capabilities of IDQ.
●Understanding the Reporting requirements.
●Developed Stored Procedures and test the application with Toad. Developed PL/SQL Procedures, Functions to build business rules, which are helpful to extract data from source and load data to target.
●Extensively wrote SQL Queries (Subqueries, correlated subqueries and Join conditions) for Data Accuracy, Data Analysis and Data Extraction needs.
●Extensively used debugger to find out errors in mappings and later fixed them.
●Proactively evaluated the quality and integrity of data by unit test and System test for Informatica mappings according to the business needs.
●Involved actively in creating test data in for performance testing using various data generator tools.
Environment Informatica Power Center 9.1.0, Teradata, IDQ, Agile, JIRA, DB2, MS SQL Server, Oracle, Flat files, MySQL
Education:
University of Houston Clear Lake at Houston (Houston, TX) Aug 17 - Present
MS, Computer Science – GPA: 3.33
Organizer of various data analytics events and Business Intelligence groups
Basaveshwar Engineering College (Bagalkot, India) Aug, 09 – Jun, 13
Bachelor of Engineering, Computer science