Post Job Free

Resume

Sign in

Data Manager

Location:
Cypress, TX
Posted:
October 05, 2020

Contact this candidate

Resume:

Shiva Yalamanchili

adgovl@r.postjobfree.com

408-***-****

Professional Summary:

•9+ years of IT experience with expertise in analysis, design, development and implementation of Data warehousing and Business Intelligence applications using ETL and BI tools with Oracle, DB2, MS SQL server databases on windows and UNIX platforms.

•Expert level experience in Data Integration and Data Warehousing, using ETL tool INFORMATICA PowerCenter (Source Analyzer, Warehouse Designer, Mapping/ Mapplet Designer, Sessions/tasks, Worklets / Workflow Manager).

•Knowledge of Informatica tools Power Exchange, PowerAnalyser, PowerConnect, Data Mart, OLAP and OLTP.

•Highly proficient in Data Modeling retaining concepts of RDBMS, Logical and Physical Data Modeling and Multidimensional Data Modeling Schema (Star schema, Snow-Flake Modeling, Facts and dimensions).

•Good knowledge of data warehouse methodologies, ODS, EDW and Metadata repository.

•Analytical Skills: Data analysis work with large amounts of data: data, facts, figures, and number crunching. You will need to see through the data and analyze it to find conclusions.

•Communication Skills: Data analysis on present findings or translate the data into an understandable document. Speak clearly, easily communicating complex ideas.

•Critical Thinking: Data analysis must look at the numbers, trends, and data and come to new conclusions based on the findings.

•Expertise in using Oracle Performance tuning concepts with Oracle hints and EXPLAIN PLAN tool

•Run and monitor software performance tests on new and existing programs for the purposes of correcting errors, isolating areas for improvement, and general debugging.

•Used PL/SQL features like Built in Functions, Analytical Functions, Cursors, Cursor variables, Native dynamic SQL, bulk binding techniques and Packages/Procedures/Functions wherever applicable to process data in an efficient manner using oracle loader.

•Highly competent working with large data transfers and experience with parallels.

•Proven experience writing and maintaining complex Oracle SQL, PLSQL functions, and Oracle packages.

•Attention to Detail: Data is precise. Data analysis have to vigilant in their analysis to come to correct conclusions.

•Bitmap, creating database indexes/tables and other objects, parallelism, Partitioning and sub partition.

•Expert-level mastery in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables, legacy and COBOL Sources.

•Maintain Development, Unit testing of framework.

•Proficient in implementing Complex business rules by creating re-usable transformations, workflows / worklets and Mappings / Mapplets.

•Experience in Performance Tuning of source, target, mappings, transformations and sessions.

•Implemented the Slowly Changing Dimensions for the purpose of incremental loading of the target database.

•Strong expertise in Relational data base systems like Oracle, SQL Server, Netezza design and data base development using SQL, PL/SQL, SQL PLUS, TOAD.

•Domain Data warehousing experience including Investment banking, Credit card and Insurance. Proven ability to implement technology-based solutions for business problems.

•Excellent analytical, problem solving and communication skills, with ability to interact with individuals at all levels.

•Good at Data warehousing concepts.

•Expertise in using Oracle Performance tuning concepts with Oracle hints and EXPLAIN PLAN tool

•Used PL/SQL features like Built in Functions, Analytical Functions, Cursors, Cursor variables, Native dynamic SQL, bulk binding techniques and Packages/Procedures/Functions wherever applicable to process data in an efficient manner

Technical Skills:

ETL Tools: Informatica Power Center, Power Exchange, Power Connect, Pervasive Data Integrator, Map Designer, IDE, Process designer and Data Profiler Microsoft DTS,SSIS,ORACLE PKG’s Sql server store proc’s.

Analysis/Reporting Tools: Business Objects, SSRS, Oracle Reports, Tableau, OBIEE.

Databases: Oracle 11i/10g/9, MS SQL Server 2005/2008, DB2, Netezza.

DB Tools: SQL*Plus, TOAD, Aginity.

Operating Systems: UNIX, WINDOWS XP/NT/ 2000/98/95.

Languages: C, SQL, PL/SQL, T-SQL, XML, JSON.

Data Modeling: All-Fusion Erwin, Visio.

Scripting Languages: UNIX Shell Scripting.

Data Modelling/Methodologies: Logical/Physical/Dimensional, Star/Snowflake, ETL, OLAP,

Complete Software Development Cycle.

Work Experience

JPMORGAN CHASE June 24,2019 - Present

Project Delivery Module Lead

•Working on Project for the period analyzing the data from old legacy application and converting into oracle and supporting the brand-new applications.

•worked on sql server store procedures and rewrote it to oracle packages by enhancing them.

•Fix the data quality issues and worked on the data enhancement.

•Worked on WIS data load into Oracle database.

•Worked on OLAP process to run the application with most frequent data.

•worked on Job schedules using Autosys using Jil files.

•worked on informatica for ETL applications.

•Worked on SSIS for the existing job fixes.

•Prepared the technical documents as well as application support documents in confluence and SharePoint.

•Provided the required data support for the tableau reports.

•Knowledge of GIT, Bitbucket, Jenkins, SONAR, SPLUNK, Maven, AIM and Continuous Delivery tools.

•Knowledge of Cloud (private cloud, public cloud etc.) working experience of cloud environments like AWS.

•Familiarity with Control M and AutoSys job scheduler.

Sprint, Overland Park, KS Sept 18 – June 20

Project Delivery Module Lead

•Excellent analytical, problem solving and communication skills, with ability to interact with individuals at all levels.

•Good at Data warehousing concepts.

•Strong knowledge of domain-based design, data modeling and data structures

•Experience as DB application developer.

•Good working knowledge on Oracle materialized views, Oracle DB Links, Oracle Golden Gate.

•Bitmap, creating database indexes/tables and other objects, parallelism, Partitioning and sub partition.

•Good working knowledge on Oracle databases (12c / Exadata)

•Expertise in using Oracle Performance tuning concepts with Oracle hints and EXPLAIN PLAN tool

•Run and monitor software performance tests on new and existing programs for the purposes of correcting errors, isolating areas for improvement, and general debugging.

•Used PL/SQL features like Built in Functions, Analytical Functions, Cursors, Cursor variables, Native dynamic SQL, bulk binding techniques and Packages/Procedures/Functions wherever applicable to process data in an efficient manner using oracle loader.

•Experience in implementing data solutions in an OLTP and ODS environment

•Expertise in writing SQLs to perform end to end testing including Data Validation, Unit Testing.

•Perform tuning and optimization on SQL queries, using Explain Plan and tkprof.

•Proficient in Logical and Physical Data modelling, Dimensional Data Modelling (Star Schema/Snowflake Schema), Creation of FACT and Dimensional tables.

•Strong expertise in Relational data base systems like Oracle, SQL Server, Netezza design and database development using SQL, PL/SQL, SQL PLUS, TOAD, AGINITY.

Bank of the West, San Ramon, CA Sept 17 – Sept 18

Sr. ETL Developer/Data Analyst (Business Intelligence Data warehouse)

Job Duties/ Responsibilities:

•Analyze user requirements, current operational procedures, and functional specifications.

•Implement and maintain multiple ETL processes to synchronize data between different source systems and database

•Provide application support for CCAR project.

•Perform tuning and optimization on SQL queries, using Explain Plan and tkprof.

•Review and alter database programs to increase operating efficiency.

•Develop database code for the feature enhancement in CCAR & ALMT.

•Test, troubleshoot, and debug database programs.

•Document, test, implement and provide ongoing support for Oracle and ETL applications.

•Involved in different phases of Data Warehouse Life Cycle of CCAR, ALMT including business reporting requirements gathering, source system analysis, logical/physical data modeling, ETL design/development and production support.

•Experienced ETL - Informatica PowerCenter professional, developed complex mappings using almost all the transformations, latest features, and performance tuning and data quality.

•Expertise in Data Visualization and Dashboard Reporting with Business object xi 3.1 and Crystal Reports.

•Proficient in Logical and Physical Data modelling, Dimensional Data Modelling (Star Schema/Snowflake Schema), Creation of FACT and Dimensional tables using Data Warehouse Construction Methodologies.

•Used PL/SQL features like Built in Functions, Analytical Functions, Cursors, Cursor variables, Native dynamic SQL, bulk binding techniques and Packages/Procedures/Functions wherever applicable to process data in an efficient manner using oracle loader.

•Highly competent working with large data transfers and experience with parallels.

•Proven experience writing and maintaining complex Oracle SQL, PLSQL functions, and Oracle packages

•Expertise in writing SQLs to perform end to end testing including Data Validation, Unit Testing.

•Over 9 years of financial domain experience with difference banks like ABN AMRO- RBS, Credit-Suisse and Bank of the West.

•Quick learner, excellent analytical, problem solving, and communication skills with ability to interact with individuals at all levels.

Specialties: Informatica Power Center 9.6.1, Oracle 10g/11g, SQL, PL/SQL, Unix scripting,Tidal scheduler, Business Objects Xi 3.1

Blue Shield of California, San Francisco, CA Feb 16 – Sept 17

Informatica Developer/Analyst (Business Intelligence Data warehouse)

Responsibilities:

•The objective was to extract data from Flat Files and Oracle database and to finally load it into a single data warehouse repository, which was in Oracle, Facets.

•Extensively used Informatica Power Center tool (Source Analyzer, Transformation Developer, Warehouse Designer, Mapping Designer and Mapplet Designer) for ETL process.

•Worked extensively on different types of Transformations like Source Qualifier, Expression, Filter, Aggregator, Rank, Update Strategy, Lookup (Connected & Unconnected), Stored Procedures, Sequence Generator, XML Parsing and Joiner.

•IDE typically provides many features for authoring, modifying, compiling, deploying and debugging software.

•Using crosswalk and CDM portal to reference the static data and generation parameter file.

•Handling team of 5 and make sure to meet the requirements and timelines for the release.

•Quick comeback for the defects and job issues.

•Developed Data Mappings between source systems and warehouse components. Analyzed the mappings to reposition the transformation for optimal performance.

•Responsible for Debugging & Performance Tuning of the Informatica mappings and sessions.

•Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager.

•Tuned Performance of Informatica session for large Data Files by increasing block size, data cache size, Sequence buffer length and target-based commit interval.

•Developed reusable Mapplets, Transformations and worklets. Created and scheduled Workflows. Used PL/SQL features like Built in Functions, Analytical Functions, Cursors, Cursor variables, Native dynamic SQL, bulk binding techniques and Packages/Procedures/Functions wherever applicable to process data in an efficient manner using oracle loader.

•Experience in implementing data solutions in an OLTP and ODS environment

•Expertise in writing SQLs to perform end to end testing including Data Validation, Unit Testing.

•Set up Workflows to schedule the loads at different intervals using Power Center Workflow Manager, Generated completion messages and status reports using Workflow Monitor.

•Analyzed Session Log files, in case the session fails in order to resolve errors in mapping or session configurations.

•Involved in identifying issues in existing mappings by analyzing data flow, evaluating transformations using Debugger.

Environment: Informatica Power Center 9.6, Power Center designer, workflow manager, workflow monitor, Oracle11g, 12c, PL/SQL Developer, SQL plus, UNIX, Windows XP/NT.

TMX Finance, Carrollton, TX April 14 – Feb16

ETL Developer BIEDW (Business Intelligence Data warehouse)

Responsibilities:

•Worked with Power Center- Designer tool in developing mappings and mapplets to extract and load the data from various sources.

•Parsed High Level design specs to simple ETL coding and mapping standards.

•Developed various transformations like Source Qualifier, Joiner transformation, Update Strategy, Lookup transformation, stored procedure Transformations, Expressions, XML Parsing using Json and Sequence Generator for loading the data into target DATA MART.

•Modified the existing mappings and created the new ones based on specification documents.

•Usage of Aggregator, SQL Overrides usage in Lookups and Source qualifiers and data flow management into multiple targets using Routers was extensive.

•Strong expertise in Relational data base systems like Oracle, SQL Server, Netezza design and database development using SQL, PL/SQL, SQL PLUS, TOAD, AGINITY.

•Developed various reusable transformations using the Transformation Developer in the Informatica PowerCenter Designer.

•Maintain Development, Unit testing of framework.

•Implemented the Slowly Changing Dimensions for the purpose of incremental loading of the target database.

•Complete knowledge of data warehouse methodologies, ODS, EDW and Metadata repository.

•Responsible for Debugging & Performance Tuning of the Informatica mappings and sessions.

•Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager.

•Tuned Performance of Informatica session for large Data Files by increasing block size, data cache size, Sequence buffer length and target-based commit interval.

•Used Maestro for job scheduling.

•Used Korn Shell scripts for Informatica pre-session, post session procedures.

•Create Reports as per the user requirements using extensive functionalities in Business Objects, which allow the users to analyze the data.

Environment: Informatica Power Center 9.x, Cobol Flat Files from Mainframes, Informatica Power Analyzer, Erwin, SSRS, Maestro, Web Services, Info View, Clear Case, PL/SQL, SQL Server 2008, Aginity,Win 2000, Netezza.

HSBC May 10 – Nov 13

Informatica Developer

Responsibilities:

•Created Logical & Physical models and used Erwin for data modeling and Dimensional Data Modeling.

•Provide application support for processes & data related to Sales, Sales Operations, Marketing system.

•Written Data loading stored procedures, functions using PL/SQL from Source systems into operational data storage.

•Worked with Pervasive Data Integrator in developing mappings and process flows to extract and load the data from various sources.

•Worked with Map Designer, Process designer and Data Profiler to create transformations

•Modified the existing mappings and created the new ones based on specification documents.

•Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.

•Usage of Expression builder with Real-time Integration Flow Language (RILF) customized existing pre-built functions as per requirement.

•Developed various reusable transformations using the Map designer

•Implemented the Slowly Changing Dimensions for the purpose of incremental loading of the target database.

•Responsible for Debugging & Performance Tuning of the Map Designer mappings and process flows.

•Complete knowledge of data warehouse methodologies, ODS, EDW and Metadata repository

•Used Parameters to increase the efficiency of the sessions in the Process flow Manager.

•Pervasive Data integrator increases operational efficiency with real time integration.

•Create Reports as per the user requirements using extensive functionalities in Business Objects, which allow the users to analyze the data.

Environment: Informatica Power Center 9.x, Cobol Flat Files from Mainframes, Pervasive Data Integrator, SQL 2005/SQL 2000, Mainframe, MS Visio, TOAD, PL/SQL Developer, windows NT/XP.

Sun Infotech, India July 09 – March 10

Developer- Solution/Production Development

Responsibilities:

•Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.

•Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups (Connected, Unconnected), Expression, Aggregator, Update strategy & stored procedure transformation.

•Responsible for identifying reusable logic to build several Mapplets, developed E-Mail Tasks in the Workflow Manager for sending successful or failure messages.

•Involved in documentation of the project activity all along the course of the project along with Failure Recovery plans.

•Well versed with C, C++, JAVA, Applets, Servlet, Oracle 9i, SQL and JAVA Fundamentals.

•Proficiency at grasping new technical concepts quickly & utilizing it in a productive manner.

•Excellent proficiency in Tomcat Apache and IIS web servers

•Outstanding knowledge of SOAP TCP/IP HTTP UDP and FTP protocols

•Exceptional abilities in using source code analysis tools automated build process and unit testing

•Designed and developed UNIX scripts for creating, dropping tables which are used for scheduling jobs.

•Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse tables using Informatica Workflow Manager.

•Scheduled workflows to extract transform and load data into warehouse as per Business requirements. Improved the session performance by pipeline partitioning.

•Optimized the performance of the mappings by various tests on sources, targets and transformations. Developed Procedures and Functions in PL/SQL for ETL.

Environment: Informatica PowerCenter 8.1.0, XML, DB2, Informatica Power Connect, Workflow Monitor, ERWIN, windows 2000, Oracle 9i, PL/SQL Developer, SQL Server2000,Toad, UNIX, Microsoft Excel.

Education:

B. Tech in Computer Science from AIIT, Hyderabad, India (2009)

Master’s in computer science from International Technological University (2015)



Contact this candidate