Post Job Free

Resume

Sign in

Data Developer

Location:
Hopkins, MN
Posted:
July 18, 2018

Contact this candidate

Resume:

Halya Dangeti

Phone: 408-***-****

Email: ac6bgc@r.postjobfree.com

SUMMARY

* ***** ** ********* ********** in IT as a PL/SQL, Oracle Developer, Netezza Developer, Microsoft SQL Server with expertise in Design, Development, Testing, Technical Documentation and Support.

Strong Data Warehousing ETL experience of using Informatica Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor.

Understanding of Data warehousing concepts and Dimensional modeling (Star schema and Snowflake schema)

Experience in working with relational databases such as Oracle and Netezza.

Worked as an Oracle PL/SQL Developer using different platforms like Windows, Unix and Linux .

Worked on Netezza Database objects creation.

Hands on experience on NZSQL utilities, NZLOAD and control files creation.

Experience in writing complex SQL queries, Netezza procedures, views etc.

Well versed in developing various database objects like Packages, Stored Procedures, Functions, Triggers, Tables, Indexes, Partitioning, Constraints, Materialized Views.

Created Cursors and Ref Cursors in the PL/SQL Blocks.

Functions and Query Re-write in Oracle 11g/10g /9i database.

Used Explain Plan, HINTS for performance tuning of SQL, PL/SQL queries.

Worked on Dynamic SQL and Exception Handling.

Well experienced in writing complex views for the reporting team.

Solid experience in writing complex SQL queries and PL/SQL procedures to extract data from various source tables.

Proficient in loading data from flat files into database tables using SQL*Loader scripts.

Involved in SQL performance tuning.

Worked on DBMS_PROFILER to find out the performance bottlenecks in the pl/sql procedures.

Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.

Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Power Exchange, Power Connect as ETL tool on Oracle, Netezza Databases.

Having hands-on experience in creating Informatica Mappings, Sessions and Workflows.

Worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, Sequence Generator, XML Source qualifier, XML Parser, XML generator and Web service consumer/producer.

Worked extensively with slowly changing dimensions (SCD Type 1,2).

Extensively worked with Informatica performance tuning involving source level, target level and map level bottlenecks.

Worked on existing mapping for the performance tuning to reduce the total ETL process time.

Wrote stored procedures and UNIX scripts as a part of ETL Process.

Strong experience in providing technical documentation and status reports of the applications as per the requirement.

Involved in creating detailed Documentation HLD and LLD, which includes all Business requirements and technical specifications, also created Unit Test Plan(UTP) documents.

Efficient in creating Test cases and performed data validation and process testing for application moving into production.

Independently perform complex troubleshooting, root-cause analysis and solution development.

Good communication, interpersonal, analytical skills and strong ability to perform as a part of team.

Ready to take up challenging tasks.

Have quickly acquired the additional application skills to complete the Job.

Skill Set

ETL Tools: Informatica

Development Tools: Oracle developer, Toad, Netezza Aginity workbench

Other tools: Quality Centre (QC), Rational Clear case, Discovery tool

Languages: Oracle 10g/11g, Netezza, Pl/Sql, Microsoft SQL Server,My Sql, Python

Data Modelling: ERwin Data Modeler, Toad Data Modeler.

O/S: Windows 2000/XP, Windows 7,UNIX,Linux

Education:

Bachelors in Information Technology in 2009 from CIET, India

PROFESSIONAL EXPERIENCE

CISCO SYSTEMS – Sanjose,CA Sep 2016 – June 2018

Role: PL-SQL/ ETL Developer

Description: The key objective of this project is to provide a consolidated view of all the licenses, subscriptions and their usage for each Cisco customer under one roof. It involves real time data population from various systems from which customers generate the entitlements or start using the entitlements they purchased. The challenge here is near real time data sync to canonicals for the vast License transactions that happen across different Licenses Entitlement and Consumption channels.

Responsibilities:

Involved in writing Packages, Functions, Stored Procedures and Database Triggers.

Writing PL/SQL code using the technical and functional specifications.

Coordinated with the front-end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.

Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica power center.

Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.

Created indexes on the tables for faster retrieval of the data to enhance database performance.

Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like Toad, PL/SQL Developer.

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.

Involved in Data profiling using IDQ.

Done enhancements for the previous iteration Informatica Mappings to load data from Source systems to Staging tables and then to Data Mart.

Loaded incremental, history data into datamarts, using Autosys jobs.

Created SCD1, SCD2 Mappings using CDC and Oracle Procedure.

Created reusable components for defaulting and checksum generation.

Updating session statistics in control tables, after the session completion.

Implementing control tables for Audit purpose.

Involved in creation of views for reporting purpose.

Involved in Data loading using SQL queries and procedures for reporting team.

Tracked defects on regular basis and brainstormed offshore team to prevent the re-occurrence of the defects.

Generated SQL and PL/SQL scripts to install create and drop database objects including Tables, Views, Primary Keys, Indexes, Constraints, Sequences and Synonyms.

Created stored procedures, functions, packages, database triggers and cursors based on requirement.

Involved in performance tuning of SQL queries.

Written shell scripts to remove control M characters from the files.

Created SQL*Loader control files for moving the data from flat files to staging area tables.

Created all database objects like Tables, Views, Sequences, Triggers and Materialized Views.

Wrote complex SQL queries, Sub queries and nested loops for faster data retrieval.

Extensively worked through SQL developer for database development and tuning.

Developed migration scripts for initial full data loads in Test and Production environments.

Used bulk collections, indexes, materialized views to improve the query executions.

Worked in SQL query tuning and performance improvement using Hints, Indexes and Partitions.

Developed complex queries to retrieve the data.

Partitioning strategies for very large tables implementing range, list partitions, hash sub partitions.

Handle tables, index and partitioning and Involved in data cleaning, merging and conversion.

Created indexes for faster access of data.

Used regular expressions in SQL queries.

Created test data for unit testing.

Documented business rules for writing triggers and constraints, functional and technical design, test cases and user guides.

Involved in testing all forms, PL/SQL code for logic correction. Performed Unit testing on queries and reports.

Environment: Informatica 9.6, Oracle 11g

Team Size: 20

Bank of America, IBM India Sep 2015 – Aug 2016

Role: PL/SQL Developer

Description: The Bank of America Corporation is an American multinational banking and financial services corporation headquartered in Charlotte, North Carolina. The bank's 2008 acquisition of Merrill Lynch made Bank of America the world's largest wealth Management Corporation and a major player in the investment banking market.

Responsibilities:

Involved in creating detailed Functional Documentation which includes all Business requirements and Technical specifications.

Interacted extensively with business users, Business Systems Analyst for requirements gathering and analysis

Worked for designing Logical and Physical data models for Master Data and Transactional Data.

Developed Advance packages, procedures, triggers, functions, Indexes, materialized views and Collections to implement business logic.

Developed database objects including tables, Indexes, views, sequences, packages,

triggers and procedures to troubleshoot any database problems.

Understanding legacy systems data and building designing target DB schema

Worked on Database Application Development, Query Optimization, Performance Tuning

and DBA solutions and implementation experience in complete System Development Life Cycle.

Performance tuning of PL/SQL code of a stored procedure which migrates data from

source table with multi-million records to destination table.

Improved the performance of the slow SQL queries by implementing indexes, using

FORALL, BULK COLLECT, HINTS, and developed various procedures, functions and packages to implement the new business.

Performed SQL and PLSQL tuning and Application tuning using various concepts like

EXPLAINPLAN, DBMS_PROFILER.

Worked on migration from oracle 11g to Oracle 12c

Worked on Extraction, Transformation and Loading of data (ETL) using Informatica power center.

Designed and developed ETL mappings to extract master and transactional data from

heterogeneous data feeds and load.

Used Bulk Collect feature to improve performance and experience in performance tuning by table partitioning.

Worked extensively with cursors and reference cursors.

Used SQL loader to load data from flat files (CSV) to development database.

Developed UNIX shell scripts to remove control M characters.

SQL statements to extract data in the form of flat files for data loading, Data Conversions

and migration.

Involved in peer review of code/UTPs delivered by offshore.

Took the responsibility in assigning the tasks to the team members and getting them finished.

Environment: Oracle 10g

Team Size: 15

Bank of Montreal, IBM India Aug2014 – Sep 2015 Role: SQL/ETL Developer

Description: Bank of Montreal Financial Group is one of the largest Banks in Canada. The objective of this project is to convert ETL code from Oracle to SQL Server. In the process of ETL, code conversion creates the SQL’s and build SSIS packages. After creating, the SQL’s undergo the unit test code for verifying the code and validate the code in QA environment, comparing code from DB2 with converted code in Netezza.

Responsibilities:

Developed scripts in oracle to load the data from source to target and executed them by creating packages in SSIS.

Tuning the SQL queries, which are taking long time to load data into target tables.

Developed detailed Test cases for all plsql procedures.

Involved in deploying the code to TEST/QA/PROD and supported QA/UAT issues within no time.

Tracked defects on regular basis and fixed them without leading to new issues.

Interacted with data mappers on the client side to get resolution/validate the issues while doing analysis.

Attending the on-shore team calls on daily basis.

Involved in peer review of code/UTPs delivered by offshore.

Helping the team members whenever they struck while doing coding and fixing the defects.

KT sessions given to the new team members joined in the team.

Environment: Oracle database

Team Size: 15

CIBC(Canadian Imperial Bank of Commerce), IBM India Aug 2013 – July 2014

Role: Netezza Database Developer

Description: The purpose of the Business Banking Customer Profitability project (BBCP) of the client Canadian Imperial Bank of Commerce (CIBC) is to develop a new profitability analysis platform for Business Banking and expand its usage from the over $5MM credit segment to all client credit segments ($0 to over $5MM).

Other objectives of this project include-

Standardizing and automating the data feeds.

Consolidation of business rules and logic.

Elimination of data redundancy and improvement in data integrity.

Significantly reduce or eliminate the amount of manual intervention required.

Improve the current reconciliation capabilities with the General Ledger (GL).

Responsibilities:

Developed procedures in Netezza to load the data from source to target in the current iteration.

Developed database objects including tables, views, sequences.

Created procedures to develop functionality.

Created NZLOAD utility to load data from files into temporary staging tables.

Created NZLOAD control files to load different data sets.

Developed detailed Test cases for all Netezza procedures.

Created NZSQL utilities to load into files when required.

Involved in deploying the code to TEST/QA/PROD and supported QA/UAT issues within no time.

Tracked defects on regular basis and fixed them without leading to new issues.

Interacted with data mappers on the client side to get resolution/validate the issues while doing analysis.

Involved in peer review of code/UTPs delivered by offshore.

Helping the team members whenever they struck while doing coding and fixing the defects.

KT sessions given to the new team members joined in the team.

Environment: Netezza database

Team Size: 10

Oakam, India Feb 2010 - July 2013

Role:PL-SQL/ ETL Developer

Description: Oakam provides straightforward and simple financial services for people who might find it difficult to borrow from banks. As part of the project, we developed mappings for installment recovery, calculation of delinquency rates to find the risk from the customer, checked reward eligibility periodically and reporting for analyzing business.

Responsibilities:

Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) for data migration with Informatica power center.

Involved in extracting the data from the Flat Files and Relational databases into staging area.

Developed mappings/sessions using Informatica Power Center for initial loading and incremental loading.

Developed Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.

Created mappings, which involved Slowly Changing Dimensions.

Created procedures for calculation of Loan Total Repayments, Bonus Rewards and used in the stored procedure Transformation.

Environment: Informatica 8.6, Oracle 10g

Team Size: 5



Contact this candidate