Post Job Free

Resume

Sign in

Data Developer

Location:
Princeton, NJ
Salary:
$70/hr
Posted:
October 11, 2019

Contact this candidate

Resume:

Bharath Malinedi

Somerville, NJ *****

Ph: 732-***-****

Email ID: adakhx@r.postjobfree.com

Summary:

8+ years of expertise in developing Oracle Database applications along with ETL in Various domains: Insurance, Banking, Pharmaceuticals, Retail and Telecom industries.

Expertise in building and developing high quality database applications with Oracle SQL and PL/SQL with proficiency in Oracle 12c/11g/10g/9i and UNIX Shell Scripting.

Extensive experience in Data base Development, Data Modeling using tools including ErStudio, Erwin 3.5/3.x Physical and Logical Modeling and Dimensional Modeling Dimensional Tables and Fact Tables and strong understanding of Star schema modeling, Snowflake modeling. Also have worked on ER modeling.

Proficient in developing and maintaining PL/SQL Packages, Procedures, Functions, Triggers, Joins, Sub Queries, Correlated Sub Queries, Views, Partitioning, Cursors and Ref-Cursors, Constraints, Transactions, Tables, Views, Materialized Views, DB links and Indexes.

Excellent skills in writing different types of SQL Advanced queries like sub queries, nested sub queries and co-related sub queries along with set operators.

Expert in working with advanced concepts like Pipe Row, Nested Tables, Varrays, Records, Types, Bulk Collects and Bulk Variables.

As an ETL developer, I have expertise in Data Warehouse and reporting tools like Informatica, Cognos Qlik view and OLTP and OLAP implementations and normalized data structures.

Experience in Data warehousing application development using Informatica Power Center Tools, i.e. Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Informatica Server and Repository Server manager.

Experience in Data warehouse design using Star-Schema, Snowflake schema, Star joined schemas.

Experience across all stages of Software Development Life Cycle (SDLC) including Requirement gathering, Design, Development, Unit Testing, Systems Integration and User Acceptance Testing and preparing technical design documents.

Good experience in Erwin data modeler while working on ER diagrams during design phase of the project.

Extensively used various techniques in Performance Tuning and Optimization like Hints, Indexes, Explain Plan, Trace Utility and TKPROF.

Good experience in Data Migration techniques using Oracle External Tables, SQL* Loader, Import/Export and Batch Processing.

Extensive working with Incident management tool (ITSME), service center, GIT Version control tool and other tools for problem managements while working as Production support executive for TTSL project.

Extensive hands on experience in tools like Toad, SQL Developer, TRU Console, TRU Migrate, TRU Compare, PDF Generator, Putty, Autosys, UC4, HP Service center, Control-M, ITSME... etc.

Knowledge on Core Java, concepts in Java like struts, Hibernate and some advanced concepts as well.

Expertise in migrating Pharma-Clinical and non-clinical data from documentum to Veeva Vault cloud system.

Skill Set:

Databases

Oracle 12c, 11g, 10g, SQL Server 2012.

ETL Tools

Informatica Power Center 8.X/ 9.X, Cognos, Qlik view, SSRS

Languages

Oracle PL /SQL, SQL, Unix Shell Scripting, Python, Core Java, DQL, VQL

Tools

Toad, SQL Developer, Putty, SQL * PLUS, FileZilla, ERWIN Data Modeler, WinSCP, UC4, Autosys, Control M, HP Service Center, GIT hub, SVN Version Control tools, Amazon Redshift, ITSME tool

Operating Systems

Windows server 2010, windows 7, HP UNIX, Linux

PROFESSIONAL EXPERIENCE:

Prudential Financial, Greater New York Area Jan 2019 – Present

Senior Oracle PL/SQL Developer

At Prudential, I worked for Tax data warehouse Project, which is a front-end display of all the forecasted data. Responsible for loading the data from different source applications along with their estimated projections for forecast tax purpose.

Responsibilities:

Created scripts to create new tables, views, queries for new enhancement in the application using SQL Developer.

Worked on SQL*Loader to load data from flat files obtained from various facilities every day.

Developed PL/SQL triggers and master tables for automatic creation of primary keys.

Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.

Developed brand new PL/SQL code to archive, restore, purge data from source to target using database links w.r.t stand-alone and master/detail tables as well.

Extensively used Pivot, Unpivot concepts for creating reports from the views.

Developed complex queries using Hierarchical and Analytical Query logics.

Created indexes on the tables for faster retrieval of the data to enhance database performance.

Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.

Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.

Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.

Partitioned the fact tables and materialized views to enhance the performance.

Extensively used bulk collection in PL/SQL objects for improving the performing.

Created records, tables, collections for improving Query performance by reducing context switching.

Used Pragma Autonomous Transaction to avoid mutating problem in database trigger.

Created and modified several UNIX Shell Scripts according to the changing needs of the project and client requirements.

Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Environment: Oracle 12c, PL/SQL, TOAD, SQL Developer, Informatica, Putty, UNIX, Shell Scripting, Autosys, BPC Tool, UC4, HP Service center.

Sanofi, Basking Ridge, NJ Apr 2017 – Dec 2018

Senior Oracle PL/SQL Developer

At Sanofi, I worked as a Senior Oracle PL/SQL Developer. Here we worked majorly on Pharmaceutical data, like managing pharma-clinical, regulatory, safety, quality related data for clients to Veeva Vault CRM System and Work on the database related activities.

Responsibilities:

Involved in logical data modeling, including developing and maintaining data model diagrams E-R Diagrams, implementing physical designs, and facilitating data model requirements gathering.

Involved in user requirement and requirement specification from users for outbound calling and Campaign Management modules

Documenting business requirements, functional specifications and test requirements.

Participated in logical and physical database design and developed the logical dimensional models using Erwin 9.8.

Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like Toad, SQL Developer.

Designed and Developed Triggers and Packages using PL/SQL

Created indexes on tables and Optimizing Stored procedure queries

Wrote Scripts for batch processing and data loading using SQL Loader.

Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.

Created PL/SQL scripts to extract the data from the operational database into simple flat text files using UTL_FILE package.

Worked in PL-SQL code tuning System Analysis, Design, Coding, Testing, Development and Documentation

Worked on both proactive and reactive SQL Tuning and Query Optimization Techniques for complex SQL Statements. Used Performance Monitor and SQL Profiler to optimize queries and enhance the performance of databases.

Partitioned the tables and materialized views to enhance the performance.

Extensively used bulk collection in PL/SQL objects for improving the performing.

Environment: Oracle 12c, Toad, SQL Loader, UNIX Shell Scripts, Putty, ERWIN 9.8, Cognos, UC4, HP Service center, GIT Version control tool.

Norwegian Cruise Line, Miami, FL Jul 2016 – Mar 2017

Oracle PL/SQL Developer

At Norwegian Cruise Line, I worked for Pronto Project, which is a front-end display of all the cruise details of future. Responsible for loading the data of all the cruise deals along with their estimated pricings and other details, this will be helpful for the business forecasting.

Responsibilities:

Coordinated with the front-end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.

Worked on SQL*Loader to load data from flat files obtained from various facilities every day.

Developed PL/SQL triggers and master tables for automatic creation of primary keys.

Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.

Created Logins, Users and Roles in the context of SQL Server security. Developed brand new PL/SQL code to archive, restore, purge data from source to target using database links w.r.t stand-alone and master/detail tables as well.

Developed ETL programs (mappings, sessions, workflows) via Informatica. Created complex mapplets for reusable purposes.

Created scripts to create new tables, views, queries for new enhancement in the application using TOAD.

Created indexes on the tables for faster retrieval of the data to enhance database performance.

Involved in data loading using PL/SQL and SQL*Loader calling UNIX scripts to download and manipulate files.

Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.

Used Bulk Collections for better performance and easy retrieval of data, by reducing context switching between SQL and PL/SQL engines.

Partitioned the fact tables and materialized views to enhance the performance.

Extensively used bulk collection in PL/SQL objects for improving the performing.

Created records, tables, collections for improving Query performance by reducing context switching.

Used Pragma Autonomous Transaction to avoid mutating problem in database trigger.

Created and modified several UNIX Shell Scripts according to the changing needs of the project and client requirements.

Wrote Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.

Extensively used the advanced features of PL/SQL like Records, Tables, Object types and Dynamic SQL.

Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application.

Environment: Oracle 11g, PL/SQL, TOAD, SQL plus, Informatica, Putty, UNIX, Shell Scripting, ERWIN 7.1, Cognos, UC4, HP Service center, SSIS, SSRS.

Pfizer- Peapack, NJ Nov 2014 - Jun 2016

Senior PL/SQL & ETL Developer

At Pfizer, I worked for E1 Integration project; the purpose of the project is to include a new financial system into Pfizer employees. The data from People soft HRMS will be migrated to GRW database system.

Responsibilities:

Handled the requirement gathering by participating in meetings with the clients on regular intervals, closely worked with Business Analyst and Project manager for the proper requirements documentation.

Analyze the user requirements and prepare proper technical design documents relevant to proceed for the development.

Impact analysis on the existing data objects available in the GRW environment.

Created and maintained database objects like Tables, Views, Sequences, Synonyms, Indexes, Stored Procedures, Triggers, Transactions, Functions and Packages.

Created materialized views to load the data by refreshing it in regular intervals so that it can be useful for the downstream users.

Responsible for implementing the functionality to load data from PeopleSoft environment to GRW.

Used Bulk Collections for better performance by reducing context switching between SQL and PL/SQL Engines.

Extensively used the advanced features of PL/SQL Records, Tables, Object Types and Dynamic SQL.

Developed Database Triggers for audit and validation purpose.

Wrote complex SQL Statements, Complex Joins, Co-related Sub-queries and SQL Statements with Analytical Functions.

Worked by partitioning table techniques when table is having huge data for data accessing and performance improvements.

Worked on SQL*Loader and Import/Export to load data from flat files obtained from various facilities every day.

Used Exception Handling techniques extensively for the ease of debugging and displaying the error messages in the Application.

Created and modified UNIX shell Scripts according to the changing needs of the project.

Developed test cases and performed Unit Testing and Integration Testing of the Repositories.

Environment: Oracle 11g, PL/SQL, TOAD, SQL plus, Putty, ERWIN 7.1, UNIX, Shell Scripting, Cognos, UC4, HP Service center, People soft HRMS, GIT Version control tool.

Estee Lauder (ESTL), New York Feb 2014 - Oct 2014

Oracle PL/ SQL Developer

In Estee Lauder, I was a part of Inventory Optimization team. As part of the team i was responsible for loading orders, purchases and stock related data from various vendors. We had built up the database and will be responsible for making strategic decisions like forecasting business.

Responsibilities:

Responsible for proper orders and purchase related data loading into smart-ops optimizer from different inbound systems and similarly processed data back to those out bound systems.

Created entity relationship diagrams and multidimensional data models, reports and diagrams for marketing.

Used Model Mart of ERwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.

Worked in PLSQL code tuning System Analysis, Design, Coding, Testing, Development and documentation.

Created and maintained all database objects such as Tables, views, indexes, sequences, constraints, procedures, functions.

Created materialized views to load the data by refreshing it in regular intervals so that it can be useful for the downstream users.

Designed and Developed Triggers and Packages using PL/SQL.

Created indexes on tables and Optimizing Stored procedure queries.

Wrote Scripts for batch processing and data loading using SQL * Loader.

Monitored the batch processing with the help of Control M tool.

Environment: Oracle 10g, PL/SQL, Autosys, SQL plus, Putty, ERWIN Modeler, Toad, SQL Developer, UNIX Shell Scripting, Control M

KeyBank, Cleveland, Ohio Feb 2012 - Jan 2014

Informatica &PL/SQL Developer

At KeyBank, I worked for Personalization project. As part of the project we developed database application that will help the customers on various offers they get as part of the bank promotions according to their interest.

Responsibilities:

Analyzed the requirements and framed the business logic for the ETL Processes.

Designed Conceptual Model as an architectural migration target for new ODS model and redesigned the key structures of the Data Warehouse (tables, MVs, Indexes).

Developed Data Analysis for Erwin Logical and Physical model design and worked on ER diagrams.

Used Erwin relational tool for logical and physical model for target ODS model.

Document and communicate changes to the OLTP, OLAP systems and data warehouse teams, including Data Modeling, ETL and Database teams.

Created database objects like Tables, Views, Sequences, Synonyms, Stored Procedures, Functions, Packages, Cursor, Ref Cursor and Triggers.

Developed Database Triggers for audit and validation purpose.

Wrote complex SQL Statements, Complex Joins, Co-related Sub-queries and SQL Statements with Analytical Functions.

Extensively used Hints to direct the optimizer to choose an optimum query Execution Plan.

Extensively used Bulk Collection in PL/SQL Objects for improving the performance.

Handled errors using Exception Handling extensively for debugging and maintainability.

Created Mappings using Designer, Extracted the data from the flat files and other RDBMS databases into staging area and loaded onto Data Warehouse.

Developed the Informatica Mappings by using Aggregator, SQL Overrides by using Lookups, source filter by using Source Qualifiers and Data Flow Management into multiple targets using Router.

Performed Unit Test and Integration Test to make sure the deliverable is match with business requirement.

Environment: Oracle 10g, PL/SQL, Informatica Power center, Erwin data modeler, SQL plus, Putty, Toad, UNIX, Shell Scripting.

Tata Teleservices Limited (TTSL), Hyderabad Mar 2011– Jan 2012

Production Support Executive

As a production support executive, we supported the TIPPS front end application. It was used by all the customer care centers across India. Whole Docomo customers’ data was maintained by us. We were responsible for proper data migrations and will work on various SR’s, AR’s by the business owners.

Responsibilities:

Maintaining the TIPPS frontend application which was developed by using Oracle Forms and Reports.

Generating Daily, Weekly and monthly Reports using UNIX Scripts for the end users.

Monitoring the daily, weekly and monthly jobs in Autosys tool.

Creating Cursors and Ref cursors as a part of the stored procedures to retrieve the selected data.

Extensively used partitioning of tables when working with tables having huge historical data.

Used Bulk Collections and Bulk Variables for better performance and easy retrieval of data by reducing context switching between SQL and PL/SQL engines.

Used Oracle External Tables, SQL* Loader, Import/Export to read Flat Files and apply business logic before saving these transactions.

Creating UNIX shell scripts to automate the loading of data from vendors through SQL* Loader and to connect to SQL plus and execute the packages and procedures that are created.

Worked with complex queries to retrieve information from database depending on the user requirements (AR’s).

Debugging and resolving of the production fixes (PFs).

Worked on resolving different Analysis requirements ARs, Business requirements BRs and Change requirements CRs.

Environment: Oracle 10g, PL/SQL, TOAD, SQL * Loader, UNIX, Autosys, Putty, FTP and ITSME tool.

Education:

Bachelor of Technology (B. Tech) – Electronics - 2010, JNTUK, India.



Contact this candidate