Post Job Free

Resume

Sign in

Pl Sql Data

Location:
Tampa, FL
Posted:
July 01, 2023

Contact this candidate

Resume:

Venkata Sathya Ramesh Kumar Maganti

Lead ETL Developer

Charlotte-NC

813-***-****

adx038@r.postjobfree.com

https://www.linkedin.com/in/ramesh-kumar-oracle/

PROFESSIONAL SUMMARY

Around 15+ years of It Experience in Development, Enhancement, Production support, Data Modeling, PLSQL Programming Including Business Requirements, System requirements gathering of Enterprise Data Warehouses of Health care, oil and gas, Inventory, online transaction, finance, Entertainment domains.

Around 8+ years of ETL Data Warehouse Experience using Informatica PowerCenter 9.0x/10.x, ODI 12, Airflow, Control M, Shell Scripting, Cron scheduling.

Having Strong Experience in Data Profiling, Data Analysis, creating Source to Target mapping document, writing and reviewing BRDs, FRDs.

Experience in Data Modeling using Dimensional Data Modeling, Star/Snowflake Schema Modeling, FACT And Dimensional tables, Physical and logical data modeling, using Erwin.

Expertise in designing and developing Informatica Mappings from varied transformation logic using filter, expression, rank, router, aggregation, joiner, lookup and update strategy.

Having good working knowledge with Informatica Workflows, Session Logs, Parameter File, Workflow Monitor, Workflow Scheduler.

Data Warehouse Systems Performance Tuning done by Identifying Bottlenecks, Optimizing Data Model and Schema, Data Partitioning, Parallel Processing, ETL Code Optimization, Use Bulk Loading, implementing Cache and Materialized Views and Error Handling and Logging.

Implemented different types of SCD using Informatica PowerCenter and written Pre- and Post-Session Commands, and used Source Qualifier Override.

Experience on ODI Designer for designing the interfaces, defining the data stores, interfaces and packages, used variable and modified the ODI Knowledge Modules (RKM, LKM, IKM, CKM, JKM) to create interfaces to cleanse, Load and transform the Data from Sources to Target databases, created mappings.

Experience in Extraction, Transformation and Loading of data from heterogeneous sources systems like Flat files, Excel, XML, Oracle, SQL Server.

Used MDM suite of tools to design, develop, optimize and support Master Data Management (MDM) for various domains.

Having experience in ETL Testing - Data quality, Source to target count, Performance, Data transformation, Data integration, Production validation, Reports.

Developed, implemented and maintained SSIS/DTS Packages to Extract Data from SQL Servers.

Excellent knowledge in analyzing data warehouse by building SQL server Analysis service SSAS 2005/2008 Integrated data from multiple databases and APIs using Python scripts, ensuring data consistency and accuracy.

Created interactive and visually appealing reports and dashboards in Apache Airflow, Power BI Desktop. Used Apache Kafka 0.10 for messaging system in ODI.

My PLSQL skill set includes proficiency in writing optimized PL/SQL code, creating and managing database objects such as tables, views, procedures, functions, and triggers.

Extensive experience in Oracle Back-end (server-side) PL/SQL Development i.e., Packages, Stored Procedures, Functions &Triggers using SQL Developer and Toad as development tools.

I possess an in-depth understanding of advanced PL/SQL programming constructs, including complex control flow statements, advanced exception handling, and sophisticated cursor management. I leverage these constructs to write robust, maintainable, and high-performance code.

Having Strong experience in features of PL/SQL like collections, nest table, ref cursors, materialize views and dynamic SQL.

Extensively used Analytical Functions (ROW_NUMBER, RANK, DENSE_RANK, FIRST_VALUE, LAST_VALUE, LAG, LEAD…etc.) to tune the SQL quires.

Performance Tuning done by Query Optimization, Indexing Strategies, SQL Performance Profiling, System

Configuration and Parameter Tuning, Database Partitioning Using tools such as EXPLAIN PLAN, SQL Trace and SQL Tuning Advisor.

Developed many Packaged procedures with complex logic, validation to transfer stage to Target tables. Generated and Automated Periodical Reports Using Oracle Packaged Procedures.

Well versed with Agile Project Management System (KANBAN) and JIRA.

Strong knowledge and experience in Unix/Linux Shell Scripting, Cron jobs

Active contribution to application lifecycle including maintenance, release management, critical bug fixing and ensure error free delivery of the software solutions.

SKILLS

Database Systems: Oracle (19c/11c/10g/9i/8i), Postgres 9.x/10.x/11.x, SQL Server2005/2014

Data Warehousing: Informatica power center 10.x/9.5.x, ODI 11/12, power BI 2, Apache Kafka 0.10, Control M 9, SSIS, SSAS

Languages: SQL, PL-SQL, Unix, Shell Scripting, Python, Core Java

Development Tools: SQL*Plus, SQL Developer, TOAD 9.6/11/12/16, PLSQL Developer, Oracle Forms & Reports, Python 1.4/1.5

Data Modeling Tools: ER/Studio 2016, 9.7/9.5/8.5, Erwin r9.7/7.5.8/4.1.4, Visio 2016/2013

Web Technologies: REST API, Jason, XML, Micro Services

Software Engineering: UML using Visio

Office Applications: MS Office Professional Plus 2016/2013/2010/2007(Word, Excel, PowerPoint) Configuration Management: Visual Source Safe and Concurrent Version Control, Git Hub, DevOps

EXPERIENCE

Baker & Taylor Inc

Charlotte, NC // Lead ETL/Informatica/PLSQL Developer // 02/2021 – present

Baker& Taylor is the premier worldwide distributor of books, digital content, and entertainment products. Baker & Taylor has been in business for over 190 years, offering an extensive selection of products and services for academic and public libraries. It is the only partner that can truly offer a one-stop shopping experience for all formats, including print, movies and music, and digital downloadable eBooks and audio content.

Responsibilities

Designing and developing data integration solutions using Informatica PowerCenter or other Informatica products. This

involves extracting data from various sources, transforming and cleansing the data, and loading it into the target systems.

Building complex ETL (Extract, Transform, Load) mappings and workflows to meet business requirements. This includes understanding data models, creating mappings, filter, expression, rank, router, aggregation, joiner, lookup and update strategy.

Ensuring data quality and integrity throughout the ETL process. This involves implementing data validation rules, error handling mechanisms, and data cleansing techniques to maintain data accuracy and consistency.

Optimizing the performance of ETL processes by identifying and resolving performance bottlenecks. This includes analyzing query execution plans, fine-tuning mappings and workflows, and implementing caching, partitioning, and parallel processing techniques.

Collaborating with database administrators and system administrators to manage Informatica environments. This includes installing and configuring Informatica software, monitoring system performance, managing metadata, and troubleshooting technical issues.

Evaluating the volume of data being processed and identify any opportunities for reducing the data volume through filtering, aggregation, or data pruning techniques.

Optimizing the data extraction process by using incremental loading or change data capture mechanisms to extract only the changed or relevant data from the source systems.

Analyzing the data sources and identify any performance issues or limitations that may impact ETL performance. For example, slow database queries, network latency, or inefficient data access patterns.

Streamlining and optimizing the data transformation logic to minimize unnecessary computations and data manipulations.

Used efficient algorithms and techniques for data transformation operations such as joins, aggregations, sorting, and filtering.

parallel processing and distributed computing techniques to distribute the processing load across multiple resources and improve performance.

Optimized the use of temporary storage, such as in-memory caching or disk-based buffers, to reduce I/O operations and improve data processing performance

Implemented comprehensive performance monitoring and logging mechanisms to track and analyze the ETL process's performance metrics, such as data throughput, processing time, resource utilization, and bottlenecks.

Used tools that provide real-time monitoring, alerting, and performance analysis capabilities to identify performance issues and optimize the ETL processes accordingly.

Evaluated and optimized the database indexing strategy to ensure efficient data retrieval and query performance during the ETL process.

Identified and optimized the SQL queries or database operations that are frequently executed during the ETL process, such as lookups or data validations.

Utilized distributed or parallel processing frameworks and technologies to leverage multiple resources and scale the ETL process horizontally.

Working closely with business analysts and stakeholders to understand data integration requirements. This involves participating in meetings, conducting workshops, and translating business needs into technical specifications.

Developed Mappings and load Fact and dimension tables SCD type 1 and SCD type2 dimensions and incremental loading and unit tested the mappings.

Creating technical documentation, such as design specifications, deployment guides, and user manuals. Additionally, generating reports on project progress, issue resolution, and system performance.

Collaborating with cross-functional teams, including data analysts, data architects, project managers, and business users. Effective communication skills are essential to understand requirements, provide status updates, and resolve issues.

Keeping up-to-date with Informatica and industry trends, exploring new tools and technologies, and identifying opportunities to enhance existing processes and systems.

Environment: Oracle 19c, Python 1.5/1.6, Informatica PowerCenter 9.0x/10.1X, Git, DevOps, Toad for Oracle, Jira, Windows, XP, XML, Excel, Visio

Burlington Northern and Santa Fe Railway (BNSF)

Dallas, TX // Senior ETL/ODI/PLSQL Developer // 12/2018 – 01/2021

The BNSF Railway Company operating one of the largest railroad networks in North America. Integrating Data Modeling technology into the system, BNSF dramatically increased efficiency, productivity and savings. The goal of the project is providing more accurate and timely data for better business decisions.

Responsibilities

Designing and implementing Kafka integration solutions within ODI. Configuring and managing Kafka connections, topics, and partitions within ODI.

Developing Kafka producers and consumers using ODI's built-in functionality or custom code.

Defining mappings and transformations to extract, transform, and load data between Kafka and target systems using ODI's graphical interface.

Implementing error handling and exception management for Kafka integration processes. Ensuring data consistency, reliability, and fault tolerance in Kafka-ODI integration solutions.

Collaborating with ODI developers and administrators to align Kafka integration with ODI workflows and architecture. Monitoring and optimizing data flows between Kafka and target systems, identifying and resolving performance bottlenecks.

Designing and developing ETL (Extract, Transform, Load) processes using ODI to integrate Kafka data with Oracle databases and other target systems.

Building and configuring ODI interfaces, packages, and scenarios to extract data from Kafka topics, transform it, and load it into target systems.

Writing and optimizing SQL queries, stored procedures, and scripts within ODI for data integration tasks.

Collaborating with business users and stakeholders to gather requirements and ensure the accuracy and completeness of data integration processes.

Monitoring and managing ODI repositories, agents, and sessions to ensure smooth execution and performance. Troubleshooting and resolving issues related to ODI workflows, data integration, and connectivity with Kafka. Ensuring data integrity, security, and compliance with Oracle and Kafka standards and best practices.

Performance tuning and optimizing ODI processes, including query optimization and indexing.

Environment: Oracle 12c, ODI 12, Apache Kafka 0.10, SQL Server 2000/2005/2008, Erwin 7.5.8, Windows 7, XML, Excel, Visio

Hospital Information System, Ministry of Health

Muscat, Sultanate of Oman // Senior ETL/PLSQL/SQL Server Developer // 03/2012 – 11/2018

HIMS (Al-Shifa) is a Health Information Management System that is implemented at the Ministry of Health Hospitals, Polyclinics and Health Centers. Currently, 180 Institutions are using this system. Al-Shifa 3+ offered compliance to HL7 & DICOM standards and other international standards like US Healthcare System, HIMS and Healthcare Standards DICOM, ICD-9, ICD-10, ICD-10, HCPCS Level 1 & HCPCS level 2. This system involves 40 modules that will meet all requirements of Advanced Hospital Management. It provides a user-friendly environment such as usage of pen and paper right from patient registration to patient’s discharge, maintenance of stores (Medicine, Equipment & Surgical), E-Notification and Central Blood Bank. Alshifa 3+ is designed for each category of healthcare (primary, secondary & tertiary) with only configurations of the same application to differentiate the system for each category

Responsibilities

Designing, developing, and implementing ETL (Extract, Transform, Load) processes to extract data from various sources, transform it based on business rules, and load it into the target data warehouse or system.

Writing scripts and queries using Shell Scripting and SQL to extract data from source systems. This may involve interacting with databases, files, APIs, or other data sources on IBM AIX or Windows Server environments.

Developing Shell Scripts and SQL queries to perform data transformations, data cleansing, and data enrichment as required. This includes applying business rules, data validation, and data mapping during the ETL process.

Utilizing ETL tools such as IBM Infosphere DataStage, or other similar tools, to design and build ETL jobs and workflows. This involves configuring data flows, defining data mappings, and creating transformations using the ETL tool.

Writing Shell Scripts to automate ETL processes, including scheduling tasks using Cron jobs on IBM AIX or Windows Server. This involves creating scripts for data extraction, transformation, and loading, as well as job scheduling and monitoring.

Developing and optimizing SQL queries and PL/SQL scripts for data manipulation and data validation. This includes creating and modifying database objects, such as tables, views, and stored procedures, on Oracle 12c.

Configuring and managing ETL jobs and workflows using Control M or similar job scheduling tools. This includes creating job schedules, dependencies, and monitoring job execution for successful completion.

Analyzing and optimizing SQL queries, ETL processes, and database performance for improved efficiency. This involves identifying and resolving performance bottlenecks, optimizing data access methods, and tuning ETL workflows.

Configuring SSIS packages to connect and integrate data from diverse data sources such as databases, flat files, web services, and Excel spreadsheets. This includes defining connection managers, data source mappings, and handling data type conversions.

stands for SQL Server Analysis Services (SSAS) created for online analytical processing (OLAP) and data mining. these multidimensional data models used for efficient analysis and reporting.

Environment: Oracle 12c, Oracle PL SQL, Informatica Power center 9.0X, SQL Server 2014, Data Warehouse, Java Web services

Inter-Tech LLC

Muscat, Sultanate of Oman // Oracle/PLSQL/ Developer // 04/2008 – 04/2012

Inventory Management System (In house Project)

Inter-tech Oman is One of leading company for supply of computer Hardware in Oman. We deigned system to handling Stores and Purchase. We designed user-friendly, auto computing system. Stores deigned with Receipt Voucher, Indent, Issue voucher, All the stock balance computation made with PLSQL Procedures. Provided provision to Issue return, Supplier Return.

Drilling Program Tool, PDO, Oman

Drilling Program Tool (DPT) is to create a simple workflow for generating Drilling Programs includes the well delivery process milestones and deliverables as per DCAF.A system to automate the preparation of Drilling Program documents used for Drilling process, by integrating with Well Planning Tool. This system reduces the time & cost required to prepare the Drilling Program documents, implement a systematic approach in the business, implement a standard document format within the organization etc.

Responsibilities

Designing, developing, and maintaining Oracle databases using PL/SQL. This involves creating tables, views, indexes, and other database objects to store and manage hospital-related data.

Writing efficient and optimized PL/SQL code to implement business logic and data manipulation. This includes using control structures, loops, conditional statements, and exception handling to ensure data integrity and reliability.

Implementing and managing cursors to retrieve and manipulate data from the database. This involves using explicit and implicit cursors, cursor attributes, and cursor variables to fetch and process result sets.

Implementing error handling mechanisms in PL/SQL code to capture and handle exceptions. This includes using exception handlers, raising custom exceptions, and logging error information for troubleshooting and auditing purposes.

Utilizing PL/SQL collections, such as arrays, nested tables, and associative arrays, to store and process multiple data values. This includes performing operations like bulk binds, nested table manipulations, and collection methods for efficient data handling.

Designing and creating views to provide customized and simplified access to the underlying data. This involves joining tables, selecting specific columns, and applying filters to create logical views of data for reporting and querying purposes.

Developing and maintaining database triggers to automate business rules and enforce data integrity. This includes using DML triggers, INSTEAD OF triggers, and database event triggers to perform actions before or after specific database operations.

Creating and managing materialized views to improve query performance and data availability. This involves selecting appropriate columns, defining refresh schedules, and tuning materialized views for optimized data access.

Developing PL/SQL code to integrate with Oracle Forms 6i and Oracle Reports 6i. This includes writing triggers, PL/SQL libraries, and data manipulation code to provide seamless data entry, retrieval, and reporting capabilities within the hospital information system.

Analyzing and optimizing PL/SQL code, SQL queries, and database performance for efficient data processing. This involves identifying and resolving performance bottlenecks, optimizing SQL statements, and utilizing database performance tuning techniques.

Environment: Oracle 12c, Oracle PL SQL, SQL Server 2014, Java Web services

Key Projects Handled at Fore C Soft, Chennai as Oracle Developer

Material Management System, as Oracle Developer during June 08 to March 09

Him System, B.R.S. Hospitals, Chennai, as Oracle Developer during December 07 to May 08

Field Quality System ZENA Corporation, USA, as Oracle Developer during April 2007 to November 2007

Property Tax Module, NDMC New Delhi, India as Oracle Developer during November 06to March 2007

Roles and Responsibility as Oracle Developer:

Development of applications using the forms 6i/9i and reports 6i/9i on Oracle 11g

Development of unit and system test plans. Maintenance of the Implemented system, attending the critical Errors.

Data transformation, conversion and interface, data loading, database modeling and performance tuning.

Development of complex reports to meet the user requirement by using oracle Reports 6i/9i

Creating Tables, Views, Materialized Views

Creating Menu Items, Remote Refresh of Menu

Coding of Procedure and Packages in PL/SQL developer

Integrating Procedures with Cursors. Created triggers as per the Client specification

Created functions to meet the business logic. Involving in integration of modules, unit testing and system testing

Environment: Oracle 11g, PL/SQL, Forms 6i/9i, Reports 6i/9, Windows XP, Windows 10 and MS Visio

END



Contact this candidate