Post Job Free
Sign in

Oracle Pl Management System

Location:
Emeryville, CA
Posted:
August 13, 2024

Contact this candidate

Resume:

Ravi Kumar

**************@*****.***

510-***-****

PROFESSIONAL SUMMARY

Professional Summary:

About 7 years of IT experience working in different technologies like an Oracle PL/SQL, SQL, OFSAA, AWS, SNOWFLAKE, Talend and worked with various multi-tier applications using Oracle Relational Database Management System (RDBMS), SQL and PL/SQL on different platforms like Windows NT/2000/XP, UNIX and LINUX.

Around 3 years of experience with Snowflake.

Strong experience in migrating other databases to Snowflake.

Experience with Snowflake Multi - Cluster Warehouses.

Good understanding of snowflake architecture.

Good knowledge of Snowflake warehouse, tables, schemas, stages.

Experience in working in working with snowflake functions used in COPY command and has good knowledge on different file formats supported for COPY command.

Created Snow pipe for continuous data load.

Experience in working with snowflakes, Time travel, Fail safe, Clone concepts.

Build the Logical and Physical data model for snowflake as per the changes required.

Used COPY commands to load data to internal/external stage and to actual snowflake tables.

Experience in loading JSON and PARQUET related data from S3 bucket to snowflake tables.

In-depth knowledge of data warehousing and ETL Concepts.

2+ years of experience in Talend Open Studio (6.x/5.x) for Data Integration, Data Quality and Big Data.

1+years of experience with Talend Admin Console (TAC).

Experience working with Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP, Star Schema, Snowflake Schema, Fact Table, Dimension Table, Logical Data Modeling, Physical Modeling, Dimension Data Modeling.

Around 1 year of experience working as an AWS cloud Engineer.

Worked as OFSAA Consultant for 2 years to analyze various bank regulatory reports like Liquidity coverage ratio, 2052A, Net stable funding ratio and Interest Rate Risk.

Experience in Batch Scheduling and Batch monitoring in OFSAA.

Maintaining integrity of data in the Database to avoid inconsistencies in 2052A, QRM and NSFR use case reporting.

Experienced in Automating, Configuring, and deploying instances on AWS, also familiar with EC2, Cloud watch, Cloud Formation and managing security groups on AWS.

In depth knowledge of various AWS cloud services like IAM, storage, network, and compute.

Good experience in architecting and configuring secure cloud VPC using private and public networks through subnets in AWS.

Involved in designing and developing Amazon EC2, Amazon s3, Amazon RDS, Amazon Elastic load balancing, Amazon SQS, SNS and other services.

Experienced in Creating and managing buckets on s3.

Having good technical knowledge on various AWS services like IAM, s3, VPC,SNS,SQS,EC2.

Experience with Oracle SQL and PLSQL programming and used various oracle utility tools like Toad, SQL Developer, SQL*Plus.

Expert in designing, developing, testing, and implementing PL/SQL triggers, cursors, stored procedures, packages, functions, collections at database level and reporting level.

Proficient in Dynamic SQL, SQL*Plus, UNIX shell scripting, Debuggers, Performance Tuning and Query Optimization.

Expertise in Tuning Queries for better performance with large volume of data.

Adept in key Oracle performance-related features such as Query Optimizer, Execution Plans, Explain plan, Hints, Indexes, Clusters, Partitioning.

Well-versed with different stages of Software Development Life Cycle (SDLC).

Worked with various user interface tools, database tools, scripting and notification, XML applications etc.

Expert in working with various built-in packages like DBMS_SQL, DBMS_SCHEDUER, DBMS_PROFILER, UTL_FILE.

Extensively experienced in interacting with business analysts and developers to analyze the user requirements, functional specifications, and system specifications.

Performed various types of testing, such as functional, system, regression, and user acceptance testing.

EDUCATION

Master’s in information systems- Campbellsville University- KY.

Bachelor’s in computer science – Acharya Nagarjuna University - Andhra Pradesh, India.

Business/Technical Skills:

Operating systems

Windows 2000/NT/98/95, Linux, Unix

Basic query Tools

Toad, SQL Developer, SQL*Loader, SQL*Plus

Languages

My SQL, SQL, PLSQL, Python, UNIX shell scripting, Snowsql

Database

Oracle 12C,11g/10g/9i

Performance tuning

AUTOTRACE, Explain plan, Execution plan, Dbms_profiler, TKPROF, UTL_FILE.

Protocols

TCP/IP, FTP, TELNET

Office software

MS Office suite.

Oracle/GUI tools

OFSAA 8.1

ETL tools

Talend

Cloud services

AWS, Snowflake, SnowSQL, Snowpipe

Delta Airlines, Atlanta, GA April 2020 – Present

Role: ETL/Snowflake developer

Responsibilities:

Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source systems which include loading nested JSON formatted data into snowflake table.

Experience in building Snow pipe for continuous data loads.

Experience in using Snowflake Clone and Time Travel.

In-depth knowledge of Snowflake Database, Schema and Table structures.

Experience in working with Streams and Tasks.

Experience in building ETL pipelines to fetch data from various source systems and loading it into snowflake tables.

Defining virtual warehouse sizing for Snowflake for different types of workloads.

In-depth knowledge on different file formats and functions used in Copy command for data load.

Experience in developing SQL queries using Snow SQL.

Hands-on experience with Snowflake utilities, Snow SQL, Snow Pipe.

Experience in working with Data pruning, Micro partitioning, Clustering.

Experience in migrating data from Netezza to Snowflake.

Implemented Change Data Capture technology using streams and tasks to load new records to a Data Warehouse.

Building the Logical and Physical data model for snowflake as per the changes required

In-depth knowledge of Cache concepts (Result Cache, warehouse cache) in Snowflake.

Used COPY to bulk load the data.

Created internal and external stage and transformed data during load.

Used Temporary and Transient tables on different datasets.

Used Informatica to load incremental data from source table to snowflake warehouse.

Creating informatica mappings to load data to various fact and dimension tables.

Experience in creating mappings, workflows, tasks, lookup transformations, expression transformation, union transformation. router transformation in informatica.

Experience in working with scheduling tool called control-m to schedule informatica jobs for data loads.

In-depth knowledge in data warehousing and ETL concepts.

Ability to develop scripts (Unix, Python etc.) to Extract, Load and Transform data.

Strong in Linux experience for shell scripting and Python scripting

Experience in writing python scripts for automating various use cases for QE.

Experience working in an agile environment.

Environment: Amazon Web Services(AWS), Snowflake, Snowpipe, Snow SQL, Oracle Plsql, Windows, UNIX shell scripting, XML, JSON, Python Scripting.

Fast Model Sports, Chicago, IL March 2019– April 2020

Role: Snowflake developer

Designed AWS Cloud Formation templates to create EC2 instances, custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database template.

Involved in maintaining performance of AWS EC2 instances.

Worked closely with build teams to create Lambda functions, SNS topic, policies (s3 bucket policies)

Built and managed policies for s3 buckets and used s3 bucket and glacier for storage and backup of AWS.

Managed roles and permissions of users with the help of AWS IAM.

Experience in working with version control tool called GIT to perform various activities like Branching, merging, cloning data, pull requests.

Monitoring various servers through SNS, cloud watch services.

Managing AMI, Snapshots, EBS volumes.

Managing various Murex Databases on Amazon RDS.

Worked with Nebula for registering various datasets in cloud.

Experience in working with various enterprise-wide cloud applications like FDRIFT, ECE, EFG, Datawise.

Creating tables in snowflake database and moving files to snowflake database from various s3 buckets by using Copy commands.

Experience in agile methodology and used JIRA tool. Participated in daily Agile Scrum Standup and Core Hours meetings.

Tools and Technologies used: Amazon Web Services (AWS), Windows, UNIX shell scripting, XML, JSON, Python Scripting.

Fitch Ratings, Wyoming, PA Jan 2018– FEB 2019

Role: Senior developer

Responsibilities:

Developed Single Page Web Application with Node.js, REST API, and MongoDB.

Developed Rest Api to processes the data from DB to another Rest Service using GoLang Involved in various Software Development Life Cycle (SDLC) phases of the project using Agile methodology.

Designed and coded application components in an Agile environment utilizing a test-driven development approach.

Used AWS code pipeline for moving the code across different environments.

Involved in conversion of several paper documents to online documents.

Involved in putting the entries to external XML files which are read by doc-builder.

Worked on various technologies like HTML, JSP, and Servlets. Responsible for change requests and maintenance during development of the project.

Developed Junit test cases for regression testing and integrated with ANT build.

Environment: Java 8, Agile, REST, SOAP UI, HTML, JSP, AWS, Casandra DB, Servlets, TOAD, UNIX, Ant, Junit, Oracle.

PIXELSUTRA, HYDERABAD, INDIA Feb 2014- Nov 2015

Role: Software Developer

Responsibilities:

Involved in SDLC requirements gathering, analysis, design, development, and testing of application developed using Agile methodology.

Used Waterfall Model for designing, implementing, and developing the Application.

Involved in developing web pages using HTML, CSS, JavaScript, JSP and JSF

Established Database Connectivity using JDBC, Hibernate O/R mapping with Spring ORM for Oracle.

Created use case diagrams, sequence diagrams, and preliminary class diagrams for the system using UML/Rational Rose. Designed and developed front view components using HTML, CSS, JavaScript and JSP.

Implemented Core Java concepts like Polymorphism, Inheritance, Multithreading etc.

Developed Java classes for implementing Business logics using EJB (Stateless session, entity, message driven beans).

Used SVN for version control and developed Scripts by using Ant.

Environment: Jdk, Core Java (Multithreading, Collections), SVN, Javascript, Xml, Html, Javascript.



Contact this candidate