Post Job Free

Resume

Sign in

Data Engineer Pl Sql

Location:
McKinney, TX
Posted:
April 02, 2024

Contact this candidate

Resume:

KRISHNA NUVVALA

945-***-****; e-mail: ad4qt5@r.postjobfree.com

EDUCATION

Southern Arkansas University

M.S., Computer and Information Science GPA 3.33

TECHNICAL SKILLS

Web Technologies: HTML, CSS, JavaScript, Bootstrap,

Programming: Python, R

Tools: TOAD

IDE: Eclipse

Databases: MySQl, Oracle SQL and PL/SQL.

Sap Platform: Native HANA, ABAP, Bo, BW, Business Application Studio(BAS) and BTP Cockpit, Cap DB Modelling, Fiori, SDI, SAP Cloud Platform, Sap HANA XSA and Cloud Foundry

ETL : Datastage

Cloud Technologies: Azure, AWS RDS, EC2, S3 Buckets, EMR,

Operating Systems: Linux (Red Hat, CentOS 7), Windows Systems

Big Data Ecosystem: Hadoop, HDFS, Hive, Sqoop, HBase, Oozie, Kafka, Spark, Airflow

Java Technologies: Bascis of Core Java.

PROFESSIONAL EXPERIENCE

Data Engineer - Intern Aug 2023 – Nov 2023

Responsibilities:

Mined and transformed complex healthcare data (RDBMS: Oracle, MySQL; SFTP) using Python and Shell Scripting, enabling data-driven decision making.

Worked on Sqoop jobs for ingesting data from MySQL to Amazon S3

Created hive external tables for querying the data.

Use Spark Data frame APIs to inject Oracle data to S3 and store it in Redshift. Write a script to get RDBMS data to Redshift.

Developed automated data pipelines using Sqoop, Spark SQL, and AWS services (S3, EMR) to efficiently ingest, transform, and store data in desired formats (CSV, JSON).

Optimize Hive and spark performance. identify the errors using logs.

Automatically scale-up the EMR Instances based on the data. Apply Transformation rules on the top of Data Frames.

Run and schedule the Spark script in EMR Pipes. Process Hive, csv, Json, oracle data at a time (POC).

Validate and debug the script between source and destination. Validate the source and final output data.

Collaborated with the reporting team to create insightful dashboards for data visualization, supporting informed business strategies.

Demonstrated understanding of data quality principles through data validation and debugging processes.

Description:

Zocdoc is a health care domain, Get the RDBMS data (Oracle/MySQL) and SFTP data, store in AWS-S3. Import and export data from Oracle, and MySQL through Sqoop) and store in S3. Apply Transformation rules on the top of different datasets and finally store in the desired output (CSV to JSON). Schedule the tasks in oozie, scale up automatically based on data. Finally store the desired data in RedShift and S3 in the desired format. Coordinate with Reporting team to create dashboards.

Data Engineer - Intern Jan 2023 – July 2023

Description.

Snack Pack Delivery

The dataset to be used is connected to car sales. Choose the Name and Weight columns as your first two data columns. After that, order the dataset by Weight, highest to lowest. Print the first five rows of data alone. Use SPARK in three distinct methods to accomplish it. For coding, you can use Google Colab or HDP shell in a box.

1) Worked on RDD data structure

2) A data frame format that require any SQL queries

3) SQL queries and Data Frame data structures. Reading and uploading the csv and txt files into Spark Structured Streaming program in Google Collab.

4). Design data structures and relationships specifically for cloud-based storage solutions..

Software Engineer - Accenture Jun 2021 – Aug-2022

Rolls Performed.

Analyzing the data from different dimensions and bring it into HANA Data Base.

Hands on experience in BAS

Worked on cloud DB modelling.

Applied HANA modeling skills to design complex models (Calculated Views) in the model.cds file, optimizing data access and performance within the SAP HANA platform.

Worked on table functions.

Experienced on creating the ABAP views.

Experienced on uploading the CSV files in BTP Cockpit and create the new tables.

Preparing Functional Specification, Unit Test Plan document and Operation guide.

Contributed to the development of the OSND Carrier Portal replacement utilizing SAP Cloud Platform (SCP App), leveraging your understanding of cloud-based solutions and application development.

Collaborated with the reporting team to create insightful dashboards for data visualization, supporting informed business strategies.

Involved in build and deployment and pushed the code to git.

Knowledge on SDI

Description.

OSND Carrier Portal

It’s a carrier portal that allows carriers to submit OS&Ds (Overages, Shortages, and Damages) and OS&D claims, and inquire load status at DCs (5 days out).

The current portal is home-grown, built with 20-year-old technology, and unsupported – this is a major risk for implementation of S/4 project. If not replaced, it hinders the introduction of positive changes, e.g., the simplification of the shipment structure. It intends to utilize an application based on the SAP Cloud Platform (SCP App) to replace the existing Carrier Portal.

A carrier can carry out the following functions:

Inquire about status of loads in the upcoming mentioned days.

Submit Overage, Shortage and Damages. Upload supporting documents on the portal.

Submit Accessorial requests.

Create OS&D request by the Carriers, Warehouse supervisor and Transportation team using External facing application.

View the report of all the OS&D requests raised in last few days.

Software Engineer-Technosoft October 2019 – Mar 2021

Rolls Performed.

Involved in requirement gathering, Analysis, Design, Development and UAT in Agile methodology.

Regular interaction with clients to understand requirement, clarification and UAT for offshore driven projects.

Analyzing the data from different dimensions and bring it into HANA Data Base

Created complex models (Attribute, Analytical, Calculated Views)

Preparing Functional Specification, Unit Test Plan document and Operation guide.

Followed Agile methodology

Break down of Functional design into High level of detail technical design.

Involved in System Integration test with downstream systems.

Involved in Design, develop and maintain Webi (Reports)

Involved in various meetings with Business analysts and developers.

Description:

TMS - Transport Management System

It’s easy to use database giving us flexibility for future reporting which would consolidate information on where we are sourcing from and where we are shipping from with regards to weight, volume, values, carriers and everything which matter to a logistics organization.

The below are the Key areas in Logistics.

To improve consolidation

To have better carrier selection

To have a central track & trace system

To have freight data up to the material/customer/order level

To have more accurate auditing of freight invoices

To have full freight spend visibility in SAP

Software Engineer – APEX_IT June 2017 – September 2019

Rolls Performed.

Involved in requirement gathering, Analysis, Design, Development and UAT in Agile methodology.

Regular interaction with clients to understand requirement, clarification and UAT for offshore driven projects.

Created Table Functions, Designing tables, Building HANA Views (Attribute, Analytical & Calc Views), testing with BI Reporting tools.

Understanding the Greenplum complex procedures and Function and realize into HANA Models and SQL Scripting.

Analyzing the data from different dimensions and bring it into HANA Data Base

Involved more into sql scripting provide test scripts to team.

Preparing Functional Specification, Unit Test Plan document and Operation guide.

Prepared the test cases based on the detailed design specifications and executed the same.

Involved in System Integration test with downstream systems.

Involved UNIX shell script.

Description.

EDW – HANA Migration

VMWARE Currently using Greenplum database based EDW & Reporting Landscape, It has different subject areas. Now everything brought into HANA platform and building reports. Need to validate the data in each and every business layer

EDW – Information Models

Base tables from the existing Greenplum EDW will be realized as virtual data models (base information views/models) in HANA EDW incorporating the GP routine logic through HANA procedures and scripted calc views.



Contact this candidate