Post Job Free

Resume

Sign in

Data Python

Location:
Land O' Lakes, FL
Posted:
March 14, 2021

Contact this candidate

Resume:

VINESH MANAGALAMPALLI

714-***-****

adkwya@r.postjobfree.com

SUMMARY

Over 8+ years of experience in the development of client/server and multi tired applications using Python, PySpark, Spark, Hadoop, HDFS, Hive, AWS, Oracle Database, SQL, PL/SQL and T-SQL on different platforms like Windows, UNIX and LINUX.

IT experience in Data Warehousing, Data Analysis, ETL, BI and Business Analytics.

Strong experience in writing scripts using Python API, PySpark API and Spark API for analyzing the data.

Extensively used Python Libraries PySpark, Pytest, Pymongo, cxOracle, PyExcel, Boto3, Psycopg, embedPy, NumPy and BeautifulSoup.

Manipulated data from Hadoop, Spark, NoSql Database Mongo DB. Used Dockers to move the data into the containers as per the business requirements.

Developed end to end Spark applications using PySpark to perform various data cleansing, validation, transformation and summarization activities according to the requirements.

Experienced in developing data pipelines for Hadoop migration from 2x to 3x environments using Python and UNIX.

Experienced in developing Python code to retrieve and manipulate data from AWS Redshift, Oracle 11g/12c, T-SQL, MongoDB, MS SQL Server, Excel and Flat files.

Experienced in configuring AWS environment to extract data from various sources and loaded in data in Redshift columnar database using distribution and sorting.

Expertise in Data Extraction, Cleansing, Transformation, Integration, Data Analysis, Logical/Physical Relational/Dimensional Database Modeling & Designing

Experience in designing and developing dashboards and reports by extracting data from different sources like oracle, flat files, and excel by using Tableau and Power BI.

Experienced in building Automation Regressing Scripts for validation of ETL process between multiple databases like Oracle, SQL Server, Hive, Mongo DB using Python.

Extensive experience in the database analysis, development and maintenance of business applications using Oracle 12c,11g,10g, 9i, 8i, PL/SQL Developer.

Expertise in developing SQL, PL/SQL scripts in oracle distributed environment to data analysis.

Experience in Database design using Normalization and E/R diagrams.

Extensively worked on PL/SQL Object Types, Dynamic SQL, Collections, Autonomous transaction, Compound triggers, Materialized Views and Table Partitioning.

Exposure to different types of testing like Automation testing, System & Integration testing.

Functional testing, Regression testing, Smoke testing, Database testing, Performance testing,

Worked with data transformations and ETL process

Have Knowledge on Azure data factory

Wrote simple and complex SQL queries using DML, DDL, Table joins, Group functions, Grouping functions, Analytical functions, Partition by clause towards Reports and Application development. Extensively used oracle tools like TOAD and SQL Navigator.

Aware with VBA coding and reverse engineered excel macros to prepare requirement documents. Experience in Data Extraction, Migration Transformation and Error handling.

Knowledge in Data modeling for Data Warehouse/Data Mart development, Data Analysis for Online Transaction Processing (OLTP) and Data Warehousing (OLAP)/Business Intelligence (BI) applications created. Experience in writing complex SQL Queries, created materialized views.

Excellent understanding of the software development lifecycle (SDLC).

Experience with Agile tools including Rally, Kanban and JIRA Agile.

Excellent communication and interpersonal skills, with the ability to manage responsibilities individually or as part of a team environment.

TECHNICAL SKILLS

Languages

Python, Spark, SQL, T-SQL, HiveQL, PL/SQL, C, C++

Databases

Oracle, SQL Server, AWS Redshift, MongoDB, Teradata, Vertica, Sybase, Microsoft Access, Hadoop-Spark, Hive, PostgreSQL.

DB Tools

TOAD, SQL Developer, DBeaver, SQL*Plus, PL/SQL Developer

Oracle Tools

Oracle Reports 10g/6i, Oracle Forms

ETL Tools

Informatica Power Builder 8.6/9

Web Technologies

HTML5, JSP, XML, CSS, SERVLETS

Scripting Languages

UNIX shell Scripting, Perl

Test Automation

Tools AutoSys, Win Runner, QTP, Test Director

Operating Systems

Windows, UNIX and LINUX

Reporting Tools

Tableau, MicroStrategy, Power BI, SCM, GIT

WORK EXPERIENCE

Data Engineer/ Python Developer

Aetna, Hartford, CT April 2020 – Current

Design and building ETL pipelines to automate ingestion of structured and unstructured data.

Worked with Python 2.x/3.x, SQL, Spark SQL and UNIX in building the performance and optimized frame work for data pipelines for Hadoop migration applications running on UNIX data stored on Hadoop, HDFS.

Understand existing system business logic, perform enhancement and impact analysis of the current applications with that of the migrated applications.

Used python Libraries like PySpark, cxOracle and HiveContext based on the modules and business requirement.

Worked on Bulik ingestion of data related to Inpatient claims, outpatient claims, dental and vision claims with that of ICD9 and ICD10.

Converted, loaded, integrated and moved data from Clinet applications to provider database applications vice-verse.

Cleanzed, Transformed, Created and Managed primary data base objects such as Internal/External Tables for Member data, Claims data to Data Science Teams to build reports over current enrollement and predictions for future enrollements.

Involved in writing test scripts, unit testing, system testing and documentation.

Worte complex queries in PySpark and HIVE for validation of loads on the multiple system.

Environment: Data Analytics, Python 2.7 & 3, Hadoop, HDFS, Hive, Spark, Linux, Oracle 12c, PyCharm, DBeaver, Jupyter Notebook, VS Code, Toad, Sql Developer.

Data Engineer/ Data Analyst / Python Developer

McKinesy, Dallas, TX Jul 2019 – March 2020

Worked with PySpark and Spark SQL in building the performance and optimized spark frame work for applications running on AWS EMR cluster data stored on Hadoop, HDFS, HIVE tables.

Understand existing system business logic, perform enhancement and impact analysis of the applications.

Used python Libraries like PySpark, Pytest, cxOracle and PyMongo based on the modules and business requirement.

Developed Spark applications using Spark tools like RDD transformations, Spark core, Spark streaming and SparkSQL.

Used GO lang scripts for uploading a file to S3 and deploying them and creating GO serverless application and deploying it to AWS lambda.

Developed end to end Spark applications using PySpark to perform various data cleansing, transformation and summarization activities according to the business requirements.

Python Script reviewing for the data collection, analytical reports development scripts, altercation. Created data frames for data type testing, did normalization analysis on data supporting digital data science team.

Wrote pre-processing queries in PySpark for internal spark jobs and validated using HQL

Loaded data from many different sources to local analytical area & data lakes with the environment creating CI/CD pipelines.

Create reports for the BI team using Imply-Druid exporting data from HDFS and Hive

Enhanced current Automation Regressing Scripts for validation of ETL process between multiple databases like Oracle, SQL Server, Spark, Mongo DB using Python.

Wrote Automation Regressing Scripts and Unit Test for validation for process using Pytest and Unittest.

Responsible for checking the data discrepancies while flowing from different source systems.

Environment: Data Analytics, Python 3.6, AWS, Hadoop, HDFS, Hive, Imply, Spark, Linux, Oracle 12C, Power BI, PyCharm, DBeaver, Jupyter Notebook, VS Code, Toad, Sql Developer.

Data Engineer/ Data Analyst / Python Developer

AT&T, El Segundo, Los Angeles, CA Jan 2018 – Jun 2019

Responsibilities include analyzing, trouble shooting, resolving and documenting reports. Understand existing system business logic, perform enhancement and impact analysis of the applications.

Document all the requirements received from business team, optimized for effective solutions.

Coordinate with business analyst, business client and various functional teams across the application to gather requirements and provide best solution approach.

Developed ETL programs in python to move data from source systems to analytics area.

Used python to retrieve and manipulate data from AWS Redshift, Oracle 11g/12c, MongoDB, T-SQL, MS SQL Server, Excel and Flat files.

Demonstrated to move data between production systems and across multiple platforms.

Once the data was dumped in to the analytics area, identified the business requirements to transform the data for analytics purposes based on the business requirements.

Developed Automation Regressing Scripts for validation of ETL process between multiple databases like AWS Redshift, Oracle, MongoDB, T-SQL, SQL Server using Python.

Used python Libraries like Pytest, Pymongo, cxOracle, PyExcel, PyExcel, Boto3, Psycopg, SOAP, embedPy NumPy and BeautifulSoup based on the modules and business requirement.

Configured AWS environment to extract data from various sources and loaded the data in Redshift using distribution and sorting.

Create complex data models and process flow diagrams. Create and maintain high level design (HLD), detailed design (DD), Unit test (UTP) documentations & business-process documentation based on the requirements gathered from business analysts & business user, using industry-standard methodology.

Development/enhancement of Oracle PL/SQL programs for creating Tables, Views, Sequences, Database triggers, Cursors, Stored Procedures and Functions, Exception Handling and Indexing, Optimization and Tuning of Procedures, SQL Queries to improve performance.

Developed python programs and excel functions using VB Script to move data and transform data.

Development/enhancement of UNIX shell scripts. Troubleshoot the Production issues.

Developed data analysis tools using SQL and Python code.

Designed and maintained databases using Python and developed Python based API (RESTful Web Service) using Flask, SQL Alchemy and Postgre SQL and Implemented client-side Interface using React JS.

Work closely with upper management and consultants onshore and offshore of various teams in development, maintenance, QA & Testing and production support of compensation system.

Design, Development and Enhancements of various types of reports.

Working on reporting converting and making reports using python (PyExcel Module).

Expertise in data quality, data organization, Meta data and data profiling.

Environment: Python 3.2, AWS Redshift, Oracle 11g/12C, MS SQL Server 2008/2012, Teradata, Vertica, PL/SQL, Linux, Microsoft Access, Power BI, VBA, PyCharm, Toad, Sql Developer, DBeaver, Jupyter Notebook.

Data Engineer/ Python Developer/ PL/SQL Developer

Blue Cross Blue Shield, Detroit, MI Apr 2017- Dec 2017

Analyzed the requirements and designed the program flow.

Participated in the full life cycle of this project including information gathering, analysis, design, development, testing and support of this module.

Responsible for identifying the regression test cases.

Identify and analyze data discrepancies and data quality issues and works to ensure data consistency and integrity.

Designed and Created Database objects and Developed ETL programs in python for data extraction and transformation from multiple sources.

Created, edited procedures and functions for improved business requirements.

Created complex Stored Procedures and SQL Joins and other statements to maintain referential integrity and implemented complex business logics.

Modified tables, synonyms, sequences, views, stored procedures and triggers.

Analyzed and defined critical test cases form regression stand point to be added to master

regression suite.

Prepared detailed design documentation including ETL data mapping documents and report specifications.

Worked in Agile Methodology (Scrum) to meet timelines with quality deliverables.

Reviewed database program to come up with UML modeling

Developed a tool using excel + VBA to connect to database and get summary of instruments data for any day. Implemented database triggers based on the business rules and requirements.

Extensively worked in Oracle SQL, PL/SQL, SQL*Plus, Query performance tuning, Created DDL scripts, created database objects like Tables, Views Indexes, Synonyms Sequences. Migration of existing data from MS access to oracle.

Performed Regression testing for Golden Test Cases from State (end to end test cases) and automated the process using python scripts.

Developed parallel reports using SQL and Python to validated the daily, monthly and quarterly reports.

Involved in writing JSP script to build the front-end application

Unit testing on PL/SQL Packages and procedure, functions according to business requirement.

Documented every phase like Technical specifications, source to target mappings, data and release notes. Assisted in testing and deployment of the application.

Responsible for Import and export data from various data sources like SQL Server Databases, Flat Files, MS Access, MS Excel and other OLE DB providers achieved through Importing and Export Wizard. Created various visualizations and reports using VBA.

Independently handle the deployment and support activities on Finance and SCM modules.

Assisted in Functional Requirement Specification & Use Case Diagrams to streamlining the business flow. Used SQL Developer to Load / Extract data into and from Excel files.

Environment: Oracle 10g/11g, PL/SQL, Toad, Python, PyCharm, VBA, HL7, Microsoft Access, MS SQL2005, HTML5, MS Visio 2005, Visual Studio.

Oracle Developer

Promac IT Solutions LTD, Manchester, UK Mar 2012 to Dec 2014

Client – Lloyds Bank

Banking Application Development

Complete study of the system and prepared Logical and Physical database design.

Implemented triggers to enhance functionality supplement validation and control navigation.

Worked extensively as PL/SQL programmer, Developed PL/SQL Procedures that handle key business logic. Created front-end screens for opening new registrations using HTML5, CSS.

Designed and Created database using SQL and PL/SQL, wrote backend stored procedures, functions and triggers which allow users to analyze daily, weekly and monthly connection requests, issued connections versus projected issues and keeping track of customer details monthly. Wrote SQL * loader scripts to load the data from flat files into database.

Migrated Databases from Sql server 2008 R2 to Microsoft Access using SSIS.

Worked on design and generation of different Database Objects like Tables, Views, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, PLSQL Packages, Database Triggers, etc. Test case/data preparation, execution and verification of the test results.

Created PL/SQL program units to include DML, DDL statements using Dynamic SQL.

Created database objects like Tables, Indexes, Sequences, Constraints, and Views.

Ability to develop comprehensive test plans, scripts and cases and perform unit and comprehensive testing with system developers to ensure overall functionality and technical quality of deliverables. Handled various exceptions to troubleshoot PL/SQL code.

Environment: Oracle 9i, PL/SQL, Unix, HTML, CSS, SQL*Loader, Toad, SQL Developer, SQL, Microsoft Access, Java.

EDUCATION

Master’s in Computer Information System, USA, 2016

Masters in Managing Information Technology, University of Salford, UK, 2012

Bachelor’s in computer science, JNTU, Hyderabad, 2008



Contact this candidate