Post Job Free

Resume

Sign in

Data Developer

Location:
McKinney, TX
Posted:
February 08, 2020

Contact this candidate

Resume:

Venkata Pichela

214-***-****

adbofw@r.postjobfree.com

Data Engineer Data Analyst PLSQL/ETL Developer

CAREER SUMMARY

Overall 17 years of experience in Analysis, Design, Development, Support, Migration and Testing of various Applications in Finance, Investment Banking, GRC, Healthcare, Insurance, Telecommunication, Banking, Manufacturing and Retail domains.

Worked with Clients like Capital One, Fidelity, Cisco, Deloitte, CSC, Sony, General Dynamics, Nortel Networks, Verizon, Monsanto, Zurich Insurance & Kroger

Experience in migration of applications data into Cloud Simple Storage Service (AWS S3)

Experience in Data migration to Amazon Redshift and Snowflake Cloud Databases

Expertise in Oracle 12c SQL & PL/SQL programming.

Expertise in T-SQL & PL/SQL programming

Experience on data extraction, Transformation and loading (ETL) from various sources like Oracle, Teradata, SQL Server and Flat files using SSIS Import and Export, SSIS Packages.

Expertise in design & development of PL/SQL Procedures, Functions, Packages and Triggers

Expertise in Writing Oracle, SQL Server, Spark & Teradata Scripts

Experience in migrating scripts from SQL Server to PostgreSQL

Experience in Data modelling (Logical & Physical), Database Design & Development.

Good Knowledge on Big data Hadoop Framework &Ecosystems Hive, Pig & Spark

Knowledge in Python & R programming

Extensive experience leading and coordinating onsite and offshore teams.

Experience in preparation of HLD (High Level Design) & LLD (Low Level Design) documents

Expertise in Preparation of Business, Functional & Technical Design Documents.

Experience in Functional areas like Bill of Materials, CRM Customer relation management, Payables, Receivables, Sales & Billing, procure to pay and Order to Cash management

Expertise in Oracle 6i,9i,10g,11g Web Forms & Reports Development(D2K)

Extensively worked on data extraction, Transformation and loading (ETL) data from various sources like Oracle, Teradata, SQL Server and Flat files.

Extensive experience in ETL design, development and maintenance using Oracle SQL, PL/SQL, SQL Loader, Informatica Power Center 8.x/9x.

Experience in using Informatica Client tools: Power Center Designer, Workflow Manager, Workflow Monitor, Repository Manager

Expertise in Implementing Row level data access using Oracle Label Security (Fine Grained Data Access Control)

Creating and managing Schema objects like Tables, Views, Synonyms, Indexes, Clusters, Table spaces, User Roles/Privileges and Materialized views

Extensively used PL/SQL concepts like Bulk collect, PL/SQL tables and Dynamic SQL, Autonomous Transactions, Cursors, REF Cursors & Analytical functions

SQL and PL/SQL Performance Tuning using Explain Plans, Auto trace, and TkProf.

Oracle utilities (Import, Export, Data Pump, SQL *Loader, UTL_FILE and External Tables)

Experience in UNIX Shell Scripting

Experience in Oracle E-Business Suite (ERP) &RICE components development

Good knowledge on using Oracle Business Intelligence (OBIEE) Reporting Tool

Experience on OLTP, OLAP and Cloud Database Systems.

Good Knowledge on Dimensional Data Modeling (STAR Schema & Snowflake Schema)

Education: Bachelor of Engineering from Bangalore University

Technical Skills:

Databases

Oracle, SQL Server, Teradata, Hive, Redshift, Snowflake, PostgreSQL, Cassandra NoSQL

Big data Technologies

Spark sql, Hive, PIG, Python

Tools and languages

SQL, PL/SQL. Core Java, E-Business Suite (11i, R12), UNIX Shell Programming, SQL Loader, Erwin 7.1, Putty, PVCS, CVS, VSS, Remedy, GitHub, Control M scheduling tool, Oracle Forms, Core Java, Toad, SQL Developer

Data Visualization & Business Intelligence tools

Tableau, OBIEE, Oracle Reports Builder, Crystal Reports, SSRS, SSIS, Cognos, Informatica

Operating System

Unix, SUN Solaris, HP-UX 10.x/11, Windows

Web technologies

XML, JavaScript,

Functional areas

Banking, Insurance, Healthcare, GRC, Financial Accounting, Retail, Manufacturing, Telecom, Finance, Bill of Materials, Payables, Receivables, Sales & Billing, procure to pay and Order to Cash management

Professional Experience:

Client: CBRE, Dallas, TX

Data Engineer Mar 2019– Till date

Refactoring Global Investments (GI) database into Enterprise data platform (EDP)

Creating source (Oracle DB) to target (Postgres DB) mapping documents.

Writing conversion scripts to migrate Oracle SQL scripts to Postgres scripts

Writing PL/SQL Stored procedures for change data capture for delta loads

Remediating PL/SQL Stored procedures into Postgres SQL scripts.

Remediating PL/SQL stored procedures to suit with new Dimensional modelling in Postgres

Writing scripts for cell to cell validations of the data as part of data quality check

Database support for CKC (Client Knowledge Center) application

Environment: Oracle SQL, PL/SQL, PostgreSQL, AWS, UNIX

Client: Verizon, Irving, TX

Big Data Engineer Oct 2018– Jan 2019

Migrating Teradata MLOADS to HIVE data warehouse.

Writing conversion scripts to migrate Teradata BTEQ scripts to Hive scripts

Generating Hive DDL scripts for Teradata tables & Views.

Remediating SQL scripts into Hive database.

Running Spark Jobs for data validations from source to target.

Writing scripts for cell to cell validations of the data as part of data quality check

Environment: Teradata, Hive, Hadoop, UNIX

Client: Capital One, Plano, TX

Role: Data Analyst Feb 2018– Sep 2018

Writing SQL scripts for business analysis and reporting.

Supporting Home Loans Servicing business for ad hoc requests

Development of scripts for Automation of QA testing controls.

Development of Sub-serving reports using PostgreSQL

Remediating SQL scripts and Teradata scripts into Snowflake database.

Writing Snowflake scripts by replacing Teradata scripts for enterprise reports.

Testing the migrated scripts with legacy scripts by cell to cell matching using Python scripts

Environment: SQL Server, Teradata, Snowflake, PostgreSQL, Python, Airflow

Client: Fidelity Investments, Merrimack, NH

Role: Senior Data Engineer Oct 2017– Feb 2018

Designing Fact & Dimension tables for IGAR performance attribute reports

Creating relationships between Fact & dimension keys.

Created data models using Erwin for business processes

writing PL/SQL Packages and Procedures to load Operational data into ODS (operational data source) stage tables

Created Informatica mappings & workflows for data migration from ODS stage to CDE (Central data enterprise) stage.

Created PL/SQL ETL Package interfaces to load data into Target Fact & Dimension tables.

Created Autosys Jobs to automate the loading process.

Environment:

Oracle 11G SQL, PLSQL, TOAD, SQL Developer, Informatica 10.1, Tableau 10.3, IBM Netezza UNIX, GitHub, AutoSys, Jenkins, Power Designer

Client: Capital One, Plano, TX

Role: Data Engineer May 2017– Sep 2017

Collection of data dictionary information (database, tables and columns) from existing Enterprise reports sql scripts to create a data model for each business process

Preparation of data lineage documents for derived fields in the reports

Creation of the Big Rationalized Script in SPARK by transposing required business data from multiple tables used in specific Line of Business in Home Loans Origination system

Creation of Business Spark SQL Script for derived metrics based off Rationalized data elements in Amazon S3 Bucket

Testing Home equity Enterprise Tableau & Business Objects Reports running against Business Spark SQL Script

Environment:

AWS, Spark, SQL Server, Snowflake, PostgreSQL, Erwin data modeler, GitHub

Client: Capital One, Plano, TX

Role: Data Analyst Sep 2015– Mar 2017

Understanding the Mortgage data warehouse (MDW) for Home Loans data analysis

Analyze the HMDA (Home Mortgage Disclosure data) data for LAR (Loan Application Registry) Submission.

Writing Tera Data & SQL scripts to develop Target Control Reports (TCR’s) for missing HMDA reportable population

Writing SQL queries to provide data for business analysis and Testing

Writing SQL scripts to prepare data for testing of LAR data for Quality Analysis (QA) & Quality Control (QC) check.

Understanding the Oracle Risk Data warehouse (RDW) for Business risk data analysis

Developing scripts to identify potential exceptions for ICTT - CANSPAM Commercial E-mail Solicitation Opt-Out, FCRA Affiliate Sharing and FACTA affiliate marketing Opt-Out

Create Unix Shell Scripts to embed PL/SQL packages and stored procedure to schedule batch jobs.

Writing SQL scripts for Automation of the Home Loans Mortgage Compliance Risk Controls to avoid manual testing.

Developing Business Compliance Reports for CPC (Credit policy controls) and CMP (Compliance Monitoring Program) controls using SQL scripts

Created SQL scripts to Automate Testing controls for Risk Management Team

Environment:

Teradata, Python, SQL Server, Empower. Tableau, Business Objects (BOBJ)

Client: ETC, Richardson, TX

Role: PL/SQL Lead Developer Feb 2015– Sep 2015

Understanding the Current Legacy systems to Integrate into new System

Preparation of Data mapping documents from source systems to Target system

Defining Translations and Business rules which suites the new Target System

Creation of schema objects like Tables, Public Synonyms and Sequences etc.

Writing PLSQL Packages to Extract, Transform and Load data into Target tables

Design and development of PL/SQL Interfaces to migrate the data in to New System.

Writing SQL scripts to test & validate the data after migration.

Performance tuning of the PL/SQL Procedures and SQL scripts.

Writing Dynamic SQL scripts to Enable & Disable constraints & Triggers Dynamically

Compiling all database schema Packages, Procedures, Functions, Triggers, Sequences, Synonyms, Views after migration.

Environment:

Oracle Database12c, PL/SQL, TOAD, Oracle Developer Suite 11g, Eclipse, Hibernate

Client: CISCO, Raleigh, NC

Role: Project Consultant Jul 2014–Dec 2014

Involved in Preparation of ER Diagrams for different Business Flows

Design and Creation of the new database schema objects as part of Application Enhancement.

Involved in Performance Tuning of SQL Queries

Used SQL LOADER Utility to load data from Flat files into Staging tables

Writing PLSQL Procedures to Extract, Transform and Load data to Target tables

Writing SQL Scripts which can be called from Java as a web service.

Database Migration from multiple Environments (Development, Test, Stage & Production)

Validating the Database Objects and data after migration

Design and development of new database tables and integrate with existing tables to maintain the data integrity by applying Normalization concepts.

Environment:

Oracle Database10g, PL/SQL, SQL* Loader, TOAD, SQL Developer, Eclipse, Hibernate, Erwin data modeler, GITHUB, UNIX & Windows

Client: Kroger Pharmacy - Computer Science Corporation

Role: Oracle PL/SQL Tech Lead Jul 2012 – Mar 2014

Understanding the Source & Target system with Business users and Functional Consultants

Preparation of Data Migration strategies

Involved in Business & Data flow diagrams

Data Modeling & Database Design of the Target system

Preparation Data Mapping documents from Source System to Target System

Data modeling, Schema design & Development

Used SQL LOADER Utility to load data from legacy data from Flat files

Writing PLSQL Interfaces & Conversions for Data migration

Used oracle utilities IMPORT & EXPORT for Schema migration from Oracle 9i to 10g

Involved in Performance Tuning of SQL Queries & PL/SQL programs

Writing SQL Scripts for Deployment

Used Dynamic SQL to Disable & Enable Constraints & database Triggers.

Shell scripting

Environment:

Oracle Database10g, 11g, PL/SQL, SQL* Loader, Java, Erwin data modeler, TOAD

Client: Zurich Farmers insurance, Computer Science Corporation

Role: Tech Lead & Developer Feb 2011 - May 2012

Process Keeper for Auto Insurance Functional Area

Handling Applications of Auto Insurance which deals with policy handling, Claims, Master data handling.

Created stored procedures, functions and packages as part of enhancement.

Debugging the PL/SQL Packages and Procedures

Preparation of RCA (Root Cause Analysis) Documents

Used oracle utilities SQL* LOADER and UTL_FILE to load data from external

Involved in GUI Application Enhancements using Oracle forms developer.

Involved in Reports customizations and development using Oracle Reports tool.

Responsible for providing quality support without SLA compliance.

Preparation of Run book for the Business user

Environment:

Oracle SQL, PL/SQL, Java, Oracle Forms10G, Reports 10g, PL/SQL Developer, TOAD

Client: SONY DADC, Computer Science Corporation

Role: Oracle Tech Lead &Developer Mar 2010 - Dec 2010

Database Migration & Development

Oracle Web Forms Migration & Development

Writing PLSQL Interfaces (Stored Procedures, Functions & Packages)

Creation of Reusable GUI Templates using Oracle Forms Tool

Creation of new Reports using Oracle Reports Tool

Code Reviews

Writing Oracle Scripts for Deployment of the code

Preparation Test scripts.

Performance Tuning of SQL queries Used in Business Objects Reports

Environment:

Oracle Database10g, Oracle Developer Suite (Forms 10g), TOAD, SQL *Loader, MS Visio Oracle 10g Application Server, Mercury Quality Center, UNIX & Windows

Client: All Insurance, Computer Science Corporation

Role : ETL Informatica Developer Jul 2009 - Feb 2010

Analyzed the business requirements and functional specifications.

Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.

Used Informatica Power Center 8.6 for extraction, transformation and load (ETL) of data into the data Warehouse.

Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup,

Developed complex mappings in Informatica to load the data from various sources.

Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.

Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.

Used the PL/SQL procedures for Informatica mappings for truncating the data in target tables at run time.

Environment:

Oracle Database10g Informatica Power Center 8.6, TOAD, UNIX

Client: Monsanto, Computer Science Corporation (CSC)

Role: Senior Database Developer Nov 2007 - May 2009

Data Analysis of Legacy System for Customer Master and Accounts Receivables.

Preparation of Data Mapping documents from Legacy to SAP Applications

Working with SQL* Loader, PLSQL Packages for Data Extraction from Legacy to Oracle Database.

Designing and Writing Oracle Interfaces for Data Extraction, Transformation & Loading (ETL Process).

Co-coordinating with SAP Functional and Business Consultants.

Development of PL/SQL procedures and LSMW objects for uploading data into SAP R/3

Understanding the business in the Application Receivables, Payables and Material Management modules

Used Bulk Collect & Collections (PL/SQL tables) to improve Performance of interfaces

Working with SAP R/3 system to upload the data in to SAP structures. SAP Application Testing

Environment:

SAP R/3, Oracle 10g, PL/SQL, TOAD, UNIX & Windows

Client: General Dynamics – Computer Science Corporation (CSC)

Role: Senior Application Developer Jan 2007 - Nov 2007

Designing Functional &Technical Specifications to implement Oracle Label Security (OLS)

Writing PL/SQL Procedures, Functions, Packages and Triggers to implement OLS

Customizing Oracle forms & Reports to maintain Oracle Label Security

Preparation of Test cases & Testing the OLS Procedures, Functions, Packages and Triggers

Writing SQL & PL/SQL Scripts for Implementation

Used Ref Cursors, Dynamic SQL (Execute Immediate)

Used Dynamic SQL to Enable & Disable database Integrity constraints & Triggers Dynamically

Involved in Deployment of Oracle Label Security.

Implemented the Row Level Security using OLS

Loading data from flat files into Oracle database using SQL * Loader

Involved in preparation of Unit Test cases &Testing

Environment:

Oracle 9i, SQL, PL/SQL, Oracle Forms & Reports TOAD 6.5, and SQL *LOADER, SQL *Plus, UNIX Shell Scripts, UNIX, Windows

Client: Kroger, Computer Science Corporation (CSC)

Role: Senior Software Developer Aug 006 - Jan 2007

Designed and developed PO requisition interface to load PO requisitions from the legacy System into Oracle EBS.

Designed and developed Payables Open Interface to load AP Invoices from the legacy System into Oracle EBS.

Designed and developed AR Interface to load AR Invoices from the legacy System into Oracle EBS.

Experience in implementing reporting and business intelligence projects, including design and implementation.

Identified and created aggregate tables to improve query performance.

Extensively utilized the Application Object Library to register SQL*Loader programs, PL/SQL packages and procedures, forms and reports.

Documented all the work using AIM Methodology. Created and updated MD070 Technical Specifications and Test Case documents for all the development work done.

Environment:

Oracle 10g, SQL, PL/SQL, Reports6i and Oracle EBS Applications 11i (ERP), UNIX, Windows

Client: Valeant Pharmaceuticals International (Deloitte Consulting)

Role: Senior Systems Analyst Jul 2005 - Aug 2006

Understanding the SQL*LIMS Application.

Analyzing the Schema (Data Model) of the SQL*LIMS Application.

Writing PL/SQL scripts. (Stored procedures, Functions and Packages)

Preparation of Technical design documents for enhancements.

Customizing the existing Application Using Oracle Forms & Reports

Performance Tuning of the SQL queries used in Oracle Reports Builder

Code review of the Team.

Preparation of Test plans and testing the Lab Samples.

Environment: Oracle9i, PL/SQL, Oracle Forms & Reports (9i), SQL * Loader, SQL * LIMS

Client: Revenue Cycle Services (Deloitte Consulting)

Role: Senior Analyst Dec 2004 - May 2005

Writing Pl/SQL Procedures, functions and packages as per the client requirements.

Loading the data by using SQL loader from flat files & CSV files into Oracle database

Running shell scripts to format the load data on weekly and regular basis.

Running SQL scripts to load the data on daily basis.

Writing PL/SQL scripts to automate the SQL scripts.

Loading the data by using Oracle utilities (Import, Export and SQL Loader).

Writing Shell Scripting

Environment: Oracle 9i, SQL, PL/SQL, JAVA, UNIX, Toad, Windows, UNIX

Client: MCIS Zurich Insurance Malaysia

Role: Software Developer Jan ’2003 – Sep ‘2004

Involved in developing the new Oracle Reports.

Customization of Existing Forms and Reports.

Developing PL/SQL procedures, functions, packages &database triggers.

Providing Technical support to the users of the CLAIMS module.

Preparing Technical Documentation (preparation of application flow, Data flow diagrams for Functions, Procedures, Packages and Database Triggers).

Environment: Oracle 9i, PL/SQL, froms6i and Reports6i, TOAD, Visio, Windows XP

Client: Ministry of Education Malaysia Gov.

Role: Software Developer Sep ’2002 – Dec ‘2002

Designing and developing new Form Modules using Forms6i&Reports6i.

Migrating applications from Forms4.5 to Forms 6i.

Migrating the applications from Cold Fusion 3.1&5.0 to MX6.0

Customizing the existing forms & Reports

Database Migration using IMPORT and EXPORT from Oracle 7.X to Oracle9I

Preparation of Unit Test cases & Testing

Involved in UAT sessions to users.

Environment: Oracle9i, PL/SQL, Forms6i, Reports6i, Crystal Reports8.5, AIX-Unix, Windows



Contact this candidate