Post Job Free
Sign in

Head of Data Engineering and Analytics

Location:
Sunnyvale, CA
Posted:
September 13, 2024

Contact this candidate

Resume:

Basavanna Olekar

Cell: 650-***-**** Email-ID : *********.******@*****.*** Address : Sunnyvale CA 94089 A Database Consultant with strong knowledge in Oracle 19c, 12c, 11g,10g, 9i, 8i, 8, 7 and Hive (HQL), Spark, Presto on Designing, Developing/Coding, Implementing, Migrating/Loading/Converting Data, Optimizing, Tuning and Maintenance of databases up to Petabytes in size.

Industry Experience (Domain) :

●Pharmaceutical Domain (OLTP, OLAP) : Pfizer: NY.

●Financial Domain (OLTP,OLAP,Analytics) : Emerson: St. Louis, MO ; Credit-Suisse : NY, Amazon: WA, Facebook Inc., : Menlo Park, CA

●Banking Domain (OLTP) : Citigroup : NY .

●Supply Chain Management (OLAP): Nike, Inc: Beaverton, OR; Emerson: St. Louis, MO; Puget Sound Energy (PSE): Bellevue, WA ; Snap-On Business Solutions: Richfield, OH; Stage Stores : Houston, TX; Amazon: Seattle, WA ; GAP : San Francisco, CA

●Electric and Gas Domain (OLTP, OLAP) : Puget Sound Energy (PSE): Bellevue, WA.

●Automobile Domain (OLTP, OLAP) : Snap-On Business Solutions: Richfield, OH.

●Retail Domain (OLTP,OLAP) : Stage Stores: Houston, TX; Aeries: Santa Clara, CA; HP: Palo Alto, CA.

●Tax Domain(OLAP) : Amazon: Seattle, WA.

●Data Migration : Levis : San Francisco, CA; Pfizer: NY; Puget Sound Energy (PSE): Bellevue, WA; Amazon: Seattle, WA; Nike, Inc: Beaverton, OR

●HealthCare : Delta Dental, San Francisco, CA ; WageWorks, San Mateo, CA ; Anthem, Atlanta, GA

●Policy Domain : AirBnB, San Francisco, CA

Training/Presentation Experience :

Conducted corporate classes i.e.,. Trained 1000's of employees working in below mentioned Clients/Companies.

●Clients/Companies : Infosys, Wipro, Zenieth, I-Gate, Robert Bosch, Mico Bosch, Itamic, SLK to name a few.., on Oracle (SQL, PL-SQL, DBA, Best Practices and Performance and Tuning )

EDUCATION:

●MIT : Artificial Intelligence Implications for Business Strategy Program

●M-Tech in Information Technology Allahabad University, India

●(MBA) in Systems, Madras University, India

CERTIFICATION:

Oracle 8i/9i Certified Associate (OCA) for Database Development and Administration track from Oracle Corporation.

MY PRODUCTS/ACHIVEMENTS :

Built an Intranet based application for lawyers to fasten their court paper works and track their productivity. Introducing this application lead to increase performance/productivity by over 600%. 100’s of Lawyers in South India used this application during the period of 2002 to 2006 and got settled.

In the year 2004 Built an Intranet based application for Sagas, which was used by most of the executives working for State Bank Of India and State Bank of Mysore. This product was introduced to fasten their paper work and also to avoid misplacing or forgery of/with the client documents, Accuracy, reusability productivity & security.

SUMMARY

Over 20 years of result driven experience in Database Technology, Administration, Programming, Analytics and Training.

Served my services as Database Architect (Jr), Database Specialist, Solution Architect, Database Lead, Oracle DBA, Web/PL-SQL Developer, Corporate Trainer, Sr Database Consultant.

Expert in Developing, Optimizing and Tuning Oracle SQL statements, PL-SQL Procedures/Functions/Triggers/Packages/PIPELINED Tables/Bulk Collect or Load Statements/Dynamic Statements/Simple, Ref & Traditional Cursors/Exceptions, PL-SQL optimizer, Conditional compilation, Compile time Warnings, Error/Inquiry Directives, Err Tables/Exception packages and Oracle Instances/ Databases/Servers.

Hands on experience in designing & developing highly active transaction applications

Expert in designing OLAP and OLTP Data Models.

Strong in developing ETL tools to migrate/feed Data to downstream applications using SQL, PL-SQL, perl, shell scripts and schedule them accordingly as part of automation.

Developed strategies for data acquisitions, archive recovery, and implementation of a database.

Expert in writing PL-SQL procedures for Data Conversions, Data migrations, Batch Jobs/Processing and Applications business processing.

Worked on over PetaBytes (PB) sized Database.

Hands on Oracle Database Administration like Client and Server Installations, Backup, Recovery, Performance & Tuning..

Good knowledge on SQL and PL-SQL Optimizers, NESTED_LOOPS, HASH JOIN, MERGE, SORT AGGREGRATE, Explain Plan / Statspack, Analytical Functions, Co-related Sub-query, Hierarchical query, Partitioned Tables/Queries, Joins, Hints, set operator like MINUS, INTERSET, UNION, Error Tables etc., .

Hands on experience of understanding business challenges & translates them into process/technical solutions.

Expert in determining requirements and functional specifications required of the data model and associated architecture.

Strong in Oracle security i.e., Connection & Access Privileges/Roles, Auditing, Encryption & Decryption, database, schema and object level Triggers and many more.

Strong business analysis, software development and design, and a solid understanding of data warehouse and enterprise application architecture.

Proven ability to be able to design, develop, and maintain Databases, Data Warehouses, and extraction queries (Relational/Multidimensional/Non-Relational Databases)

Strong experience with Data Analysis, Data Cleansing, Data Migration, Multidimensional Databases, Object Databases, etc.

Expert in BI solutions Some of the tools used are Oracle Reports, SSRS, NaviCat, Qlikview, Spotfire, Tableau, OBIEE, Cognos, Business Objects.,

Strong in maintaining Standards and Policies.

Experts in User-Administration, Space Management, Redo-log Management and Object Management.

Expert in Developing MySQL TSQL Procedures/Functions/Triggers and MySql Instances/Databases.

Strong in MS-SQL server 2000/2005 in SQL, T-SQL, user/object management, DTS, SSRS, MS-Reporting Services and creating databases.

Strong with Data Migration, Oracle Networks and User Managed Hot and Cold Backups, Logical Backups, RMAM and Troubleshooting. Observing Physical and logical Memory, Process, Transaction, Locks, Latches, Trace Files, Alert Log Files and Memory to name a few as part of Maintenance.

Excellent written, oral and interpersonal communication skills. Strengths include Leadership, problem solving, team-oriented and ability to work on multiple projects concurrently.

As a Team Lead managed Onsite and Offshore Teams, which includes assigning the work, reviewing the work estimation/source code w.r.t to CMM standard/functionality/Test-cases, Risk management, Approving the leave/Timesheet and providing the feedback to my senior's w.r.t to their performance/promotions on regular basis, motivating the Team members, conducted seminars on regular basis as part of training the fresher’s w.r.t to technology and Domain.

As part of training, trained clients on proper system and database functionality.

Hands on experience in Trouble shooting and investigation to correct for poor System, Application and Database performance.

Worked on Physical & Logical design and Custom data loading procedures.

Experience with defining rollback plans that would bring the database back to its previous stable condition as part of different recovery plans..

Audit database security configuration and operating system security environment.

Created WinForms(.Net, C#)

Created Test Cases using Company Template and also Excel QC/JIRA, Raised Defects/Bugs.

Created Database/System management and security programs and procedures.

Created Database/Unix users and associated them with a group and gave required permissions/privileges

Used some of the popular UNIX tools like AWK, SED, TOP, IOStat, PRStat, MPStat, SAR, DOS2UNIX, NMON, TOPAS .

Automation Created and Called/Scheduled shell scripts from RC scripts, CRON and Control-M, Autosys, DBMS_SCHEDULAR

Applied Packages and Patches for Solaris and Linux Operating system

Configured/applied the Kernal and Environmental variables

Unix Networks configuration, Telnet, FTP

Created Virtual Directory in IIS.

Created Intranet/Internet based (Web) applications using ASP, XML, CGI, PERL, Applet.

Writing complex queries in Vertica for BI and DMLs for populating fact tables, uploading data into Vertica and extracting data from Vertica projections to files, usage of COPY with parallel parsing

Vertica Rejection tables

Vertica partition management

Creating projections based on the need for query optimization, understanding of Vertica architecture (WOS/ROS)

Building Shell scripts to automate cleanups or ETL’s in Vertica

Writing Complex queries in Netezza for BI and DML’s for populating Fact tables.

Building Shell scripts to automate cleanups or ETL’s in Netezza

TECHNICAL SKILLS

Cloud/Database/ETL tools: Oracle 8/8i/9i/10g/11g/12c/19c (SQL, PL-SQL and Administration), Presto, Hive(HQL),Spark, Redshit, BigQuery, Vertica, Teradata, Netezza, SQLServer-2000/2005 (SQL), My-SQL 5.0.1 (SQL), GreenPlum, MS- Access, DB2 (SQL), Postgres, OEM, Net-Manager, IMPORT, EXPORT, Data Pump, SQL-Loader, TKPROF, STATSPACK, SQLTRACE, MySQL Enterprise Dashboard, MySQL Administrator, Load-JAVA, DBCA, DBUA, LogMiner, RMAN, Dataswarm, Airflow, OneFlow, SnowFlake ( completed online training), AWS, Azure, ADF Azure Data Factory, Azure Synapse Analytics, Azure Pipelines .

Reports: MS SQL-Reporting Services(SSRS), Report Builder (NaviCat), Business Objects (BO), OBIEE, Cognos, Qlikview 11.2, Spotfire, Tableau, Incorta, Unidash (Facebook In-house tool), Superset, Looker ( completed online training)

Web/Scripts: HTML, DHTML, VBScript, JavaScript, Jscript, CGI, Perl, ASP, XML, Kron Shell Script (KSH), Bash Shell Script, php, JSP, PSP, Servlet, Applets, .net, c#

Languages: Pascal, C, Java 1.2 (including JDBC, Applet, Swing), .Net, C#, Python.

Office Utilities: MS-Office (Word, Excel, PowerPoint, Access, Front-Page), Star Office (on Solaris, Windows), Open Office.

Editor’s/Utilities : TOAD for DB2 and Oracle, Rapid SQL, SQL-Tool, Oracle SQL, Oracle SQL Developer, SQL Work Sheet, SQLPlus, iSQLPlus, MySQL Query Browser, TIBCO, Squirrel for DB2, MS-FrontPage, Visual Interdev, DBeaver for Postgres, Visual Studio for Developing Reports for MS-Reporting Services, WASD, F-Secure, Putty, Winscp, ExamDiff to name a few..,

Testing/Bug Management Tools : QC (Quality Center), JIRA, Bugzilla Operating system : Windows, UNIX [ Solaris, AIX, Redhat Linux, SuseLinux, HPUnix, System V ], workplace

Source control / Change Management Tools : VSS, Clear Case, StarTeam, Accurev, PVCS,SVN, HG, GitHub

Schedular : Control-M, Chronos, Autosys, RC-Scripts, CRONTAB, Oracle Scheduler (DBMS_SCHEDULER), Dataswarm, Airflow, OneFlow

PROFESSIONAL EXPERIENCE :

Client : Citi-Bank, Florida Oct 2023 – Till Date

Architect Data Engineer/PL-SQL Developer ( Anti Money Laundering and Fraud Analytics):

Transformed Models into DB processes as per Analysts requests.

Built framework for Ingesting Different application data into our system

Developing, Optimizing and tuning SQL and PL-SQL code and processes

Supporting production issues

Worked on Presto POC

Technologies & Tools : Oracle 19c, SQL, PL-SQL, JSON, Presto.

Environment: Windows, Unix

Client : Anthem, Atlanta Sep 2022 – June-2023

Architect Data Engineer/ Analytics/ PL-SQL Developer ( Claims Department):

Built Data Quality check framework

Build PL-SQL processes for validating and processing and prepping Claims related data for analytics and applications

Optimizing and tuning SQL and PL-SQL code and processes

Technologies & Tools : Oracle 19c, SQL, PL_SQL, JSON, Shell Script.

Environment: Windows, Unix

Client : AirBnB, San Fransisco, CA Feb 2022 – May-2022

Data Engineer Manager (Head Of Policy Analytics):

Building a DE/Analytics team from scratch for Policy Analytics

Building Roadmaps

Tracking progress/achievements

Defining coding best practice for DE’s

Strategy to hand shake/partnering with other departments to help operations

Continuously studying and Improvising Customer experience

Build Policy Ecosystem ( Policy Analytics platform )

Build pipeline and schedule using Airflow in AWS

Build Dashboards in Superset and Tableau

Data Discovery and impact analysis

Build framework for Evaluating policy changes across multiple policy releases and build report to track historical / current / forecasting accordingly.

Conducting Daily standups, Weekly one-on-one meeting with team members.

By-weekly progress reports

Technologies & Tools : AWS Cloud, AWS Athena, AWS Glue, DataGrid, DJS ( Distributed Job Scheduler ), python, Hive(HQL), Presto, Superset, Tableau, Airflow/Oneflow, Github, Excel, Shell Script.

Environment: Windows, Unix

Client : FaceBook Inc., Menlo Park, CA Nov 2019 – Dec-2021

Lead Incorta Analytics/Data Engineer ( Global Operations Revenue Analytics ):

Built frameworks as part of introducing artificial intelligence and machine learning into Revenue, Credit and Collection management Analytics using python, presto, hive, mysql (Shard) and for data quality checks .

Design & development of custom data model to store complex data so to be used and processed efficiently by application/analytical teams

Introduced Self healing processes for Analytics. So to catch the missed data and backfill to avoid variances.

Built data pipelines as part of moving & processing complex data using Dataswarm (FaceBook internal framework)

Building platform for Enterprise Analytics consumption of centralized data which acts as source of truth.

Built many reports to support Revenue, Credit and Collection management Analytics using Tableau, Incorta, Unidash ( Facebook internal product)

Supported Sox, Internal and external quarterly audits for Tableau and Incorta reports.

Was a POC for major blockers related to any Incorta related issues.

Was part of monthend close process for Account Receivables (AR)

Technologies & Tools : Oracle 19c SQL, PL-SQL,EBS, Oracle Fusion, Presto, Mysql (Shard), Hive, Spark Sql, Python, Tableau, Incorta, unidash, Matric 360, Dataswarm, Excel, Shell Script.

Environment: Windows, Unix

Client : TinoIQ, Cupertino, CA Oct 2017 – Nov-2019

Head Of Data Engineering/Analytics:

Building a DE/Analytics team from scratch.

Building Roadmaps

Tracking progress/achievements

Defining coding best practice for DE’s

Continuously studying and Improvising Customer experience

Involved in Building Enterprise Application Architecture.

Built Batch Processes to move data from Postgres to Oracle (OLTP to Data warehouse)

Design & Development i.e., New modules & Enhancements using Oracle PL-SQL.

Worked on Jira Tickets

Building Data Model and ETL processes(PL-SQL, Shell Script) for BI.

Ingest Source system data into snowflake

Built POC for NASA security department for automating heavy/complex processes to process data from Excel and Postgres using Python

Built Pipelines in Airflow to transform logging Data for infra analytics

Built superset Dashboards

Build self healing Pipeline using ADF Azure Data Factory.

Used DBT to move data from one place to other.

Development and support of PowerBI reports

Technologies & Tools : Oracle 12c SQL, PL-SQL, Postgres, Snowflake, Python, Excel, Shell Script, AWS ( Airflow, Presto, Superset, S3, Redshift, Athena, DataGrid, Glue ) Azure (Azure Data Factory [ADF], Azure Synapse Analytics,DBT ), PowerBI, PowerQuery, DAX ( Data Analysis Expression).

.

Environment: Windows, Unix

Client : WageWorks, San Mateo, CA Feb 2017 – Oct 2017

Sr. Database Architect:

Optimizing & tuning performance issue i.e., SQL and PL-SQL used in ETL process, reports or in an application

Design & PL-SQL development i.e., New modules & Enhancements.

Reviewing & signing-off DDL, DML, PL-Sql objects related scripts

Actively involved in Enterprise Application Architecture designs, Data Modeling, developments & migrations/conversions

Optimize Tableau dashboard performance.

Technologies & Tools : Oracle 12c, SQL, PL-SQL, Shell Script.

Environment: Windows, Unix

Client : Delta, San Francisco, CA August 2014 – Oct 26, 2016

Sr. Database Consultant:

Worked in an Enterprise Architecture Team, were was responsible for optimizing & tuning high priority i.e., production & development performance issue i.e., SQL and PL-SQL used in ETL process, reports or in an application

PL-SQL ETL developments i.e., New modules & Enhancements.

Design & Development of Framework using PL-SQL processes as part of providing flexibility and reusability.

Automate existing manual PL-SQL process/Reports

Building Business Object Universe & Reports

Reviewing & signing-off DDL, DML, PL-Sql objects related scripts

Actively involved in Enterprise Application Architecture designs, Data Modeling, developments & migrations/conversions

Technologies & Tools : Oracle 11g SQL, PL-SQL, Vertica, Shell Script, Business Objects.

Environment: Windows, Unix

Client : GAP, San Francisco, CA Jan 2014 – May-2014

Sr. Database Consultant:

Work on PL-SQL ETL Enhancements.

Optimizing and tuning long running PL-SQL ETL processes

Automate existing manual PL-SQL process

Technologies & Tools : Oracle 11g SQL, PL-SQL, Shell Script, Netezza.

Environment: Windows, Unix

Client : Levis, San Fransisco, CA NOV 2013 – Jan 2014

(Data - Migration ) Sr. Database Consultant:

Design and develop framework for the below using SQL, PL-SQL, Shell Script :

Built PL-SQL Packages for Validating the data before the migration for any inconsistency.

Built PL-SQL Packages for populating automated report for the same.

Built PL-SQL Packages for Migrating data from one CRM application to other.

Built PL-SQL Packages for populating automated report for the hand shake.

Created shell script wrapper for the above execution

Technologies & Tools : Oracle 10g, SQL, PL-SQL

Environment: Windows XP

Client : HP, Palo Alto, CA Mar 2013 – Sept 2013

(Retail/.Com –OLAP ) Sr. Database/BI Consultant:

Building/tuning ETL process using perl, pl-sql and shell script.

Development, Tuning and Supporting Qlikview & Spotfire Reports

Data Modeling for building Qlikview Dashboards

Writing scripts to load data into the Qlikview fact tables.

Defining Standards for building Qlikview Reports

Deploying the Qlikview Reports to Testing and Production environment using the HP Infrastructure Standards .

Communicating with Onsite/Offshore Teams to get the ETL Processes built.

Verifying Objects in Vertica w.r.t consistency,

Writing Data Validation processes

Communicating with Top level Management & Business User w.r.t Requirement gathering

Helping Business Analyst in building New Business Model or Modify existing Business Model to benefit the revenue

Moving data from Vertica to SAS

Moving data from SAS to Vertica

Code Review

Technologies & Tools: Qlikview 11.2, Vertica, SVN, Spotfire, Oracle 11g SQL, PL-SQL

Environment: Windows 7, Solaris, Linux

Client : Aeries Communication, Santa Clara, CA Aug 2012 – Dec-2012

(OLAP ) Sr. Database Engineer:

Development, Tuning and Support for functional and PL-SQL batch Processes

Reverse Engineering w.r.t Design/process/Logical Design Designing/Building/Optimizing and Tuning Batch Jobs and Functional procedure/function/packages.

Writing Shell, Perl and AutoSys Scripts

Defining Standards for coding DB process/batch jobs in Distributed Database environments

Creating LLD’s and HLD’s for functional and batch Processes

Technologies & Tools: Oracle 9i/11G SQL, PL-SQL, shell script (ksh), perl,Putty, PL-SQL Developer

Environment: Solaris, Windows XP

Client : Amazon, Seattle, WA Aug 2011 – July 2012

(Tax / Supply Chain Management –OLAP ) Sr. Database Engineer:

Development, Tuning and Support for functional and batch Processes

Designing/Building/Optimizing and Tuning Batch Jobs and Functional procedure/function/packages.

Writing Shell, Perl and AutoSys Scripts

Defining Standards for coding DB process/batch jobs in Distributed and heterogeneous Database environments

Creating LLD’s and HLD’s for functional and batch Processes

Communicating with Business User w.r.t Requirement gathering

Technologies & Tools: Oracle 11G SQL, PL-SQL, shell script (ksh), perl, Informatica, OBIEE 10.3.1.4, .Putty, PL-SQL Developer.

Environment: Redhat Linux, Windows XP

Client : Stage Stores, Houston,TX Apr 2011 – July 2011

(Retail / Supply Chain Management –OLTP and OLAP ) Sr. Database Engineer:

Development and Support for PHP Application, Oracle 10g Forms, Oracle 10g Reports.

PL-SQL Processes Designing/Building/Optimizing and Tuning Batch Jobs and Functional procedure/function/packages.

Writing Shell, Perl and AutoSys Scripts

Defining Standards for coding DB process/batch jobs in Distributed and heterogeneous Database environments

Technologies & Tools: Oracle 9i/10G, SQL, PL-SQL,, Oracle 10g Forms, Oracle 10G Reports, Putty, PL-SQL Developer, MySql SQL, Stored Programs, PVCS, PHPs, shell script, Autosys.

Environment: Solaris, Windows XP.

Client : SnapOn, Richfield, OH June 2010 – March 2011

(Supply Chain Management –OLAP ) Sr. Database Engineer:

Development and Support for Application and Database Issues

Designing/Optimizing and Tuning Batch Jobs and Functional procedure/function/packages.

Defining Standards for coding DB process/batch jobs Code Reviewing

Technologies & Tools: Oracle 10G, SQL, PL-SQL, Putty, PL-SQL Developer, StarTeam, Accurev, JIRA

Environment: Solaris, Windows XP.

Achievements:

Optimized and Tuned 3 hours running Query to complete in 1.5 minutes.

Optimized 5 hour running process to complete in 17 mins

Optimized 18+ hour running process to complete in just 3 hour

Client : Puget Sound Energy (PSE), Bellevue, WA April 2010 June 2010

(Supply Chain Management –OLAP ) Database and Unix Developer :

Created Shell Scripts for Automating the movement of files (both FTP and Local) and running the Batch jobs

Development and Support for Application and Database Issues

Optimizing and Tuning Batch Jobs.

Go-Live Support

Technologies & Tools: Oracle 10G, SQL, PL-SQL, Putty, WINSCP, Oracle SQL Developer Environment: AIX, Windows XP.

Achievements:

Optimized and Tuned non-ending process to complete in just 17 minutes.

Created Shell Scripts to automate the required batch jobs so to save 1000’s of man hours

Optimized 12+ hour running process to complete in just an hour

Client : Nike Inc, Beaverton, OR, U.S.A Oct 2009 – March 2010

(Supply Chain Management –OLAP) Database Specialist / Architect (Jr)/Solution Architect:

Created cross-functional flow charts as part of Reverse Engineering

Physical Modeling for New Modules and Applications.

Requirement gathering.

Reviewing Func Spec’s.

Created Technical Specifications, HLD’s, Process Design

Code Reviewing.

Production and Staging Deployment/Support Designed and Developed Automation for Data Loading and Testing processes using Autosys/PL-SQL/Shell Scripts.

Created Reports using SSRS

Optimized and Tuned Long running PL-SQL Batch Jobs and Reports

Monitoring Production Batch jobs using Autosys UniCeneter.

Created Test Cases

Business Architecture Presentation for Clients.

Assign Tasks for Team Members follow-up/review the same.

Reviewing work estimation

Leading Off-Shore and On-Shore Team of Developers

Technologies & Tools: Teradata, Oracle 11G/10G/9i, SQL, PL-SQL, Administration and Maintenance, SQL Navigator for Oracle, Oracle10G Enterprise Manager (OEM), Visio, Erwin, Import/Export, SQL*LOADER, SSRS, Log-Miner, QC (Quality Center), VSS, Putty.

Environment: SOLARIS, Windows XP.

Achievements:

Optimized and Tuned 12 hour running Batch process to 4 minutes.

Optimized 30 minutes running Application process to 1 seconds

Client : Credit-Suisse, New York, U.S.A June 2009 – August 2009

(Financial Domain –OLAP) Senior Data Analyst :

Credit-Suisse is a one of the major Financial and Banking company, worked as a Senior Data Analyst for this company in New York.

Designed and Developed Automation for Data Loading and Testing processes using Control-M/PL-SQL/Shell Scripts.

Loaded data as per the requests for different teams which uses USGAP and IFRS methodologies.

Analyzed the Issues with Data, Database objects and processes, Shell scripts [ Kshell ] and UNIX environment.

Responsible to load the data form OLTP to Data warehouse and from Data warehouse to Data Marts.

Helped Support Team and Build Team to fix the Issues.

Tuned Data Load process.

Presentation on writing optimized queries for Report developers (BO Developer).

Worked on Partition Swapping approach to increase the performance and maintain the read consistency between Data-Load and QA departments.

Reverse Engineering to create ER Diagrams w.r.t Objects and Processes.

Created, Modified and Tuned Business Object Reports as per the requests.

Created, Modified and Tuned PL-SQL Procedures as per the Data Load requirements.

Communicated with Off-Shore Team to assign the jobs and review the same.

Took responsibility for delivering the data on Time during major releases.

Motivated Team members.

Raise the Defects in QC.

Created Test Cases in QC.

Manual Testing.

Batch Testing .

Technologies & Tools: Oracle 10G, SQL, PL-SQL, Administration and Maintenance, Rapid SQL for Oracle, SQL Tool for Oracle, Oracle10G Enterprise Manager (OEM), Visio, Erwin, Import/Export, SQL*LOADER, Log-Miner, Shell Scripts (Kshell), QC (Quality Center), BO (Business Objects), Clear Case 7, Putty.

Environment: SOLARIS, Windows XP.

Achievements: Optimized Data Loading process by introducing Automation, created Shell Scripts and PL-SQL Packages as part of automation so to avoid manual process.

RCC, Edision, New Jersey Jan 2009 – Jun 2009

Description : Supporting In-house product.

HMNA, Clark, New Jersey Dec 2007 – Nov 2008

Client : Pfizer, New York, U.S.A, CitiGroup, NY, U.S.A, (Pharmaceutical Domain – OLTP and OLAP, Banking domain OLTP )

Senior Database Programmer /Database Specialist /Lead/Solution Architect :

Pfizer is a major Pharmaceutical company, worked as a Senior Database Programmer / Database Specialist for the company based in New York City. Created, Maintained and tuned HMNA Development, in Oracle 9i/10G Instances/Databases, MS-SQL and MY-SQL 4.x, 5.x Databases.

Designed to develop and manipulate SQL databases, data warehouses and multidimensional databases.

Migrated huge data from different sources to Oracle on monthly and quarterly basis using PL-SQL.

Upload data from file to tables in Greenplum using Copy command

Build highly de-normalized tables in Greenplum for Reports.

Created new Procedures/Functions/Triggers/Materialized Views/Packages / Simple, Ref & Traditional Cursors / Dynamic SQL / Pipelined / Table functions as part of Project/Application requirements.

Handled Exception so to avoid process termination and log the errors.

Coded the process with all standards like by putting comments on tables / columns/source code with readable pro-logs in SQL, PL-SQL source code.

Created Tables/Views/Materialized Views/Indexes/Partition Tables/ Compressed Tables/Index Organized Tables/Objects type.

Wrote Complex Queries like Sub-Queries, Joins, Hierarchical Query, Co-related Sub-Query, Traditional cursor used inside Query, Inline View, Simple/Aggregate Functions, Analytical Functions, to name a few.

Designed and created PL-SQL automated process for a vertical storage system.

Backing up and Restoring the database using Logical / Cold / Hot / RMAN based on required type of Backup.

Monitoring the Processes /Alert log file/ Trace files on regular intervals as part of maintenance .

Tuned complex SQL Statements used inside Merge statement, Materialized Views, Application and Reports, Optimized & Tuned PL-SQL Procedure/Function/Triggers /Packages.

Developed user interface screens and Report screens

Worked closely with Informatica Developer to Tune ETL Queries/Processes.

Responsible for technical support of applications through implementing predefined SQL/PL-SQL processes.

Designed, Developed & Implemented ETL processes using TOAD, SQL*Loader, SQL and PL-SQL to meet key project requirements.

Created transactions and ran tests to find errors and confirm program specifications.

Modified, Maintained, Optimized and tuned SQL and PL-SQL Source code as required.

Created reports using MS-Reporting Services. Requirement gathering as part of new enhancements.

Optimized and Tuned Pfizer Databases/Instances/Servers w.r.t to Memory and Hardware.

Migrated huge data from Oracle to MySQL on weekly, monthly and quarterly basis.

Migrated Oracle Parent Database data to Oracle Child Database.

Exported MySQL data to Flatfiles on requests.

Installed/Configured MySQL Administration, MySQL Query Browser, MySQL Enterprize Dashboard, Navicat for accessing and maintaining MySQL Databases.

Configured and tuned MySQL Instances using my.ini config-file

Created users, granted and revoked privileges in MySQL and Oracle Databases

Monitoring MySQL Instances w.r.t Connection Health, Memory Health, Status Variables, Agent Status, Server Status, CPU Usage, I/O, Lock Contention, Caches, server logs.

Analyzed/checked/Optimized and repaired MySQL Tables, flushed Internal Caches also Cached Indexes

Monitoring and optimizing the performance of the database.

Building Oracle 10g Reports.

Collecting statistical Information about memory like Shared Pool (Library cache, Dictionary cache), Buffer Caches, Large Pool, PGA to verify the Load and hence to tune same if required.

Collecting Statistics about the Temporary /Undo and/or Rollback segments to verify the I/O Load so to tune the same.

Collecting Statistics about the Permanent segments to verify the I/O Load, Chained Rows, Migrated Rows, Fragmented Extents so to tune the Performance, remove/avoid row chaining and row migration and to de-fragment the extents and segregating the same / moving to another place if required.

Administering, Monitoring and Providing high availability and Fault-tolerant MySQL and Oracle Databases.

Technologies & Tools: Oracle 9i/10G, SQL, PL-SQL, Administration and Maintenance, Oracle 10G Reports, MY-SQL



Contact this candidate