Post Job Free

Resume

Sign in

Project Management Pl Sql

Location:
Plano, TX
Posted:
December 05, 2023

Contact this candidate

Resume:

Hariharan Anantharaman

469-***-****

ad1p3z@r.postjobfree.com

Highlights:

OCP, MCP and ITIL V3 certified professional with an ambitious and aspirational outlook towards excelling in Project Management, Proficient in coaching and mentoring the Team.

15 years of IT experience with excellent exposure to various Project Management Methodologies - Agile Scrum, RAD, RUP, Lean and Traditional.

PROFESSIONAL SUMMARY:

Specialized in Oracle SQL, PL/SQL Programming with more than 15 Year experience

PostgreSQL v15.2 with more than 5 years

Strong knowledge in business process in telecom domain

Hands on experience in complete Agile and scrum methodology

Expertise in all activities related to the development and support of Backend environments of Oracle SQL/PLSQL and ETL processes for large-scale data warehouses using Ab Initio

Involved in problem management for increasing the scalability of the target data mart

Hands on experience with ETL process flow – Gather requirements, create physical design, Design test plan, Create ETL process and Test process

Well versed with the Database and Data Warehouse concepts such as dimensional data modeling, OLTP, OLAP, Star and Snowflake Schemas, RDBMS, Multidimensional Modeling, Views, Physical and Logical models, Analytical functions, Stored Procedures, Functions and Triggers.

Efficient in trouble shooting, performance tuning, system testing, debugging and optimization of SQL query and ETL and reporting analysis

Profound experience in pre-processing the data using UNIX Korn /bash shell scripts

Solid management skills, demonstrated proficiency in leading and mentoring individuals to maximize levels of productivity, while forming cohesive team environments.

Worked closely with the Business analysts and DBAs to understand the various source data

Extensively applied SQL, PL/SQL programming skills in optimizing the Graph performance

Expertise in C/C++, Pro*C using Pointer, Structures, Macros, Functions

Expertise in creating new C/C++, Pro*C Batch Programs with embedded SQL’s, PL/SQL procedure calls, Arrays and Dynamic SQL

4 Years’ experience Golden Gate administrator to utilize solid and professional skills in oracle database. Golden gate, RAC/ASM, Data Guard, Exadata, Unix/Linux, and shell scripts

4 Years experiences and solid hands-on-experiences in installation/Configuration/Performance Tuning/Troubleshooting Oracle Goldengate in Oracle-to-Oracle environment

Tatal 9+ hands on experience with building product ionized data ingestion and processing pipelines using java, Spark, Scala etc and also experience in designing and implementing production grade data warehousing solutions on large scale data technologies

Strong experience in migrating other databases to snowflake

Work with domain experts, engineers, and other data scientists to develop, implement, and improve upon existing systems.

Participate in design meetings for creation of the data model and provide guidance on best data architecture practices.

Experience with Snowflake Multi-Cluster warehouses.

Experience in Splunk reporting system.

Understanding of snowflake cloud technology.

Experience with snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading nested JSON formatted data into snowflake table.

Professional knowledge of AWS Redshift.

Experience in building Snowpipe.

Experience in using Snowflake Clone and Time Travel.

Participates in the development improvement and maintenance of Snowflake database applications.

In-Depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles.

Build the Logical and Physical data model for snowflake as per the change required.

Define virtual Warehouse Sizing for Snowflake for different type of workloads.

Worked with cloud architect to set up the environment.

Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python / Java.

ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.

TECHNICAL EXPERTISE:

Languages: C, C++, Proc’C, XML, HTML, Java, Data stage 11.7

Network: Novell Netware 3.12, MS-Dos 6.22, UNIX

Systems: Windows (95/98/NT/2000/XP/2003)

RDBMS: Oracle 19c (SQL/PL SQL), PostgreSQL v15.2, SQL Server 2017(v14.0),

Sybase, MS Access 2003, Data Model

ETL Tools (Back End):Ab Initio (GDE 1.15.1, Co>operating system 2.15.1), Datastage 11.7,SAP Business Object 4.2

Front End Tools: Visual Basic 6.0, VC++, and Oracle Developer 9i (Forms 9i/Reports 9i)

Data Replication: Oracle Golden Gate 11gr2/12c, Export/Import, Data Pump, Snowplow

PROFESSIONAL EXPERIENCE:

Technology Lead Dec 2018 – Till Date

Verizon, (Infosys Limited), Richardson, TX

Project Name: RespOrg / National Application

Team Size: 49

Synopsis: RespOrg stands for Responsible organization for every Toll-free (TF) number. It’s a TF Order Application which enables the customers to Order certain TF for their organization. This TF is one that is ordered to be used by end customers within the country/internationality and business pays for call made by the customers. It allows 3 ways to order s TFs to the customer 1) Through Batch process 2) Through RespOrg GUI 3) Through Webservices.

Responsibilities:

Solid SQL and PL/SQL coding techniques; ability to create and maintain tables, constraints, triggers, views, stored procedures and functions, packages, and types; solid understanding of SQL tuning practices and indexing and partitioning strategies

Expert in writing complex SQL queries, advanced SQL, stored procedures, optimizing large complicated SQL statements, write and modify complex stored procedures

Deep understanding of relational database design concepts

Identify the Toll-Free data management needs and develop solutions to cater to them

Experience in application tuning using various query optimization tools like TKPROF, EXPLAIN PLAN, AUTOTRACE, AWR reports and SQL*TRACE

Implement ETL solutions for large volume of data by applying various data warehouse modeling techniques like Star Schema, Snowflake Schema and Galaxy Schema

Implementation experience with monitoring and measurement of data quality using business rules. Hands on experience hands with data quality assessment and resolution desirable

Maintain metadata, data glossary and data catalog processes, Ability to work in fast paced environment

Implementation of Agile methodology

Experience in Unix/Linux shell scripting

Experience in large volume of Data Load via ETL (SQL*Loader, External Table and Snowplow) and HVR replication strategies

Experience in Code optimization / Performance Tuning

Ability to work in fast paced environment

Experience in Telecom Industry with some risk background

Knowledge of Agile methodology is desirable

Managed and mentored the team for the timely deliverables

Co-coordinating with offshore and onsite teams in resolving the issues during requirement design and development.

BA’s presenting GD (General Document) walk through with team members.

Project Tracking update.

Detail Design walk through.

Facilitated Release management for all the enhancement

End of each release generate Test Defect Analysis Report and work with the team to find the route- cause of the issue and providing preventive solution to the teams.

Implementing Fix for the repeated production bugs based on the defect report obtained from the support team

Automation of system performance and order status to OM/AM using oracle pl/sql and UNIX shell script

Developing the advanced PostgreSQL It is enterprise class open source relational database that used in my module to build the both SQL (relational) and JSON (non-relational) querying

Developing modules for implementation using PostgreSQL, PL/SQL Scripts, Unix Shell Scripts and Pro*C

Developed custom batch programs in Pro*C and developed custom Functions, Procedures and Packages Scripts for RETEK -12 implementation.

Involved in upgrade/migration of databases from Oracle 11g to oracle 12c, migrating data from Oracle to Snowflake

Querying JSON data in Snowflake is fast and effective. That means that when we load Snowplow data into Snowflake, we’re able to load it all into a single table with a single column per event and context type, and the individual event-specific and context-specific fields available as nested types in those columns. This is an attractive data structure because all your data is in a single, easy-to-reason about table

Redshift, by contrast, has much more limited JSON parsing support: JSON parsing is very brittle (a single malformed JSON will break a whole query) and querying JSONs in Redshift is not performant. As a result, when we load Snowplow data into Redshift we do not use JSON: we shred out each event and context type into a dedicated table, so that they can be queried in a performant way

Resolved the issues in conflicts between the target and source database in Goldengate

Migrate Oracle database 11g from Linux boxes to EXADATA machine using Goldengate and Export & Import

Migrated Oracle 10g/11g/12c databases cross platform from Solaris to Linux using transportable tablespaces/Goldengate to reduce downtime

Performed database migration from non-Exadata to Exadata

Used Goldengate to migrate large database (15TB)

Data modeling analyzes data objects and their mutual relations, while ETL applies those rules and inspects them for anomalies, and loads them into a data warehouse or data mart

ETL data modeling starts by creating data models by which source data associations and constraints are described, categorized and coded for storing in a data warehouse

Data modeling solves the reliability issue of the ETL process through visual representation and information. It takes care of entity integrity and referential integrity with the help of keys

Collaborated with Web Application Engineers, used Python scripts to load the data into AWS Cloud Cassandra database

Visualized over 40 datasets with Matplotlib

Involved in processing the streaming data as well as batch data using Apache Spark

Assisted in the development and maintenance of a large-scale e-commerce platform using Python, Django and PostfgreSQL

Developed and maintained backend systems for web applications using Python, Django and Flask, resulting in enhanced performance and scalability

Designed and implemented RESTful APIs to facilitate seamless communication between front-end and back-end systems, improving overall user experience

Environment: Oracle 19c, PostgreSQL v15.2 Amazon RDS, Sybase, MS-SqlServer, Pro’C, Java with UNIX, ETL – Business Object 4.2, AB Initio (2.10.15) and Datastage 11.7, Data Modeling, Pro*C, Python, AWS

Project Lead Nov 2006 - Dec 2018

Verizon Data Service, India

Project Name: EZStatus

Team Size: 45

Synopsis: EZStatus has become the primary enterprise order tracking and status application, providing information from more than 40 Verizon Business systems and significantly reducing the time needed to provide order status to Verizon Enterprise Solution customers. This will provide the best customer service experience. This application is used by Implementation Managers (IM) to track orders, in their role as the internal customer support organization for Verizon Business, implementation managers (IM) invest a considerable amount of time tracking orders and providing status to their customers. Previously a manual process, much of that time was spent looking up order information in multiple systems and creating spreadsheets for tracking purposes. It was not unusual for IMs to traverse through 40 different systems to track an order from start to finish. To reduce the amount of time IMs must commit to tracking orders from start to finish. To reduce the amount of time IMs must commit to tracking orders and minimize the number of systems they need to access during the process, an EZStatus set out to develop a new status and tracking application.

Responsibilities:

Solid SQL and PL/SQL coding techniques; ability to create and maintain tables, constraints, triggers, views, stored procedures and functions, packages, and types; solid understanding of SQL tuning practices and indexing and partitioning strategies

Expert in writing complex SQL queries, advanced SQL, stored procedures, optimizing large complicated SQL statements, write and modify complex stored procedures

Deep understanding of relational database design concepts

Identify the Toll-Free data management needs and develop solutions to cater to them

Experience in application tuning using various query optimization tools like TKPROF, EXPLAIN PLAN, AUTOTRACE, AWR reports and SQL*TRACE

Implement ETL solutions for large volume of data by applying various data warehouse modeling techniques like Star Schema, Snowflake Schema and Galaxy Schema

Implementation experience with monitoring and measurement of data quality using business rules. Hands on experience hands with data quality assessment and resolution desirable

Maintain metadata, data glossary and data catalog processes, Ability to work in fast paced environment

Managed and mentored the team for the timely deliverables

Co-coordinating with offshore and onsite teams in resolving the issues during requirement design and development.

Generating weekly MIS (Management Information System) Report for utilization of the application by End user

Volume Order Report (Pending/Active/Cancel) to the Order Manager, Account Manager and higher management.

BA’s presenting GD (General Document) walk through with team members.

Resource Allocation based on band width.

Project Tracking update.

Detail Design walk through.

Facilitated Release management for all the enhancement

End of each release generate Test Defect Analysis Report and work with the team to find the route- cause of the issue and providing preventive solution to the teams.

Implementing Fix for the repeated production bugs based on the defect report obtained from the support team

Automation of system performance and order status to OM/AM using oracle pl/sql and UNIX shell script

Environment: Oracle 12c, Pro’C, Java with UNIX, ETL – AB Initio (2.10.15) and Datastage 9.1, Data Modeling, Pro*C, PostgreSQL

Senior Software Engineer Jul 2006 - Nov 2006

Arete Technologies, India

Project Name: eCOPS (e-Computerized Operations for Police Services)

Team Size: 19X

Synopsis: eCOPS is an integrated, enterprise-wide IT tool for enhancing the performance of state police units in crime control, law & order and administrative operations.

The primary purpose of “electronically Computerized Operations for Police Services” (eCOPS) is to maintain information to monitor and enhance the performance of the department, to take a critical look at the existing systems, processes and procedures of the department so as to identify and remove the redundancy thus making the department function better.

Responsibilities:

Developing procedures, packages scripts in Oracle 10g.

Integrating the procedures and fixing the bug.

Database Support for Generating Different type of Reports.

Decentralized Client/Server architecture.

Every Police office is self-sufficient with its own database Data transferred and consolidated at various offices in the hierarchy.

Module deployment flexibility

Integration with FACTS for capturing criminal information and physical features of accused.

Workflow instructions are sent from one office to other offices as part of workflow using MS Exchange Server.

Data Transfer

Day end / On demand/Automatic in encrypted mode

Integration with emails

Gap memos and instructions to investigation officers Queries

Search based on names of accused, age, date range, physical features etc.,

Alert system

Department-wide alerts to intended recipients.

Multi-Lingual

Data capture in desired language – complainant details in FIR and witness statements in Case Diary Part – II

Application log

Log the information of files exported, imported, messages received, login and logout of all application users.

Analysis & Reporting

Crime statistics, percentage of property recovery in comparison with previous months, years.

All reports related to the unit offices – like daily, weekly, fortnightly, monthly and other yearly reports.

Project Coordinator Nov 2002 - July 2006

Office Tiger Database Systems India Private Limited, India

Responsibilities:

Monitoring the Flow of messages.

Resolving issues regarding the Message flow.

Developing procedures, packages scripts in Oracle 9i.

Integrating the procedures and fixing the bug.

Database Support like Generating Different type of Reports.

Project Name: WWL – Global Support Apr 2005 - July 2006

Team Size: 16

Environment: Oracle 9i, Developer 9i with UNIX

Project Name: LEAVE & MIS MANAGEMENT SYSTEM Dec 2004 – Apr 2005

Team Size: 3

Responsibilities: Designing, Bug Fixing, Implementing, Supporting Functional

Environment: Oracle 9i, Developer 9i

Project Name: ACTIVITY UTILIZATION & DCS (Data Capture System) Sep 2003 - Dec 2004

Team Size: 3

Environment: Oracle 9i, Developer 9i

Project Name: DMS PRODUCTIVITY MANAGEMENT SYSTEM Nov 2002 – Aug 2003

Team Size: 2

Environment: MS-Access, VBA

ADDITIONAL EXPERIENCE:

Trainee Programmer & Network Support Mar 2001 - Nov 2002

Dev Constructions Pvt Ltd, India

Environment: VB 6.0 & Oracle 8i

System Engineer Sep 1999 – Feb 2001

Super Auto Forge LTD, India

Responsibilities:

Incoming & Outgoing mail.

Win NT 4.0 Server Handling.

User Hardware & Software support.

Taking care of all kind of Technical Network Issues

ACHIEVEMENTS:

Awarded the Best team leader for 2009,2014, 2018 & 2022

Spot Award for best Performance consequently three times

CERTIFICATIONS:

OCP (Oracle Certified Professional) in Oracle Developer 9i Stream.

MCP (Microsoft Certified Professional) in Sql Server 2005.

ITIL (Information Technology Infra Structure) Version-3 Foundation

Education:

Master of Science in Information Technology - Annamalai University 2009

Bachelors of Science in Botany - Madras University 1999



Contact this candidate