*************@*****.*** phone: 732-***-****
** ***** ** ************ ** experience as PLSQL Developer with hands on experience building large scale data processing systems. Proficient in all aspects of Data Warehousing such as Data Modeling, Database Design, Data Analysis, ETL, BI & Reporting
PROFESSIONAL SUMMARY:
Proficient in all the phases of Software Development Cycle (SDLC) and implemented various methodologies like Agile, Waterfall model with timely delivery.
Experience working with AWS cloud, S3 buckets, EC2 instance, storage gateway, EFS, lambda function, sns and sqs
Experience working with big data database Snowflake as MPP database
Experience working with Matillion ELT tool to load data into Snowflake
Experience with Data warehouse data, keen focus on performance tuning, testing, and quality assurance.
Proficient in creating ER-Diagrams, logical and physical database design and normalizing Techniques.
Experience in Optimization of SQL, Dynamic SQL statements.
Worked in debugging and Tuning the PL/SQL code, Explain plan, Tuning queries for the Oracle database.
Experience in performance Tuning using various types of Hints, Partitioning and Indexes.
Experience in data warehouse, OLTP, data cleansing, data analysis and data migration from various business applications into homegrown Customer Relationship Management (CRM) application.
Expert knowledge in scheduling data migration from different data sources using Appworx scheduling tool
Extensive Database experience using Oracle /11g/10g/9i, DB2, MS SQL Server 2010/ 2005, SQL, PL/SQL.
Expert in Oracle Import/Export utilities, Oracle SQL* Loader, PL/SQL developer, SQL*Plus
Developed UNIX Shell scripts to load files using SQL LOADER and calling PLSQL procedures and packages
Developed python script to automate data pipelines and getting files from REST API
Proficient in creating ER-Diagrams, logical and physical database design and normalizing Techniques.
Experience in Optimization of SQL, Dynamic SQL statements, data warehouse, OLTP, data cleansing, data modeling, data analysis and data migration
Expert knowledge in scheduling data migration from different data sources using Appworx, Autosys, Control M Scheduling tools and CRON jobs.
10+ years of experience in onsite offshore coordinated projects. Experience in project planning, resource allocation, and project tracking.
Demonstrated expertise in team building, interpersonal skills, and leadership with expert verbal and written communication.
TECHNICAL SKILL SET:
Cloud: AWS cloud
Big Data Database: Snowflake
Relational Database: Oracle 10g, 11g and 12c; SQL server 2008 and 2012
ETL Tools: Informatica 9.5.1, 10.1.1, Talend, Matillion, Oracle warehouse builder
Versioning Tools: GitHub, Stash, Subversion (SVN)
Other tools: TOAD, SQL Developer, PL/SQL Developer
Operating Systems: UNIX, Linux, Windows 10/8/ 7/Vista/XP/2000/98 SE/95.
Languages: Shell Scripting, Python, SQL,PL/ SQL, Java Script
Reporting tools: Tableau 2019
Scheduling tools: Control M, Appworx
PROFESSIONAL EXPERIENCE:
PRINCESS CRUISES/Valencia - CA (Jun 2016 –present)
Employer: Apolis
Senior PLSQL Developer
Princess Cruises is an American-owned cruise line, based in Santa Clarita, California in the United States and incorporated in Bermuda. Previously a subsidiary of P&O Princess Cruises, the company is now one of ten cruise ship brands owned by Carnival Corporation & plc and accounts for approximately 19% share of its revenue. Implementing Data warehousing solution by interlinking different areas information (Booking, Air ticketing, Yield, CRM). As a Team member in ACE we are providing ETL solutions using Oracle PLSQL, OWB and Informatica.
Environment: : AWS Cloud, Snowflake, Redshift, Matillion(ETL tool), Python, Oracle SQL Developer, PLSQL Developer, Toad, OWB 11.2.0.4, Informatica 10.1,9.5.1, Informatica MDM, SVN, Control –M, JIRA, Tableau 2019, UNIX, LINUX, Shell Scripting
Responsibilities:
Involved in requirements gathering, business analysis, testing and project coordination
Designed, Modelled and Developed End to End Agent Scorecard application which consists of a set of Score Cards to track an array of complex reservation agent KPI’s sourced from multiple sub-systems using Informatica and advanced PLSQL, dynamic queries and complex logic also taken care of performance by coordinating directly with business for requirements and UAT. Also involved in creating Tableau dashboard on top of the database tables.
Crated the metadata driven framework to automate the process to load more than 200 files in to staging and ODS tables using UNIX and PL/SQL packages and scheduled it through automation tool Control-M
Developed framework driven data pipelines using python & Matillion to load data into SNOWFLAKE as MPP database and created data lake in SNOWFLAKE
Clear understanding of SNOWFLAKE database architecture and its advantages including micro partitioning, columnar database, data storage and compute power.
Taken care of PII data in SNOWFLAKE data warehouse, while giving access to users
Automated the process to load files from AWS S3 buckets into snowflake using Python
Automated the process to get files from RESTAPI and place them in S3 buckets using Python
Parsed JSON file in to table using python and snowflake flatten function
Created python script to copy files in to EFS in AWS EC2 instance and then move the files to AWS S3 and then loading the files in to snowflake.
Worked on AWS S3, AWS Lambda, AWS SNS and AWS SQS
Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
Did POC on AWSRedshift to load data and compared with Snowflake
Created Tableau dashboard for marketing data and error log monitoring
Created reliable, scalable and optimized architecture which needs less maintenance, less support in future.
Involved in continuous enhancements to improve performance of long running and complex queries by following best practices, hints, bulk collects, partitioning and analyzing explain plan
Created one stop for error logging by using exception handling extensively for all different applications, which is easy to monitor and created Tableau dashboard on top of it for easy monitoring during on call support.
Fixed specs defects, technical coding defects such that the data loads properly as per business needs
Created detail ETL Standards documents for Design, Development, Release Management and Production Support.
Design Detail ETL spec for offshore development and ensured Quality ETL Deliverables.
Created detail ETL Migration Processes for Informatica, Database, Scheduling teams.
Automated, Redesigned and tuned several ETL Process for optimal utilization of Time and resources.
Created and maintained tables, views, complex PL/SQL procedures, functions, and packages and performing DML operations using insert, update and delete statements
Extensively used Dynamic SQL and cursors and their attributes while developing PL/SQL objects
Extensively used the oracle exception handling techniques for implementing all types of exceptions
Created and maintained PLSQL procedures to load survey data sent in XML files into Oracle tables
Created UNIX shell scripts to load data from flat files into Oracle tables
Created UNIX shell scripts to sftp files to multiple vendors
Created UNIX shell scripts to migrate Database, ETL, Unix objects from Tortoise subversion(code management tool) to Oracle, OWB and Unix environments
Worked closely with business users in resolving the UAT Defects
Provided production support
Lead the offshore team and other team members in key projects.
YAHOO/Burbank – CA (Nov 2015- May 2016)
Project: Search tools and data management
Senior Oracle ETL Developer
Employer: Apolis
Yahoo search products generate petabytes of data that is hosted on Hadoop HDFS, Hadoop ecosystem technologies and Oracle RAC is used to build data marts and various data extract solutions for reporting.
Environment: GRID, UNIX, Windows
Technology: Oracle Data warehouse, PL/SQL
Tools: Informatica power center, Informatica power Exchange, SQL Developer, GitHub, Jira
Responsibilities:
Directly interacted with business to understand the requirements, assisted to do UAT and performing data analytics on search data feed
Oracle Data warehouse data mart design and development for reporting.
Developed Stored Procedures and used them in Stored Procedure transformation for data processing and have used data migration tools
Developing Unix/ Oracle PL/SQL code modules and ETL using metadata driven architecture
Building, tuning and maintaining SQL code for data warehouse larger than 350 terabytes(big data)
Worked on HDFS Grid system and monitored the feeds from external systems
Resolving issues related to production support and providing permanent fixes for defects
Participating in code review, data analysis, data issue fixing and code performance optimization
Building ad hoc reports for business customers using Toad/SQL developer
Created shell script to migrate from CFI to HDFS proxy to pull data from GRID 2 ORACLE.
Developed UNIX shell script and PL/SQL code to migrate ETL process from SQL*LOADER to External tables.
Extensively worked on creating dynamic partitions and Indexes using metadata.
Involved in preparing test plans, unit testing, System integration testing, implementation and maintenance.
Worked on curl commands, grid 2 oracle, if/then, case, cut, grep, crontab, find, xargs, sed, sftp, nohup
Using Informatica Power Center Designer analyzed the source data to Extract & Transform from various source systems (oracle 10g, DB2, SQL server and flat files) by incorporating business rules using different objects and functions that the tool supports.
Using Informatica Power Center created mappings and mapplets to transform the data according to the business rules.
Used various transformations like Source Qualifier, Joiner, Lookup, sql, router, normalizer, Filter, Expression and Update Strategy.
Implemented slowly changing dimensions (SCD) for the Tables as per user requirement.
Implementing the Change Data Capture Process using Informatica Power Exchange.
Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
Followed Agile methodology and used Jira to maintain tickets.
DIRECTV/ELSEGUNDO - CA (Jun 2013- Oct 2015)
TAOS (Tracking Activation & Offers System)
Senior PLSQLDeveloper
Employer: Infosys
EMploy
TAOS stands for Tracking Activation & Offer System, it is the system through which DIRECTV processes credits and commitments for its customers. Tracking activation offer system is mainly used to track the backend offers and commitments.
TAOS processes approximately $2 billion credits per year and there are approximately 1200 to 1300 jobs that run 24/7 on TAOS!
Environment: UNIX, Windows
Technology: SQL Server, Oracle PL/SQL, oracle forms and reports
Tools: Talend, SQA Loader and Autosys
Responsibilities:
Extensively Worked on Extraction, Transformation and Load (ETL) process using PL/SQL to populate the tables
Developed the ETL mappings in PL/SQL via packages, stored procedures, functions, views and triggers.
Built physical data structures like tables, views, constraints & Indexes.
Used Explain Plan, Analyze, Hints, Auto Trace to tune queries for better performance and also Extensive Usage of Indexes.
Worked with SQL loader and control files to load the data in different schema tables.
Used analytical functions, analyzed and generated reports for data warehouse data and OLTP, analyzed data to cleanup, maintained data warehouse data and created Materialized views, range and hash partitions
Implemented features including Materialized Views for better performance of summary tables, Autonomous Transactions, Coding Dynamic SQL Statements.
Created database objects like SQL Queries, tables, Joins, Stored procedures using Oracle Utilities like PL/SQL, SQL* Plus, SQL Server.
Designed and developed complex Oracle Report & Forms as per the client's requirement in Oracle 10g/11g.
Design and developed the UNIX shell scripts to get files from ftp and call PL/SQL procedures and package
Worked on if/then, case, cut, grep, crontab, find, xargs, sed, sftp, nohup
Worked on get current date, untar, zip and move files, email files, check return codes, call other scripts Created the Shell Scripts to automate the execution of the PL/SQL subprograms and to move the data to store in historical folders.
Used FTP to transfer the files into different servers as per needed by the business users.
Involved in preparing functional document, user document, process flow diagram and flow charts.
Created the test scripts and complex queries to test the application and to quality assurance of the data.
Created UI using oracle forms and c related reports
Involved in creation/scheduling of Autosys Jobs (JIL file generation)
Involved in preparing test plans, unit testing, System integration testing, implementation and maintenance.
Worked on Logical, Physical and conceptual design of database Data Modeling and Data Architecture with the use of MS VISIO, and Talend as a modeling tool.
Developed PL/SQL packages using for all, bulk collects and bulk variables.
Developed control files for SQL*Loader and PL/SQL programs for loading and validating the data into the Database.
Used Exception Handling extensively for the ease of debugging and displaying the error messages in the application.
WarnerMusicGroup/New York – NY (Nov 2010–May 2013)
PL/SQL Developer
Employer: Infosys
Warner Music Group Corp. has four application groups namely Financials, Global and Digital, Royalties and DBA. The activities in the scope of the project include: Application Maintenance, Break/Fix, Enhancements, User assistance like on demand job runs, user queries and data fixes etc., Unit testing, Support for QA testing, documentation of fixes, Regular support for identified batch jobs, online application and interfaces, After business hours support for identified batch jobs, online applications and interfaces, UAT support, Assist in conforming SOX complaint activities.
Royalty accounting is one of the major applications of WMG to handle the interfaces between royalties and financials. WMG is migrating Royalties and Financial systems from oracle to EROS and SAP. This system will take care of migrate the data to new systems and also to implement the interfaces for new systems.
Technology: SQL Server, Oracle PL/ SQL, Oracle forms and reports
Responsibilities:
Involved in gathering functional requirements from business users and application users for the Royalty interface.
Worked on Logical, Physical and conceptual design of Database Data Modeling and Data architecture with the use of Erwin and MS VISIO as a modeling tool.
Writing PL/SQL code using the technical and functional specifications.
Generated SQL and PL/SQL scripts to install create and drop database objects including Tables, Views, Indexes, Sequences and Synonyms.
Involved in writing Packages, Functions, Stored Procedures and Database Triggers.
Created stored procedures, functions, packages, data base triggers and cursors based on requirement.
Developed complex queries to retrieve the data and created indexes for faster access of data.
Documented business rules for writing triggers and constraints, functional and technical design, test cases and user guides.
Worked on Database tuning and performance monitoring.
Performed Unit testing on queries and reports.
PL/SQL tables and global variables and also using IN and OUT parameters with TYPE, ROWTYPE, PL/SQL tables and PL/SQL records.
Involved in debugging and Error handling, creating Index’s, passing hints, analyzing the table.
Prepared and provided the functional specifications to Data services and SAP team.
Design and developed Royalty account status tracker to migrate data from legacy to SAP Artist Ledger and EROS
Design and developed user interface to modify and accept/ reject data before migrating.
Created oracle database objects and PL/SQL procedures, packages and functions
Worked on Transoft as front end and SQL Server as Backend for Royalty account status tracker.
Involved in integration testing and regression testing for Legacy to EROS and SAP Artist ledger migration.
Coordinating with Offshore team, assigning work to team members and status tracking.
Coordinating with users and other technical teams, resolving issues and status tracking.
Worked on migrating data from XML, csv, txt files and oracle database using Talend tool.
ETL design and development, Creation of the Talend & SQL mappings, sessions and workflows to implement the Business Logic.
WarnerMusicGroup/Hyderabad (Jun 2006 – Oct 2010)
Oracle Developer
Employer: Infosys
Environment: UNIX, Windows 2000/XP
Technology: Oracle PL / SQL, HTML, Java script, shell scripting
Tools: Actuate, Appworx, PVCS, Remedy, VSS, Toad, Change Management, Quality Center and XML Spy
Responsibilities:
Involved in the Trans application design, develops, and test in Unit and Integration Testing of the Oracle application and migrating into production.
Improve database performance for US Digital Sales, which are over 2.5 terabyte, requirement gathering and creating a technical design, database design and new business process improvements.
Gathered requirements for the Trans application from the business users and technical users.
Migration of the existing Oracle application from 9i to 11g.
Involved in gathering new business rules, designing, developing, implementing and testing enhancements into exiting reporting solutions for the Business users.
Design and developed complex packages/procedures and shell scripts for eTrans sales and GL posting.
Planned and tracked the implementation of all scheduled reporting solutions, which were delivered to Business Users automatically.
Involved in processing the monthly/weekly Digital sales from the CRM application (ETrans) and publishing the reports to business users for month end sales posting and booking using scheduled Oracle jobs and Actuate reporting tools.
Coordinating with Offshore team, assigning work to team members and status tracking.
Mentoring other junior members of the team.
EDUCATION:
Bachelor of Technology, JNT University, India