Post Job Free

Resume

Sign in

Data Developer

Location:
Charlotte, NC
Posted:
November 10, 2020

Contact this candidate

Resume:

PROFESSIONAL SUMMARY

Over ** years of programming experience as an Oracle PL/SQL Developer in Analysis, Design and Implementation of Business Applications using the Oracle Relational Database Management System (RDBMS).

Involved in all phases of the SDLC (Software Development Life Cycle) from analysis, design, development, testing, implementation and maintenance with timely delivery against aggressive deadlines.

Experience with Data flow diagrams, Data dictionary, Data Structure, Database normalization theory techniques, Entity relationship modeling and design techniques.

Expertise in Client-Server application development using Oracle 19c/12c/11g/10g/9i/8i, Exadata, PL/SQL, SQL *PLUS, OEM, AWR, TOAD and SQL*LOADER.

Effectively made use of Table Functions, Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views, Query Re-Write and Transportable table spaces.

Strong experience in Data Warehouse, Data Mart, Data Lakes concepts, ETL (Informatica/ Talend).

Good knowledge on logical and physical Data Modeling (Relational, dimensional, Star and Snowflake Schema) using normalizing Techniques.

Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.

Excellent technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.

Experience in Oracle supplied packages, Dynamic SQL, Records and PL/SQL Tables.

Loaded Data into Oracle Tables using SQL Loader.

Experience with Oracle Supplied Packages such as DBMS_SQL, DBMS_JOB and UTL_FILE.

Created Packages and Procedures to automatically drop table indexes and create indexes for the tables.

Worked extensively on Ref Cursor, External Tables and Collections.

Expertise in Dynamic SQL, Collections and Exception handling.

Experience in SQL performance tuning using Cost-Based Optimization (CBO).

Good knowledge of key Oracle performance related features such as Query Optimizer, Execution Plans and Indexes.

Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.

Experience in ETL techniques and Analysis and Reporting including hands on experience with the Reporting tools such as Oracle Reports/Tableau/Microstrategy.

Created Shell Scripts for invoking SQL scripts and scheduled them using crontab.

Experience with Big Data: Hadoop, HDFS, MapReduce, YARN, Spark, HBase, Hive, Sqoop, Cloudera Hue.

Experience with Cloud: AWS (S3, Redshift).

Excellent communication, interpersonal, analytical skills and strong ability to perform as part of a team.

Technical Skills:

Databases: Oracle 8i/9i/10g/11g/12c/19c (SQL, PL/SQL, Stored Procedures, Triggers), MS SQL SERVER 2000/2005/2008, Sybase SQL Server 15.x, TSQL 12.0/13.0/14.0.

DB2/UDB, Teradata, SAP Tables and MS Access.

ETL Tools: Informatica (PowerCenter 7.1/8.6.1/9.1.0), Talend (ODI and BigData Platform 5.x/6.x/7.x), Oracle Change Data Capture (CDC) and SQL*Loader.

Reporting Tools: Oracle Reports 2.5, Tableau 7.0 and APEX 5.0

Operating Systems: UNIX (Sun Solaris, LINUX, HP UNIX, AIX), Windows NT/98/95/2000 & Windows XP/10.

Data Modeling: Erwin 3.5.2, 4.0/ MS Visio 2010.

Languages/Utilities: SQL, PL/SQL, T-SQL, HIVEQL, UNIX shell scripts, Java, XML.

Other Tools: AutoSys, OEM, AWR, ADDM, GIT, Nexus.

EXPERIENCE

Sr. Database Developer/Data Analyst (Oracle and BigData)

Bank of America, Charlotte, NC

(Nov. 2018 – Present)

•Developing the Batch processing for Credit Risk Platform Application in Oracle Exadata (12c/19c).

•Major participation in Data Migration from NZ to Oracle Exadata for Forklift Application

•Created programs to perform DQ checks to sustain Data Integrity.

•Created Stored Procedures, Functions, Packages, Triggers, DBA_JOBS using SQL and PL/SQL.

•Participated in Data modeling design and Techniques (ER Diagrams and UML).

•Performance Tuning for the optimization of batch process between multiple sources of records.

•Delivered the report queries to reporting team to be present in the Tableau Dashboard for daily batch status to mitigate dependencies on batch monitoring timings.

•Participated in Oracle objects analysis and implementing the optimization techniques to bring the processing time down for DB queries within the packages, procedures (during batch runs) and designed Informatica Jobs (during staging the data) for multimillion rows.

•Developed the Informatica jobs to transfer the data files (Flat/XML/RDBMS) from SFTP server to local RDBMS(Oracle/SQL), load them to HBASE (using Sqoop utility), perform the analytics operations using HiveQL on HBASE tables.

•Participated in the MAPPER/REDUCER/DRIVER utilities/code of Java to push the analyzed result back to HDFS, from where the data will be loaded into local databases using Sqoop.

•Developing the Informatica jobs to archive/retrieve data to/from HDFS using Data Exchange with the application of retention policies.

•Utilization of GIT Repository for Sub-Version of the code branch.

•Monitoring the batch process to troubleshoot the production run.

•Participating in Scrum calls, utilizing JIRA Stories (both Users and Technical), Epics, Confluence pages, participation in Planning Poker, attending Sprint meetings, grooming sessions to assign Points to the stories, creating IFS (Interface Functional Specification), Design, UTP (Unit Test Plan) documents to implement the Agile process to current project.

Environments: Oracle 12c/19c, OEM, DBATools, Sqoop, Toad 12, Informatica 9.6, Autosys, WinSCP, JIRA 7, Agile, Nexus, Hadoop 2, Cloudera 5.8.

Sr. Database Developer/Data Analyst (Oracle and Bigdata)

Bank of America, Charlotte, NC

(Sep. 2014 – Nov. 2018)

•Responsible for end to end design, development, enhancement, reporting, production support and migration, using various applications and tools utilized in the CPM (Counterparty Portfolio Management) Technology Team.

•Played a vital role by creating the packages, procedures, and functions for the development and automation of the FED CCAR Report.

•Wrote Unix shell scripts to create dynamic “control [.ctl]” program (via sqlldr) and created Autosys jobs to help trigger the script to load the data from external text files to Oracle environment post file arrives at box.

•Responsible for creating the data mapping, implementation of transformation logic, creating and monitoring the workflow, tune the existing workflow to improve performance using Power Center 9.6 to load the data from various sources.

•Responsible for loading the Oracle data to HDFS using Informatica 9.4, Sqoop 1.4 and perform Data Analysis operations using HIVEQL.

•Utilized the AWS S3 features of storage classes, PCU, MD5, Versioning, Bucketing and reliability to store the data in AWS cloud from local and HDFS.

•Created and managed the internal/external tables in Hive 1.4 and 2.1 by using optimization techniques to achieve performance during load and analyzing the structured data.

•Usage of UDF, UDAF, UDTF functions, Map side joins and Data Sampling

•Responsible for performance tuning of the packages, procedures, creation of the indexes, partitions, materialized view for the improvement of Oracle Application and process.

Environment: Oracle 11g/12c, Toad 11, Informatica 9.6, Hadoop 2.7.3, Sqoop 1.4, AWS S3, Cloudera 5.8, HIVE 1.4 and 2.1, WinSCP

Sr. Oracle PL/SQL Developer, ETL Developer, Reports Developer

EXP Pharmaceuticals, Fremont, CA

(Jan. 2012 – Sep. 2014)

•Assisted the team to design the DWH, involved in Business modeling (Logical and Physical Models) using ERWin and MS Visio.

•Assisted the DBA team to create the Facts and Measure for DWH (in Snowflake schema).

•Designed the process to interchange data between corporate and its trading partners using oracle PL/SQL via AS2 (Applicability Statement 2) connector.

•Created new and manipulated existing forms using Oracle Apex.

•Created Reports, using Tableau/APEX.

•Wrote Cron-Jobs for download data from SFTP and load them in Oracle tables using Oracle SQLLoader Control files (CTL Files).

•Wrote Unix scripting to create dynamic control (.CTL) file to load data in Oracle tables which get executed through Cron-jobs via SQL Loader and used ETL tools for Data Integration (Informatica 9.1)

•Wrote and updated oracle jobs using DBMS_JOB package.

•Assisted the team to enhance the readability from Database to APIs to support the Webservices for better communication between multiple UI portals.

•Wrote and supported FAE (Field Application Engineering) jobs. This is basically a system designed for marketing / sales rep to gather information from end customers and designed reports to see the performance/ profitability / Data Analysis.

•Tuning of Oracle Procedures using ANSI standards to minimize cost and maximize performance.

•Created and maintained Materialized Views for quick reports for Oracle Apps reports for Reporting team. (Materialized Views are then used by .Net team for its’ Excel Report extraction)

•Wrote Shell Scripts for calling Oracle Programs in HP-Unix environment

•Production Support and Quality Control

Environment: Oracle 10g/11g, Informatica 9.1, Tableau 7.0, APEX 4.1, ERWin 7.3, MS-Visio 2010, HP-Unix

Sr. Oracle PL/SQL Developer/ ETL Developer

Time Warner Cable, Charlotte, NC

(Aug. 2010 – Dec. 2011)

•Responsible for its GIA (Go it Alone) project designing/ concept/ modeling and implementation of the OLAP (MOLAP) based on stakeholders’ requirement.

•Attended meetings with Business to gather the requirements and performed analysis.

•Assisted DBAs to design the tables along with its constraints.

•Assisted in the OLTP environment for performance tuning.

•Used Informatica for ETL and performed testing.

•Developed procedures for sanity checks based on T-0 to T-n rules/requirements

•Developed PL/SQL Script for data load using dynamic sqlloader scripts and created triggers in Unix for auto load

•Daily Report Generation of Telephone Migration Project Status using Oracle Reporting tools.

•Production Support and Quality Control.

•Prepared test environment for QA team to test the developed application from end-end

•Migrated the scripts from PERL to Oracle PL/SQL package

•Developed tools to populate test data for Quality Assurance team

•Developed a framework for extensible automated production support in PL/SQL

•Documented the user guide for the system

Environment: Oracle 10g, ERWin, MS-Visio 7, Informatica 8.6, Oracle Report 6, HP-Unix, Windows NT

Sr. Technical (OBIEE) Developer

Dunkin Donuts Canton, MA

(Feb. 2009 – July 2010)

•Interacted with Business Analyst for analyzing the business requirements.

•Designed and built data model, designed and customized Star and Snowflake schemas by extending Fact and Dimension Tables Involved in the installation of OBIEE using Erwin Tool.

•Assisted in functional support for all retail products

•Supported Oracle retail management

•Involved in the installation of OBIEE on both the windows and UNIX client systems.

•Involved in developing the OBIEE process design document.

•Involved in identifying the data sources and the granularity level of data populated in the sources.

•Played a major role in designing the dimensional modeling of star schema for the enterprise data ware house of ODS.

•Built metadata in the analytics repository using OBIEE Administration Tool in physical, business and presentation layers.

•Created dimensional hierarchies.

•Developed Time Series measures for comparing the various case types of service requests between the months/quarters/years.

•Created connection pools, imported physical tables and defined joins in the physical layer of the repository.

•Implemented Basic Security and LDAP server Authentication.

•Generated Reports and Dashboards by using Report features like Pivot tables, charts and view selector.

•Created dashboards with global prompts, column selectors, navigation links, and automatic drill-down.

•Created materialized views and aggregates to improve performance of the queries generated by BI server.

•Worked on OBIEE publisher to publish the reports and dashboards to the users in different formats.

•Interacted with ETL team and contributed the role in developing reusable and non-reusable mappings, source and target definitions using Informatica power center.

•Documented the Developed design for intuitive understanding and future reference.

•Worked closely with Users, Developers and Administrators to resolve ongoing Production issues by reviewing design changes made to production systems and made corresponding changes to Repository.

Environment: Oracle BIEE 10.1.3.4, Informatica Power Center 8.1.6, Oracle 10g, PL/SQL, DB2, Windows 2003 server, IBM-AIX. Oracle Retail-RETEK (11.x, 10.x, 9.x, 8.x, 7, x)

Oracle PL/SQL Developer

Branch Banking & Trust (BB&T), Winston Salem, NC

(Dec- 2007 – Jan-2009)

•Designed the ETL processes using Informatica 6.2 to load data from flat file sources into the staging area.

•Developed mappings using Informatica for data loading in dimensions.

•Developed mapping to load the data in staging area tables.

•Used Debugger to troubleshoot the mappings.

•Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Filter, Sorter and Source Qualifier.

•Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.

•Defined Target Load Order Plan for loading data into Target Tables.

Environment: Oracle 9i, Informatica 6.2, Business Objects 5.1, Windows 2000 Professional

Oracle PL/SQL Developer

Accelrys Software Solutions, San Diego, CA

(Jan. 2006 - Nov. 2007)

•Gathered requirements for the product.

•Designed the DB for the applications using ER-Studio

•Performed technical and functional reviews

•Designed and developed database tables, triggers, cursors, procedures, functions and packages to meet business requirements.

•Developed some PL/SQL scripts, which will change the server data across all the components in the system.

•Developed triggers and PL/SQL procedures for automation of jobs scheduling

•Developed a tool 'Meta-Load' (to load bulk data) using core java, Eclipse IDE, JSP, PL/SQL and HTML

•Supported Installation and configuration of Oracle 10g/9i

•Quality assurance

•Prepared user manual and help document for the product using ROBO help

•Trained the team on domain and technology used

•Code review in PL/SQL procedures, functions and triggers

•Managed performance and tuning of SQL queries and fixed the slow running queries in production with utilities like Explain, Trace, Stored Outlines

•Prepared schema statistics, data preparation by import/export from external databases in Dev and Quality control databases

Environment: Windows NT, Solaris 2.6., Oracle 10g Database, Oracle 10g Application Server, MS-VSS (for source code control), RoboHelp, Meta Load, XML, JavaScript, J2EE, Servlets, JDeveloper, JDBC, Internet Explorer 6.0, Mozilla, JavaWebServer2.0

Senior Oracle Consultant, PL/SQL Developer / DBA

Leo Technologies – Amritsar, India

(Jan. 2001 - Dec. 2005)

Developed a framework for extensible automated production support in PL/SQL

Wrote larger and complex PL/SQL procedures and functions to efficiently perform complex business logic to summarize data.

Create spreadsheets and reports using oracle PL SQL, Toad and SQL.

Performed key functions in the database / application migration process from 7.3 to 8.1.3. Converted /updated hundreds of applications to be 8.1.3 compatible.

Maintained the system, wrote scripts in Oracle SQL to monitor batch processing to get early warnings in case of wrong data processing.

Installed OS, software, application and product on 5 windows NT Server

Maintained Lotus applications and DB2 servers in clustered mode

Allocated system storage and planned future storage requirements for the database system

Planned for backup and recovery of database information

Managed primary database structures (table spaces) and primary objects (table, views, indexes)

Exported the database and imported the same into development and test environment whenever required

Created Database Users, Roles and managing Security. Implement effective database security practices and procedures

Ensured compliance with Oracle license agreement

Controlled and monitored user access to the database

Monitored and optimized the performance of the database

Maintained archive data on TSM server using Tivoli

Performed disaster recovery

Contacted Oracle Corporation for technical support.

Maintained user access and antivirus definitions for all the 2000 system

Involved in configuration of SAN switch

Maintained version control using CIAO

Environment: RS 6000 AIX 4.3.3, IBM Netfinity Server, Oracle 7.0, Oracle 8.0, Oracle enterprise Servers, Toad, SQLPlus, Domino R5 5.0.8 Server, Domino R6 6.0 Server, Intel X 255 series servers, LTO 3853, FAStT 200, TSM server, AIX, CIAO

EDUCATION / CERTIFICATIONS

Bachelor of Commerce

Graduation of NIIT (National Institute of Information Technology)

Certification of Oracle 10g from Oracle University



Contact this candidate