Resume

Sign in

Data Oracle

Location:
Portland, Oregon, United States
Posted:
May 03, 2019

Contact this candidate

Resume:

Possessing * years ** months of IT experience as a Hadoop Administrator for Cloudera distribution, 3 years as an Oracle DBA and 18 Years of total IT experience in Linux/Unix, HDFS, YARN, Hive, Impala, Sentry, Sqoop, Flume, HBase, Kafka, Spark, Kerberos, Zoo Keeper, LDAP, TLS encryption, HDFS encryption, KMS configuration, Rack Assignment in Hadoop, Cluster Tuning/Monitoring, AWS, Oracle SQL, PL/SQL in OLTP and OLAP, Unix Shell Scripting, Oracle Warehouse Builder (ETL), Brio (OLAP), My SQL, DB2, Autosys, Nagios, Oracle Apex 4.2, Jenkins, Maven, SVN, CVS, GIT, JIRA, Java, Java Script, REST APIs and Agile processes Scrum/Kanban for project development.

Professional Summary

Possessing 2 years 10 months of Hadoop Administration experience using Cloudera as cluster managing software.

Possessing 3 years of experience as an Oracle DBA and 18 years of total IT experience in analysis, design, development and maintenance of Data Warehouse/ OLTP/Web based E-commerce applications in Retail, Telecom, Health Care, HR, Supply Chain Management and Banking domains.

Worked as a Tech Lead, Developer in several projects for various client organizations.

Hands on experience in managing, administering, tuning and troubleshooting of Hadoop cluster using Cloudera.

Acquired knowledge on various Hadoop concepts and features by becoming a member of Cloudera forums.

Expert in system capacity management ie. memory, I/O and disk space management areas.

Expert in installing and upgrading Hadoop components and dependencies like JDKs and OS packages if required.

Expertise on Unix activities ie. cron jobs, package installations, shell scripting and implementing Unix security

features.

Hands on experience on implementing kerberos security features for Hadoop components.

Expertise on administering LDAP, Kerberos principals and Keytabs for users & services authentication and implementing various authorization features like POSIX accesses, HDFS ACLs, Sentry roles etc.

Deep understanding on YARN container and resource pool architecture, resource management tools in Hadoop cluster ie. Fair Scheduler, Capacity Scheduler and FIFO Scheduler.

Thorough knowledge and hands on experience on implementing various levels of encryption ie. OS/HDFS/Network levels and configuring public CA certificates, configuring KMS system required to implement data encryption.

Thorough knowledge on managing and backing up of Namenode metadata and cluster disaster recovery options.

Hands on experience as a Database developer in building and deploying applications using Java/J2EE technologies.

Thorough knowledge and hands on experience on AWS features ie. IaaS, Regions, Availability Zones, Virtual Private

Cloud, Subnets, Route Tables, NACLs, Security Groups and IAM.

Deep understanding on AWS services ie. EC2, S3, EBS, Glacier, RDS, Redshift, Amazon Aurora and Dynamo DB.

In depth knowledge of data backup/recovery, data integration and data archiving methodologies.

Proven track record of building applications using Oracle SQL, PL/SQL, BRIO 8.0, Oracle Warehouse Builder 10.2, Unix (Linux/Aix/Sun Solaris) and Erwin.

Highly proficient in PL/SQL programming in OLAP/OLTP environments and tuning Oracle queries and instance

level tuning in Oracle.

Hands on experience as a data architect/modeler/analyst in OALP and OLTP environments.

Hands on experience on configuration management tools like GIt, Clearcase, Tortoise CVS, SCM, VSS, SVN, TFS.

Data modeling experience in designing Entity-Relationship and Star Schema data models.

Experience in QMS, defect prevention methodologies, HIPAA and data security features.

In depth knowledge of agile methodology approaches for project management.

Versatile team player with strong programming and problem solving skills. Extremely focused and able to quickly identify and respond to priority issues.

Possessing global IT experience, strive to learn new technologies, expertise to prioritize issues and ability to handle critical tasks during software development and maintenance of existing projects.

Extensive working experience in Application Development with expertise in full software development life cycle.

Excellent technical and communication skills.

Certifications

AWS Certified Cloud Practitioner, Verification URL: https://aw.certmetrics.com/amazon/public/verification.aspx

License: 0TPSH491D2441S58

Cloudera Certified Hadoop Administrator, Verification URL: http://certification.cloudera.com/verify/

License: 100-020-783

Oracle PL/SQL Developer Certified Associate in 2006.

Education Summary

Certified course in oracle warehouse builder from Oracle School, India in 2006.

Certified course in AS/400 from Pentasoft, India in 1999.

Diploma in advanced computer applications from Vyshnavi Software Consultancy, India in 1999.

Post Graduate diploma in computer applications from Prime Computers India in 1995.

Bachelor of Engineering from Andhra University, India in 1994.

Skills Summary

Languages : SQL, PL/SQL, Java, C, C++, COBOL, Fortran 77, ILE RPG, ILE CL.

GUI : Oracle Apex 4.2, Developer 2000, Screen Design Aid.

Operating Systems : Aix, Linux, Cent OS, DOS, OS/400, Windows NT, Windows 2000, Win XP.

Databases : Oracle9i, Oracle 10g, Oracle 11g, MySQL, DB2 UDB7.0, Dynamo DB.

ETL & OLAP Tools : Oracle Warehouse Builder10.2, Oracle Data Integrator (ODI), Brio 8.0.

Database Tools : Toad, Benthic, SQL Navigator, SQL Developer, PL/SQL Developer, Erwin

Scripting Languages : UNIX Shell Scripting, Java Script, HTML, XML, Perl 6.0

Organizational Tools : JIRA, SVN, GIT, Tortoise CVS, Clearcase, VSS, Win CVS, SCM, MS Project.

Application Tools : Putty, Cygwin, Visual Inter Dev, MS Visio, WinScp, WsFTP.

PC Skills : MS-Word, Excel, MS Power Point

Project Experiences

Sep 2015 – July 2018

System: Enterprise Data Analytics for Nike

This big data application is to build an enterprise level data store for Nike using various Hadoop technologies to ingest, store and retrieve data in various formats for its business and analytical operations. This big data platform stores data from various sources ie. Nike product forecast systems, planning systems, product creation databases, market place data store, SAP inventory systems, order management, pricing systems, point of sales and EDI systems for all regions across the globe for all lines of business ie. Footwear, Apparel, Equipment and Visual merchandise. This big data application is continuously evolving with advanced industry standard cutting edge technologies and does support product data feed from non Nike systems as well.

Environment

Cloudera Enterprise 5.16.1, Oracle JDK 7, CDH 5, RHEL version 7.6, Kerberos 5, LDAP, Zoo Keeper, Hive, Impala, Hue, Spark 1.6, Python, HttpFS for REST APIs, Apache Sentry, Pig, Sqoop, Flume, HBase,Kafka, Oozie, Cloudera Search, Oracle 11g, JIRA, Winscp, Putty, SVN, Git.

Contribution

Deploying and administering Cloudera manager, CDH and other Hadoop services on Linux environment.

Installing and validating Kerberos client packages, OS packages like SSH, Java JDK, performing Linux OS configurations as suggested by Cloudera (ex: Disabling accesstime), setting up password less SSH from Namenode and CM servers before adding new nodes to the cluster.

Validate LDAP user credentials and creating necessary Kerberos principals for users to access cluster nodes and services.

Granting users the privileges needed to access servers and other Hadoop services and revoke the same as required.

Create and administer Sentry roles, Directory Snapshots and HDFS ACLs to manage data integrity in HDFS.

Enabling high availability for Namenode, YARN services, configuring Namenode to write its metadata to multiple locations (two) on different disks and regularly backing up of Namenode metadata.

Performing Linux administration tasks like partitioning, formatting and mounting when required.

Resolve network and Kerberos connectivity issues in a timely manner for a smooth execution of application code.

Perform thorough research, configuring YARN container’s minimum memory size and minimum cores number, Datanode heap memory size, HDFS cache memory size, Datanode disk size for mapper/reducers intermediate tasks, namenode heap size, configuring preemption, configuring Dynamic resource pools for various applications and many other CDH configuration parameters to improve the performance of the cluster.

Debug and troubleshoot service role logs (Spark, Sqoop, Flume, Kafka etc), finding root cause for process failures, taking immediate action to temporarily resolve the issues, plan and execute long term actions to avoid such failures in near future.

Setting up alerts, analyze and configure Charts/Reports/Dash boards using Cloudera managed services to monitor system health and resolve issues in a timely manner to meet SLAs.

Create and manage role groups as needed, adding, decommissioning and removing nodes in the cluster.

Capacity planning in terms of memory, disk and I/O for new applications and for incoming data in HDFS.

Manage various Hadoop services, upgrade Cloudera Manager, CDH and other services as needed.

Rebalance the cluster when new nodes are added and also periodically at off-peak hours (usually on weekends) to distribute data blocks as to reach rebalance threshold of the cluster.

Coordinating with the other business teams like infrastructure, network, database, application teams and work towards quick resolution of issues if needed.

Worked on 24/7 on-call rotational production support for Cloudera Cluster, using JIRA as SaFe implementation tool and prioritizing JIRA tickets based on severity and SLAs.

Coordinate with Cloudera support team to minimize system down time during patching and upgrading activities.

Jan 2015 – Aug 2015

System: Product Engine Solutions for Nike

This project Product Engine Solutions (PES) is a centralized data integration solution built on Oracle database intended to manage/maintain Nikes core product database which serves as an upstream data source for Point of Sales, Nike product forecast and planning systems, GoToMarket (GTM), EDI systems, Data warehouses, Physical warehouse systems, SAP and pricing applications for all regions over the globe. Nikes Product Engine Solutions bring together technology, data and product development processes to deliver solutions that support and enable Nike’s Global Products for Footwear, Apparel, Equipment and Visual merchandise. Product Engine Solutions does support product creation from innovation through product release and feed product information to all of Nike’s other systems.

Environment

Oracle 11g on stand alone and could servers, Jenkins, Maven, Flyway Database plugin, SVN, Git, Stash, JIRA, Toad for Oracle, Linux, Apex 4.1, Winscp, Putty, SAP MDM and Autosys, Servicenow 2.0.

Contribution

Understanding retail domain functional concepts and business processes.

Working on various new and modifying existing several PL/SQL, SQL objects, Unix shell scripts, Auosys batch jobs, Apex applications in PES needed to cater various business requirements.

Resolve various SQL, PL/SQL code level and Oracle instance level performance issues arise.

Implement advanced Oracle features in data integration programs and implement new ideas to build optimal software code.

Create technical specification scripts for data integration, migration, and conversion activities.

Learning and implementing GIT/Stash/Source Tree as a set of new version control software tools and provide training to other team members.

Acquire expert knowledge on JIRA as a SaFe implantation tool and provided training to other team members on Safe/ JIRA processes.

Analyze and debug various UNIX shell scripts when batch failures occur.

Develop and modify PES dash board applications built in Oracle Apex 4.2 using Apex forms and reports.

Working on solving/answering various process related issues received from other teams at Nike and 3rd party vendors.

Feb 2013 – Jan 2015

System: Work Assure for Arris

This project Work Assure (WA) is an extensive work force management application intended to maintain/analyze the status of various tech support tickets in the Cable industry created when there is an issue with the Internet/Tv/Phone connections and to measure the performance of technicians. This application is widely used to provide a shortest route to reach to customers locations (homes/offices etc) using Google maps and to keep track of various tickets logged by customers.

This application also maintains the number of hours the technicians worked to resolve the issues and keeps track of various statuses of the tickets. This application provides an online web-based reporting system used to analyze the performance of technicians and various statuses of tickets by pulling information from an Oracle database. The dashboard application of WA represents the metrics in the form of graphs/charts used by dispatchers/manages of MSOs (Multiple system operators) to diagnose/improve their service quality.

Environment

Oracle 10g/11g, Windows 2003/2008, JIRA, TFS, PL/SQL Developer, Toad for Oracle, Windows 7, Remote Connect, Microsoft Visual Studio 2012, Java Script, ODBC, MSMQ, SOAP, Web services, XML.

Contribution

Understanding internet/telephony domain functional concepts and business processes.

Working on building new PL/SQL objects when enhancements are required in WA system.

Working on various modifications to PL/SQL, SQL objects in WA, needed to cater new business requirements.

Working on upgrades and installations of WA Oracle database.

Implementing various performance tuning techniques in Oracle queries of WA application.

Working on solving/answering various process related issues received from the clients.

Provide technician related data to the clients after performing data reconciliation and quality analysis.

Worked on database installations, backup, recovery and data loading activities.

Performing data virtualization of work orders assigned to technicians via several work assure application components.

Working on maintaining different database jobs that do various aggregations of performance metrics collected in

WA.

Working on 24/7 on-call rotational production support DBA for WA application.

Oct 2011 – Jan 2013

System: Serve Assure Advanced for Arris

This project Serve Assure Advanced (SAA) is an extensive application intended to maintain/analyze performance metrics of DOCSIS network components like cable modems/STBs (set-up box)/MTAs (Multimedia terminal adapter)/CMTS (Cable modem terminal system) in internet/telephony domains, widely used by clients like Time Warner Cable, Cable Vision NY, Comcast, UPC (Europe), Cable Vision Argentina and many more. SAA gathers information about service failures, various performance metrics related to different DOCSIS elements mentioned above and stores aggregated metrics measured at different dimension levels in the Oracle database. The UI part of SAA represents the performance metrics/service failures in the form of graphs/charts used by MSOs (Multiple system operators) and technical support teams to diagnose/improve their service quality.

Environment

Oracle 11g, Solaris 10, Linux, JIRA, SVN, PL/SQL Developer, Toad for Oracle, Windows 7, Remote Connect, Java/J2EE, JDBC, JMS, SOAP Webservices, XML, Eclipse, LDAP security server.

Contribution

Understanding internet/telephony domain functional concepts and business processes.

Working on building new PL/SQL objects when enhancements are required in SAA system.

Working on various modifications to PL/SQL, SQL objects in SAA, needed to cater new business requirements.

Working on manual/automated upgrades and installations of SAA Oracle database.

Implementing various performance tuning techniques in Oracle queries of SAA application.

Working on maintaining/enhancing various shell scripts required to deploy SAA on different OS architectures.

Working on solving/answering various process related issues received from the clients.

Working on maintaining different database jobs that do various aggregations of performance metrics collected in

SAA.

Taking part in data loading and data analysis activities required to gauge the performance of SAA data warehouse.

Working on making changes to SAA release scripts to cater automated upgrades and installations of Oracle database.

Working on 24/7 on-call rotational production support for SAA application.

Nov 2010 – Sep 2011

System: Continuous Product Engineering (CPE) - OMS Finance for Walmart.com

This project CPE OMS Finance is intended to maintain/enhance the payment processing system for Walmart.Com, a robust e-commerce web application. OMS Finance is intended to process payment transactions for various payment methods like Credit Card/ Debit Card/BML/Paypal/Physical Giftcard/e-Giftcard. This system performs various payment actions like Real_Time AUTH/Batch AUTH/BILL/REFUND/AUTH Void/Fraud Check/VBV verification/Applying Associate Discount etc. This project is also intended to maintain various payment processing sub systems like Value-Link/INCOMM/ Red/VBV/Risk Server/Credit Card Info DB/Live Processor etc.

Environment

Oracle RAC 11g, Perl 6.0, SQL Loader, Unix, Linux, Solaris, CVS, BMC Remedy, PL/SQL Developer, Toad for Oracle, Windows 2000, Remote Connect, Java/J2EE, JDBC, SOAP Webservices, Eclipse, Nagios, LDAP security server, MySQL.

Contribution

Understanding Retail/E-Commerce business concepts and business processes.

Worked on enhancing/maintaining the existing Perl CGI (Common Gateway Interface) modules.

Worked on building PL/SQL objects when enhancements are required in the OMS Finance system.

Involved in data loading and data analysis activities to populate the source tables when price adjustments are needed.

Worked on 24/7 production support for OMS Finance (payment processing) system.

Implemented various performance tuning techniques in Oracle queries in an OLTP environment.

Worked on solving/answering various process related issues from business teams.

Worked on scheduling/maintaining various shell scripts those execute various critical payment processes.

Working on making changes to xml web service calls from PL/SQL for sales/shipping/wrap tax calculations.

Jun 2010 – Nov 2010

System: Delta Claim System (DCS) for Delta Dental, U.S.A

This project DCS(Delta Claim System) with Delta Dental is intended to generate several types of Invoices/Bills/Reports generated by Delta’s weekly/monthly billing processes related to dental insurance claims. DCS system contains several components used to maintain Providers/Claims/Group/Product information. This project involves making changes to various monthly/weekly billing processes to populate various data sources that cater as repositories for different Reports and making enhancements to different DCS processes that generate various Invoices/Bills.

Environment

Oracle RAC 11g, SQL Loader, Unix, SVN, Toad for Oracle, Windows 2000, Remote Connect, Dot Net.

Contribution

Understanding dental insurance business concepts and business processes.

Involved in data loading and data analysis activities to populate different repository tables in DCS.

Worked on building PL/SQL objects for data extraction and data loading activities.

Worked on data loading using SQL Loader.

Worked on building PL/SQL objects to generate various Invoices/Bills/Reports from oracle databases.

Implemented various performance tuning techniques in Oracle queries.

Jan 2010 – Jun 2010

System: Home Affordable Modification Program for JP Morgan Chase, U.S.A

This project with JPMorgan Chase is called Obama’s HAMP program which is introduced to help delinquent home loan borrowers prevent from foreclosing their home loans due to economic downturn. The main objective of this project is to fetch/provide various kinds of loan level/portfolio level information (payments, property details, brokers/servicers/investors info) for the loans that become eligible under HAMP program based on the guidelines set by the U.S government. This project involves fetching various kinds of loan level info (eligible for HAMP/Chase modification program) from different applications i.e. TMQ/APC (Chase pre-qualification applications), AgentDesktop/Modcalc (Chase Underwriting applications), production tables that contain loan level transactional data and to populate various HAMP tables. These HAMP tables populated with various kinds of loan level info will be used to provide loan details to U.S treasury and GSE organizations (Fanniemae and Freddiemac).

Environment

Oracle RAC 11g, SQL Loader, Linux, Rational Clearcase, Toad for Oracle, Windows 2003, Remote Connect.

Contribution

Understanding mortgage business concepts and business processes.

Involved in data mining and data analysis activities to populate different repository tables/ cater data for adhoc requirements from U.S treasury and chase operations team.

Worked on building PL/SQL objects for data extraction and data loading activities.

Worked on data loading using SQL Loader.

Taken part in building free form reports and pivot reports using excel for data analysis.

Worked in creating Unix shell scripts to schedule PL/SQL procedures which do various data extracts from source databases.

Implemented various performance tuning techniques in Oracle queries.

May 2007 – Dec 2009

System: Cost Management System for Multiplan, USA.

This project with Multiplan is to maintain and enhance a cost management system through which Multiplan provides best cost management practices in health care domain. Multiplan is a leader in providing cost managing practices, it acts as a mediator between Insurance companies and its network subscribers (Doctors and Hospitals). It has major Insurance companies as its clients and renowned doctors and hospitals across the country as its network subscribers. The project involves reprising various kinds of medical claims received from the insurance companies, based on the contracted prices with its subscribers. Also this project involves various data warehouse activities to extract providers/ratecodes data from the source systems and generate flat files required by the clients (Insurance companies). The project also involves developing various objects in PL/SQL through which various kinds of providers’ information could be extracted, modified and passed to its clients as per their requirements.

Environment

Oracle 10g, SQL Loader, Linux, Cold Fusion, SCM, Pervasive, Toad for Oracle, Windows 2003, Microsoft Remote Desktop Connection, JIRA CR tool, HP quality center, Java/J2EE.

Contribution

Senior Consultant (Team Size: 7)

Involved in creating various PL/SQL objects, preparing UTC scripts, Technical specification scripts, deployment documents for various objects.

Worked on rebuilding various PL/SQL objects using Oracle collections to implement Batch processing techniques to improve the performance and lessen the processing time.

Worked on building various Application Programming Interfaces (APIs) using PL/SQL that would be accessed by different applications.

Worked on gathering requirements for data warehouse/web based applications and took part in various data architecture activities.

Extensively worked towards designing data models for different data marts and web based applications.

Extensively worked on data loading using SQL Loader and automatically creating control scripts for SQL Loader.

Taken part in building PL/SQL objects for data extraction and loading into the data repositories.

Worked in developing Unix shell scripts used to schedule PL/SQL procedures which do various data extracts from different databases.

Worked on implementing Oracle streams for messaging claim ids which will be processed claim processing engines.

Worked on preparing the process flow documents for the existing processes and documenting standard procedures to maintain the applications.

Worked on understanding the existing provider database and providing various kinds of provider information to the clients as and when needed.

Implemented various technical methods to capture change data of providers and ratecodes/rate schedules

Worked for various Change Requests logged in JIRA change management application which involve different design changes to the data models and PL/SQL objects.

Implemented various performance tuning techniques in Oracle queries and instance level tuning techniques.

Worked on resolving various defects logged in HP’s quality center.

Worked on creating Perl scripts to validate huge data files.

Involved in Project planning activities like preparing different project schedules and master schedules using MS project.

Involved in UA Testing and production level testing for different objects.

Responsible for 24x7 production support.



Contact this candidate