Post Job Free

Resume

Sign in

Data Security

Location:
Atlanta, GA
Posted:
January 19, 2021

Contact this candidate

Resume:

Temi Amire

Cell: 210-***-****

Location: Texas

Email:adjjoq@r.postjobfree.com

Career objective

** ***** ** ******* *********** on relational database management systems, designing and deploying database architecture and systems infrastructure for corporate IT systems; creating strategies for performing repetitive tasks such as data acquisition, migration, data recoveries and new database implementations. I would like to be part of a challenging environment where my technical skills and analytical abilities can be used towards achieving the objectives and goals of your organization.

Summary

Hands on experience migrating data over from various databases/systems via SQL, PL/SQL, and/or XML.

Experienced with Oracle 10g, 11g, 12c, 18c, 19c, ASM, Grid Infrastructure and RAC on Exadata (CC/CS), ODA, AIX, Solaris, Unix operating systems, knowledge of cloud technologies and backup and restore techniques for medium to very large databases.

Extensive SME experience with Exadata (CC/CS) and Golden Gate replication (multi directional) and Conflict Detection & Resolution (CDR).

Hands on experience with automation and configuration management technologies such as Ansible, Puppet or Chef Deep knowledge of DB performance/scaling concepts and tuning best practices.

Consulting for data/document migrations, product development and systems integration.

Backup and recovery of databases (RDBMS/NoSQL) for production and non-production environments.

Strong experience performing tuning, database architectures, replication, problem diagnosis and resolution, shell scripting, PL/SQL development, and backup and recovery.

Implementing and supporting leading edge analytical solutions on Hadoop Clusters On-Prem and/or in GCP.

Good understanding with tool-based migration like SSMA, Ora2PG, Azure Data Factory, Data bricks.

Evaluated utilization of Exadata systems workloads to determine the strategy for use and migration moving forward.

Passionate about data, system administration, implementation challenges and learning new technologies, and willing to improve and support Hadoop clusters on-prem and in cloud.

Great understanding of metadata management, data modeling, and related tools.

Experienced on continuous integration tools like Jenkins and Version control systems like SVN, GIT Good Scripting skills and proficient in at least couple of them Bash, JavaScript, Perl.

Developing and expansion of existing Data Architecture.

Experience with data integration, data injection, data pipeline, data validation, or data mining.

Helping to design and build data pipelines, streams, reporting tools, data generators and a whole range of tools to provide information and insight.

Give colleagues and clients the tools to find and use data for routine and non-routine analysis.

Writing, modifying, troubleshooting and enhancing complex Unix shell scripts and Oracle PL/SQL.

Implementing large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms.

Experience with Big Data tools such as Kubernetes, MongoDB.

Deploying and managing Mongo DB databases in Linux environment.

In-depth understanding of Mongo DB HA strategies including replica sets and sharding.

Creating and managing schedules for data management (migration, integration, etc.) efforts, working with clients to validate migrated data, working with Agile development teams to understand changes and their impacts towards data migration efforts, among other tasks.

Formulating and executing sales strategy to exceed revenue objectives through the adoption of AWS.

Migrated Applications from one Database Engine to another.

Worked closely with AWS Platform Service Engineering and Architecture teams to help ensure the success of project consulting engagements with customer.

Hands on Experience in various database structures such as Table spaces, data files, control files, redo log and archive log files using OEM 11g/12c, GUI tools and command-line.

Experience using OEM for performance tuning in 12c and managed database using OEM on 11g and 12c and Monitoring Exadata Machine.

Great Understanding of table optimization, partitioning and large volume scaling.

Performed corrective measures for moderately complex code deficiencies and escalates alternative proposals.

Implementing monitoring procedures to maximize availability and performance of the database, while meeting defined targets.

Conducting one-to-few and one-to-many training sessions to transfer knowledge to customers considering or already using AWS.

Documenting data flows, data mappings, performs data lineage analysis as required.

Capturing and sharing best-practice knowledge amongst the AWS solutions architect community.

Hands on experience in both Golden Gate and Exadata (CS/CC) systems.

Performed fail-over testing and demonstrated expert level knowledge of networks, storage, encryption, security, cloud and backup and recovery technologies.

Managed the implementation of database upgrades and migrations.

Building deep relationships with senior technical individuals within customers to enable them to be cloud advocates.

Troubleshooting and resolving customer issues related to DB systems availability and performance.

Ensuring success in building and migrating applications, software and services on the AWS platform

Writing complex SQL and/or PL/SQL queries, joins, tables, etc.

Acting as a technical liaison between customers, service engineering teams and support.

Implementing, refining, and evolving tools and processes to maintain 5 nines availability for 24 7 cloud-based SaaS solutions.

Online Migration of Oracle Database to the PostgreSQL database using replication tools.

Proactive in updating the latest security patches to Enterprise PostgreSQL database.

Configuring log analyzation tools like pgFouine and pgBadger etc.

Developing database functions, scripts, stored procedures, and triggers to support application development.

Hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL.

Implementing agile management ideals by facilitating exercises such as sprint planning and team leading standup.

Strong working knowledge of Data Warehouse/ Mart concepts.

Experience of backing up and restoring PostgreSQL database on a regular basis.

Hands on experience with a wide array of DevOps tools (Git, Maven, Ant, Gradle, GitLab, Artifactory, SonarQube)

Responsible for complex database performance tuning, security, database backup/recovery strategy and implementing monitoring procedures to maximize availability of the Exadata system including any databases.

Experience with data warehouse, data-lake, and enterprise big data platforms in multi-data-center contexts required.

Developing procedures to enhance reporting and query capabilities to improve efficiency and accuracy.

Involved in implementation of high availability solutions with 9i, 10g, 11g and 12c RAC, disaster recovery infrastructure (Physical and Logical Standby) using Oracle Active Data Guard, Materialized views, Golden Gate.

Establish performance tuning and analysis as well as best practices for the different Oracle environments residing on Exadata Linux systems, RAC environments, as well as consolidated environments running multiple database applications against the same Exadata system.

Experience with setup and administration of Oracle Database Security environment using Profiles, database Privileges and Roles.

Experience with geo-redundant database deployments (active/active, active/standby, etc.)

Solid Linux fundamentals including kernel and OS tuning, as they relate to DB performance and security.

Familiarity with Sarbanes Oxley, PCI DSS, and other security practices.

Great understanding on Postgres upgrades, backup/recovery, patching strategies/capabilities and day to day support of application development personnel.

Hands on experience with 2-3 live projects Knowledge of replication, backup, and disaster recovery design Knowledge of system sizing, capacity planning and performance tuning on Cassandra DB.

Experience in writing, debugging, maintaining and tuning Transaction SQL, and stored procedures.

Performing analysis on Exadata’s/Oracle’s CPU, Memory, storage and I/O with regards to how our app (GEO) uses the db and recommend sizing(to db or our app) to optimize performance in all 3 releases as stated in item 1.

Troubleshooting areas for possible bugs or knowledge of known bugs in Exadata and work with Oracle Support / Design to have those fixed and applied within 30 days to Exadata from the time the issue is acknowledged by Oracle as a bug.

Knowledge of all Exadata/Oracle parameters, including underscore parameters, and how they are set to optimize performance.

Interrupting alert logs and the error messages generated by the Oracle systems.

Generating and interrupting Oracle performance metrics to provide evidence on the DBs performance and stress points.

Responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms.

Worked on multiple projects as an Oracle Architect/DBA or SME in database technology.

Familiar with Linux operating system and shell scripting skills.

Developed relationships with customers, comprehend and fulfill their needs, and maintain their satisfaction through regular communication and engagement with their environments.

In-depth knowledge of Cassandra databases architecture and internals.

Analyses, tests and implements physical Database design to support and install Cassandra clusters.

Working with VLDB's (multiple terabyte) in a high availability, high volume transaction processing (OLTP), Batch and Reporting/DW environments on various platforms including Exadata (CS/CC), ODA, AIX, Linux, and Solaris.

Experience using Infrastructure as Code (IAC) tools such as Ansible, Terraform, etc. to automate system configurations and updates that can land in various target locations.

Collaborating with business partners, technical partners, and representatives from other groups to understand and evaluate their requirements regarding data integration, gap analysis, and ensure their needs are met in a timely fashion.

Analyzing system logs and identifying potential issues with computer systems.

Enabled DevOps development activities and complex development tasks that will involve working with tools such as Docker, Kafka and container management systems.

Design, configure, implement and support an automated workflow for cloud and virtual environment provisioning.

Strong understanding of Postgres performance tuning (partitioning, striping, OLAP, data warehouses, data marts and other advanced physical data modeling techniques).

Performed unit testing prior to moving code/configuration to the QA process.

Performed database administration activities in a safe, recoverable and professional manner that ensures optimal operation of all database environments.

Building and maintaining templates, rules, style sheets, workflows, data extractions and imports, interfaces, and object models.

Maintaining up-to-date documentation of all procedures, infrastructure diagrams, system configuration and baseline standards, implementation plans, system back out plans, system recovery plans, maintenance plans, system upgrades, and all system changes.

Areas of Expertise

High Availability Solutions (10)

Oracle Installation and Configuration(9+)

Golden Gate (6)

Data Modeling (8)

Disaster Recovery (10)

Exadata (6)

Data Guard (6)

Kafka (2)

Apache Hadoop(2)

Microsoft Azure (4)

Cloud Architecture

Performance Tuning (9)

Data Migration (6)

Backup and Recovery (10)

Database Security (9)

PL/SQL (7)

Postgres (3)

AWS (4)

Mongo DB (3)

Replication, log Shipping and Mirroring (7)

Installing and Managing Clustered Nodes(8)

Database Upgrade Patches(9)

Cassandra DB (2)

SQL Server (7)

Integration and reporting Services (6)

Analytical and Leadership Skills (3)

Agile Software Development (2)

Waterfall Methodology (1)

Strategic Planner (3)

Terraform

Education

Bachelors Degree – [Computer Engineering] Obafemi Awolowo University September 2005

Associate Degree – [Computer Information Systems] Concordia University June 2010

Certifications

AZ-303

Upgrade Oracle 9i/10g/11g OCA to Oracle Database 12c OCP

Oracle Cloud Infrastructure 2019 Architect Professional

Training:

Oracle 12c RAC Training, Onlc Center Atlanta, GA May,2013

Oracle database Architecture 11g

Oracle Cloud Infrastructure

Technical Skills:

Operating System: Red Hat Linux, Windows 2003/2008 Server, VMware Server

Languages: SQL, PL/SQL, UNIX Shell Script, HTML

Database: Oracle RDBMS 10g, 11gR2 and 12c Postgres 9.3, 9.4 MongoDB 3.0 and 3.2,

Tools/ Software: TOAD, SQL Developer, SQL*Plus, SQL*Loader, Oracle Enterprise Manager (OEM), DBCA, Data Pump, RMAN, SFTP, Telnet, TCP/IP, Secure CRT, putty, Terraform, Oracle Scheduler, Unix Shell Scripting, Microsoft Word, Excel, and Power Point, SNOW,Hadoop, EnterpriseDB.

Professional Experience

Rackspace - Austin, TX Feb 2018 – Present

Oracle Database Administrator

Designing, creating, and delivering routine to moderately complex program specifications for code development and support on multiple projects/issues with a wide understanding of the application / database to better align interactions and technologies.

Developing basic to moderate complex code using front and / or back end programming languages within multiple platforms as needed in collaboration with business and technology teams for internal and external client software solutions.

Analyzing the method of transforming existing data into a format for the new environment and the loading of this data into other database structures.

Developing SQL Script, analyzing data, creating data metrics, and integrating data using emerging technologies.

Ensuring that procedures and infrastructure details are properly documented and shared among the team.

Lead the Engineering, integration, implementation, and operation of a virtualization strategy to include Database server consolidation, storage consolidation, and automation.

Interacting with users continuously to address their issues and search for the optimum usage pattern to propagate.

Assumed ownership and responsibility for data accuracy, completeness, consistency, and integrity of data fields.

Monitoring and analyzing performance metrics and allocate database resources to achieve optimum database performance.

Maintaining security hardened baseline configurations and deploy security (patches), software maintenance (patches) and new software releases and supporting products infrastructure as directed by the system owner, policy, or the Cybersecurity Directorate.

Supporting multiple servers and multiple databases of medium to high complexity (complexity defined by database size, technology, transaction velocity and volume, system feeds and interfaces) with multiple concurrent users, ensuring control, integrity and accessibility of the data.

Consolidating, transforming, and migrating data from many source systems.

Writing and maintaining Unix/Linux shell scripts Identify Root cause analysis of production-related database issues.

Collaborating with engineering teams for solution On regular basis analyze the queries executed across environments, discover bottleneck queries, and propose optimizations.

Solving issues, performance tuning, security remediation/patch management, Upkeep of the platforms.

Designing and implementing sharding and indexing strategies for MongoDB.

Responsible for Mongo backups and restores.

Implementing and maintaining MongoDB OPS Manager.

Maintaining detailed documentation of database Design/Architecture and setup.

Solving MongoDB performance issues.

Deploying/implementing new instances of our financial product (Customer facing Analytical platform) in the cloud through cloud formation templates and/or on-prem setup.

Ensuring that every cluster and services are always available without performance issues, using monitoring and alerting tools.

Designing and implementing conflict detection and resolution (CDR) solutions for replication.

Assisting with determining data migration requirements and security controls.

Defining roles and responsibilities for the data migration activities using a data migration checklist and sign-off process to ensure completion of planned activities.

Performing Tuning in terms of system architecture, scalability, understanding symptoms and problems and knowing when to tune.

Configuring, administering and implementing complex database infrastructures, multiple replication technologies, multi-directional flows and other high availability architectural topologies.

Troubleshooting and resolving customer issues related to DB systems availability and performance.

Analyzing, modifying, and developing moderately complex code/unit testing in order to develop concise application documentation. Performs testing and validation requirements for moderately complex code changes.

Giving colleagues and clients the tools to find and use data for routine and non-routine analysis.

Building blocks for transforming enterprise data solutions.

Designing and building modern data pipelines, data streams, and data service Application Programming Interfaces (APIs).

Providing support to leadership for the design, development and enforcement of business / infrastructure application standards to include associated controls, procedures and monitoring to ensure compliance and accuracy of data.

Conducting and providing basic billable hours and resource estimates on initiatives, projects and issues.

Analyze current business practices, processes and procedures to spot future opportunities

Coordinated data models, dictionaries, and other database documentation across multiple applications.

Crafting the architectures, data warehouses and databases that support access and Advanced Analytics, and bring them to life through modern visualization tools.

Implementing effective metrics and monitoring.

Implementing large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms.

Reviewing database design and integration of systems, and makes recommendations regarding enhancements and/or improvements.

Installing and configuring connection pooling tools like PgPool and PgBouncer etc.

Installing and configuring the High Availability in Postgres SQL like Streaming replication, Host standby, warm standby. Maintenance of PostgreSQL master-slave clusters utilizing streaming replication Installing and configuring the PostgreSQL Monitoring tools like PEM and AppD.

Online Migration of Oracle Database to the PostgreSQL database using replication tools.

Working with data transformation teams to ensure that the model design and development is properly communicated.

Investigating issues to provide a detailed analysis of your findings in a digestible format by email, instant message or voice.

Configuring log analyized tools like pgFouine and pgBadger etc.

Contributing to requirements specifications for new and enhanced product features to ensure Operations requirements are captured and satisfied.

Developing and expansion of existing Data Architecture.

Translating business needs into data models supporting long-term solutions.

Cloning of databases using RMAN and using latest RMAN features.

Working with VLDB's (multiple terabyte) in a high availability, high volume transaction processing (OLTP), Batch and Reporting/DW environments on various platforms including Exadata (CS/CC), ODA, AIX, Linux, and Solaris.

Developed procedures to enhance reporting and query capabilities to improve efficiency and accuracy.

Coordinated various efforts in maintaining a vast environment of over 70 databases.

Successfully migrated databases from 9i-10g, 10g-11g, and 11g -12c.

Developing conceptual, logical, and physical data modeling.

Provided guidance on service oriented architectures, application frameworks, system integration methods, Enterprise Data Business and Technical Metadata Management.

Implemented classic Golden gate replication as well as integrated replication.

Tested and Implemented different physical and logical Backup and Recovery methods like point in time, Media recovery, Full/partial export/import.

Responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms.

Recovered databases from corrupted/lost datafiles in real-time on production systems.

Configure Golden gate high availability using oracle DBFS and Active DataGuard.

Configure Golden Gate downstream mining and Integrated Component for active passive and active replication.

Implementing new technologies and supporting operations on numerous Oracle databases while working closely with internal clients, systems programmers, and application developers to define and resolve information flow and content issues---helping to transform business requirements into environment specific databases.

Implementing backup schema through RMAN.

Performing tuning of SQL's and procedures. This involved using trace dumps, TKPROF, Stats pack, Net Trace and also used specific Oracle performance parameters on a selected basis.

Installation, configuration and administration of Oracle RAC on Development boxes in 9i/10G.

Configured ASM on Windows on development boxes and recommended implementation on production sites as a High Availability feature.

Implemented Data Guard on Production with Disaster-Recovery and Standby database modes.

Writing shell scripts for hot backup and setting up cron jobs to automate backup procedures, cleaning up archives.

PL/SQL script with mapping and validation to insert / update data from one instance to other oracle Applications base tables using DB link.

Analyzing and understanding the Specifications and Business Requirement documents.

Worked on SQL and PL/SQL which involves all the custom objects migration from 11.0.3 to R12.

Implemented audit features to locate culprit processes and unauthorized logins.

Setting up and maintaining Golden Gate on 10g, 11g databases.

Developing cross-functional relationships to drive an understanding of issues and developing solutions that meet business and technical requirements.

Supporting recurring day-to-day operation elements of the Oracle EPM solutions.

Performing routine financial cycle updates for forecasting, plans and actuals including manual updates to dimension members, substitution variable changes, and management of dimension security for opening/closing data for edit.

Documenting data flows, data mappings, performs data lineage analysis as required.

Working with multiple upstream systems, databases, document the changes for data requirements.

Collaborating with business partners, technical partners, and representatives from other groups to understand and evaluate their requirements regarding data integration, gap analysis, and ensure their needs are met in a timely fashion.

Delivering ETL, data warehouse and data analytics capabilities.

Reporting data visualization skills using software like Tableau or OBIEE.

Writing complex SQL and/or PL/SQL queries, joins, tables, etc.

Developing data cleansing strategies, validation, and action plans.

Coordinating with data owners to validate cleansed data (e.g., reconciling transaction details to summary records such as commitments and obligations to General Ledger (GL) balances).

Assisting with data transformation, data load testing, validity of the values of the data elements, and validating data relationships.

Pythian Group – Ottawa, ON Oct, 2016 – Feb, 2018

Database Administration Solutions Architect

Administered database security across all database environments.

Solved database and SQL performance issues using Oracle native tools including SQL Rewrite.

Maintain security hardened baseline configurations and deploy security (patches), software maintenance (patches) and new software releases and supporting products infrastructure.

Installed of Webmin for GUI of server details of an instance.

Integrated LIFERAY on AWS Installation and integration of WEBMIN on AWS Installation and integrating OROCRM on AWS Installation and integration of JOOMLA on AWS.

Set up NAS on AWS instance and integrated instance backup to NAS Folder.

Installed and integrate Samba server with AWS instance.

Backup and recovery of databases (RDBMS/NoSQL) for production and non-production environments.

Lead in creating a clear vision for the future Solve production problems and participate in a 24x7 on-call rotation.

Built and Deployed schema changes to production and non-production.

Writing and maintaining Unix/Linux shell scripts Identify Root cause analysis of production-related database issues, collaborate with engineering teams for solution on regular basis.

Responsible for leading the architecture and setup of an AWS hosted Data Lake as well as the ingestion pipeline and processing for 100+ datasets, working closely with Agile software development team(s).

Creating and managing schedules for data management (migration, integration, etc.) efforts, working with clients to validate migrated data, working with Agile development teams to understand changes and their impacts towards data migration efforts, among other tasks.

Writing, modifying, troubleshooting and enhancing complex Unix shell scripts and Oracle PL/SQL.

Supported multiple servers and multiple databases of medium to high complexity (complexity defined by database size, technology, transaction velocity and volume, system feeds and interfaces) with multiple concurrent users, ensuring control, integrity and accessibility of the data.

Evaluating why certain QEP’s are chosen and deep knowledge of optimizing the sql that makes up those QEP’s.

Troubleshoot and resolved customer issues related to DB systems availability and performance.

Aware of business issues as they impact overall project plans.

Working with data transformation teams to ensure that the model design and development is properly communicated.

Developed conceptual, logical, and physical data modeling.

Managed various database structures: Table spaces, data files, control files, redo log and archive log files using OEM, GUI tools and command-line.

Developed relationships with customers, comprehend and fulfill their needs, and maintain their satisfaction through regular communication and engagement with their environments.

Developed procedures to enhance reporting and query capabilities to improve efficiency and accuracy.

Maintained a state-of-the-art knowledge of existing best practices in database administration.

Involved in 24/7 support for both production and development databases including 2-node production RAC database.

Conceptualize and provide Cassandra database solutions based on client requirements.

Responsible for complex database performance tuning, security, database backup/recovery strategy, and implementing monitoring procedures to maximize availability of the Exadata system including any databases.

Translating business needs into data models supporting long-term solutions

Implementation of high availability solutions with 9i, 10g, 11g and 12c RAC, disaster recovery infrastructure (Physical and Logical Standby) using Oracle Active Data Guard, Materialized views, Golden Gate. The projects involve replication across geographical time zones.

Install, upgrade and patch Oracle databases in PROD to TEST and DEV environments.

Established performance tuning and analysis as well as best practices for the different Oracle environments residing on Exadata Linux systems, RAC environments, as well as consolidated environments running multiple database applications against the same Exadata system.

Configured and implemented Oracle Goldengate replication.

Assisted with data transformation, data load testing, validity of the values of the data elements, and validating data relationships.

Implemented new technologies and supported operations on numerous Oracle databases while working closely with internal clients, systems programmers, and application developers to define and resolve information flow and content issues---helping to transform business requirements into environment specific databases.

Assisted with determining data migration requirements and security controls.

Resolved the issues in conflict with the target and source database in Goldengate.

Experienced using OEM for performance tuning in 12c.

Coordinated data models, dictionaries, and other database documentation across multiple applications.

Ensured data integrity including entity, referential integrity, check constraints and database triggers.

Designed database schemas to align with development and business goals and objectives.

Diagnosed and resolved performance issues associated with SQL tuning and resource contention including locks and deadlocks.

Implemented transparent data Encryption (TDE). Used TDE with Encrypted columns and Data Pump.

Performed software upgrades and patches of Cassandra and plans the installation and configuration.

Provided daily Cassandra database technical administration and support, including processing change management requests, monitoring, performance tuning and database cloning.

Designed a high-performing database by creating multiplex control files, mirroring redo logs using redo log groups and configuring table spaces

Extensively used ILM tool for data masking.

Performed hot, cold backups and setup/utilize RMAN



Contact this candidate