Post Job Free
Sign in

Data Social Media

Location:
Houston, TX
Posted:
May 17, 2018

Contact this candidate

Resume:

Data Modeler

Technically accomplished professional with over 18 years of experience in Information technology in multiple industries, which includes over 7 years of hands on experience in Data Modeling and over 7 years as Oracle Database Administrator and Enterprise Application Development in multiple industries,

Professional Summary:

•Experience in Data Modeling with hands on experience in Designing Hadoop Data Lake, Data Mining and Warehouse

•Experience on interacting with the Business analysts, End users for gather business requirements and convert them in to data model

•Coordinating with Development, Testing teams for reviewing Test cases and Code for Identify any gaps

•Worked with the Platform teams for Physical Data Model Implementations and storage and capacity planning estimates

•Develop conceptual, Logical and Physical data models

•Develop best practices for standard naming conventions and coding practices to ensure consistency of data models as per set standards

•Optimize and update logical and physical data models.

•Understand and translate business needs into data models solving

•Identify key roles and responsibilities required within the master data and Transactional data

•Defined security roles and responsibilities based in nature of working teams

•Experience on Creating Schema and Physical Data Model implementation on bases like Hive, Vertica, Oracle, MS-SQL

•Knowledge on No sql db’s like Hbase, Mango db

•Familiar with data architecture including Data modeling, data ingestion pipeline design and Experience in optimizing ETL work flows.

•Hands on Experience on Data Extraction, Transformation, Loading Data, Data Visualization using hortonworks Hadoop Platform

•Good Experience in importing and exporting data between HDFS and Relational Database Management systems using Sqoop.

•Strategy planning, sizing preparation

•Data Modeling experience using ERWIN and ER STUDIO

•Expert on vertica database optimization, designing storage projections and segmentation

•Knowledge in NoSQL databases like HBase, Mongo DB

•Oracle Database Administrator 9 Years’ experience in configuration, performance tuning, Backup and Recovery and data optimization areas

•Establishing best practices for the organization.

•Applies in-depth or broad technical knowledge to provide maintenance solutions across one or more technology areas (e.g., Vertica, Hadoop, Oracle DBA, MS SQL server).

•Involved in Different stages of project (End to End) from Scoping to Implementation phase and end User Training

•Release Manager Experience

•Strong working experience on Requirement analysis, System design, Storage estimation and writing Test Cases

•Guide System Analysts, Engineers, Programmers and others on performance tuning of queries and interfaces

•Experience as On-Call 24x7 production DBA support, application / development Engineer

Technical Skills:

Bigdata Ecosystem

HDFS, HBase, Hive, Sqoop, Oozie, HDP, Beeline

SDLC Methodologies

Agile, Waterfall

Programming Languages

Shell Scripting, Vsql, SQL and PLSQL

NoSQL Databases

MongoDB

Operating Systems

Windows Family, LINUX, HP UX, AIX

Scripting Languages

shell, SQL

Databases

Vertica, Oracle, MS - SQL Server, MS Access, Mongo DB, DB2

Build Tools

Putty, Nagios

Scheduling Tools

Tidal, Autosys

Certifications:

•Horton-Works HDP Certified Administrator.

•Oracle Certified Data Base Administrator.

Professional Experience

Hewlett-Packard Enterprise, Houston, TX Nov 2013 - Nov 2017

Data Modeler

Description: The objective of this project is to develop a Hadoop data Lake to Onboard multiple sources data into single cluster and designing reporting layers in vertica for reporting needs and develop new marketing strategies. The data comes from source systems such as life applications to feed the business intelligence system. Apart from this there was data coming from flat files, Social Media, EDW and other databases too. relevant data into the Data Warehouse is the requirement to be designed and implemented.

Responsibilities:

•Worked closely with Business teams and End user for Requirements Collection and transformed into Data Model

•Identified data sets, Volume and Estimated data Growth rate

•Defined Naming standards and coding rules

•Created Data Models for Transformational, Reporting and Analytical data layers.

•Documented Logical and Physical Data models in Understandable words.

•Created Mapping Documentation and data Transformation process documents for Development, QA and Business teams.

•Created Hive schemas and defined data transformation process for different layers and defined profiles for each layer based on responsibilities.

•Coordinated with Hadoop Admin team for defining security access profiles creation and assignment

•Worked with the Development Team for explain data model, reviewed ETL process and Coding standards.

•Worked with the Testing Team for explain data model, reviewed Test Cases to Identify any gaps between the requirement’s.

•Defined data ingestion pipeline and optimized ETL work flows

•Documented Data migration strategies

•Implemented Reporting Data Model in Vertica cluster and defined projections and segmentation

•Worked with the DBA team for creating access profiles based on Roles and responsibilities

•Involved in Performance tuning for scalability

•Defined backup plans along with the systems admin and DBA Teams

•Validated all securities and data along with the testing team

•Automated all the jobs, for pulling data from relational databases to load data into Hive tables, using Oozie workflows and enabled email alerts on any failure cases

•Managed and reviewed Hadoop log files.

Environment: Erwin, Hadoop, HDFS, Hive, Pig, Sqoop, Oozie, Vertica DB, MS Sql DB, Oracle DB.

Hewlett Packard Company, Houston, TX Nov 2011 - Oct 2014

Data Modeler

Description: Implemented Data Model for a HAVEN based solution with analytical and visualization capabilities to Understand correlation between CX metrics, sales, revenue, profit, and share. Provide 360 degree view of Customer Experience, including relevant Operational metrics, Internal Surveys and Social Media information Better customer contact, new product introduction, training, ‘early-warning’ about potential issues.

Responsibilities:

•Worked closely with Business teams and End user for Requirements Collection and transformed into Data Model

•Identified data sets, Volume and Estimated data Growth rate

•Defined Naming standards and coding rules

•Created Data Models for Transformational, Reporting and Analytical data layers.

•Documented Logical Data mode and Physical Data model in Understandable words.

•Created Mapping Documentation and data Transformation process documents for Development. QA, and Business teams.

•Created reverse engineering Model

•Coordinated with Hadoop Admin team for defining security access profiles creation and assignment

•Worked with the Development Team for explain data model, reviewed ETL process and Coding standards.

•Implemented Reporting Data Model in Vertica cluster and defined projections and segmentation

Hewlett Packard Company, Houston, TX Nov 2010 - Oct 2011

Data Modeler

Description: The objective of the project is to migrate historical data of the multiple source (Oracle) data to a MS-SQL database using SSIS and scheduled jobs for data automation on incremental basis daily using TIDAL scheduler

Responsibilities:

•Created Data Modeling using Erwin

•Estimated data Growth and Space Management

•Implemented Data model in database

•Provided Data Migration strategies

•Designed pipeline for Automating & scheduling TIDAL jobs

•Worked closely with the DBS and system Administrators for Performance tuning

Environment: Erwin, Oracle, MS-Sql, Tidal, Batch scripting, SQL

Hewlett-Packard, Houston, TX Nov 2005 - Oct 2010

Sr. Database Administrator

Description: Global Database Administration Team is owned and responsible to support all data bases for Hewlett- Packard Company. Provide 24X7 support for all kind of issues, incident management, Release Management, Major Incidents, and Escalations & Problem Management.

Responsibilities:

•Experience as On-Call 24x7 production DBA support, application/development DBA.

•Extensive knowledge on database administration for Oracle 8i, 9i, 10g and 11g, with experience in very large-scale database environment and mission critical large OLTP and OLAP systems.

•Experience in managing Oracle databases running on HP-UX, MS-Windows, NT Server, UNIX and AIX.

• Hands on experience in Oracle 10g/11g RAC implementation & administration (Oracle Cluster ware setup and configuration, RAC installation using ASM, Grid Installation).

•Good experience RAC Cluster ware administration,

•Configuration and Patching. Extensive experience in Data guard configuration, implementing and maintaining standby databases. Hands on experience in Grid control, Streams, Golden Gate Replication.

•Experience in Performance Tuning using EXPLAIN PLAN, SQLTRACE, TKPROF, STATSPACK, AWR and ADDM.

•Setup Oracle Enterprise Manager (OEM) 10g & Grid Control (11g) for database monitoring/performance/diagnostics.

•Extensive experience in Database Backups RMAN (Full/Incremental backups) and in traditional hot/cold backups.

•Experience in LOGICAL BACKUPS and Database migration tools with DATA PUMP, Export/Import and Transportable Tablespaces.

•Good experience in writing/editing UNIX Shell Scripts. Good experience in PATCHING AND UPGRADATION.

•Expertise in upgrading Oracle Development and Production databases from 8i to 9i, 9i to 10g & 11g, 10g to 11g.

•Extensive knowledge and experience on managing very large Data Warehouse databases.

•Proficient in SQL, PL/SQL, Stored Procedures, Functions, Cursors and Triggers. Extensive experience in configuring and implementing ASM and proficiency in using ASMCMD and ASMLIB.

•Experience in Flash-Back Technology.

•Experience in Database Refresh and patch management and in Database cloning with RMAN and manual methods.

•Experience in Capacity Planning and Bench Marking. Developed database monitoring/health check alert scripts for database uptime/downtime status, and sizing issues using grid control (OEM).

•Proficient in raising TAR with Oracle Support and using Meta link to resolve the bottlenecks in the database environments.

•Excellent Communication skills, co-ordination and communication with system administrators, business data analysts and development teams

Environment: Oracle, OEM, RAC HP-UX, Windows NT

Mazda Automobiles, Koln, Germany May 2005 - Oct 2005

Data Migration Engineer

Description: Migrated data from DB2 Database on IBM OS/390 into Oracle 10g on HP-UX, Lot issues to connecting to OS 390 system using ODBC/JDBC driver, developed customized java based connection for communicating databases. Provided Solution Migrated data, data base objects and to replicate data in bidirectional way.

Responsibilities:

•Oracle 10g server installation and setup databases

•Security profile management

•Data modeling and Schema creation

•Success full migrated Schema from IBM OS/390 DB2 7.1 to Oracle 10g HP-UX and mapped Data types

•Automated Data Migration

•Application and data base tuning

•Involved in real time replication from DB2 to Oracle and Oracle to DB2

Environment: Oracle, HP-UX, IBM OS/390 DB2 7.1, SQL ways, Windows

Intel India Private LTD, Bangalore, India July 2004 - Apr 2005

Database Administrator

Description: Co-coordinating with Development and Support teams for new changes implementation and code migration, Code release activities and day to activities

Responsibilities:

•Managing the database's storage structures

•Performs the capacity planning

•Managing users and security

•Managing schema objects, such as tables, indexes, and views

•Making database backups and performing recovery when necessary

•Proactively monitoring the database's health and taking preventive or corrective action as required

•Monitoring and tuning performance

•Troubleshoots with problems regarding the databases, applications and development tools.

•Alert log monitoring and Trouble shooting

•Monitor application related jobs and data replication activities

•Implement and maintain database security

Environment: UNIX, Oracle 8i, Autosys

Visaka Industries Ltd, Hyderabad, India Aug 1999 - Jun 2004

Application Developer

Description: Developed forms and reports using Oracle D2K for sales accounting, Delivery and Order Management, Material Management, Stock order, Sales invoicing and Pay role applications

Responsibilities:

•Built PL/SQL code base

•Created Forms and reports using Oracle D2K

•Migrated D2K to Developer 5.0

•End user training

Support during New Financial year changes

Environment: AIX, Windows NT, Oracle 7.1, D2k, Developer 5

Education:

Masters in Computer Science, India.



Contact this candidate