Resume

Sign in

Data Sql

Location:
Littleton, Colorado, United States
Posted:
February 22, 2018

Contact this candidate

Ronnie Shahrabani

Aurora CO, *****, USA 720-***-****

e-mail: ac4kr7@r.postjobfree.com

LinkedIn: www.linkedin.com/pub/ronnie-shahrabani/8/194/b11/

An accomplished Software Engineer with over 30 years of international experience in all phases of the Software Development Life Cycle -SDLC.

I am highly motivated, determined and dedicated, reliable and thorough, competent with demonstrated ability to maintain excellent functioning under pressure. Thrive in deadline-driven environments. I have excellent problem solving and analytical skills. I operate independently, requiring little to no supervision.

I am seeking a job opportunity in the Colorado Denver Metro area in ETL and Database Development, where my unique qualities and strong experience in ETL, Oracle databases - Complex PL/SQLs, Procedures, Functions, Dynamic SQL, Indexes, Partitioning, Tuning with OLAP Data Warehouse and Data Mart as well as OLTP Transactional Databases can be a great contribution to the hiring company.

Summary of Qualifications

ETL architecture, development and execution of ETL processes, requirements and standards. Implementing the “Kimball” Dimensional Modeling.

Hands on experience with:

oBig Data technologies - Hadoop HDFS, HIVE (Partitioning, file formats), HBASE and MAP Reduce.

oPDI - Pentaho Data Integration with JavaScript – expert level.

oInformatica Power Center.

oOracle 11g, DB2 and SQL Server ( MSSQL v 2008 / v 2012 / v 2014) – expert level.

oExpert in SQL, Complex PL/SQLs, Procedures, Functions, Reports, Dynamic SQL, Indexes, Partitioning, Optimization and Tuning, with Oracle databases - Data Warehouse and Data Mart.

Technical analysis and design implementation and maintenance of BI - Business Intelligence and Data Warehousing solutions.

ORACLE DBA - OCP Certified, Microsoft SQL Server DBA - MCP Certified.

Strong relationship with customers and strong customer service attitude.

Commitment to quality work and high attention to detail.

Easily adapts to changing priorities.

Fast learner and hard working with strong can-do attitude. If I don’t know how to do something I will go figure it out. I have desire to keep learning and advancing my skills.

Technical Expertise

Programming and Development Tools/Languages:

Informatica Power Center

Pentaho and Hadoop Framework Fundamentals, HDFS, HIVE.

PDI - Pentaho Data Integration with JavaScript, on Oracle DB2 and MSSQL databases.

SQL Server Management Studio, PL/Sql developer, AquaData, DbVisualizer,

UNIX / Linux shell scripting, SQL, PL/SQL, T-SQL, SqlPlus, TOAD, BRIO, SubVersion SVN, JCS,

COBOL, DYL, SAS, RPG, FORTRAN, JCL, File-AID/MVS,

Quality Center QA testing tool, WinRunner, Microsoft Office Applications, Microsoft Project, Exceed

Operating Systems:

SUN/HP/UNIX, Linux, MS Windows, IBM MVS-OS/390, VAX/VMS, MS-DOS, IBM Mainframe, IBM 34/36/AS400

Databases:

ORACLE 6 to 11G, SQL Server ( MSSQL v 2008 / v 2012 / v 2014), Informix, DB2, Microsoft Access, MySql

Professional Experience

February 2016 – January 2018 – Travelport

Position: Senior ETL Data Warehouse Designer and Developer

Main Responsibilities:

Designed and Developed and Implemented multiple projects using:

oBig Data technologies - Hadoop HDFS, HIVE (Partitioning, file formats).

oInformatica with Linux and DB2 with the relevant Stored Procedures.

oPDI Pentaho with Linux,MsSql and DB2.

oDesigned, Created and Fine Tuned the relevant schemas Staging, Warehouse, CurrentView.

Defined technical requirements, and data architectures for the data warehouse and the analytics ecosystem

Focused on cost savings, high performance, high reliability, quality of user experience and architectural alignment of solutions

Coached/Mentored others in application of data warehouse and analytics architectural standards

Created models for new data warehouse development and/or changes to existing data stores

Identified inefficiencies and gaps in current data warehouses and leveraged solutions.

Designed systems, policies, and procedures for disaster recovery and data archiving to ensure effective availability, protection, and integrity of data assets.

Planned capacity and resource expansion to ensure data warehouse scalability

Identified data discrepancies and data quality issues, and worked to ensure data consistency and integrity

Developed, implemented, and maintained change control and testing processes for modifications to data systems

Designed, Created and Fine Tuned the relevant schemas Staging, Warehouse, CurrentView, using Informatica with MsSql and DB2, as well as the relevant Stored Procedures.

Knowledge, Relevant Skills and Hands On Experience:

ETL experience across Pentaho Data Integration (Kettle, Spoon) and Informatica Power Center.

Big Data technologies like Hadoop HIVE (Partitioning, file formats), HBASE and MAP Reduce.

MS SQL, DB2 and Oracle.

Implementing Data Marts and Enterprise Data Warehouses.

Linux environment.

Best practices associated with end to end technical architecture and technical support documentation requirements, design, development, testing and implementation practices as they relate to data warehouse processes.

Current knowledge of the data warehouse market, vendors, and standards bodies.

Advanced communication, problem solving, and conflict resolution with internal and external stakeholders.

Working in a team-oriented, collaborative and geographically diverse team environment.

September 2013 – January 2016 – Stoneriver

Position: Data Analyst ETL and Database Developer

Main Responsibilities:

Implemented and maintained Data Warehouse and BI - Business Intelligence applications for the P&C product lines.

Analyze data loaded into data warehouse for accuracy.

ETL architecture, development and execution of ETL processes, requirements and standards.

Technical analysis and design of Business Intelligence and Data Warehousing solutions. Autonomously code ETL sub-system and necessary data structure from technical documentation.

Expert in SQL, PL/SQL Programming, Optimization and Tuning. Improve/better processes efficiency.

Dramatically improved Team’s knowledge base in SQL tuning in Oracle and SQL Server, JavaScript as well as Pentaho Performance, pushing the Pentaho capabilities to the limit.

Developed automatic “SCHEMA_STATS” gathering for Data Warehouse and Data Mart supporting “Linux/Unix” and “Windows_NT” as well – embedded at the critical points of the Bulk and Incremental.

Reconstructed the build of Data Warehouse, Data Mart and Snapshot, to support De-Normalization As well as “No De-normalization”.

Provided automatic job generation procedures for Data Warehouse, Data Mart as well.

Provided a comprehensive Automatic Dynamic RECOVERY Mechanism for Data Warehouse, Data Mart and Snapshot.

Provided complex mechanism to fire the jobs in parallel - automatically to the various types of machines, depending on the number of Processors.

Enhanced and Improved performance of all the INTERFACES for BWC client.

Established Procedures required to be run on every BI environment.

Dramatically improved the performance of the Data Warehouse and Data Mart– Dimensions and Facts, as well as Snapshot tables:

oProvided new Indexes.

oReconstructed the Select statement of selected transformations – to minimize redundant “Outer Joins”.

oRefined the Parallelism and Cache for table access for a bunch of Select Statements - implementing the “lesson Learned” recommendations came from site’s DBA.

oEstablished Automatic Parallelism of the Dimension Lookups (depending on the number of Processors).

oReconstructed the Snapshot processes for Claims and Policies to work in Parallel.

Provided documentation such as:

oReplacing the entire Pentaho Catalog.

oRefreshing the Data Warehouse and Refreshing the DM from Scratch.

oRebuilding the Data Warehouse and Rebuilding the DM from Scratch.

oRecovery Negative Test Scenarios.

Conversion system for a new prospect client:

oProvided Conversion catalogs of Stage DB/Schema to PowerSuite PCMP DB/Schema.

oProvided two different catalogs one for Oracle and one for MSSQL.

oImplementing Pentaho 5.3, Oracle 11 and MSSQL 2012.

October 2012 – August 2013 – Epsilon

Position: ETL and Database Developer

Main Responsibilities:

Practiced ETL architecture, designed, developed and implemented ETL processes, requirements and standards for marketing database projects.

Developed and Programmed with UNIX scripting, Oracle 11G, PL/SQL complex SQL, SubVersion SVN, Toad.

Provided Tuning with OLAP Data Warehouse and Data Mart as well as OLTP Transactional Databases.

Expert in SQL, PL/SQL Programming, Optimization and Tuning. Improve / better the efficiency of the processes.

Provided Complex PL/SQLs, Procedures, Functions, Dynamic SQL, Indexes, Partitioning, Optimization and Tuning, experience with Oracle databases - Data Warehouse and Data Mart.

Conducted appropriate testing of applications to ensure quality. Knowledge, Skill, and Ability Requirements.

Significantly Improved Quality, hence Client Satisfaction:

oEnhanced processes in Data Mart / Data Warehouse to work more efficient and gain performance applying merge, parallelism, hints, indexes, Cursors with bulk collect and sometimes restructure the SQLs.

oNoticeably Enhanced the SFTP Job to account for all types of failures even if the failure is due to the other end.

oCreated generic procedures to do Directory Cleanup with UNIX scripting.

oDramatically improved Project Stabilization conducting research to improve SQLs and processes efficiency and performance in Data Mart / Data Warehouse (i.e. Change the DW table load to utilize MERGE, etc.).

oImpressively improved to improve performance and time span of selected Framework Jobs (i.e. decrease a major process run time from 6 hours to 1/2 hour), hence eliminate violating the SLA.

Created new jobs and Fixed defects in existing jobs for Data Mart / Data Warehouse using Toad, Unix, SubVersion, JCS while maintaining high quality and performance efficiency to meet client requirements.

August 2009 – September 2011 – NBC - U.S. Department of the Interior

Finance and Procurement Systems Division at National Business Center. Contract position with SofTec / RTL.

Position: Software Expert

Main Responsibilities:

Designed Reports, Programmed and Maintained software of FFS - Federal Financial System on IBM Mainframe.

Provided Tuning with OLAP Data Warehouse and Data Mart as well as OLTP Transactional Databases.

Provided expertise in SQL, PL/SQL Programming and Tuning to Improve / better the efficiency of the processes.

Developed Complex PL/SQLs, Procedures, Functions, Reports, Dynamic SQL, Indexes, with Oracle Data Warehouse and huge databases.

Programmed, Performed complex data mining and ad hoc reports with COBOL, VSAM, File-AID/MVS, ORACLE, TOAD, BRIO, SQL, PL/SQL and Microsoft Access DB.

Analyzed and performed conversion of jobs / programs from DYL280 and SAS to COBOL on IBM Mainframe.

Developed and automated reports - saving NBC and its clients thousands of dollars in manual report preparation. Expertise was very valuable in NBC's major project with the deployment of the Financial Business Management System (FBMS). Provided help with data cleansing, developing queries to gather statistical data for NBC's metric reporting.

Developed and automated reports - saving NBC and its clients thousands of dollars in manual report preparation. Expertise was very valuable in NBC's major project with the deployment of the Financial Business Management System (FBMS). Provided help with data cleansing, developing queries to gather statistical data for NBC's metric reporting.

Redesigned and Reprogrammed an old legacy system tracking the costs of the Airplanes - Current Year versus Previous Year. This was done on PL/SQL on the ORACLE Data-Warehouse.

Designed Reports, Programmed, Maintained software of FFS - Federal Financial System through Complex PL/SQL, Procedures, Functions, Reports, Dynamic SQL, Indexes.

Tuned and Optimized existing as well as new SQL, PL/SQL programs anonymous blocks as well as stored procedures to Improve / better the efficiency of the processes.

June 1992 – August 2008 – AMDOCS Inc.

Position: Software Expert - Senior Team Leader

Various positions at Israel Development Center and at relocations to various Amdocs customer sites at the US and Australia.

Main Responsibilities:

Performed Software Design, Development, Programming, Implementation in integrated environments.

Developed and maintained Batch and Online – Client/Server Applications and Integration Jobs using UNIX, COBOL, ORACLE, TOAD, SQL, SqlPlus, PL/SQL, Informix, DB2.

Developed Complex PL/SQLs, Procedures, Functions, Reports, Dynamic SQL, Indexes, Tuning, ERD Changes, Data Migration.

Performed Tuning with OLAP Data Warehouse and Data Mart as well as OLTP Transactional Databases.

Significantly Improved the level of stability and reliability, troubleshoot and resolve problems in Production and all other environments.

Liaison with client staff concerning software issues and customer training.

Planned future enhancements and software upgrades and releases.

Managed trained and supervised teams of programmers in software development, teams of test designers and testers.

August 1984 – May 1992 – Solmal Data Processing Company Ltd. – Israel

Position: Designer and System Programmer, Team Leader (1-3 subordinates)

Main Responsibilities:

Significantly enhanced and improved systems stability and reliability, by redesigning and reprogramming several Legacy Systems.

Designed and programmed financial systems on MS-DOS, IBM Mainframe, IBM 34/36/AS400

Performed complex data mining and ad hoc reports (COBOL / RPG and VSAM files as well as DB2)

Analyzed and directed the conversion from IBM-36 to IBM AS/400 and IBM PC systems

Analyzed to improve applications and system performance

Managerial Experience

QA Senior Team Leader (5-10 subordinates)

Development Senior Team Leader (1-5 subordinates)

Education

Computer Software Engineering, Tel Aviv University, College of Engineering, Israel - Associate Degree.

Professional Courses

Pentaho Corporation - Pentaho and Hadoop Framework Fundamentals

ETL training:

oTDWI Dimensional Data Modeling.

oTDWI Dimensional Design.

oTDWI Predictive Analytics.

oParticipate in the 4Sight BI Training.

Data Processing, Systems Analyst certification, Sivan Computer College, Tel Aviv Israel

ORACLE DBA - OCP Certified, Microsoft SQL Server DBA - MCP Certified, Tel Aviv College, Israel

oIntroduction to Oracle: SQL, PL/SQL

oEnterprise DBA: Architecture and Administration

oEnterprise DBA: Backup and Recovery

oEnterprise DBA: Performance and Tuning

oEnterprise DBA: Network and Administration

oDevelop Data Models and Design Databases

Windows 2000 Server

Administrating Microsoft SQL Server 2000 Database

Microsoft ACCESS

UNIX

IBM-MVS(OS/390); IBM-AS400; IBM-DB2

WinRunner

Self-Training

Informatica Power Center.

oInformatica PowerCenter Designer.

oInformatica PowerCenterWorkflow Manager.

oInformatica PowerCenterWorkflow Monitor.

oInformatica PowerCenterRepository Manager.

C# 2010 and .NET 4.0 and Object Oriented Programming, Visual Studio 2010

oADO.NET - Connected Layer

oADO.NET - Disconnected Layer

oADO.NET - Entity Framework

oWCF - Windows Communication Foundation

oWF - Windows Workflow Foundation 4.0

oWPF - Windows Presentation Foundation and XAML

oLINQ



Contact this candidate