Post Job Free

Resume

Sign in

Sr SAS DI Studio GRID Developer

Location:
Hawthorne, CA
Posted:
July 19, 2015

Contact this candidate

Resume:

Dwayne Hill

**** * ***** ******, ********* #C, Gardena, CA 90249

310-***-**** (Mobile)

acqtkj@r.postjobfree.com

PROFESSIONAL SUMMARY

Sr. SQL / SAS DI Studio Developer with twelve years of professional experience in technical leadership roles in financial Services in Big Data Analytics as SAS Grid implementation and support manager, Industry Solutions Architect, SAS Platform administrator and developer, SAS Principal, SAS Trainer and Data Warehousing consultant for Fortune 100 financial services companies.

This experience includes involvement in all phases of the software development life cycle: analysis, design, implementation and system testing. Accustomed to communication with cross-functional business units to create/gather business requirements and translate them into system solutions. Extensive experience in Windows and UNIX environments, managing multiple concurrent projects.

EDUCATION

University of Southern California, Los Angeles, CA

Bachelor of Arts – Mathematics- 9/1996 - 5/2000 - Graduated

SKILLS

Tools: SAS/BASE, SAS/STAT, SAS/MACROS, SAS/GRAPH, SAS/ODS, SAS/ACCESS Interface to Oracle, SAS/ACCESS Interface to ODBC, SAS/ACCESS Interface to Teradata, SAS GRID, SAS/ENTERPRISE GUIDE, SAS/Connect to Oracle on Unix Server (AIX Version 6.1), SAS/Data Integration Studio, Web Studio, Information Maps, SQL, PL/SQL, T-SQL, PL/SQL Developer, TOAD, Oracle 11g/10g/9i, SQL Developer, Import/Export tools, SQL*Loader, SQL*Navigator, EXCEL, DataFlux, ACCESS, PowerPoint, Crystal Reports, MicroStrategy, CRM, Unica Affinium, Business Objects, and Teradata SQL Assistant

Databases: Oracle Databases, Oracle Clinical Databases, Teradata Databases, MS SQL Server Databases, Netezza Databases, Sybase Databases, SAS Server Databases, EXCEL Databases, and ACCESS Databases

Environments: Windows and UNIX

Professional Experience:

Visa

01/2015 – 07/2015

Sr. SAS Consultant

Responsible for Effective operation of the Central Analytics Environment (CAE).

Responsible for Integration of SAS grid with Cloudera Hadoop implementation.

Responsible for SAS middle-tier implementation.

Administering the SAS 9 software environment including installation, configuration, tuning, upgrades and hot-fixes developing and documenting administration processes & system configuration.

Monitoring and logging SAS servers and tune servers for optimal performance and Storage management - do more with less.

Monitoring and managing hosts, jobs, users and queues in the SAS Grid environment.

Evaluation and migration to newer versions of SAS Grid environment.

Designing, implementing and maintaining the security framework for CAE.

Recommending appropriate tools and technology for modeling and analysis work.

Developing and delivering SAS training including SAS Grid for end users and administrators.

Optimizing use of environment resources for business.

Actively planning and prioritizing requests for resources.

SAS SME to support the CAE business user community.

New software evaluation.

Support of all Production / Development environment related issues

SAS Analytics administration expertise to debug user issues.

Performing daily tasks of SAS / Unix system administration, including file system performance monitoring, reviewing OS patch

Resource and infrastructure planning.

Environment: SAS Grid, Platform LSF, Management Console (with Metacoda), SASGSUB, Enterprise Guide, SAS Enterprise Miner, SAS EBI, Web Reports studio, DI Studio, Add-in for Microsoft Office, Stored Processes, OLAP Cube, Studio, Analytics Pro, Base, Macros, Connect, ODS.

WineHouse Los Angeles

05/2014 - 09/2014

SAS BI Developer

Access Data from WIN08-PDCVH database (MS SQL Server)

Perform weekly checks on WIN08-PDCVH database to validate Wine House SQL (POS Database) replication to the SAS Server (WHSASSRV)

Manage data across the following WIN08-PDCVH tables: Category, Customer, Department, Item, Quantity Discount, Supplier, Supplier List, Transaction, Transaction Entry Check to ensure SAS processes run error-free by checking SAS log files and underlying datasets

Check Wine House SAS Portal via the Web Browser to ensure the presenting data layer (Portal Reports) display correct data

Environments: SAS base, SAS macros, SAS proc procedures, SAS datastep

Kaiser Permanente

05/2013 – 11/2013

Health Informatics Services – North West IT

SQL/SAS DI Studio Developer

Convert and streamline SAS projects and programs/analysis to run intuitively on UNIX, PC and Mainframe in batch mode.

Installed and maintained SAS GRID client applications like Base SAS, Enterprise Guide, and Enterprise Miner on user machines as needed.

Trouble shooting client applications like Enterprise Guide and Base SAS in GRID environment.

Involved in gathering and analyzing business requirements and detail technical design of ETL to support data around OpQ – Specialty Care and OpQ – Primary Care

Involved in the construction of the ETL process- extracting data from multiple databases (SQL Server, Oracle, Teradata) using SAS DI Studio, SAS/Access, SAS SQL procedures, creation of permanent SAS data sets within UNIX directories using Transformations according to the Business Requirements and Loading of data in to the appropriate OpQ data sets and OpQ Oracle History tables

Perform the ETL tuning for OpQ by using lookup instead of join if possible, removing unnecessary component, making use of the indexes on the tables

Register multiple data sources (raw external files, datasets, database tables) into meta-folders using DI Studio

Create, modify metadata definitions for raw external files, datasets, database tables using DI Studio

Convert SAS DI Studio jobs into LSF flows that can be automatically triggered by time or file events to run on Platform Process Manager Server

Developed Packages, SQL Scripts and Database Triggers to populate the historic data in stage OpQ Oracle History Tables

Developed structural changes to database objects according to business logic

Used SQL*Loader scripts to load the flat files into the stage OpQ Oracle Database

Performed application SQL Tuning using SQL Trace and Explain plan

Created simple and complex views

Used various SQL Expressions in standalone procedures and functions

Wrote complex SQL scripts and was POC for performance tuning of SQL queries

Used Export/Import Oracle utilities of Data Pump to move data to different testing environments

Worked on Unix Shell Scripting for scheduling batch jobs using Crontab

Environments: SAS DI Studio, SAS/Access, SAS SQL, UNIX, Oracle 11g/10g, SQL, PL/SQL, SQL*PLUS9.2.0.3, TOAD 10.6, SQL Developer 3.1, ORACLE 11g,10g, SQL* Plus, UNIX, PL/SQL Developer,

VISIO

Pinnacle Solutions

SAS Developer

Responsibilities include processing of a variety reports that are useful to those in the airline industry on an EBI-based portal

Sourcing data from DOT (Department of Transportation)

Downloading, cleaning, and transforming this data (i.e. flights, airports, passenger load, schedules and financial information) to create the reporting datasets that underlie the EBI-based portal

Providing assistance in revamping the current but yet complex legacy version of the cleaning and transforming process into a more automated process that is based on SAS

Providing new airline schedules data monthly

Environments: SAS base, SAS macros, SAS proc procedures, SAS datastep, EBI, DOT

Freedom Specialty Insurance - New York, NY

Claims and Reporting/ SQL/SAS DI Studio Developer

Develop SAS programs using Enterprise Guide to prepare data, develop, evaluate and validate predictive models

Extensively used SAS Grid architecture on UNIX/Linux to perform job and workflow scheduling

Design and build highly interactive SAS applications using SAS BI suite and OLAP tools

Create interfaces between UNIX, Window and Web environments using SAS programs, DI Studio Jobs, and tools to obtain and maintain data from external vendors.

Provided programming / implementation support for different SAS tools (SAS FORECAST STUDIO, SAS-EMINE, SAS-EGUIDE, BASE SAS, SAS GRID)

Create, refine and manipulate SAS reports, graphs and dashboards using Web Report Studio

Sourcing data from Main SCA, ADR, Derivative Shareholder Actions, Investment Shareholder Actions, and Investment Banking E&O databases within Freedom Specialty’s Peregrine Model.

Extract, Transform, and Load data into Main SCA, ADR, Derivative Shareholder Actions, Investment Shareholder Actions, and Investment Banking E&O databases

Developed Packages, SQL Scripts and Database Triggers to populate the historic data in stage Oracle SCA Litigations Database History Tables

Developed structural changes to database objects according to business logic

Used SQL*Loader scripts to load the flat files into the stage Oracle SCA Litigations Database

Performed application SQL Tuning using SQL Trace and Explain plan

Created simple and complex views

Used various SQL Expressions in standalone procedures and functions

Wrote complex SQL scripts and was POC for performance tuning of SQL queries

Used Export/Import Oracle utilities of Data Pump to move data to different testing environments

Worked on Unix Shell Scripting for scheduling batch jobs using Crontab

Environments: SAS DI Studio, SAS/Access, SAS SQL, SAS BI suite and OLAP tools, UNIX, Main SCA,ADR, Derivative Shareholder Actions, Investment Shareholder Actions, and Investment Banking E&O

Oracle databases, Oracle 11g/10g, SQL, PL/SQL, SQL*PLUS9.2.0.3, TOAD 10.6, SQL Developer 3.1,

ORACLE 11g, 10g, SQL* Plus, UNIX, PL/SQL Developer, VISIO

Bank of America

07/2011 – 07/2012

IT– Home Loans/Credit Risk – Enterprise Data Warehouse Group

Sr. SAS DI Studio Developer

Responsibilities include extracting consumer home loans data from Teradata, Oracle, and SQL Server databases via SAS Enterprise Guide 4.3 and SAS Data Integration Studio 4.3 clients.

Employing DataFlux Data Management Server 2.5 and DataFlux Data Manager Studio 2.5 to standardize and validate the quality of geographic and risk dimensions data elements, their values and formats across SQL SERVER, NETEZZA, TERADATA, and SAS Data marts for data aggregation and data cross reference (merges).

Successfully implemented SAS GRID and performed migration from stand-alone system, which reduced the analytical model development drastically and helped increase the productivity

FTP SAS code (native to PC processing) from SASPLEX remote server to HLI LINUX environment to test for cross-platform functionality and improve efficiency of real time.

Use SAS Access engine, SAS Grid computing, SAS Scalable Performance Data Server (SPDS) to access for processing large volumes of data on the server Fit the explanatory models and investigate multivariate relations

Creating month-end prototypes of Excel tables and graphs in support of SRR project using HTML tag scripting in SAS Enterprise Guide 4.3

Accessing RCHTERA (Teradata), Lynx (SQL Server), Pyxis (SQL Server), Hydra (SQL Server), Odysseus (SQL Server), and Laser (Oracle) databases via SAS Enterprise Guide 4.3, SAS Data Integration Studio 4.3, Toad for SQL Server, and Teradata SQL Assistant for sourcing target data elements to source data elements.

Environments: RCHTERA (Teradata), Lynx (SQL Server), Pyxis (SQL Server), Hydra (SQL Server), Odysseus (SQL Server), and Laser (Oracle) databases via SAS Enterprise Guide 4.3, SAS Data Integration Studio 4.3, Toad for SQL Server, and Teradata SQL

Wells Fargo (Wachovia) Dealer Services

02/2010 – 07/2011

IT – Data Warehouse - Enterprise Data Integration Group, SQL/SAS Developer/SAS Administrator

Responsibilities include extracting data from wds/wfs local and remote SAS servers via SAS Enterprise Guide 4.0 (Connect to Oracle).

Formatting and manipulating data according to business requirements and audiences.

Employing DataFlux Data Manager Server 2.5 and DataFlux Data Manager Studio 2.5 to standardize and validate the quality of geographic data elements, their values and formats across ORACLE, TERADATA, and SAS Data marts for data aggregation and data cross reference (merges).

Building month-end datasets in SAS to report on expected loss for auto and unsecured loan accounts.

SAS data warehouse upgrade from Enterprise Guide to DI Studio, converting SAS based code into DI Studio.

Importing loss forecast model data from Informatica into SAS.

Sub setting, sorting, and indexing individual datasets for use in STAR Basel and New Loss Forecasting models.

Creating and maintaining database model documentation for moderate complexity projects, and gathering requirements.

Assisting more senior database analysts in the development and implementation of data solutions.

Developed structural changes to database objects according to business logic

Used SQL*Loader scripts to load the flat files into the stage Oracle Database

Performed application SQL Tuning using SQL Trace and Explain plan

Created simple and complex views

Used various SQL Expressions in standalone procedures and functions

Wrote complex SQL scripts and was POC for performance tuning of SQL queries

Used Export/Import Oracle utilities of Data Pump to move data to different testing environments

Worked on Unix Shell Scripting for scheduling batch jobs using Crontab

Environments: SAS DI Studio, SAS/Access, SAS SQL, SAS BI suite and OLAP tools, UNIX, Oracle 11g/10g, SQL, PL/SQL, SQL*PLUS9.2.0.3, TOAD 10.6, SQL Developer 3.1, ORACLE 11g,10g, SQL* Plus, UNIX, PL/SQL Developer, VISIO

WellPoint

10/2009 – 12/2009

- Individual Membership Retention – Metrics and Data for Membership and Enrollment / SAS Programmer/SAS Developer

Responsibilities included; serving as an expert in data analysis, reporting and formulating recommendations and providing guidance to other data analysts.

Primary duties included, but are not limited to: Creating and maintaining databases to track business performance.

Analyzed data and summarizes performance using summary statistical procedures.

Additional responsibilities involved developing and analyzing business performance reports (e.g. for claims data, provider data, utilization data).

Providing notations of performance deviations and anomalies, creating and publishing periodic reports, as well as any necessary ad hoc reports.

Taking business issues and devising best ways to develop appropriate diagnostic and/or tracking data that will translate business requirements into usable decision support tools.

Required to make recommendations based upon data analysis and provide analytic consultation to other business areas, leadership or external customers.

Environments: SAS base, SAS macros, SAS proc procedures, SAS datastep, MS Access

Client: Amgen - Thousand Oaks, CA

08/2008 – 10/2009

Planning and Platform Services – Clinical Metrics and Reporting/Business Analyst (SAS)

Responsibilities included; designing, generating, validating, and managing reports in SAS Base version 9.1, Microsoft Office Excel 2003, and Cognos Report Builder.

Provided input in the implementation of new technologies and systems.

Liaised between Business and IS; developing User Acceptance tests in SAS; verifying data integrity across SAS datasets/tables and Cognos reports and managing system configuration.

Provided ongoing data support and troubleshooting/debugging previously written SAS code; tracking changes, issues, and recommended enhancements to reports and systems.

Performing data entry, capturing departmental metrics, and assisting in data analysis of Protocol/Site/Subject data as it relates to CRO, and FSP designations.

Highlighting issues regarding data quality and integrity; and completing action items that arise from departmental project needs.

Additional responsibilities included but are not limited to assisting department with internal/external activities and special projects as requested.

Environments: SAS Base, SAS macros, SAS proc procedures, SAS datastep,

Microsoft Office Excel 2003, and Cognos Report Builder.

Employer: Warner Bros. Home Entertainment Inc., Warner Home Video Division 07/2007 – 10/2007

Burbank, CA. - Supervisor, Sales Planning and Analysis – Temporary/Contract

Responsibilities included; determining the appropriate forecasting method, predictor variables and statistical model validation techniques for catalog titles.

Provided POS and Shipment Base Forecasts for all active/pre-active titles.

Assisted in forecast promotion projects to VMI accounts.

Worked with manager to determine the appropriate forecasting method and forecast validation technique.

Communicated catalog forecast/actual results to Account Executives, Manager of Catalog Forecasting, Director of Sales Forecasting and WHV Executive Management, and updating Daily Seasonal Index for VMI accounts.

Additional responsibilities included ensuring company goals were consistent with departmental forecast, creating and maintaining hybrid data sets such as Teradata (Bus. Obj., MSI, SAS) and Local (SAS, Access) for WHV and competitive (Nielsen) Catalog titles.

Used SAS DI Studio in this role.

Worked with VMI, IT groups and customers to develop sources for promotion and re-price information sources.

Worked with IT in defining key metrics for causal model development, developed Post Mortem analysis for base forecasts to monitor forecast accuracy and improve models based on research findings.

Designing automated sensitivity analyses of forecasting processes, including model selection and testing, and defining and executing summary reports, including Top Line Summary Report.

Environments: Business Objects., MSI, SAS base, SAS macros, SAS proc procedures, SAS datastep

Employer: Kelly Scientific Resources (Agency), 1/2006 – 3/2006

Client: Getty Conservation Institute - Los Angeles, CA. -Research Laboratory Assistant/Data

Analyst – Temporary/Contract

Responsibilities included utilizing a PC SAS programming environment at the Getty Conservation Institute.

Included providing processing of incoming environmental data using a set of existing programs and modifying existing programs. This was successfully done using SAS Base and SAS Graph.

Additional responsibilities included assisting in compiling, analyzing and interpreting results, contributing to reports on research results, and assisting scientists with basic laboratory work. The use of SAS Base and Graph, MS Office suite programs and Adobe PHOTOSHOP was required in order to competently work in basic functions of scientific support.

Familiarity with AutoCAD, and Java programming language were useful in updating web templates that provided monthly analysis of environmental data results.

Environments: SAS Base, SAS macros, SAS proc procedures, SAS datastep, MS Office suite programs

and Adobe PHOTOSHOP

Employer: Office Depot/Viking Office Products, 3/2002 - 7/2005

Delray Beach, FL. - Sr. Analyst

Responsibilities included a wide range of projects, from budgeting to test analysis, as well as ad hoc projects from different marketing business units.

Statistical Software packages used to carry out these projects were SAS base, SAS macros, SAS proc procedures, and SAS graph.

Used Extract Transfer Load (ETL) processes.

These tools facilitated the development process in improving financial revenue/performance.

Another primary responsibility for this position involved developing and executing analysis of retention initiatives using automated reporting built from SAS base programming. This was an ongoing effort that was accomplished on a quarterly basis.

This position was also responsible for the determination of sales forecasting and seasonal strategy by product category.

Utilized various analyses from product managers and linear/logistic regression techniques through SAS base, a product affinity database was created and automated to report sales metrics by product on a quarterly basis.

High priority was given to all stages of the catalog campaign process.

These analyses were created from automated reports built using a combination of SAS base and ESSBASE reports in EXCEL.

Through these analyses a targeted group/universe of customers were selected from a master database of customer accounts and was filtered down even further at the request of the Marketing Manager to produce a customer file relevant to the season and catalog type to be mailed.

After creating this customer file, performed audits and QA analysis on the file to ensure that the customer selection matched the selection criteria. This also was accomplished through SAS base. This file was placed on an FTP server and electronically sent to a processor where physical catalogs were created.

Environments: EXCEL, SAS base, SAS macros, SAS proc procedures, SAS datastep, and SAS graph.

PROFESSIONAL REFERENCES

Sahay Anthonisami, US Tech Solutions, Work: (201) 524 – 9600 Ext. 315, Mobile: (973) 337 – 0096

Harold Kurt, Kaiser Permanente, Work: (503) 813 – 4096

Rob Craige, Bank of America, Work: (980) 387 – 7783

Mary Webb, Recruit Networks, Work: (949) 922 – 1470

Latha Asuri, Wells Fargo Dealer Services, Work: (949) 753 – 3207

Edward Suda, Wells Fargo Dealer Services, Work: (949) 753 – 3207

Pierre Howard, ACT-1/Apple One, Work: (805) 277 – 5719

Tracy Sparks, Amgen, Work: (805) 447 – 6149

Lindsay Lippert, Amgen, Work: (805) 447 - 6149



Contact this candidate