Post Job Free

Resume

Sign in

Data Developer

Location:
Fairfield, OH, 45014
Posted:
June 08, 2020

Contact this candidate

Resume:

Joseph Bryan Bates

** ******** ******, ********* ** 45014

[Cell] 702-***-**** [Email] addovn@r.postjobfree.com

Senior Data Architect/Developer, Data Warehouse Architect/Developer, ETL Developer, Database Developer

Highly adept IT expert facilitating business systems requirements, data integrity, performance and delivering successfully developed solutions.

Broad-minded, dedicated and highly proficient IT professional with experience serving in Systems Architecture, Solution Provider Consultant and Senior Developer positions. Expertise in optimizing business performance while leading full life cycle development initiatives, contingency planning and product research. Excellent problem-resolution skills, troubleshooting and collaboration talents to achieve organizational goals. Confident and sincere interpersonal communication style.

Requirements Analysis / Data Architect / Data Modeling / Data Flow Analysis / Database Design & Administration / Data Warehouse Architect / Identity Analytics Specialist / Integration Architect / Full Life Cycle Development / Development, Testing & QA / Implementation Documentation / Client Training & Presentation

Technical Proficiencies:

Programming Languages:

MIST,BIML,R, C++; Visual C#, Visual Basic; Visual J#, Visual J++, Visual Interdev, Visual Studio .Net 2005/ Visual Studio.Net 2003, Visual Studio .Net; Visual Studio 4; ASP.Net; Java; JavaScript; Java Server Pages; XML; XSL; MSXML; HTML; DHTML; Com/DCom; Windows 32 API & Registry; VB Script; WMI; SQL; SQL Server DTS; ADODB/ADO.Net; Clipper; QMF; MMS; Ledger Warehouse; Rexx; Command Com; Virtual Machine; Winsock 32 API; Lotus Notes via OLE and API; JSP; XSL; Python.

Databases:

MS SQL Server 2016, 2014, 2012. 2008, 2005, 2000, 7.0 & 6.5; MS Access 97 to 2010; Oracle 7.3 & 8 Enterprise Edition; Sybase; Dbase 3+ to 5.0, DB2, Informix

Object Access:

SSIS, DTS Data Transformation Services, ADO, ADO.Net, RDO, Active X, Java Conduits, IBM Client Access, various ETL tools, Cognos; SQL, SQL/92, (Including the manipulation of cursors), Oracle PL/SL, SQL*Plus, Stored Procedure, Triggers and User Defined Data/Functions.

Architecture:

Very large scale database architecture (2TB+); “Follow the Sun” support methodology; Cross Platform Integrations; Rapid Response & Zero Fault Tolerance Data Critical Architectures; Performance Optimized Architectures; Database Farm Architecture; Active/Passive Multi-Server Architecture; Migration Architecture from current physical hard drives to Solid State Disk Drives

(Memory Drives, showing an increase in performance of 1000%).

ER/Win, PowerDesigner & Visio Database data modeling and data mapping; “pen and pencil” analysis; UML packages and other business analysis tools. Metadata Modeling, ER Diagramming and other flowcharting techniques.

OLAP

BIML (Business Intelligence Markup Language), Mist 4, Excel & Access OLE DB; XML; Crystal Reports; Cognos and PowerPlay, SQL 6,5, 7.0, 2000, 2005, 2008 SSIS (10 Years), SSRS (10 Years), SSAS (6 Years), SSNS (6 Years) {Which has been replaced with StreamInsight}, MicroStrategy, IBM Identity Analytics

Professional Experience

GE Military Aviation Division 12/2019 – Present

ETL ARCHITECT

• Responsible for re-architecting the ETL solutions to begin migration to a cloud based solution with either AWS or MS Azure.

• Helped restructure the CVS repository for Java code to bring it up to current standards of practices for GE.

• worked with engineers to design reports for metrics on military aircraft engines.

• Fixed errors in Java programs the had become obsolete due to time they were written in and all of the upgrades to java that has occurred.

• Optimized queries on oracle db’s to be more efficient and run considerably faster than previous iterations.

• Troubleshot issues with oracle, java and informatica to resolve critical problems.

• documented all aspects of code, etl and solutions and then disseminated to management, clients and peer developers.

Sugar Creek 07/2019 – 10/2019

ETL ARCHITECT

• Responsible for taking T/SQL scripts and converting to SSIS for loading data warehouses and data marts. Optimize code performance for accuracy and efficiency. Performance tune SQL Server, Data Marts, and queries. Point out potential trouble areas.

• optimized existing data warehouse load taking approx 90 to 120 minutes, down to 5 minutes utilizing SSIS with parallel processing and optimized loading of data.

• Create custom logging that would log start and stop times for each table, from staging to production.

• Using Visual Basic.Net created summary emails for each ETL task to send out after run, to inform admin of process start, stop, run times and success or failure. Create custom error handling that would attempt retries of processes and send out failure alerts to appropriate parties.

• With Visual Basic.Net and C#.Net created step by step xml logging, so that trace through the entire running of the ETL processes could be done to analyze areas that needed improvement or need restructuring for better efficiency.

• Created proper indexing on multiple billion row table to allow querying in a timely format. Was taking 7 to 9 minutes to just return a count, down to under 30 seconds for doing complex aggregates within time range periods

Applied Systems 04/2014 – 06/2019

BI Team Lead

Applied Systems software for the US, Canada and European Insurance markets.

• Converted legacy TAM system (DBASE III) that was using Pervasive for data migration to SSIS and migrating to new EPIC (SQL Server). Previous Pervasive code would take approx. a full day to do a single client conversion. After converting SSIS Architecture got the average client down to 1 hour.

• Utilized Visual Basic and Visual C# to create complex data transformations for manipulation of data.

• The process of taking the DBASE database at the beginning and ending up with client data in the EDW involved multiple steps. I created a Master SSIS package that would run all these steps for each client, scheduled and supporting parallel processing, backing up of database, error recovery, step by step logging, restart capabilities, and metrics reporting. This took what was usually taking a single person half of the day to monitor and do manually into a completed automated process with email alerts for status, failures and metrics. Every item option is configurable through a database backend, per server being run upon.

• Evaluated interns and made recommendation on who to hire. Once hired, I was responsible for teaching that individual (which had no BI exposure or IT work experience, and still in school) Business Intelligence, ETL, SSIS development, methodology, and architecture.

• Frequently required to create, or update .Net code for several applications, ranging from the app that parses the DBASE .dbf files and imports into SQL Server, to creating a package versioning app that would display the version and build dates for all SSIS packages, to help with versioning control and verification that the correct .dtsx files were being implemented.

• Mentored my boss, the VP of Data Migrations in advanced SSIS and SQL concepts.

• Assisted jr developers with advanced programming concepts and troubleshooting problems in their projects.

• Acted as general go to resource for Microsoft Business Intelligence Products, .Net code, SQL queries and query optimization, and SQL Server administration.

Adreima (Contractor) 05/2013 – 08/2013

Database Architect/Business Intelligence Developer

Adreima is a third party Medicare/Medicaid billing processor that also handles legal and clinical trials for dozens of hospital groups across the US.

Optimized SQL Server instance to show more than an 300% increase in performance by eliminating extensive use of cursors, optimizing stored procedures and removal of obsolete triggers.

Created automated SSIS packages to import data from client systems that were delivered in flat file format with each client be different specifications. This showed an overall improvement from their previous import procedures which would take approx 30 minutes for 1000 records to under 1 minute while maintaining all business logic.

Trained other employees to use SSIS to simplify their data projects.

Implemented database standards to be followed by the development team for optimized query times.

Bechtel Corporation (Contractor) 01/2013 – 04/2013

Senior Business Intelligence Developer

Bechtel Corporation is one of the world largest engineering firms that specialize in everything from civil constructions, power generation, mega scale engineer (such as the Hoover Dam and The San Francisco Bridge)

●Worked to creates financial reports for senior management on demand using SQL Server Reporting Servers.

●Utilize SQL Server Integration Services to load data from the source system into the report data warehouse/data marts.

●Specialize in data architecture and optimization of queries to gather the greatest amount of performance out of a system that has more than 60,000 users pulling reports during any given month.

●Mentor other employee’s in area’s that they are not at the top of their game, such as SSIS, Data Architecture, performance optimization and proper structuring of queries.

●Increase performance on the majority of queries by a good 20%, as well as creating automation processes for loading data and checking overall data quality and assurance.

●Responsible for writing up technical documentation, risk assessment and contingency planning.

Accelerated Payment Technologies 04/2012 -- 08/2012

Senior Developer

Accelerated Payment Technologies is a financial gateway processing company.

●Created migration architecture that moved all customers utilizing a third party processing gateway over to internal gateway processing, while updating their legacy software and creating new identifying credentials for the client that enabled to utilize the newly created web portal for gift card processing and reporting.

●Trained additional employees in the utilization of SSIS 2008 tor the actual migration and report generation.

●Created all of the architecture's documentation, from project proposal, to data mapping, software switch over, Risk Assessment & Contingency Planning, Success Criteria and Solution and Lesions Learned at the end of the project phase.

●Assisted with doing Peer Review of programming in charge of creating the legacy processing software, which was/is still developed in Borland C++ Builder 4.0 to catch any critical flaws or security holes.

●Solved two persistent bugs with the primary legacy application that had been a phantom problem for the last 15 years of the company’s time. The problems had been misidentified as Floating Point Masking Errors, or differences between old Borland and Microsoft development environments, and even problems with running unmanaged code in a managed environment. It was simply a matter of having improperly structured try/catch try/finally nested error checking routine.

Westpac Banking Group (Contractor) 05/2011 – 11/2011

Business Intelligence Reports Developer

Brought onto Westpac for a crisis management team in the Merchants facility division to develop reports and business processes that formulates data across several different data sources to isolate processes failures that are resulting in negative customer experiences with Merchants terminal transaction processing. Deliver ad hoc reports for metrics on delivery of several measurable KPI’s than turn those ad hoc requests into reporting processes that can be utilized by the business analyst in the future.

Working across several technological domains, Microsoft BI, Oracle BI, Teradata BI, SQL Server 2005 & 2008 and other data sources for Westpac Banking Group, then converting those same metrics into processes that can run for the St George Banking Group division as well.

Worked with various members of the crisis management team to develop the key metrics responsible for business loss/gain and turned those into weekly dashboards. There was a dashboard for the Westpac Banking Group with its own set of metrics. Another for he St George Banking Group, which incorporates St George Bank, Bank of Queensland, Bank of Melbourne, Southern Australia Bank, and one or two other minor branch banks. This resulted in data being kept in several different formats, which had to be resolved into a single view for the St George Banking Group.

Discussed with my boss and senior management that the next steps would be to move away from using Excel and manual queries for the Dashboards and turn it into a SQL Server SSIS ETL process for the gathering and storing of the data, and then utilizing SQL Server SSRS for reporting and display the data. This step would be followed by the final step of taking the Merchant Facility data and creating a Merchant Facility OLAP cube for use by the analyst. My recommendations were received very well, and I was asked to put together the framework and architecture for the SSIS and SSRS work.

New South Whales Dept of Health 12/2009 – 04/2011

Business Intelligence Specialist

Team Technical Leader

Dialog IT is a Professional Services organization that has been contracted by NSW Dept of Health to build their

Data Warehouse utilizing SQL Server 2008 for storage, SSIS for the ETL layer, SSAS for the cubes and Business

Objects for reporting.

My responsibilities at NSW Dept of Health includes mentoring the other developers on more advanced database

development concepts Business Intelligence Conceptualization, Analytics, SSAS Cube Creation & Generation,

optimizing database access and security, working with the project manager to

formulate strategic business approaches to proposed client functionality while staying within the pricing structure

of a fixed priced bid. Setting up and configuring servers (Database, Application, OLAP, etc). Reviewing

proposed developmental concepts for validity and practicality.

Some of the key accomplishments to date include creating an SSIS ETL package that executes a third party Address Cleansing software product and then updating the Enterprise Data Warehouse tables with the cleansed address results.

Creating a Visual C# console application that utilizing the Business Objects API that is launched from SSIS.

This application runs all of the Business Objects reports dealing with the Data Error Detail cube. It refreshes the data of reports with and without parameters and then sends the reports to all of the subscription user’s email.

Visual C# dll that is utilized from an SSIS Script Component (which is created dynamically) to Process (The more than 750 Business Logic Rules per Data Stream) as Business Validation rules stored as metadata for the Business Validation SSIS package. As well as writing any violations of the Business Rules into tables that are then flowed into the Data Error Mart and Data Error Cubes.

Created ETL routines to transform the data and populate the Data Marts.

Created the OLAP Cubes for the BA’s to perform Analysis and Trend Analysis on.

Created a Master Control Business Intelligence Admin Interface web application as the central management Point Of Contact in order to run the entire Data Warehouse, recycle jobs that had failed, generate the running of reports, as well as handle a multitude of other related BI task.

Created a SVN rollout application that contains all of the notes and bug fixes per release and per file fixed. This application allowed you to select the environments that you wished to rollout to and it would deploy to all of the servers listed the SSIS packages as well as executing SQL Scripts for updated Stored Procedures, User Functions and other SQL Items. It came with a front end so that users could see the historical changes per file or release. Finally, since the deployment was not connected to SVN, it tagged the files in the properties section of the file with the SVN Version Number, SVN Author, Deployment Date and Release Version.

Zenith Solutions (Contractor) 08/2009 – 10/2009

Senior Data Warehouse Architect/Developer

Zenith Solutions is a consultant firm that specializes in Data Warehousing that was contracted by Hudson Global to review and assist with their project of uniting all of their various source systems into a single data store which would then be used to populate a data warehouse. Initial responsibilities were to review all SSIS packages created to date, correct any errors, optimize for performance, and efficiency. Next produce a best practices document for further development to follow. Assist with creation of SSIS ETL packages to complete the population of the single data store source. Finally direct and assist team with transforming the data from the single data store into de-normalized structures to populate the data warehouse. This phase also included the creation of some of the SSAS cubes, and the creation of several SSRS reports, as well as modification to custom Visual Basic & Visual C# code for specific data loaders.

Servian (Contractor) 05/2009 – 08/2009

Senior Architect/Developer

Primary responsibilities included creating SSIS ETL packages to import the 2010 new financial codes into the data warehouse and extending the OLAP cube to contain these new financial codes, reconsolidation of new data against old data, and verification that matched old codes were indeed updated with the new 2010 codes. Identification and correction of bugs with the ASP C# 3.5 front end. Correct issues with stored procedures, user functions, Sql Server Reporting Services Reports, SQL Servers Integrations Services Packages as well as Sql Server Analysis Services problems.

As well as fixing issues, I also worked on new change management items that came across for new ETL packages for Data Warehouse Loads. Optimize various stored procedures, queries, functions, triggers and indexes that were working less then efficiently. Finally, I was charged with creating several different types of reports ranging from utilizing SSRS, to grid displayed, editable content on the .Net presentation layer.

Aristocrat Technologies Inc. (Contractor) 09/2008 – 02/2009

Reports Developer (Contracted through teksystems)

I was contracted through TekSystems to assist Aristocrat Technologies Inc. with updating and developing of new reports for the new version of their Oasis product for 2009. The reports were developed with Crystal Reports 9 and SQL Server 2008 Reporting Services, with SQL Server 2008 for the database.

Found a few stored procedures that had error in their logic, and sent back to DBA’s for correction, helping to improve the overall accuracy of the product.

Assisted developers by showing a Visual C# .Net recompile trick associated with Crystal Reports 9, which causes updates to the report to be picked up, instead of being skipped over, which was currently happening. Saved developers overall time through the year by doing this process more efficiently then the way it was being handled.

Audited and corrected existing reports for known defects from the time I had spent working with Aristocrat Technologies Inc. customers as a consultant.

Black Gaming, Mesquite, NV (Consultant) 04/2007 – 06/2008

Senior Data Architect & Senior Developer (Independent Consultant)

Primary deliverables were:

Using SQL Stored Procedures and SSIS import customer data from all three production casino management servers into a single staging server.

Using SSIS, create routines that would merge up individuals from each casino’s casino management system and create a master ID for each individual so that campaign management could be targeted at specific individuals based upon the play across all of the casinos, instead of a single point of view which was how things were currently running. Since Mesquite is a small town, and Black Gaming owns three of the four casinos, local residents would play at multiple Black Gaming casinos. Due to the fact that each Casino had its own Casino Management System, someone like John Smith would have a customer number at each of the casinos. The routines created would merge then into a single point of view from the corporate level, based off of attribute pattern matching and fuzzy lookup logic against Name and Address.

Create the campaign management reports that were utilized to sending out marketing mailers, that had build in denial logic. Many customers would go to these particular casinos for golf, messages or to get away from the busier Las Vegas area. Of these customers, quite a few did not play the gaming machines. The denial logic would filter out these individuals, so that marketing mailers with an estimate value of $250 per mailer were not sent out to non-revenue generating customers.

Utilizing SSRS, I created several financial reports, utilizing the merged individual master ID. These reports allowed a customer by customer view from a single casino, comparisons from one casino to the next, and a top level corporate view.

Created Visual C# Windows Services that monitored the overall health of the Database, which would launch routines to do database backups, re-index and recalculate statistics, archive data and other maintenance routines, on an as needed basis instead of a schedule basis.

The new reports and match/merge routines to create a single customer ID for individuals across the multiple casinos resulted in saving close to a hundred man hours a month with the elimination of employee’s having to manually attempt to remove duplicate campaign marketing mailers.

The new reports, campaign mailers, and single point of customer view dramatically increased customer satisfaction, since now customers were being realized at the proper level of play allowing them to get credit for their overall play, allowing customers to play frequently to be promoted from a lower level rewards group to a higher level rewards group, since the new reports contained all of their play across the three casinos, instead of play at each separate casino.

The new process for creating the campaign marketing mailer resulted in the savings of hundreds of thousands of dollars that were being claimed by customer due to the result of them getting duplicate rewards mailers from multiple casinos, since the process was moved from being done by the individual casino to be processed by the corporate division.

Act as a DBA for all servers involved in the Data Warehouse Project.

Boyd Gaming, Las Vegas, NV (Consultant) 07/2005 – 02/2007

Senior Data Architect/Data Warehouse Architect (Independent Consultant)

Acted as Senior Lead under the direction of the Deborah Cooper the Project Manager to utilize resource’s to achieve our implementation schedule. Gathered all requirements for data elements to be used in the Data Warehouse/Data mart project and worked with internal resources to setup the routines to capture data real-time from the AS/400 DB2 database servers and insert into the SQL Servers running the Entity Analytics Data Warehouse. Delivered specifications, design plans and implementation schedules and efforts to senior level personnel at Boyd Gaming. Created detailed data flow charts, data modeling diagrams, OLAP Cubes, and scheme extensions. Utilizing SSIS and SSAS to create the ETL streams to load the Data Warehouse and SSAS to create the cubes that populated the Data Mart. The cubes were developed on a per property bases (22 properties) with an overall cube for the corporate view. Report and Data Mining designs were pre-designed for users to easily drill down, and manipulate data for marketing. The overall corporate view was used for financial overview.

Was charged with the development of .Net 2.0 and .Net 1.1 Frameworks and Visual Studio 2003 C# and Visual Basic special custom code sections to work around the large holes and bugs that were in the vendor’s software. Trained each new employee on the Data Warehouse software. This training would be anything from how to use, all the way to DBA level training. Migrated 120 corporate financial reports from being run on DB2 over to SSRS with automated delivery so that each senior management level employee automatically received their financial reports on the day the reports were scheduled to run. Previously a VP would have to direct a employee to run the reports, with small manual changes being done for each separate Senior Management employee. The new SSRS report incorporated logic that would substitute the unique values required by each Senior Management employee.

Utilizing the Visual Studio .Net Development Framework 2.0 & 1.1, I was able to effect fixes in the Code Base that has put the entire project on hold. Worked with other vendors to unite the overall architecture of the system, as encompassed by IBM, Microsoft and several additional vendors to document a unified architecture. Contributed as being a critical person in the implementation of Boyd Gaming’s Data Warehouse. Without my knowledge and skill, the deployment would have failed.

Key Achievements:

Created several critical work a round’s of the original vendor’s software that allowed the Data Warehouse to function properly in a Casino environment, which the initial vendor refused to implement, even though they agreed they were program defects. These were created with Visual Studio 2003 C# and Visual Basic.

Credited with several critical applications (designed with .Net 2003 C# and Visual Basic, utilizing .Net 2.0 & 1.1 frameworks) that help self correct problems that are found by the applications to make the Data Warehouse/Data Mart environment a more stable platform running at greater performance efficiency.

Identified critical flaws in the original vendor’s application and created additional tools to compensate. Such as a data audit utility that would audit the data after the ETL tools transformation and load. This increased Boyd Gaming’s data quality by orders of magnitude.

Credit with the technical success of the overall project. The data warehouse/data mart is now used across 22 properties across the United States, and has grown to 1.5 TB. Average processing is on the order of 75 million records per day. Talks have been in way to migrate away from physical hardware sans and utilize SSD for increased performance.

Act as a DBA for all servers involved in the Real Time Data Warehouse/Data Mart.

Honeywell Aerospace Systems, Phoenix, AZ (Contractor) 01/2005 – 10/2005

Senior Data Architect & Senior Developer (Independent Consultant)

Contracted by Honeywell Systems to review the Oracle database design and the ASP.Net 1.1 C# & Visual Basic web based code for their automated physical warehouse system. The warehouse system would continuously try to place objects into already filled bins. No one onsite could determine if it was a coding problem, or a problem with the Oracle database design. Honeywell had several consultants in over the course of a year, without any progress being made as to why the system would continue to do these actions. The original vendor was no longer available to offer support.

Key Achievements:

In less than 6 months time, found several problems within triggers, stored procedures, user defined functions and Visual C# code that resulted in an erratic data loss causing bins to appear as empty, but were filled. Corrected these problems and brought the physical warehouse system back to standard operational parameters.

Located flaws in .Net 1.1 security routines that were resulting in over 60% wasted server resources, hitting a security loop every single actions, when it only needed a single session variable set and then checked each action. This correction resulted in an improvement in speed and performance of the system by 60%.

Utilizing Visual Studio C# and the Microsoft .Net 1.1 Frameworks, made modifications to several sections of code that was not properly optimized, causing page crashes, long loading times and overall system inefficiencies.

While onsite, a situation occurred where I was able to utilize my disaster recovery skills. In the warehouse the overhead fire system had gone off, flooding a large area of the warehouse and soaking several PC’s and other electronic equipment in fire retardant water. I was able to form a group of individuals and direct then in appropriate actions, which resulted in 98% of the equipment being saved and put back into operation within two days.

Perform the duties of an Oracle DBA with the associated servers involved with the ProMove Project.

Systems Research & Development, Las Vegas, NV 08/2003 – 11/2004

Professional Services Implementation Architect/Senior Professional Services Developer

Work closely with clients to determine project scope and requirements and tailor tools and systems to meet client goals, utilizing Visual Studio .Net, Microsoft .Net 1.1 Framework, Visual C++, Visual C# and Visual Basic. Compose business plans and draft statements of work and related documentation. Complete risk assessment, contingency planning, test scheduling and analysis and resource allocation. Maintain up-to-date knowledge of technical and methodological advances in data warehousing and identity establishment. Present project outlines and progress reports to senior-level management. Liaise between clients and vendors. Train Company and client staff members.

Key Achievements:

Earned solid reputation among diverse clients as innovative and adept professional committed to quality and efficiency.

Re-established strong working relationship with previously discouraged client, resulting in 17-property professional services agreement.

Designed and implemented a data quality audit application for use as data warehouse corrective tool that verified and tracked data synchronization, utilizing Visual Studio .Net 1.1 Framework, written in Visual C# .Net 2003.

Developed suite of internal monitoring tools to aid support staff in monitoring,



Contact this candidate