• email@example.com • www.linkedin.com/in/phillipjulian
As a data scientist with analytical skills and expertise in big data, I bridge the gap between data and knowledge, so that business decisions may be based upon what the data “says”. To make the analysis effective, clear presentations promote understanding and lead towards actions based upon the recommendations.
I sought many challenging projects in many industries because I was confident of success and wanted to acquire expertise. My expertise includes SAS, UNIX, SQL, JMP, databases, BI, data visualization tools, and big data experience since 1998. An internet pioneer who created one leg of the first email and news network in 1983. A member of the team that designed, wrote, and implemented the SAS System on the “minicomputers”, and saved SAS Institute by helping to win a landmark software copyright case. Designed and created an end-to-end marketing and sales reporting solution from the ground up – from IMS tapes to emailed Excel reports. In 2009, I made a data-based decision and sought a career in banking because SAS got 42% of its revenue from the financial sector. Now I’m ready for the next challenge, or an everyday routine that needs dependable quality.
The PNC Financial Services Group – Raleigh, NC March 2014 to Present
VP Data Expert, EDM Analytics Competency Center
Analyst, programmer, and expert for Basel Retail Risk Modeling Support, SMT (SAS Modeling Tool), and the data and software. Responsible for running the modeling code, researching issues, designing solutions, developing updates to the code, and presenting analytical results as requested by business owners, modelers, other analysts, or the Basel committee.
•Planned and facilitated an SMT upgrade that needed new versions of SAS CSB (Credit Scoring for Banking), SAS DIS (Data Integration Studio), and Teradata. We lacked a crucial piece of software, and had no time to hire consultants. So I wrote my first DIS custom transformation to perform Teradata ETL, and created other DIS metadata and programs, which saved enough time so that we could have a non-working vacation at year’s end.
•Created the RCMR Run Book and the DDS ABT Run Book for Basel Retail Risk Modeling Support, and supplied the supporting documents for Basel compliance. Updated the RCMR Run Book for process changes and to maintain Basel compliance. The DDS ABT Run Book also shows setup, maintenance, validation, debugging, and recovery.
•Migrated and debugged all jobs, code, and shells from our Solaris box to the Linux SAS grid. When a co-worker suddenly left the company, I became the sole developer and architect for DDS and ABT. The project was completed fairly quickly, but it was still a massive debugging and discovery project with 6 job runs per month, 260 programs, and 305,432 lines of SAS code.
•I also created the QA programs for the SAS grid, which required modified shell scripts and more than 250 SAS program changes for directory locations, database names, and autoexec files.
•During code review for Basel Retail Risk Modeling, some crucial columns were not found. My SAS dictionary of all datasets located those columns in datasets that we were not searching.
•Analysis of performance using shell commands for the Linux system, with FullStimer statistics from the SAS logs.
•Presented several papers at SAS User groups and at SESUG regional conferences from 2010 to Present. My papers shows sophisticated uses of the SAS Data Dictionary to profile, explore, and validate SAS datasets. Special techniques show how to use this information for the validation of development and production datasets by dictionary attributes and by Proc Compare code that is automatically generated for all indexed or sorted SAS datasets.
•From 2010 to 2015, I have 3 papers in the Lex Jansen repository:
oUsing Dictionary Tables to Explore SAS® Datasets
oUsing Dictionary Tables to Profile SAS® Datasets
oMigrating Databases from Oracle to Teradata
SAS Institute – Garden City, NY Sept. 2013 – Dec. 2013
SAS Application Developer – Model Implementation
Contracted at JP Morgan Chase (JPMC) through SAS and Vector Consulting. Worked with the largest share of the automobile loan market. Duties involved risk management, implementing CCAR processes, and predictive modeling per LOB. Improved CCAR processes, performance, governance, quality, scalability, and analytics. FRB (Federal Reserve Board) accepted JPMC’s process and the analytical results before the contract ended.
Translated DB2 pass-through SQL code into native SAS SQL code, and debugged the resulting code.
oCreated methods to translate between SAS date values and DB2 date values.
Discovered errors in a validation spreadsheet that incorrectly showed a successful test.
Created Windows batch file automation with error checking and aborts for severe errors.
Improved and ran existing SAS code for CSL RR and Write Down models, and assured that the final numbers compared exactly, with an error tolerance of 10 7).
Bank of America – Charlotte, NC Dec. 2010 – July 2013
VP Market Information Manager, Analytics & Reporting – Campaign Analytics
Analyst for marketing campaigns, executive scorecards, and customer contact databases. Responsibilities included campaign requirements design, DOE (Design of Experiment), measurement plans, data gathering, data mining, modeling, analysis, and presenting results and recommendations for improvements.
•Marketing campaign analyst for 2011 and 2012 DFS (Dealer Financial Services) refinance campaigns. Data mining and categorical analysis were used to discover significant factors for submitting vehicle loan applications and gaining loan approval.
oDecision tree formulas were adapted for scoring the 2012 campaign and new data for 2013.
oApp rate of approval increased from 30% to 61%, with minimal losses in funded apps.
oCampaign and marketing changes may gain more approvals from customers “in the money”.
oThe current lender from THOR, and the new DFS lender, greatly influenced app approvals.
•Retail Marketing Scorecard analyst. Defined metrics, located business owners and data sources, and helped automate executive reporting. Two Silver Awards from business partners.
•Marketing campaign analyst and reporting resource for BankAmeriDeals™ digital channels.
oAnalyzed monthly CRM campaigns, and created DOE’s with test and control populations.
oCreated a unified cell matrix for analyzing trends in all BankAmeriDeals™ campaigns.
oLevels of customer engagement in the BankAmeriDeals™ program were created from the data, and applied to the unified dataset of all campaigns.
For example, listing creatives for all “new viewers” confirmed results of VOC research.
•Created the first integrated, customer-level view of digital marketing data from multiple channels.
oCreated attribution of offers to booked units by mapping campaign names to the product hierarchy, which was linked to financial measures for expense, profit, and 5-year values (SVA).
oMy digital market mapping was adopted into the bank’s offer attribution framework, TOMS.
•Created the first analysis of all digital markets with offers linked to bank products, bookings, and SVA, with customer segmentation (CVF) levels defined for each customer.
•Uncovered data errors in digital channels (SASO, TOLA), which led to significant corrections in 2012.
•Appraised data quality in the Epsilon ADM database. Gaps and alignment issues were discovered by comparing it to similar data from Harte Hanks. Recommended changes were implemented in 2012.
•Marketing campaign analyst for RDS (Relationship Deepening Strategy) and for eStatements.
•Mortgage Funds Forecast for the digital channels, including current and funded applications.
Bank of America – Charlotte, NC June 2010 – Dec. 2010
SAS Programmer Analyst, Digital Response & Analytical Marketing (contract: Techlink Systems)
Enhanced automation and reporting for a set of programs that ran 100 digital marketing campaigns per week.
•Created rule-based SAS programs that translated SQL into English, and greatly improved communications with campaign managers. Campaign reports became easier for non-programmers to understand, and this reduced errors and misunderstandings of the campaign criteria.
•SAS formats and macros provided SQL translations for almost any report, and it was used in waterfall reports, pre-sizing reports, and audit reports. Automated SAS batch jobs ran on UNIX and created the reports, which were multi-tab Excel workbooks that required no editing and no special formatting.
•Consulted on SAS programming methods, and debugged SAS programs for other analysts.
Wachovia, a Wells Fargo Company – Charlotte, NC Nov. 2009 – June 2010 Lead SAS Programmer Analyst, Targeted Marketing & Analytics (contract: Strategic Staffing Solutions) Execution lists and analysis for direct marketing campaigns at Wells Fargo Home Mortgage (WFHM).
•Defined execution requirements to meet campaign goals, including audits and post-marketing analysis.
•Created SAS programs that produced customer lists for execution by external vendors.
•Analyzed the mortgage refinance market.
•Improved automation by fixing a formatting issue in the weekly PDF reports on mortgage campaigns.
•Mentored other SAS analysts, and consulted on SAS programming, UNIX, and database schema.
Inspire Pharmaceuticals – Durham, NC Jan. 2005 – Mar. 2009 Senior Analyst, Sales Operations and Marketing
Joined a biopharmaceutical company (NASDAQ:ISPH; 235 employees; acquired by Merck in 2010) to create (by myself) an original reporting solution for a 69-person sales force. Program inputs included IMS data tapes, detail calls and samples, prescriber information, sales force alignment, targets, rosters, and sales forecasts.
•In the first 120 days, I created an automated reporting system by using SAS, Excel, DDE, and VBA.
•Customized email and Excel reports for each person, with a total of 73 tabs and 358 reports.
•Produced continuous improvement with testing plans, spreadsheet comparisons, and root cause analysis.
•Created SAS aggregates for ad hoc reports, issue resolution, compensation reporting, targeting analysis, business development, sales operations, and the analysis of market potential.
•Created a macro-based system that ran 6 (or more) different versions of the same SAS code for different versions of the IMS tapes, and also made it easier to test changes in the data.
o Developed a macro to compare any two datasets with identical column names and structure.
The macro results were much easier to read than Proc Compare results.
o Compared two versions of IMS Rx, and discovered a bias towards scripts for larger products.
From 2006 to 2007, managed StayInFront SFA and Target Mobile SFA, while the sales force nearly doubled.
•Provided support, training, custom ETL, reporting, and user acceptance testing for StayInFront SFA.
•Managed the project to replace StayInFront SFA with Target Mobile SFA.
•Exported and migrated all StayInFront SFA data to Target Mobile SFA.
From 2007 to 2009, managed project to implement a data warehouse, and migrate reports to MicroStrategy.
•Created MicroStrategy metrics, filters, time rollups, reports, and documents to replace the 358 reports.
•Migrated all the SAS reports to MicroStrategy, and delivered the Excel documents by Narrowcast.
Other major projects:
•Analyzed refill prescription data in the AzaSite market, and discovered a subset of chronic off-label usage to treat blepharitis. Estimated the market potential, and produced target lists of high prescribers.
Merck started Phase 2 clinical trials in 2009, and the product is close to being approved by the FDA.
•At the 2007 National Sales meeting, my system design hastened the delivery of Force Analyzer, a QlikView application based upon a MicroStrategy data mart.
Pharmalink FHI Research Triangle Park, NC Nov. 2004 – Dec. 2004
Statistical Programmer (contract: MATRIX)
SAS Statistical programming for Phase 1 and 2 clinical trials CRO. Team validation of statistical programming.
Remaining resume contents may not be in chronological order.
Durham, NC 3 Contracts from 1995 to October 2004
Statistical Programmer (contract: Bogier Consulting) GSK Jan. 2004 – Oct. 2004
Phase 4 clinical research on CHORUS (Collaborations in HIV Outcomes Research – US), the world’s broadest cohort of HIV patients, including surveys, demographics, labs, and electronic medical records (EMR) from A4.
•Team SAS programming with epidemiologists, statisticians, clinical trial managers, and medical boards.
•Produced high quality research that was presented in medical journals and at international conferences.
Senior Data Warehouse Analyst (contract: Sapphire) GSK Jan. 2002 – Nov. 2003
Managed the PRx warehouse of SAS datasets, containing 5 terabytes of data and 5.3 billion records for scripts in all markets (Wolters Kluwer data), physician demographics, calls, details, samples, patient level data, and census data. Used by Commercial Analytics for market research, targeting, and promotional response analysis.
•Improved loading performance by 50%, by spawning multiple SAS jobs that ran simultaneously.
•Technical SAS lead on the Sun Starfire upgrade team. Improved performance by 80% with system enhancements, and reduced overall data warehouse load time from 10 days to 1.5 days.
•Consistently delivered high quality results on time and with very few errors discovered by the QA team.
•Created UNIX and Windows shell programs to automate the system and track the loading process.
•Created a process document to standardize data loading and the production of QA reports.
•Managed changes in PRx, PAS, GrowthTrac, SAS, and IPO (recoding of all product keys).
•Redesigned PAS as a .NET application making API calls to SAS Integration Technologies middleware.
•Migrated SAS datasets and applications from SAS v6 to SAS v8, and verified datasets and programs.
oAlso adapted X commands and file pipes to UNIX, as needed.
•Researched compatibility issues for copying SAS version 8 datasets between UNIX and Windows NT.
SAS and Database Programmer Analyst (contract: Metro IS) GSK 1995 – 1997
Ad hoc SAS reporting on large pharmaceutical databases of sales, marketing, and promotional activity data.
•Analyzed prescription data, and produced custom target lists for brand managers and sales promotions.
•Defined prescriber matching algorithms for GSK’s data warehouse, when PRx was in its infancy.
Nortel Networks Research Triangle Park, NC Regular Fulltime Employee: 1998 to 2001
3 Contracts from 1995 to 1998
Software Engineer Nortel 1998 – 2001
Joined R&D Project Management Office at Nortel (TSX:NT), a global telecommunications company. Created reports and websites to manage new product features and to produce the highest quality software in the world.
•Transformed a 60,000 line program (CSAT), written in Oracle Pro*C, into a 3,000 line SAS program. Reduced program maintenance effort by a factor of 40, and reduced runtime from 12 hours to 2 hours.
•Fully automated the CSAT application, which used several host systems – UNIX crons, Windows AT, FTP scripts, and a perl Windows routine that killed and restarted the job when severe errors occurred.
•Enhanced CSAT by adding drill down (and up) to detail and to Oracle, using SAS ODS to HTML.
•Boosted software defect coverage from 30% to 99% by using LDAP data to find responsible developers.
•Purchased the departmental server, and became DBA with 3 Oracle databases and 5 Intranet websites.
•Fully automated several orphaned applications, which minimized efforts required for daily processing.
SAS Database and Metrics Analyst (contract: COMSYS) Nortel 1997 – 1998
Programming for EMERALD, a system that predicted the risk of software errors in a source code module.
•Designed and produced SAS datasets and reports for module risk prediction.
•Improved database access, when migrating the SAS/Access routines from Ingres to Oracle.
SAS Programmer Analyst (contract: AiC) Nortel 1997
Analyzed customer satisfaction data, and created quality metric reports, for the Broadband Division of Nortel.
•Created SAS reports on elementary statistics and correlating factors from a Macintosh 4D database, which was imported into SAS by custom ETL, and used for analyzing customer satisfaction.
•Created “SAS-on-the-Web” before SAS/IntrNet existed. SAS jobs were submitted from a web page.
SAS and Database Programmer Analyst (contract: CDI) Nortel 1995
Project Management Office of BNR (Bell Northern Research), overseeing software quality in new products.
•Produced reports on software and design metrics, which were deposited into the BNR data warehouse.
•Created an automated system that merged data from several databases – a hierarchical database (IBM VM), an Ingres database of metrics, and UNIX hosts running cron jobs, shells, and SAS batch jobs.
SAS Institute, Inc. Cary, NC 1984 – 1995 Systems Developer
Developed the SAS System at one of the largest, privately-held software companies. In 1984, SAS had 320 employees and annual revenues of $52.6M. Now SAS has more than 14,175 and annual revenues of $3.24B.
•Worked in a highly talented team to create the first SAS system on minicomputers. If books heralded those days at SAS, you would see similarities between the early days of SAS and Microsoft Windows.
•Our success with the initial versions of SAS prevented a software copyright challenge by L&H Systems.
•Designed and developed most of the technological innovations in today’s SAS System – from the initial versions through SAS 6.11 for VAX, Data General, Prime, and UNIX.
•Designed, developed, tested, and debugged the SAS System for the minicomputer R&D team.
•Created the first release of SAS/Access to Oracle for VAX, Data General, Prime, and UNIX.
•Created the first release of C-Kermit, an open-source file transfer program, for Data General AOS/VS.
North Carolina State University Raleigh, NC 1981 - 1984
Systems Programmer / Applications Programmer
User support, custom programming, PC support, and systems programming for a major research university.
•An internet pioneer in 1983, creating internet email and news between NCSU, UNC-CH, and Duke.
•Created an ASCII protocol converter on an IBM Series/1, which let simple terminals communicate like IBM 3270 terminals. IBM later commercialized this prototype, based upon the YACC 3270 emulator.
•Upgraded a surplus X-ray spectrometer by programming an Apple II+ with a custom PC interface to operate the machine and analyze the output results. The spectrometer was run by a BASIC program that sent low-level commands to a 6522 VIA chip.
•Languages -- SAS, SQL, JMP, MicroStrategy Desktop and Web, perl, C, C#, .NET, Excel VBA.
•Databases – Teradata, DB2, Oracle, Microsoft SQL Server, Microsoft Access, Ingres, Oracle DBA.
•SAS skills -- Base, Data Step, Macro, ODS, SQL, SQL Dictionary tables. Access to Teradata / Oracle /
DB2 / Ingres. Access to PC file formats, ODBC, Microsoft SQL Server. STAT, GRAPH, Insight, Connect, Enterprise Guide, Integration Technologies, FTP and PIPE file access methods, SAS DDE to Excel, ODS tagsets for Excel workbooks, HTML, and PDF.
•Systems and shells -- UNIX, UNIX shells, GNU emulators, VAX, IBM Mainframes, Windows.
•Web delivery -- HTML, SAS ODS to HTML, perl packages for CGI, Oracle, DBI, and ARS Remedy.
•Pharmaceutical Data – IMS Xponent, PlanTrak, Formulary Focus, CoPay, EarlyView; Wolters Kluwer (NDC) monthly and weekly scripts (GrowthTrac).
•Sales Force Automation (SFA) -- StayInFront, Target Mobile SFA, migration from StayInFront to Target.
EDUCATION and CERTIFICATIONS
MS (no degree) Computer Science North Carolina State University, Raleigh, NC
7 years of graduate level classes, but no degree.
BA Mathematics University of North Carolina, Chapel Hill, NC
SAS Certified Advanced Programmer for SAS 9
MicroStrategy Certified Enterprise Reporter (CER)
MicroStrategy Certified Report Developer (CRD)
MicroStrategy Certified Project Designer (CPD)
Tufte seminar, 2008: Presenting Data and Information.
Wake Technical IT Academy, 2003 – 2004: C#, ADO.NET, Oracle OCP DBA Certification for Oracle 9i.
Oracle Corporation, 2000: Configuring and Tuning Oracle on Windows NT.
Other, 1999 – 2000: Introduction to Oracle for Experienced SQL Users; Clarify; Clearcase for UNIX; CGI programming with perl; Front Page for Windows NT.
•Using Dictionary Tables to Explore SAS® Datasets. Charlotte SAS Users Group, 2/26/2010. Poster at SESUG 2010 and SESUG 2011. One hour presentation and poster at IFSUG 2012 and SESUG 2012.
•Using Dictionary Tables to Profile SAS® Datasets. A companion paper to the one above.
•For copies of the paper and SAS code, see:
oIFSUG in the “2012 Proceedings” tab -- http://www.ifsug.org
oMy presentations on SAS Community, or search my name on Lex Jansen’s repository – http://www.lexjansen.com