Post Job Free
Sign in

Data Developer

Location:
Libertyville, IL
Posted:
December 19, 2017

Contact this candidate

Resume:

John I. Lee *********@*****.***

********@*****.***

OBJECTIVE.

Details-oriented, precise, and resourceful computer software professional with extensive experience developing sophisticated and innovative cost-reduction and productivity solutions to complex business and IT problems for numerous Fortune 500 companies, seeking an IT consultant position as an application programmer.

TECHNICAL SKILLS

Operating Systems:

Windows 95/98/2000/NT/XP, HP-UNIX, AIX, Solaris, PC DOS, OS/390 IBM MVS, UNIX (Bourne/Korn shell scripting, VI, EX)

Programming Environments:

SAS (Base, Macro, Stat, Graphs, UNIX-PC-Mainframe), Ab Initio (ETL, GDE, EME, Co>Operating System), Teradata (MLOAD, FASTLOAD, BTEQ/ITEQ, TPUMP, TPT, FASTEXPORT, SQL Assistant, SAS/ACCESS interface to Teradata, DB2),

Teradata Table Compression (MVC, ALC, BLC),

Teradata(DBQL(logtbl,steptbl,objtbl,sqltbl,explaintbl),Viewpoint,SWS,TVI,BAR,SLES),SAS DDE(dynamic data exchange), Oracle (SQL*Plus, PL/SQL, SQLLOAD, ADO, SAS/ACCESS to Oracle), Cognos, SAS/AF (Frame, SCL, Components, Attributes w/dot notation, SCL to Frame, Properties, Form/Table Viewer, SAS base),SAS Grid Manager,

SAS Metadata Server (Platform Job Scheduler for SAS includes Platform LSF (an execution agent). Load Sharing Facility (bsub, bmod, bkill, bstop, bresume, bbot, btop, bjobs, bhist, bpeek, bqueues, and bparams)

SAS Graphs/Gchart (block, hbar, hbar3d, vbar, vbar3d, pie, pie3d, donut, star),

Metadata Driven Programming Technique using SAS Macros (parameters) and data steps on validation and imputation on user inputs. .

Internet, OOP, MS-Office, Miscellaneous Technologies:

MS-Office (Access, Excel, Word), ODBC HTML, Cold Fusion/IIS, JAVA (J2SE), JavaScript, VBScript/ASP, Visual Basic 6.0, MS SQL Server, TCP/IP, FTP, Lotus Notes, Outlook, PeopleSoft Tools, SQR, NEON-Shadow Direct, Microsoft SMS,Q&E, ESSBASE, Microfocus COBOL, SQL, Microsoft VB, VBA (Visual Basic for application) utilized macros, sub procedures, function procedures, properties, methods on Excel).

Mainframe Technologies:

DB2 (SQL, QMF, SPUFI, DCLGEN, COBOL, SAS/ACCESS to DB2), FOCUS, COBOL/COBOLII, Panvalet, Command Level CICS, JCL/PROCs, TSO, ISPF, SPF, PDF, IBM Utilities, Abend Aid, IBM Report Writer, Easytrieve/Easytrieve Plus, IDCAMS, VSAM/KSDS, ICETOOL/SYNCSORT, FileAid/Dumper

EDUCATION

B.S. in Computer Science, Mathematics

CONTINUING PROFESSIONAL EDUCATION

SAS / Teradata Certified.

SOLEX Academy E-Commerce Program – Web-related applications development using HTML, DHTML, VBScript, SQL, JavaScript, ASP, ADO, OOP, JAVA (J2SE, AWT, Swing)

LAN\MIND – UNIX

CARA – Professional Systems Analyst Development

Chicago Educational Center – Advanced SQL, SQL*PLUS, PL/SQL, Oracle 7 Applications Tuning

PeopleSoft Corporation – PeopleSoft nVision Reporting, PeopleSoft General Ledger I, PeopleTools I

Microsoft – Visual Basic

Arbor – ESSBASE

PROFESSIONAL EXPERIENCE

PureTech Information Services, Inc. Oct 2013 - Present

Senior Consultant Contractor.

Provide and support variety high effective techniques

(SAS, ETL, Teradata, UNIX, Windows, Mainframe) for various organizations.

Support many financial organizations such as Citi group,Discover, Capital one under PureTech Information Services, Inc,

My responsibilities was design, planning new and update, collaborate old Teradata system which contains many (100+) tables as well as creating many new tables under different database system.

Manage/Monitor system performance, including workload management and throttling in TASM, query performance and tuning, physical node behavior, and

Amp distribution Consult on physical database implementations, DML, DDL, PPI, skew analysis, join methods, and data access paths.

Strong knowledge of Teradata Architecture: Nodes (AMP &PE), VProcs, BYNETS

Efficient usage of Teradata DDL Statements – Derived Tables, Volatile Tables and Global Temporary Tables

Hands on Knowledge on Teradata Concepts – Primary Index, Partitioned Primary Index, Secondary Index, Join Index and Compression Techniques.

Understand the Concept of Locking Mechanisms in Teradata

Experience on Query Tuning mechanisms using indexes & proper usage of Statistics in Teradata

Utilized Teradata (MLOAD, FASTLOAD, BTEQ/ITEQ, TPUMP, TPTexport, TPTload, FASTEXPORT, SQL Assistant.

Under new database system tested, debugged, reported to management, implemented and mentor others.

Utilized SAS/Access interface to connect new Teradata database to retrieve and pull data into SAS platforms.

Enjoyed to exam an inefficient existing SAS program and find places that enhancements could be made, quantify and justify the enhancements, make the changes, test, debug, implement them.

Abbott/AbbVie – Waukegan, IL April 2013 – Oct 2013

Senior SAS / Teradata developer

Support production system (Teradata, ETL, SQL,UNIX and SAS) process high volume (ten’s millions) data as input and output..

Utilized DBQL to trace inefficient SQL for Suspect Query Report.

Teradata (v13)/SAS(9.3) application developer, responsible for AMP(access module processor) which is main core in the Teradata RDBMS and the Virtual Processor used to manage the database, handle file tasks and manipulate the disk subsystem in the multi-tasking and possibly parallel-processing

Created tables allow Unique and Primary keys as defined constraints on a table, so the rows across the AMPs evenly distributed and shared across the pre-defined number of AMPs.

Utilized Hash Function to verify even data distribution over all AMP’s.

Created Teradata RDBMS support seven types of Indexes (UPI, USI, NUPI, NUSI, join index, Hash index, PPI) for good query performance.

TERADATA utilities TPT,Tpump(5 DML),Fastload,Multiload(5 DML) Fastexport, BTEQ, analyze Performance analysis, SQL Optimization, Teradata SQL Assistant.

SAS 9.3, 9.2, SAS EG 4.3, 5.1, Teradata 13.10., Linux, UNIX, Windows7 (PC) Environment.

Support PPD (pharmaceutical products) create many SAS support programs for end-users to make decisions on Promotional Marketing Data both in SAS and Teradata.

Worked on Healthcare Claims and SAS/Base, SAS/SQL, SAS/Macros, SAS/Stat, SAS/EG, SAS/PROCs.

Utilized SAS/Access interface Pass-through (PROC SQL w/LIBNAME) to database (Teradata 13.10, DB2, Oracle).

Extract historical data from data warehouse(Teradata 13.10) into SAS 9.3,9.2 environments

And redesign data to transpose vertically or horizontally for users.

Utilized ‘proc import’ to manipulate Excel (csv, tab, dlm) to create SAS datasets to feed into SAS process.

Utilized ‘proc export’ for SAS datasets to create Excel (tab, dlm, csv) format.

Merge many SAS files to consolidate into one central location.

Created many high-level SAS Macros and utilized SAS Array’s to improve upon and simplify existing unorganized data processes. Used positional and key- word parameters to process Macros.

ARRAYs and DO loops - do over, do until, do while, and do index DIM, multiDIM.

SAS/Base, SAS/AF (Frame), SAS/SQL, SAS/Macros, SAS/Stat (summary, means, freq), SAS/EG (4.3, 5.1), SAS/PROCs proc report, proc summary nway chartype autoname auto label, proc means, proc tabulate with colpctn, colpctsum, pagepctn, pagepctsum, reppctn, reppctsum, rowpctn and rowpctsum, PCTN,PCTSUM.

Analyzed high volume of SAS datasets and report abnormal cases.

Create many SAS Macro jobs within Macro job and Automated in Unix used AT,Crontab command.

Utilized SAS Grid manager client utility to process large amount of data as quickly as possible for data-intensive projects.

Base SAS ODS (html, rtf, pdf, csv, csvall, proclabel).

Utilized SAS Enterprise Guide submit SAS programs to the Grid with options of megaserver, metaphor.

Utilized SAS Enterprise Guide generate ODS output on the Grid.

Utilized ‘proc ANOVA’ analyzed variance for balanced data using Model statement with dependent variable given independent variable.

Utilized ‘proc REG’ statistical tech. to estimated/analyzed relationships among variables it include modeling and help understand values changes on dependent variable on given independent variables.

Utilized ‘proc gchart’ to generate data chart reports.

Utilized Metadata Driven Programming Technique using SAS macros (parameters) and data steps.

Utilized ‘CDISC (Clinical Data Interchange Standards Consortium) procedure’ to import an XLM document that conforms to CDISC ODM version.

CAPITAL ONE – Mettawa, IL April 2012 – December 2012

Senior SAS / Teradata developer

Support production system (Teradata, ETL, SQL,UNIX and SAS) process high volume (ten’s millions) data as input and output..

Teradata (v13)/SAS(9.3) application developer, responsible for AMP(access module processor) which is main core in the Teradata RDBMS and the Virtual Processor used to manage the database, handle file tasks and manipulate the disk subsystem in the multi-tasking and possibly parallel-processing

Utilized DBQL to trace inefficient SQL for Suspect Query Report.

Created tables allow Unique and Primary keys as defined constraints on a table, so the rows across the AMPs evenly distributed and shared across the pre-defined number of AMPs

To improve query performance Created Teradata RDBMS support seven types of Indexes (UPI, USI, NUPI, NUSI, join index, Hash index, PPI).

Utilized Hash Function to verify even data distribution over all AMP’s.

Created many high-level SAS Macros and utilized SAS Array’s to improve upon and simplify existing unorganized data processes. Used positional and key- word parameters to process Macros.

ARRAYs and DO loops - do over, do until, do while, and do index DIM, multiDIM.

SAS/Base, SAS/AF (Frame), SAS/SQL, SAS/Macros, SAS/Stat (summary, means, freq), SAS/EG (4.3, 5.1), SAS/PROCs proc report, proc summary nway chartype autoname auto label, proc means, proc tabulate with colpctn, colpctsum, pagepctn, pagepctsum, reppctn, reppctsum, rowpctn and rowpctsum, PCTN, PCTSUM.

Analyzed high volume of SAS datasets and report abnormal cases.

Create many SAS Macro jobs within Macro job and Automated in Unix used AT,Crontab command.

SAS 9.3, 9.2, SAS EG 4.3, 5.1, Teradata 13.10., Linux, UNIX, Windows7 (PC) Environment.

Utilized Teradata utilities to convert Mainframe flat files, SAS dataset and Excel (.csv) into Teradata environment

Generated decision making customized SQL reports for business community

Utilized Teradata options such as TENACITY, SLEEP to avoid an abort when encounter the maximum number of Teradata utilities are running

Utilized Teradata SQL assistant ODBC-based client utility used to access and manipulate data

The Explain statement is used to aid in identifying potential performance issues On every queries.

TERADATA utilities TPT,Tpump(5 DML),Fastload,Multiload(5 DML) Fastexport, BTEQ, analyze Performance analysis, SQL Optimization, Teradata SQL Assistant.

CITI GROUP – Elk Grove Village, IL May 2011 – December 2011

Senior SAS / Teradata developer

Support production system (Teradata, ETL, SQL,UNIX and SAS) process high volume (ten’s millions) data as input and output..

Utilized DBQL to trace inefficient SQL for Suspect Query Report.

Teradata (v13)/SAS(9.3) application developer, responsible for AMP(access module processor) which is main core in the Teradata RDBMS and the Virtual Processor used to manage the database, handle file tasks and manipulate the disk subsystem in the multi-tasking and possibly parallel-processing

SAS/Base, SAS/AF (Frame), SAS/SQL, SAS/Macros, SAS/Stat (summary, means, freq), SAS/EG (4.3, 5.1), SAS/PROCs proc report, proc summary nway chartype autoname auto label, proc means, proc tabulate with colpctn, colpctsum, pagepctn, pagepctsum, reppctn, reppctsum, rowpctn and rowpctsum, PCTN, PCTSUM.

Created many high-level SAS Macros and utilized SAS Array’s to improve upon and simplify existing unorganized data processes. Used positional and key- word parameters to process Macros.

ARRAYs and DO loops - do over, do until, do while, and do index DIM, multiDIM.

Utilized SAS data view, contains a definition of data that is in a shared environment within organization

Wrote many korn shell scripts that executes SAS programs to create numerous SAS tables for business analyst

Handle Card holder files, Account master files for Sears, Oil companies, Home Depot and Retails

Used many SAS stored procedures in conjunction with ‘data step’ process

Procedures SQL, sort, freq, format, CNTLIN, CNTLOUT, summary/autoname, compare, compare two SAS datasets for discrepancy, transpose, print, report, contents, datasets, import, export, append/force.

Create functional specifications and perform validation of reports

Strong SAS Developer with business analysis skills that can analyze code and suggest solutions for automation and process improvements. Able to articulate with the business users as well as IT project team.

Created Teradata RDBMS support five types of Indexes(UPI, USI, NUPI, NUSI, join index

Based on business needed for higher level of performance and efficiency and used data management such as indexing, joining, sorting, aggregation

Used pass-through to connect Teradata from Unix bring Teradata data into SAS/Unix environment

Created/manipulated account credit master related tables in SAS/Unix environment for users

Loaded SAS tables into Teradata used utilities such as Fastload, Multiload, BulkLoad,

TPT (Teradata Parallel Transporter)

Generated customized SQL reports for business community

Utilized SAS/Access interface to TERADATA database via pass-through to extact millions rows data using join options. inner join, left join, right join, full join.

Used TERADATA utilities Multiload, Fastload to create Teradata Tables with proper Primary Indexes

To improve query performance, utilized USI,NUSI,PPI, HASH INDEX and JOIN INDEX.

The Explain statement is used to aid in identifying potential performance issues On every queries.

TERADATA utilities TPT,Tpump(5 DML),Fastload,Multiload(5 DML) Fastexport, BTEQ, analyze Performance analysis, SQL Optimization, Teradata SQL Assistant.

DISCOVER FINANCIAL SERVICES / DISCOVER CARD – Riverwood, IL August 2006 – December 2010.

Senior SAS / Teradata developer

Support production system (Teradata, ETL, SQL,UNIX and SAS) process high volume (ten’s millions) data as input and output..

SAS 9.2 (Base, Macros, ODS, Stat, Access), Linux, UNIX (AIX, Korn Shell, VI editor), Teradata (v12), Oracle, Ab Initio, Cognos, Mainframe. SAS GRID environment

Utilized SAS/Access interface Pass-through(PROC SQL w/LIBNAME) to database(Teradata, DB2, Oracle) in UNIX and MainFrame environment

Used SAS, SQL and Ab Initio to extract and transform data from a wide variety of enterprise data sources – a large-scale

Utilized DBQL to trace inefficient SQL for Suspect Query Report.

Teradata Data Warehouse (tables with billions of rows), an Oracle Data Mart, conventional and partitioned flat files – on the UNIX (AIX, Korn shell), PC and Mainframe platforms.

Utilized SAS Grid manager client utility to process large datasets processing. Command-line utility.

Base SAS ODS (html,rtf,pdf,csv,csvall,proclabel).

Strong working knowledge of Unix and Shell Scripting

Good Mainframe knowledge with SAS extensive data manipulation experience

Automated scoring code initially developed by in-house statisticians and outside vendors to allow it to be more easily run on an ongoing basis, using shell scripting and SAS macros to eliminate error-prone manual intervention and provide required audit reports

Automated Reage and Performance systems and converted Python programs to SAS programs used SAS macro functions and UNIX’s CRON and AT command to setup automatic job schedule and generated adhoc report for users

Automated Direct payment systems used SAS macro functions and UNIX’s CRON and AT command to schedule the jobs and generated adhoc report for users

Procedures SQL, sort, freq, format, CNTLIN, CNTLOUT, summary/autoname, compare, compare two SAS datasets for discrepancy, transpose, print, report, contents, datasets, import, export, append/force.

Create functional specifications and perform validation of reports

Strong SAS Developer with business analysis skills that can analyze code and suggest solutions for automation and process improvements. Able to articulate with the business users as well as IT project team.

Analyzed high volume of SAS datasets and report abnormal cases.

SAS/Base, SAS/AF (Frame), SAS/SQL, SAS/Macros, SAS/Stat (summary, means, freq), SAS/EG (4.3, 5.1), SAS/PROCs proc report, proc summary nway chartype autoname auto label, proc means, proc tabulate with colpctn, colpctsum, pagepctn, pagepctsum, reppctn, reppctsum, rowpctn and rowpctsum, PCTN, PCTSUM.

TERADATA utilities TPT,Tpump,Fastload, Multiload,Fastexport, BTEQ, analyze Performance analysis, SQL Optimization, Teradata SQL Assistant.

Provided IT and data sourcing support for business analysts and statisticians in the Risk Management Department

Development environment was SAS 9.1.3 (Base, Macros, ODS, Stat, Access),Linux, UNIX (AIX, Korn Shell, vi editor), Teradata, Oracle, Ab Initio, Cognos, Mainframe

Used SAS, SQL and Ab Initio to extract and transform data from a wide variety of enterprise data sources – a large-scale Teradata Data Warehouse (tables with billions of rows), an Oracle Data Mart, conventional and partitioned flat files – on the UNIX (AIX, Korn shell, vi editor), PC and Mainframe platforms

Supported statisticians' modeling efforts on both the front-end (sourcing and transforming data) and back-end (reporting results, QA). Responsible for tuning statistician's SQL and SAS code, reducing model scoring run-time in one case from days to under an hour

Responsible for the initial set-up and day-to-day operations and support of a Teradata-based reporting Data Mart for Discover Personal Loans, which was used enterprise-wide to track the growth of the product from launch to over $1 billion in outstanding loans. Also supported and maintained daily standard report processing. Applied changes to reflect product enhancements, including a conversion from an in-house loan-servicing product to an external vendor's banking product (FISERV)

Worked with analysts and statisticians to implement line management strategies, including the generation of production transactions used by IT to effect changes to cardholders' accounts (credit line increases and decreases), as well as the associated required reporting

Worked with Marketing and Risk to develop and run processing to supply Consumer and Business Card data to an outside vendor (Argus) on a monthly basis to enable Discover to participate in industry-wide competitive analysis studies

Worked with the Risk and IT groups to transition code from successful model and strategy tests to permanent production status, including a Tradeline model that earned the Discover employees a President's Award

Processed numerous quick-turnaround ad hoc reporting and data requests for Discover and its outside consulting firms, e.g., Credit Card reform legislation requests. Results were delivered in a variety of formats – flat files, SAS data sets, Teradata tables, Excel, etc

Supported analysts' efforts to explore new sources of data, including new scores, mortgage data, and enhanced Credit Bureau information. Worked with analysts to pull samples to provide to the Credit Bureaus and outside consulting firms, and created SAS data sets and Teradata tables with returned data for follow-up, in-house analysis

Provided mainframe storage usage reports that enabled Risk Management to identify unused archived data (disk and tape), allowing the department to dramatically reduce recurring monthly storage charges

As a Teradata(v13)/SAS(9.2) application developer, responsible for AMP(access module processor) which is main core in the Teradata RDBMS and the Virtual Processor used to manage the database, handle file tasks and manipulate the disk subsystem in the multi-tasking and possibly parallel-processing

Created tables allow Unique and Primary keys as defined constraints on a table, so the rows across the AMPs evenly distributed and shared across the pre-defined number of AMPs

Created Teradata RDBMS support five types of Indexes(UPI, USI, NUPI, NUSI, join index)

Based on business needed for higher level of performance and efficiency

and used data management such as indexing, joining, sorting, aggregation

Used pass-through to connect Teradata from Unix bring Teradata data into SAS/Unix environment and vreated/manipulated account credit master related tables in SAS/Unix environment for users

Loaded SAS tables into Teradata used utilities such as Fastload, Multiload, BulkLoad Generated decision making customized SQL reports for business community

Utilized SAS/Access interface to TERADATA database via pass-through to extactmillions rows data using join options. inner join, left join, right join, full join.

Used TERADATA utilities Multiload, Fastload to create Teradata Tables with proper Primary Indexes To avoid table skew for good performance

Utilized Teradata options such as TENACITY, SLEEP to avoid an abort when encounter the maximum number of Teradata utilities are running

Utilized Teradata SQL assistant ODBC-based client utility used to access and manipulate data.

To improve query performance, utilized USI,NUSI,PPI, HASH INDEX and JOIN INDEX.

The Explain statement is used to aid in identifying potential performance issues On every queries.

CITIGROUP RISK MANAGEMENT, Elk Grove Village, IL March 2006 – June 2006.

Senior SAS / Teradata developer

Support production system (Teradata, ETL, SQL,UNIX and SAS) process high volume (ten’s millions) data as input and output..

Utilized SAS/Access interface Pass-through(PROC SQL w/LIBNAME) to database(Teradata, DB2, Oracle) in UNIX and Mainframe environment

Defined, designed and developed systems to support Management. Also investigated, analyzed, documented and resolved system issues

Development environment included SAS 9.1.3, UNIX Solaris 5.9, vi editor, IBM Mainframe (MVS), PC and Microsoft products

Responsible for populating a large UNIX-based Reporting Data Mart (20+ GB, 1000 variables)

Supported ETL processes that loaded MSR (Mailed System Records) mainframe data into the SAS environment

Worked on the Historic Repository conversion, migrating processing from the TSYS credit card system to the FDR credit card system

Developed and tested SAS programs to maintain SAS datasets for the Account, Balance, Rewards and Transaction applications

Generated statistical analytics reports for Business Clients

Helped less-experienced programmers finish their programs and ready them for production implementation

Processed large-scale SAS datasets – e.g., 80 million observations, 2100 variables – using SAS indexing to improve performance and compression to reduce space requirements

Developed and tested numerous SAS programs and UNIX scripts, ensuring that the resulting processing was production-ready – return codes checked, variables verified for valid and meaningful values, UNIX scripts created for automated job scheduling and notification

Utilized custom SAS processing (DATA _NULL_, Merge), as well as a number of SAS PROCs, including Means, SQL, Summary, Forms, Print, Tabulate, Freq, Report, Rank, CORR, Univariate, Standard, Append, Explode, Export, Format, Registry, Release, Datasets, Compare, Import, Sort, Contents, Options, Source, Convert, Optload, Copy, Optsave, Transpose, Printto, Trantab

Employed SAS macros and arrays to help minimize coding redundancy

Tuned SAS programs to reduce CPU resource usage and boost performance.

Procedures SQL, sort, freq, format, CNTLIN, CNTLOUT, summary/autoname, compare, compare two SAS datasets for discrepancy, transpose, print, report, contents, datasets, import, export, append/force.

SAS/Base, SAS/AF (Frame), SAS/SQL, SAS/Macros, SAS/Stat (summary, means, freq), SAS/EG (4.3, 5.1), SAS/PROCs proc report, proc summary nway chartype autoname auto label, proc means, proc tabulate with colpctn, colpctsum, pagepctn, pagepctsum, reppctn, reppctsum, rowpctn and rowpctsum, PCTN, PCTSUM.

ABBOTT LABORATORIES (CHR BUSINESS SYSTEMS) – Abbott Park, IL February 2004 - December 2005.

Senior SAS / Teradata developer

Support production system (Teradata, ETL, SQL,UNIX and SAS) process high volume (ten’s millions) data as input and output..

Data management, data manipulation, statistical analysis and reporting using SAS 8, UNIX, DB2, Mainframe (MVS), PC and Microsoft products (Excel, Access, VBA, pivot tables, charting). Designed, developed, implemented, tuned and supported many SAS programs, SAS datasets and SAS macros

Worked on Healthcare Claims and SAS/Base, SAS/SQL, SAS/Macros, SAS/Stat, SAS/EG, SAS/PROCs.

Assisted internal business clients with their requirements gathering, design, development, construction, and implementation efforts

Supported the production Corporate Human Resource System – 300 SAS jobs in a mainframe, Web, UNIX and PC environment

Designed, developed, implemented and supported SAS-based New Hire and Transferred Employees Status weekly, monthly, quarterly and annual processing

Helped build company’s new HOSPIRA system (150 SAS jobs), which processed a variety of files and reports

Created 200 statistical and analysis SAS ad-hoc reports to satisfy various departments’ requirements for timely analysis

Developed Web-based Corporate Employee File Organization (CEFO) application in HTML and Cold Fusion that was used to maintain employees’ records on-line

Responsible for the support of 75 AIMS DB2 tables, which stored current and historical data for use by ad hoc requestors from various departments. Provided tech support for the system’s users

Mentored less-experienced employees, providing guidance on how to program and maintain jobs efficiently

Created Infopac (on-line report viewing system) and Dispatch tables

Maintained a number of COBOL programs used to populate DB2 tables with up-to-date information

Supported COBOL/UNIX data manipulation processes, including UNIX Bourne/Korn scripts, to handle feeds to/from various systems

Worked on a team that created large mainframe (MVS) flat files used to load the MENTOR system’s Oracle database tables

Extracted HR data using Tesseract in conjunction with COBOL/DB2

Utilized IBM DB2 catalog tables to identify prime keys for multiple-table joins so appropriate changes could be made to improve performance

Created many new mainframe PROCs with symbolic parameters

Loaded various DB2 tables using DB2 utilities; created many VSAM files (KSDS) with alternate indices and path entries

Used Excel and Access VBA to manipulate .txt, .csv and Excel files. Created pivot tables for analysis, charting, reporting purposes

Procedures SQL, sort, freq, format, CNTLIN, CNTLOUT, summary/autoname, compare, compare two SAS datasets for discrepancy, transpose, print, report, contents, datasets, import, export, append/force.

WALGREEN CO., Deerfield, IL January 2003 – January 2004

Senior SAS / Teradata developer

Support production system (Teradata, ETL, SQL,UNIX and SAS) process high volume (ten’s millions) data as input and output..

Performed requirements gathering, design, development, construction, and implementation for internal business clients

Supported legacy AP, DMS (Check Print System), FA (Future Rent) and Lawson AP Systems

Built Travel Meal System using MicroFocus COBOL (Express 3.1) in conjunction with Oracle Database via ODBC

Upgraded /modified various MicroFocus COBOL programs in the DMS system to improve efficiency and boost performance

Created and maintained UNIX Bourne shell scripts and COBOL programs used to process data feeds

Maintained and enhanced COBOL programs used to update DB2 tables

Created, maintained and tested mainframe MVS JCL and numerous SAS programs – SAS procedures used included PROC Summary, PROC Tabulate, PROC Means, PROC Access, PROC Report, PROC Freq, PROC SQL, PROC Sort, PROC Print, and PROC Forms

Used SAS Data _Null_ to convert SAS data to mainframe flat files and generate customized reports

Performed complex data manipulation against master and transaction files using SAS (update, merge)

Responsible for archiving SAS data to tape and disk backup for history purposes

CNA INSURANCE, Chicago, IL July 1990- February 2002

IT Application Specialist / Systems Analyst,

Supported the needs of a number of internal business clients throughout the entire systems lifecycle – requirements, design, development, testing, implementation, maintenance and support

Development was done in a variety of environments and platforms, including the mainframe (MVS, COBOL/ COBOL II), UNIX (HP), PC (Visual Basic), database (DB2, Oracle, Teradata), SAS (multiple platforms)

Assumed full responsibility for the Information Diversify Facility System (Oracle data warehouse), as well as the Claims Verification System (DB2 database, Visual Basic scripts)

Developed a number of on-line applications that utilized Visual Basic 6.0 on the front-end, COM/DCOM/MTS on the middle-tier, and UNIX/Oracle on the back-end. Created documentation for the systems, implemented them, and also trained users

Designed, wrote, tested and implemented a Claim/Policy Verification System that employed Visual Basic scripts which communicated with DB2 via ODBC. Provided documentation for the system and trained users

Implemented improved Visual Basic programs and forms to address users’ requests

Enhanced many COBOL/COBOLII programs, and made technical recommendations to project managers

Converted Visual Basic 3.1 16-bit applications to 32-bit Visual Basic 6.0 and implemented the upgraded applications. Provided VB mentoring and training to less-experienced team members

Built a large-scale Oracle data warehouse on HP-UNIX for CNA’s Monthly Financial Close

Generated status reports and documented problems and their resolution for team members and managers

Assumed a group DBA role



Contact this candidate