Post Job Free

Resume

Sign in

Data Developer

Location:
Lewisville, TX
Posted:
November 08, 2017

Contact this candidate

Resume:

SYNOPSIS:

Over *+ years of diversified IT experience with extensive knowledge and work experience in design, development and Support of Client-Server, testing, implementation and maintenance in data warehousing environment.

Over 7 years of industry experience in developing strategies for ETL (Extraction, Transformation and Loading) mechanism using Ab Initio tool in complex, high volume Data Warehousing projects in both Windows and UNIX.

EXPERIENCE SUMMARY:

Worked in various Heterogeneous Source Systems like Oracle, DB2 UDB, Teradata, MS SQL Server, Flat files and Legacy Systems.

Strong hands on experience with Ab Initio GDE (1.15/1.14/1/13/3.0.6), Co-op (2.15/2.14/2.13/3.0.6).

Used UNIX commands to copy, delete, and create files and directories and utilities such as top CRONTAB. Used UNIX commands to copy, delete, and create files and directories and utilities such as top CRONTAB.

Strong experience in Data Engineering, Data Quality, Data Management, Data Analysis, Data Presentation.

Experience in building Data Products from transaction level data. Designed a data product for cross border payments(FX) to identify all the international wire payments.

Good Experience in Event Categorization, Business Name Standardization from transaction level data.

Rich experience in developing interactive dashboards using TABLEAU software.

Well versed with various AbInitio components such as Round Robin, Join, Rollup, Partition by key, gather, merge, interleave, Debug sorted, Scan, Validate and FTP

Experience in writing UNIX shell scripts (Wrapper Scripts) for Autosys Scheduling.

Thorough understanding of Software Development Life Cycle (SDLC) including requirements analysis, system analysis, design, development, Documentation, training, implementation and post-implementation review cycles.

Good knowledge in Financial and Asset Management domain.

Worked on SAS/Base, SAS/Stat, SAS/Access, SAS Graphs and SAS/Macro.

Expertise with various Ab initio parallelism techniques and implemented number of AbInitio Graphs using Data parallelism and Multi File System (MFS) techniques.

Specializing in ETL tools using AbInitio (GDE, Co>operating system, EME) & SQL * Loader for data migration to Enterprise Data Warehouse.

Exceptional skills in writing complex SQL Queries and procedures involving multiple tables, constraints and relationships for efficient data retrieval from relational database for data validation using SQL Query Analyzer and TOAD.

Expertise in all components in the GDE of Ab Initio for creating and executing testing and maintaining graphs in Ab Initio and also experience with Ab Initio Co-operating System in application tuning and debugging strategies. Experience in Parameterization of graphs and worked with PSET.

Proficient in QA processes, Test strategies, and in performing Functional, GUI, Regression, Database, Integration and Load Testing.

Developed wide variety of DML’s for data from different sources. Developed many conditional DML’s, vectors and created several .dbc files for different data sources.

Perform troubleshooting, maintenance, and performance tuning of Ab Initio graphs, PL/SQL procedures, and Korn shell scripts

Used Continuous Flows for inserting and updating the transactional data as per the requirement.

Delivered Data Mapping validation strategy, Reviewed/examined legacy data with Ab Initio Data Profiler.

Developed various UNIX shell scripts to run Ab Initio and Data base jobs

Expertise in the concepts of ER Modeling, Dimensional Modeling, Data marts, Fact and Dimension Tables

Well versed with UNIX shell wrappers, KSH and Oracle PL/SQL programming and stored procedures. Technical Mentor with excellent communication and client liaison skills.

Extensively worked on EME for version control

Metadata mapping from legacy source system to target database fields and involved in creating Ab Initio DML’s from COBOL copybooks.

Excellent knowledge of Data Warehousing and Business Intelligence Concepts and working with Data Modeling Methodologies and Architectures Star Schema/Snowflake Schema

Excellent Communication & Interpersonal skills.

Quick learner and excellent team player, ability to meet tight deadlines and work under pressure

TECHNICAL SKILLS:

ETL Tools: AB INITIO, GDE, CO >Operating System, Informatica 9.5.

SAS Tools: SAS V8, V9, SAS/BASE, SAS/GRAPHS, SAS MACROS, SAS/ODS.

Reporting Tools: Tableau 10.3, ClearQuest 6.5, Business Objects 5.i, COGNOS, Crystal Reports 8.5

RDBMS: Oracle 9i/8i, DB2 UDB EEE, Teradata V2R5, SQL Server 2000/7.0

Operating Systems: Unix (AIX, SUN Solaris), Linux (Redhat, Slackware,SUSE), Windows NT/XP/2000

Data Modeling Tools: Erwin 4.0, Star Schema Modeling, Snowflakes Modeling, MS Visio

Scheduling Tools: Autosys, Control-M.

Development tools: MS-Project, MS-Excel, MS-PowerPoint, MS-Word, Macromedia Dream Weaver, Flash

Methodologies: Ralph Kimball’s Data modeling, Star and Snowflake Schema Modeling, OOAD

Development Languages: SQL, PL/SQL, Unix Shell Scripting, C/C++, Java, PHP, Perl, HTML, DHTML, ASP, JDBC, ODBC, JSP

QUALIFICATIONS:

Bachelor of engineering in Computer Science, Bellary, Karnataka, India

PROJECT EXPERIENCE:

Bank of America, New York, NY Oct 16 - Present

Data Scientist/Tableau Devloper

Responsibilities:

Understand the requirements of various business segment client managers and treasury officers in buiding dashboards related to Cross border payments.

Understand the data flow of the International wires coming from various source

systems.

Developed an end-to- end data product which captures all the international wire

payments initiated from the bank.

Developed code to identify destination(beneficiary) country from the semi-structured /

structured data populated in various columns/field by using SAS & Teradata -SQL.

Developed logic using SWIFT codes, IBAN (International Bank Account) numbers, Bank

Address Fields & Routing numbers available in various fields in the Teradata tables in

semi-structured format.

Tested the code/logic to ensure there are no invalid values, false positives, and wrong

assignments while identifying the destination country.

Shared the final output files with the respective business partners for validation.

Streamlined the SAS & SQL code files and moved the final code into production after

receiving sign-off from business partners.

Used TABLEAU visualization software and built interactive dashboards for various

business segment (Small Business, Large Corporates, Middle Market, Business Banking)

client managers for getting better insights from the data.

Dashboard helps the client managers & treasury officers to tap the revenue

opportunities by selling FX services to various clients of the bank.

Strong experience in building dashboards using Tableau Desktop & Tableau Server

products.

Proficient in leveraging tableau capabilities like creating Calculations, Filters,

Parameters, sets, Table Calculations, Mark Card Changes, Tooltip Formatting to create

focused and effective visualizations using Tableau Desktop.

Represented the cross-border payment views using the Geographic Maps, Pie Charts,

Bar Charts, line charts, cross tabulations and Heat Maps.

Created a view for top-N analysis to identify top beneficiary parties, currency formats,

share of wallets.

Created calculated fields using arithmetic calculations, custom aggregations, date math

and quick table calculations.

Worked with merging multiple data sources using Cross database joins and data

blending concepts for pulling various data fields.

Implemented the In-Memory computing by using Tableau data extract option to

improve the performance of the dashboard.

Dashboard highlights the total number of transactions & Dollar amount sent by each

client for the rolling 12 months.

Dashboard can be drilled down to see who is the Client Manager & Treasury officers

responsible to sell FX services for each client.

For any given client, dashboard will show up the number of transactions sent to various

countries, and the currency format it was sent, whether they have used bank’s FX

services or not, and the beneficiary party details and beneficiary financial institution

details.

Dashboard also serves as a tool to identify third party FX service providers.

Published the dashboards on to Tableau Server, and scheduled the extracts.

Assigned appropriate permissions to the various business segments end users.

Responsible for adding, removing and setting permissions to the users of the dashboard.

Wells Fargo, San Francisco, CA Jan 14 – July 16

Data Scientist/Tableau Developer

Responsibilities:

Designed dashboard which serves as a tool to monitor the health of the customers (360

degree view).

Responsible for building end to end dashboard from requirement gathering to delivering

the tableau dashboard to end users.

Dashboard gives the snapshot of various network trends of Buyers, Suppliers & its

Employees for a given client.

Dashboard will show the Employer to Employee coverage, commercial product break

down, Consumer & Commercial exposure for a given client.

Dashboard also allows users to drill down to the employees of a client to see their

product mix, FICO mix, product balances, exposure values & Geographical distribution.

Dashboard further provides insights at Industry & Sector level for a given client.

Connected to Teradata through Tableau & wrote custom SQL queries to pull the data

from the database into tableau.

Used groups, Parameters, sorts, sets and filters to create focused and effective

visualizations in Tableau.

Made use of Actions, images, Maps and Trend Lines, Groups, hierarchies, sets to create

detail level summary report.

Worked on creating calculated fields for custom requirement using different functions

like number functions, string functions, date functions and table calculations.

Built network data using the Geographic Maps, Pie Charts, Bar Charts, line charts, cross

tabulations and Heat Maps.

Designed and developed various network views from multiple data sources by blending

data on a single worksheet in Tableau Desktop.

Worked on Creating various plots and drill down analysis using Tableau desktop.

Worked on scheduling reports, managing data sources and refresh extracts.

Dashboard developed in Dev environment and has been migrated to UAT & PROD

environments.

Worked on Tableau server for Publishing, scheduling the Data refreshing frequency,

security and controlling access.

Administered users, user groups, and scheduled instances for reports in Tableau.

Resolved various performance issues and analyzed the best process distribution for

different projects.

Fannie Mae, Herndon, VA Jul ’13 – Present

Sr. ETL Developer

Responsibilities:

Designed and deployed ETL graphs by studying the business requirements from the functional design document and from the business users.

Worked on the creation of the Low level design documents.

Worked with the DBA team for the creation of the required database objects in the data mart.

Created mapping documents for the development of ETL.

Worked extensively on Ab Initio vector functions to deal with complex business rules. Functions like vector_concat, vector_sort-dedup_first, and vector_append, vector_select, were used.

Worked heavily on PLAN, PSET, PDL make the ETL Flow more dynamic, versatile and set dependencies.

Worked with PSET to execute same job for different source systems with different set of parameter values.

Developed DML’s for data from different sources like flat files. Developed conditional DML’s for data that has Header, body and Trailers.

Developed complex Ab Initio XFR’s to derive new fields and solve various business requirements.

Comparing different versions of code in EME Console with EME graphical or textual difference functionality.

Used different EME air commands like air ls, air object version, air project export etc.

Phasing and check points were used in the graphs to avoid deadlock and recover completed stages of the graph, in case of a failure.

Worked with creation of csv files as an output from the ETL.

Worked extensively on creation of Ad hoc excel reports for the business reviews, Which includes the usage of components like Write Excel Flow and Read Excel Flow.

Worked extensively on Logical and Physical data modeling.

Created UNIX shell scripts (Wrapper Scripts) for Autosys Scheduling.

Used Autosys Scheduler including Scheduling, Updating, creating and querying box/command jobs in a multi-server environment.

Written wrapper scripts to automate batch jobs.

Experience in building Ab Initio BRE rules (Business Rules Engine) and calling the BRE XFRs in Ab Initio.

Taking care of dependencies while creation of the Box jobs in Autosys.

Improved the performance of the ETLs developed by optimizing the usage of the components like Sort that will break the parallel processing of the data.

Responsible for creation of the test case scenarios and make sure that the data from source is available in the target database in a proper format.

Created UNIX script to email the files generated by the ETL and SFTP the files to the target servers.

Performed integration testing and involved in code reviews and UAT.

Coordinating with the testing team for the test results and creation of the test scripts. Worked on creation of the target queries for the test script.

Environment: Ab Initio GDE 3.1.3, 3.1.2, Co-Op 3.1.3, 3.0, UNIX, Oracle 11i, Toad for Oracle, Windows2003, EME Console, Autosys and Op-Console

VISA Inc. Foster City, CA Sep ’12 – Jul ’13

Sr. ETL Developer

Responsibilities:

Designed and developed various Ab Initio Graphs.

Developed Ab Initio graphs based on business requirements using various Ab Initio components such as Partition by key, Partition by round robin, Reformat, Rollup, Join, Scan, Normalize, Gather, Merge etc.

Involved in estimation activity for development efforts.

Involved in unit testing and documentation of Ab Initio Graphs.

Involved in meetings to gather information and requirements from the business.

Used Psets and PLAN to create Ab Initio jobs.

Involved in establish and/or continuously improve upon development guidelines, standards and procedures.

Good Knowledge of Dimensional data modeling, Fact and Dimensional tables, Physical and Logical data modeling.

Experience in Relational databases- Oracle, DB2, Sql Server, and Teradata.

Extensively used Ab Initio tool’s feature of Component, Data and Pipeline parallelism.

Used Meta Programming functions and PDL to create generic applications.

Developed and Troubleshooted data load for Teradata using FAST Load, Multi Load, T Pump Techniques.

Responsible for loading configuration data from DEV environment to QA and then to PROD.

Analyzed the issues for unmatched records and provided code fix to the issues.

Responsible for designing and developing shell scripts (Wrapper) to validate data.

Used Teradata Sql assistant to query data in the target Teradata tables.

Developed and automated Load utility graphs which refreshes the data across databases.

Developed shell scripts to automate file manipulation and data loading.

Developed UNIX wrapper scripts to run multiple Ab Intio graphs together in a sequence based on requirements.

Worked on Ab Initio Multi File System (MFS) and multifiles to make use of data parallelism and to achieve better performance.

Used Ab Initio EME for check in/ Checkout and Version Control.

Developed data validation controls and Exception processes per business requirements to validate transformations in Ab Initio Graphs and report exceptions.

Involved in understanding source system, documenting issues, following up on issues and seeking resolutions.

Involved in writing user defined functions for business process to improve the performance of the application.

Created Psets to run generic graph with different parameters and avoided cod redundancy.

Developed and automated Load utility graphs which refreshes the data across databases.

Created test plans and test cases and responsible for Unit testing.

Environment: Ab Initio (GDE 3.0.6, Co>Op 3.0.6), Oracle 10g, Teradata, SQL Developer, EME, Control-M, Unix Shell Scripting

Medco, Franklin Lakes, NJ Nov ’11 – Sep ’12

Sr. ETL Developer

Responsibilities:

Worked in the Data Management Team on Data Extraction, Fictionalization, Subset, Data Cleansing and Data Validation.

Used Ab Initio as ETL tool to pull data from source systems, cleanse, transform and load data into database.

Prepare logical/physical diagram of DW, and present it in front of business leaders. Used Erwin for model design.

Extensive usages of Multi file system where data is partitioned into four partitions for parallel processing.

Implemented Lookups instead of joins, in-memory sorts to minimize the execution times while dealing with huge volumes of data

Developed Generic graphs for data cleansing, data validation and data transformation.

Worked with De-partition components like Concatenate, Gather, Interleave and Merge in order to de-partition and repartition data from multi files accordingly.

Deployed and ran the graphs as executable Korn shell scripts in the applications system.

Modified Ab Initio graphs to utilize data parallelism and thereby improve the overall performance to fine-tune the execution times by using multi file systems and lookup files whenever required.

Worked with Partition components like Partition by key, Partition-by-expression and Partition-by-round robin to partition the data from serial file.

Created common graphs to perform common data conversions that can be used across the applications using parameter approach using conditional DMLs

Responsible for cleansing the data from source systems using Ab Initio components Such as Join, Dedup Sorted, De-normalize, Normalize, Reformat, Filter-by-expression, Rollup.

Used Toad to verify the counts and results of the graphs.

Worked with Teradata utilities such as MLoad, Fast load, TPump, fast export etc. to load or extract tables.

Implemented business validation rules in staging area by using Ab Initio inbuilt functions such as is_valid, is_null, is_defined, is_error, is_blank etc.

Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.

Used Enterprise Meta Environment [EME] for version control, Control-M for scheduling purposes.

Worked on Continuous Flows in Ab Initio.

Extensively used data base components such as Run-sql, Multi update table, Update table and Join with DB.

Involved in setting up and maintaining sandbox parameters.

Created several Ad-Hoc graphs to provide reports to instant requests from Business.

Written wrapper scripts to automate batch jobs and also provided production support when needed.

Tuning of Ab Initio graphs for better performance.

Involved in System and Integration testing of project.

Worked closely with QA to help them understand the system to meet all the required test cases to be verified before the code is migrated to subsequent environments UAT and PROD.

Wrote several shell scripts to remove old files and move raw logs to archives.

Process and transform delta feeds of customer data which comes in daily.

Developed dynamic graphs to load data from data sources into tables and to parse records.

Environment: Ab Initio [Co-op 2.15, GDE 1.15.9.2], UNIX, EME, Teradata, TOAD, Erwin, Control-M and Windows.

Fannie Mae, Virginia, VA Apr ’10 – Nov ’11

Sr. ETL Developer

Responsibilities:

Extensively worked on understanding the business requirements and responsible for Categorizing and Prioritizing the requirements according to the time frame, system resources, human resources and other related factors.

Involved in all phases of the Software Development Life Cycle Analysis, Business modeling and Data modeling.

Used Ab Initio to extract, transform and load data from multiple input sources like flat files, Oracle files to the database.

Designed, developed and unit tested relational data warehouse project at Fannie Mae using Ab Initio ETL tool.

Created various Ab Initio graphs with 8- way Multi File Systems (MFS) that is composed of individual files on different nodes that are partitioned and stored in distributed directories (using Multidirectories).

Involved in preparing graph design documents.

Developed several partition based Ab Intio graphs for high volume data warehouse.

Developed various parameterized graphs in GDE.

Extensively used Ab Initio tool’s feature of component, data and pipeline parallelism.

Extensively used aggregating components like Rollup, Scan and Scan with Rollup in transformations to consolidate the data.

Involved in tuning the graphs by creating Lookup files and by minimizing the number of components in the flows and also by implementing MFS.

Developed/Modified subject area graphs based on business requirements using various Ab Initio Components like Filter by Expression, Partition by Expression, reformat, join, gather, merge rollup, normalize, denormalize, scan, replicate etc.

Involved in writing shell scripts.

Involved in performance tuning of SQL queries, views using TOAD.

Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab-Initio partition components to segment data.

Developed data base using SQL, Indexes, and Constraints etc. Good experience on SQL optimization. Oracle is preferred.

Have good knowledge on development methodologies.

Used phases and checkpoints in the graphs to avoid the deadlocks, improve the performance and recover the graphs from the last successful checkpoint.

Extensively used Ab Initio built in string, math and date functions.

Used Ab Initio web interface to navigate the EME to view graphs, files and datasets and examine the dependencies among objects.

Efficiently used Graph level parameters in building and executing graphs.

Experience using Autosys Scheduler including Scheduling, Updating, creating and querying box/command jobs in a multi-server environment.

Created UNIX shell scripts (Wrapper Scripts) for Autosys Scheduling.

Performed various data cleansing and data quality exercises using various Ab Initio functions.

Provided level2 production support activities for the application.

Scheduled the development team to walk through their process with the DTOPS and test/acceptance teams.

Involved in creating LOE and presenting it to the application manager.

Worked with the PM assigned to the application to provide input to the project plan.

Perfomed the tasks outlined in the plan.

Worked with the application manager and PM to add dates to the dashboard.

Involved in evaluating the code and see what needs to change to confirm to operational requirements.

Provide a status to the assigned PM on the tasks assigned to the team.

Worked extensively on Logical and Physical data modeling.

Environment: Ab Initio GDE (1.15/1.14), Co>Op (2.15/2.14), EME, Autosys, UNIX, Clear Quest, Oracle 10g, Toad, UNIX Script (KShell).

Amgen, Thousand Oaks, CA Jul ’08 – Apr ’10

Sr. ETL Developer

Responsiblities:

Performed Analysis, designing and preparing the functional, technical design document, and code specifications.

Based on the requirements created Functional design documents and Technical design specification documents for transforming and loading.

Developed and supported the extraction, transformation and load process (ETL) for a Data Warehouse from their OLTP systems using Ab Initio and provide technical support and hands-on mentoring in the use of Ab Initio.

Responsible for all pre-ETL tasks upon which the Data Warehouse depends, including managing and collection of various existing data sources

Involved in developing UNIX Korn Shell wrappers to run various Ab Initio Scripts.

Developed Ab Initio XFR’s to derive new fields and solve various business requirements.

Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, Broadcast, merge etc.

Debugged and modified shell scripts using edit script and vi editor in UNIX environment in determining various graph paths and run file path for job request engine.

Determine level of testing required to validate new or revised system capabilities.

Parameterized and sectorized the graphs from local to global parameters for various job request loads.

Expertise in the concepts of Data Warehousing, Data Marts, ER Modeling, Dimensional Modeling, Fact and Dimensional tables with data modeling tools ERWIN and Sybase Power Designer.

Experience in automated Client Reporting.

Worked on improving the performance of Ab Initio graphs by using various Ab Initio performance technique’s like using looks instead of Join’s etc.

Implemented Lookup’s, lookup local, in-Memory Joins and rollup’s to speed up various Ab Initio Graphs.

Design Documentation for the developed graphs.

Migrated the code from different DEV/BID environments to QA environment as the per the EDW deployment guide, also investigating whether the UNIX code, Teradata(TD) and Ab Initio code( ETL code standards for Mappings, workflows &Sessions) are up to standards set by Business & team

Created mapping document and ETL design document.

Updated and inserted the transaction data according to the business changes using continuous Flows.

Good understanding of new Ab Initio features like Component Folding, Parameter Definition Language (PDL), Continuous flows, Queues, publisher and subscriber components

Involved in writing complex SQL queries based on the given requirements.

Worked with DBA team to ensure implementation of the databases for the physical data models intended for the data marts.

Used Fast Load, Multi-Load, and Tpump for loading data in to Teradata Database

Created UNIX shell scripts to automate and schedule the jobs.

Created the migration scripts, test scripts for testing the applications, creating and supporting the Business Objects reports.

Involved in creating high level design documents.

Produced the required documents like Mapping documents, Design documents and Use Case documents.

24*7 production support includes monitoring batch jobs, investigating and resolving the problems.

Involved in the integration testing with ETL developers and User Acceptance Testing (UAT) with Business Analysts.

Experience in coordinating with offshore on development/maintenance projects.

Performed physical data modeling, performing regular refreshes of development and test database environments using the Export/Import utility.

Environment: Ab Initio (GDE 1.15, Co>Operating System 2.15), Oracle 10g, DB2, Teradata v2R6, Toad, ERwin 7.3, Control-M, Pl/Sql, Sql*Loader

Home Depot, Atlanta, GA Mar ’07 – Jul ’08

Sr. ETL Developer

Responsibilities:

Developed and supported the extraction, transformation and load process (ETL) for a Data mart from their OLTP systems using Ab Initio.

Developed metadata mapping from legacy source system to target database fields and experience in creating Ab Initio DMLs.

Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, join, scan, gather, Broadcast, compress, uncompress etc.

Conducted analysis and generated tables, listings and graphs using SAS.

Worked closely with data managers.

Created functional and technical detailed design documents for ETL.

Extensively used Ab Initio EME Data store/sandbox for version control, code promotion and impact analysis.

Created Source to Target data mappings.

Perform validation on derived datasets and following the standard operating procedures during validation process.

Experience in Star-Schema model and designing fact and dimension tables.

Extensively used the Ab Initio tool’s feature of Component, Data and Pipeline parallelism.

Experience in developing Unix Korn Shell wrappers to run various Ab Initio Scripts.

Implemented Lookups, lookup_local, In-Memory Joins and Scan to speed up various Ab Initio Graphs.

Worked on the sales Data Mart for moving the enterprise data like a queue system using Continuous Flows

Experienced in Performance Tuning of graphs and troubleshooting techniques

Developed various Ab Initio Graphs for data cleansing using Ab Initio function such as is_valid, is_error, is_defined, string_* functions etc.

I have also used Ab initio with EME for ETL process.

Developed Ab Initio XFRs to derive new fields and solve various business requirements.

Experience in using Conditional Components and Conditional DML.

Created drafts in Control M to schedule Jobs and create Dependencies.

Attending weekly collision meeting and staff meetings to avoid project components overlay with other multi channel projects.

Involved in tracking and ensuring resolution for production issues.

Worked in close coordination with business architects, testing and production support groups.

Document the process as per PMM Methodology

Interacting with the Source Team and Business to get the Validation of the data.

Driving Run activity and UAT co-ordination.

Load test data files into database tables using Output Table.

Testing - unit testing & integration testing.

Involved in creating source to target mapping and low-level design documents.

Deploy graphs on to the test, staging area and schedule batch jobs

Requested Change Requests to implement code in Production.

Migrated Scripts and Control M drafts to Testing and staging.

Experience in scheduling Control M jobs in production.

Involved in the Implementation during production

Supported the code after post production deployment.

Environment: SAS/Graphs, SAS/Stats, SAS/Ods



Contact this candidate