PRADEEP KUMAR REDDY BOYILLA
*********@*****.***
[pic]
Summary of Experience
Over 8 plus years of extensive industry experience in the analysis, design,
development of statistical data models including data extraction and
manipulation, writing macros and reporting on various projects for
financial industries using SAS and its tools.
. SAS professional with around 7 years of experience in developing
statistical methods routinely used in financial analysis.
. Around 1.5 years of experience on Hadoop, HDFS, HBASE, HIVE, IMPALA,
OOZIE, SQOOP, and Map Reduce Programming.
. Around 1 years of experience on R programming.
. Starting Working on Tableau Dash boards.
. Having fair knowledge on python scripting
. Having Expertise on SAS Admin activities like installing SAS,
maintaining users using SAS Management Console.
. Having expertise knowledge in using Teradata SQL Assistant for
Teradata, Aginity Workbench for Netezza and SQL Developer for Oracle.
. Experience in SAS/BASE, SAS/EG, SAS/SQL, SAS/MACROS, SAS/ODS,
SAS/STAT, SAS/ACCESS, SAS/CONNECT, SAS/DI STUDIO
. Solid understanding and proficient use of various procedures like proc
SQL, proc report, proc freq, proc means, proc contents, proc dataset,
proc append, proc import, proc export, proc print, proc printo, proc
sort, drop, keep, retain, date manipulations, formats, in formats.
. Excellent command in writing complex routines for Data Validations,
data extraction, transformation and loading to target decision support
systems using MS Excel Macros and SAS on various environments.
. Proven skills in Data Cleansing, Data Archival, Data Migration, ad-hoc
reporting and coding utilizing SAS on UNIX, Windows
. Experience in using various MS office tools for analyzing, arranging
and presenting data
. Strong experience in coding using SQL, procedures/Functions, SAS
macros.
. Experience in creating SAS programs for EDIT CHECKS and performed data
validation on raw data sets and created data set for analysis
. Experience in the different methods of extraction, transformation and
loading of data into the warehouse (ETL)
. In-depth experience in SAS DI STUDIO where creating SAS Jobs by using
various (File Reader, File writer, Extract, SQL Join, Lookup, Sort,
User written and SCD Type2 Loader etc transformations.
. Good analytical skills pertaining understanding a business domain
coupled with excellent teamwork and strong communication skills.
. Hard working, ethical and highly result oriented. Ability to quickly
grasp and apply new concepts and technologies.
Technical Skills
SAS Tools SAS/BASE,SAS/EG,
SAS/SQL,SAS/MACROS,SAS/ODS,SAS/STAT,SAS/ACCESS,
SAS/CONNECT,SAS/DI STUDIO.
Programming R, Python, SQL, PL/SQL and Unix Shell Scripting
Databases Oracle 11g, Teradata, Netezza
Environment Windows and Unix.
Education
B Tech in Electronics and Communication Engineering from JNTU Hyderabad
in 2005
Certifications
SAS Certified Base Programmer for SAS 9.
PBS Client has given best employee award for year 2009.
Experience
. Working as Senior Analyst/Developer in Dish Network LLC from March
2013 to till date.
. Worked as Sr. Software Engineer in Larsen & Toubro Infotech Ltd from
Feb 2008 to March 2013.
. Worked as SAS Programmer in IVenture Technologies, Bangalore from Dec
2005 to Jan 2008.
Professional Summary
Enterprise Analytics, Dish Network, CO March
2013 - Till Date
Senior Developer/Analyst
As a team we will be doing predictive models for different teams across
Dish and generating Fraud reports based on fuzzy logic.
Responsibilities:
. Collecting the requirements from retailer services group for Fraud
reporting.
. Generating the daily, weekly and monthly fraud reports based on fuzzy
fraud logic.
. Fuzzy logic is implemented by using SAS in built function called SPEDIS.
. Connecting to Netezza to get the activations and deactivations and Work
order data.
. Connecting to Teradata to get the demographics, accounts and neighborhood
data.
. Working to get the training and scoring data for models.
. Working to get the data for Upsell and Winback and Retention modeling.
. Requirement gathering for need base segmentation Tableau dah board
reports.
. Working on viewership data to get the list of customers who watched more
VOD to give promotions and recommendations.
. Working in R programming for address scrubbing for external vendors such
as Altitude and Trost.
. Generating daily report with the current status (AC,DV, NP etc..) for
each account.
. Worked on python script to parse xml file and put the data in Hadoop.
. Working on Hadoop to SQOOP the tables from Netezza and Sqoop.
. Using OOZIE to create the workflow with SQOOP, HIVE, shell actions.
. Using Workflow coordinator to schedule the workflow on weekly base.
. Using Impala to get the data from the HBASE tables.
Environment: SAS 9.3, SAS/EG 5.1, BASE/SAS, SAS/MACRO, SAS/SQL,
SAS/CONNECT, SAS/ACCESS, SAS/ODS, R, Hadoop HBASE, HDFS, HIVE, Impala,
SQOOP,OOZIE Oracle 11g, Netezza, Teradata Windows XP.
GCBC (Global Consumer BASEL Consolidator), Citigroup, NJ
November 2010 - March 2013
Sr. Software Engineer
Worked on the implementation of BASEL II revised framework on international
convergence of capital measurement and capital standards.
Responsibilities:
. Collection of the user requirements from the business users in the form
of business requirement document (BRD) and prepare functional
specification and technical design documents.
. Create the Folder structure and update the File monitor program in UNIX
for each file upload.
. Understand and update the existing code or create new code by using
BASE/SAS, SAS/MACROS and SAS/SQL to implement new requirements.
. Upload the File from PeopleSoft front end and check the SAS log in UNIX.
. Populate the status table in each stage and error table if any error
occurs and load the ODS and Warehouse tables if file processed
successfully.
. Working on enhancements for CLTV and Master Table load process with
Utilization band.
. Developed and supporting the CLTV and Master table load process.
. Designed and developed the Retail, Treasury and Recon portfolios file
upload functionality.
. Wrote various format and integrity edit checks on CSV files and on
staging SAS datasets.
. Develop SAS code to validate data of different portfolios.
. Modified existing SAS programs and created new programs using SAS macro
variables to improve ease and speed of modification as well as
consistency of results.
. Analyzed and implemented the code and table changes to improve
performance and enhance data quality.
. Done Code Optimization using SAS macros.
. Keeping track of any data validation issues using MQC (Mercury Quality
Center).
Environment: SAS 9.3, BASE/SAS, SAS/MACRO, SAS/SQL, SAS/CONNECT,
SAS/ACCESS, SAS/ODS, Oracle 11g, TOAD, AIX-Unix 5.3 and Windows XP.
GCBC Automation, Citigroup, NY May 2010 - October
2010
Sr. Software Engineer
Re- Engineering SAS to Ab initio to automate the File creation process.
Responsibilities:
. Understanding the BASL BRD to know the significance of each column the in
the CSV file
. Get the Production the SAS code and analyze.
. Create the understandable EXCEL template to start dad mapping for each
line of business.
. Worked on source to target data mapping of all the target fields for each
business lines.
. Extensively utilized SAS/BASE, SAS/SQL and SAS/MACRO.
. Extensively used procedures such as PROC APPEND, PROC IMPORT, PROC
MEANS, PROC SUMMARY, PROC CONTENTS and PROC SQL.
. Involved in project review meetings with respective Business SME's.
. Generate the genealogy by using EXCEL MACRO to make it better
understandable to end user.
Environment: SAS 9.1.3, BASE/SAS, SAS/MACRO, SAS/SQL, SAS/CONNECT,
SAS/ACCESS, SAS/ODS, Oracle 11g, TOAD, AIX-Unix 5.3 and Windows XP.
Statistics from Betalings Services, NETS, DENMARK
September 2009 - April 2010
Software Engineer
The customers for this solution are creditors, banks, internal supporters
and sales representatives/account managers. The purpose is to make data in
Betalings Service available and accessible to customers for statistics and
data extracts. The solution provides the same data as available in existing
BS supplemented with few transforms and additional features requested by
the customers.
Responsibilities:
. Analyzes the business requirements.
. Creates Conceptual and Logical data modeling.
. Creates the Physical data modeling.
. Creates technical specification document including the ETL process flow.
. Creates the SAS jobs using DI Studio to process the source files and to
load the SAS SPD tables.
. Creates the SAS jobs to migrate the data from SAS SPD tables into SAS
tables.
. Creates and revises the common macros.
. Creates and executes the test cases as part of unit, integration, system
and performance testing.
. Consolidates the defects identified while reviewing and testing and
analyzing the same to prevent them.
. Creates the miscellaneous docs like developer guidelines, trouble
shooting guide, optimizing process flows, parallel processing, SAS defect
prevention and code review checklists etc.
Environment: SAS DI Studio 3.4, SAS 9.1.3, BASE/SAS, SAS/MACRO, SAS/SQL,
SAS/CONNECT, SAS/ODS,SAS/ACCESS DB2 and Windows XP.
BS Re-Engineering, NETS, DENMARK February 2008 -
August 2009
Software Engineer
PBS is a leading payment service provider in Denmark owned by major Danish
banks. It is specialized in electronic payments and card services. To meet
its ambitious plan for growth in a cost effective manner, PBS has been re-
engineering of existing payments application with a different technology
and added functionality.
Responsibilities:
. Analyzed the functional requirement document.
. Analyzed the impact of the new change requests and estimates the effort
for the same.
. Involved in creation of Logical data modeling.
. Created the Physical data modeling.
. Created technical specification document.
. Created the SAS jobs using DI Studio to process the source files and to
load the DB2 tables.
. Created the SAS jobs using DI Studio to fetch the data from DB2 tables
and to generate the out files.
. Used various transformations like File Reader, File writer, Extract, SQL
Join, Lookup, Sort, User written and SCD Type2 Loader in SAS jobs to
perform ETL operation.
. Created and executes the test cases as part of unit, integration, system
and performance testing.
. Consolidated the defects identified while reviewing and testing and
analyzed the same to prevent them.
. Created the miscellaneous docs like developer guidelines, trouble
shooting guide, optimizing process flows, parallel processing, SAS defect
prevention and code review checklists etc.
Environment: SAS DI Studio 3.4, SAS 9.1.3, Base/SAS, SAS/MACRO, SAS/SQL,
SAS/CONNECT, SAS/ODS, PROC FREQ, Proc MEANS, DB2 and Windows XP.
Bank Credit Analysis, Commerce Bank, INDIA May
2006 - January 2008
SAS Programmer
The objective is to develop a credit risk rule that can be used to
determine. If a new applicant is under a good credit risk or a bad credit
risk or average risk by using no of explanatory variables of historical
data Essentially developing a function of several variables that allows us
to classify a new applicant into one of 3 categories good, bad or average
Classification procedure by using logistic Regression.
Responsibilities:
. Reading data from different sources like .csv, excel, tab delimited
files.
. Perform various business checks on the source data to confirm whether
data is correct or not.
. Compare the source data with historical data to get some statistical
analysis.
. Perform transformations like Merge, Sort and Update etc.. to get the data
in required format.
. Generate Reports in user required format by using ODS and PROC Report.
. SAS data sets are validated using SAS procedures like Proc Means, Proc
Frequency and Proc Univariate.
. Performed data integrity checks and data cleaning using SAS MACROS and
data steps, created HTML outputs with table of contents using ODS HTML.
Environment: SAS 8, BASE/SAS, SAS/MACRO, SAS/SQL, SAS/CONNECT, SAS/ODS, SQL
Server 2000 and Windows XP.