PROFESSIONAL SUMMARY:
Over ** years of professional experience in modeling, analysis, design, development and implementation of Statistical/Data models, application, writing macros, generating reports and graphs as well as troubleshooting for major bank & financial and Investment domain. Experience includes a clear understanding and working with all phases of SDLC. Analytical, methodical and resourceful approach to problem solving, identifying and documenting root causes while taking corrective actions to meet short and long-term business needs. I possess the ability to understand business propositions and adapt quickly to add value to the team.
SUMMARY:
●Extensive experience in SAS (9.1.3, 9.1, 8.2 versions) Programming and data analysis using BASE SAS software in Windows and UNIX environments.
●Experienced in complete Software Development Life Cycle (SDLC) across multiple projects starting from requirements gathering, System Requirement Specification (SRS) contribution, understanding High-Level Design (HLD), Service Level Agreement (SLA), develops and review code, change management, issue tracking and quality assurance as Business Analyst.
●Utilizing SAS Procedures, Macros, and other SAS applications for data extraction, data cleansing, data loading and reporting. Involved in the data analysis, DB creation, applying database constraints and making ready to load onto the database.
●Experience writing advanced SQL programs for joining multiple tables, sorting data, creating SQL views, creating indexes and metadata analysis. Experience in performing Ad Hoc Queries for various applications and reports on a daily, weekly and monthly basis using complex SQL Queries.
●Adept at managing teams, including subcontractors and offshore resources, and collaborating with cross-functional teams to deliver successful MDM and Digital Asset Management (DAM) programs.
●Experience in SAS Anti Money Laundering (AML) Solutions, AML Alert Generation Process, Scenario Management and SAS Fraud Management.
●Created concept for data pipelines using different AI tools to extract, transform, and load data from multiple sources into target systems.
●Conceptualizing, designing, and configuring MDM UI, workflows, and rules for complex business processes.
●Collaborated with data architects and business analysts to understand data requirements to integrate AI with SQL databases and other data sources
●Designed data warehousing solutions using AWS Redshift, resulting in a 45% performance improvement for business intelligence and reporting tools.
●Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients’ needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements
●Strong communication and presentation skills. Detail oriented and able to multitask effectively.
●Hands-on experience in SAS programming for extracting data from CSV, EXCEL and Microsoft Access files, Flat files, Excel spreadsheets and external RDBMS (ORACLE) tables using LIBNAME and SQL PASSTHRU facility.
●Hands-on experience in SAS, SQL and R programming for extracting data from Flat files, Excel spreadsheets, Sybase and SQL Server tables to generate data reports.
●Extracting multi-million record files from Teradata database (RDBMS), data mining and analysis, creating marketing incentive reports.
●Understanding of regulatory guidelines (CCAR, Basel) around Model Risk Management
●Participate in design and code reviews with developers using Python, R and other technologies.
●Reviewing client requests and applying the necessary technical updates in existing Python processes
●Designed and created data extracts, supporting SSRS, PowerBI or other visualization tools reporting applications
●Ability to design and deploy software to provide automation within an existing infrastructure
●Experience with traditional relational SQL Databases, specifically Azure SQL.
●Deep understanding of data and how it relates to architecture (Data ingestion, Realtime, Batch, SQL, NoSQL, Analytics, etc.)
●Demonstrated willingness as well as the ability to learn new technologies
TECHNICAL SKILLS:
Operating Systems
Windows 98/2000/NT/XP, MS-DOS, UNIX.LINUX
Languages
C/ C++, SAS, PL/SQL, Python, R Programming, VB, HTML, CSS, JavaScript, Snowflake
Database
MS-Access, Sybase IQ, SQL Server 2000/7.0/6.5, Oracle, Teradata, MS-Excel
Tools
WINSQL, Putty, Reflection FTP Client, MS Project, MS Visio, MS Word, MS PowerPoint, MS Excel, MS Management Studio, Sybase Power Designer, Fivetran, Business Objects, Erwin, PowerBI, Azure, AWS.
SAS Skills
SAS/BASE SAS/MACRO-SQL, SAS/STAT, SAS/ACCESS, SAS/CONNECT, SAS/MACROS, SAS/ODS, SAS/VIYA, SAS/STAT, SAS/CPM, SAS/ETL, SAS Risk Stratum, SAS/FM, SAS GRID, SAS/GRAPH, SAS ENTERPRISE GUIDE, SAS Customer Intelligence
Fidelity Investment, Smithfield, RI Apr 22 – Present
SAS Consultant/ Data Analyst
Fidelity Investments is an American multinational financial services corporation based in Boston, Massachusetts. Fidelity Investments operates a brokerage firm,
manages a large family of mutual funds, provides fund distribution and investment advice, retirement services, index funds, wealth management, securities execution and clearance, asset custody, and life insurance. As a part of a 401(K)-department team in Fidelity, I collect all the 401(K) related information of the clients from prior 401(k) holders and convert them into Fidelity format. I have worked for many clients for E.g., SONOVA, Charles River, etc.
Responsibilities:
●Work closely with business analysts, project manager to understand requirements and create Functional Design Documents, Detailed Design Documents, Data Sourcing Documents and Unit Test Plans
●Involved in developing, testing, and writing programs and reports in SAS according to specifications as required.
●Be an integral member of the project team. Setup project team meetings with clients and other team members in each phase test, refresh and live.
●Experience working in an Agile/Scrum project environment with project team members in multiple locations, including offshore, if required.
●Extensively used various statistical procedures such as Proc Freq, Proc Means, Proc Reg, and Proc Univariate for analysis.
●Codes SAS utility macros, writes and implements test plans to support SAS macro development.
●Create a python program to handle PL/SQL functions like cursors and loops, which are not supported by a snowflake.
●Generated high quality reports in the form of listing, HTML, RTF and PDF formats using SAS ODS.
●Take full responsibility on end-to-end deliverables (analysis, design, develop, test, and production deploy).
●Modifying existing SAS programs to change from one Teradata source to another.
●Analyze and convert complex retirement plan requirements and develop a pension benefit validation platform that meets the scheduled release date provided using Excel, Visual Basic, VBA macros and Sybase.
●Complete detailed unit/acceptance testing and certification to ensure development is delivered on time and in conjunction with defined release management practices.
●Involved in the team implementing and customizing the SAS AML and SAS Fraud Management suite.
●Documented SAS code, processes, and methodologies to ensure reproducibility and knowledge sharing within the team, facilitating a culture of transparency and collaboration.
●Design, develop and present risk/fraud related data, insights, and recommendations to senior management, providing valuable inputs for decision-making and strategic planning.
●Scheduled and automated data extracts from various sources, including relational databases, Excel, Access DB, Snowflake, and using SOQL in Salesforce.
●Code customized ad-hoc for quick test to verify client specification.
●Audit existing Power BI efforts with a focus on improvement.
●Utilized state-of-the-art software components for performance metrics, data visualization and business intelligence.
●Designed and developed Power BI graphical and visualization solutions with business requirement documents and plans for creating interactive dashboards
●Work closely with system analysts, project managers and clients.
●Ensure designs meet performance, security, usability, and reliability and scalability requirements.
Environment:
SAS/MACRO, SAS/SQL, SAS/ODS, SAS/FM, UNIX, IMS DBMS, SAS ETL, Studio 9.1 SAS/Advance, JIRA, MS Access, Snowflake, PowerBI
Global Atlantic Financial Group, Boston, MA Feb 19 – Mar 22
SAS Consultant / Data management
Global Atlantic Financial Group is a leader in the U.S. life insurance and annuity industry, serving the needs of individuals and institutions. Global Atlantic is a majority-owned subsidiary of KKR, a leading global investment firm that offers alternative asset management across multiple strategies and capital markets solutions. In This Project I was part of the SAS team where we collected row data in different file formats and business requirements and automated the SAS process from start to end and generated Month and Quarterly reports for different clients for E.g., MassMutual etc.
Responsibilities:
●Involved in developing, testing, and writing programs and reports in SAS/SQL according to specifications as required.
●Extensively performed Data Cleansing during the ETL in Fivetran or Alteryx to Extraction and Loading Phase by analyzing the raw data and writing and creating complex reusable process and pipeline.
●Used SAS/ACCESS to read datasets from, Databases, Salesforce using SQL, Flat files, CSV also read and write other PC File formats.
●Develop dashboards to display cost vs. revenue performance, margin analytics, and cost drivers.
●Conducted data cleansing, transformation, and integration, ensuring data quality and integrity while preparing data for analysis using SAS EG.
●Manipulated existing Oracle database by accessing the data using SQL pass through facility.
●Worked on pre-existing macros for data validation by checking data distribution and comparison to standard data.
●Analyzed various tables using Data manipulation techniques like merging, appending and sorting.
●Implemented a Digital Asset Management (DAM) platform, coordinating with cross-functional teams to standardize taxonomy and metadata for improved user experience.
●Used SAS Platform RTM and Environment manager to manage SAS Grid environment.
●Utilized technical and business expertise, the manager develops and leads PIM and MDM projects that are critical for digital transformation.
●Load and reconcile financial, operational, and policy data into the CPM environment.
●Set up Grid Option sets for various departments to run their jobs in queues that were assigned to them
●Successfully converted existing SAS code into SAS Visual Analytics reports and dashboards, enhancing data presentation and accessibility for stakeholders.
●Transferred and Migrated Data from one platform to another to be used for further analysis.
●Used Dynamic Data Exchange (DDE) feature of SAS for importing and exporting data between SAS and SQL. Conducted meetings with business users and data architects to define metadata for tables to perform ETL process using Fivetran or Alteryx.
●Involve in designing and implementing solutions related to the cost and profitability management (CPM) area.
●Experience with specialized automation software or scripting required for automation such as scheduling automation software, process and decision automation, data capture, management and workflow orchestration using Control-M.
●Collaborate closely with cross-functional teams to understand data requirements and translate them into robust and efficient data models using Erwin.
●Integrating views of conceptual, logical and physical data models helps business and technical stakeholders understand data structures and meaning in Erwin.
●Conceptualized and modeled MDM domains, ensuring alignment with business requirements and data governance policies.
●Supported all stages of the SDLC, incorporating robust data modeling techniques in Snowflake to ensure a structured and efficient database architecture.
●Wrote and executed various MYSQL database queries from Python using Python-MySQL connector and MySQL dB package.
●Conduct exploratory data analysis using SAS/SQL to uncover patterns, trends, and anomalies in large datasets.
●Collaborate with data analysts and business stakeholders and data governance teams to understand data requirements and propose optimal solutions and establish data quality standards and ensure compliance.
●Participate in development of various Python scripts to find vulnerabilities with SQL Queries by doing SQL injection, permission checks, and performance analysis and developed scripts to migrate data from a proprietary database to PostgreSQL.
●Reviewing client requests and applying the necessary technical updates in existing Python processes
●Take efforts include solving business issues technically via system development based on business needs, implementing and validating the changes, documentation and supporting the ongoing processing.
Environment:
SAS EG, SAS/VIYA, SAS/Macro, SAS/Stat, SAS/SQL, SAS/ODS, SAS/GRID, SAS CPM, Oracle 9i, Fivetran, PL/SQL, Control-M, Python, Erwin, Alteryx and MS Access.
Chubb Personal Insurance, Whitehouse Station, NJ Apr 17 – Feb 19
SAS Programmer / Data Analyst
The Chubb Corporation is a leading global insurance organization that receives high ratings and provides property, casualty and specialty insurance to individuals and businesses around the world. The project included writing SAS code for maintaining the Revenue Neutral TRG (Tibus Rating Group) premium for policies for specific states. My job involves file handling and transformation of data through the ETL (Extraction, Transformation and Loading) Process which is received from IBM’s Information Management System (IMS) DBMS.
Responsibilities:
●Worked with various data like ASCII and Binary formats and extensively used different kinds of informants to read these data files.
●Handled files having many missing values and special characters and generated datasets based on the conditional requirements of the special character and missing values present in the file raw data files.
●Also cleaned the data using SAS code (SAS Macros) and VB Script and developed SAS Code on UNIX according to the specifications.
●Administered SAS Visual Analytics, SAS Visual Statistics, SAS Data Preparation, and SAS Data Mining software.
●Imported raw data files extracted from IBM’s Information Management System (IMS) Mainframe Database into SAS using the Import Techniques and modified existing datasets using through various conditional statements.
●Utilized technical writing and diagramming skills, including proficiency with modeling and mapping tools (e.g., Visio, Erwin), Microsoft Office Suite, and MS Project.
●Integrated SAS Viya Server with an identity and access management system using SAML and other means.
●Developed and maintained data pipelines, ETL processes, and data integration solutions using Azure services and open-source technologies.
●Collaborate with clients to define data strategy, assess existing data infrastructure, and develop migration plans to Azure.
●Utilized Azure technologies, including Azure Data Factory, Azure Synapse and Azure SQL to enhance data integration and processing.
●Conducted performance optimization and troubleshooting of data pipelines and SQL databases
●Participated in data migration projects, transitioning on-premises data to Azure cloud services and google cloud in cross-functional or cross-team project
●Evaluate and recommend Azure services and features to enhance data solutions, while managing costs effectively.
●Extensively used SQL statement functionalities for analysis of large datasets (million records) and for data manipulation like Indexing and Appending various datasets.
●Interacted and coordinated with key departments to analyze areas and discuss the primary model requirements for the project.
●Responsible for analyzing and collecting specific policies for Testing the Calculation of Revenue Neutral TRG premium and the deviation results generated through the SAS datasets.
●Created new State datasets for Territories, Policies, States, Vehicles, Drivers etc. and other key domains. Analyzed the various Tiers of specific states and the Tier Factors that reflect the final Deviation produced through the calculation in the TRG Premium.
●Creating new procedures or generating reports using the data modelling tool Erwin.
●Visualize compelling data stories on the report canvas using Erwin.
●Optimized database performance by analyzing query execution plans and implementing indexing strategies.
●Loaded various Old and New Rates data file retrieved from the Actuarial department in specific format in SAS through the import mechanism.
●Extensive use of detecting, describing and verifying defects, bug tracking and modifying the existing datasets for the improvement of the existing system.
●Documented methodology, data reports and model results and communicated with the Project Team / Manager to share knowledge.
Environment:
SAS/BASE, SAS/MACRO, SAS/SQL, SAS/ODS, SAS/Viya, UNIX, Mainframe, IMS DBMS, SAS ETL, Erwin Studio 9.1, Azure services
JP MORGAN CHASE, COLUMBUS, OH May 16- Mar 17
SAS Data Analyst
The project analyzed short-term and mid-term credit needs of its consumer in order to analyze purchasing power of its consumer and subsequently embrace business activities such as Risk Analysis and Management, Database marketing, Credit policy, management reporting and analysis and information support. Member of the Auto Finance team, I was involved in the creation and conversion of SAS datasets from tables and flat files, analyzing the dataset and generating reports by employing complex data manipulation. The analysis produced helped the company in minimizing the manual process of analyzing risk and making appropriate Risk decisions.
Responsibilities:
●Involved in MIGRATION of auto loans from one system to another and combine the auto loans as per business requirements. Project called VLS to ALS conversion.
●Involved in extracting data from flat files, Excel spreadsheets and external RDBMS tables using LIBNAME and SQL PASSTHRU facility.
●Expertly manage and analyze over one hundred SMF/RMF record types, ensuring accurate and efficient data handling.
●Modified existing reports as needed due to migration of loans and updated logic to maintain consistency during migration.
●Contributed to the bank's CECL implementation strategy, focusing on data quality and model validation.
●Developed summary reports to provide daily summary of newly issued auto loans as well as summarize the data from delinquent loans.
●As a DB2 developer I worked closely with data stewards, data architects, business architects, solution architects, and other DBA's to support business needs.
●Also developed logic summarizes the number of new or used vehicles sold on demographic bases vs number of vehicles pulled back due to delayed payment.
●Perform in-depth problem resolution for system health conditions, ensuring minimal downtime and optimal performance.
●Designed and implemented highly scalable and fault-tolerant cloud solutions using AWS services, with a focus on AWS Step Functions and Lambda, for automating critical business processes, resulting in a 30% reduction in operational costs.
●Manage coupling facility structures, set LPAR weights, and implement capping strategies to optimize resource allocation.
●Designed and implemented credit scoring models using SAS, improving risk assessment accuracy by 25%.
●Designed and developed complex and efficient Oracle SQL scripts to support data analytics initiatives.
●Leveraged AWS SNS and SQS to establish event-driven communication between microservices, enabling real-time processing of data and reducing system latency by 40%.
●Generated highly customized reports using SAS/MACRO as per the requirements.
●Applied knowledge of Banking KYC requirements to ensure compliance within data models, supporting regulatory needs.
●Assisted in the transition from the incurred loss model to CECL, focusing on data collection and analysis using SAS.
●Utilized data modeling tools such as ERwin and ER/Studio to create and maintain accurate data models.
●Develop proof of concept (POC) for new products and establish centers of excellence within the organization.
●Generated customized reports based on statistical analysis done using SAS EG /BASE SAS in the form of ODS, HTML and PDF to help business to make decisions.
●Involved in process optimization on processes taking a long time to run.
●Debugged, created, maintained and documented ad-hoc reports on demand on a daily basis.
●Monitor Daily, Weekly, Monthly & Quarterly jobs.
●Extensively worked with Excel and CSV files merge them by VLOOKUP.
Environment: SAS/EG, SAS/MACRO, SAS/ODS, SAS/ACCESS, SAS Risk Stratum, File Zilla, JCL, Putty, LPAR, SQL Server Management studio, AWS Lambda, SNS.
Fulcrum Analytics, Inc., Fairfield, CT Jan 16 – Apr 16
Business Data Analyst
Project was involved with the update and maintenance of the Data warehouse and generation of various reports. The project also involved managing the data-warehouse, cleaning the customer database, extracting data, creating SAS datasets and writing reports for analysis.
Responsibilities:
●Used SAS Web Tools for extracting the data from Oracle database and created SAS data sets.
●Supported the direct marketing efforts for Consumer and Small Business products, which includes Mortgage, Home Equity, Deposits, Student Lending, and Insurance Services.
●Worked closely with data modeling team and management like Marketing managers, Campaign managers, financial managers etc., to analyze customer related data, generate custom reports, tables, listings, and graphs.
●Acted as a system analyst for data analytics, performance management system, and data warehousing projects.
●Implementing strong security measures in AWS environments, following AWS best practices.
●Continuous Integration/Continuous Deployment (CI/CD) pipelines with AWS CodePipeline, AWS CodeBuild, and AWS Code Deploy.
●Good knowledge in Amazon AWS concepts like EMR and EC2 web services which provides fast and efficient processing of Big Data.
●Managed data storage and transfer with AWS S3, optimizing data pipelines for high-throughput and secure data storage, achieving a 50% improvement in data access times.
●Collaborated in an Agile environment, utilizing Snowflake and data modeling skills to facilitate seamless data integration, migration, and transformation between different platforms.
●Optimized SQL scripts in Snowflake, focusing on query execution plans and efficiency improvements, which facilitated a deeper exploration of data and enhanced insights into data
●Architect complex data models, schemas, and ETL pipelines to ensure optimal performance and scalability.
●Collaborate with cross-functional teams to analyze business requirements and translate them into efficient Snowflake solutions.
●Develop and/or maintain SAS programs for data conversions.
●Analyze the content, structure and format of source data.
●Developed, modified, and generated Monthly Business Review (MBR) reports and Quarterly Business Review (QBR) reports summarizing business activity.
●Supported all stages of the SDLC, incorporating robust data modeling techniques in Snowflake to ensure a structured and efficient database architecture.
●Develop programs for extensive statistical analysis of data using SAS procedures.
●Used to create and execute SAS code extracting direct marketing lists from CDW.
●Experienced with planning, architecture, and design of Teradata data warehousing and SQL optimization
●Assessing the validity of statistical and non-statistical models used for credit risk management
●Generating the monthly, yearly and historical graphs using SAS/Graph.
●Participated in the planning and developing of sales and marketing campaigns utilizing the Campaign Management System.
●Data mining to identify significant relationships among customer behaviors and descriptive characteristics.
●Managed and implemented various projects for customers using the Teradata relational database system.
●Wrote validation programs and tested them against the client data for data integrity.
●Regularly interacted with the financial analysts for the presentation of reports.
Environment:
Base SAS 9.1.3, SAS/Access, SAS/Connect, SAS/Stat, SAS/Graph, SAS/SQL, SAS/ODS, SAS/Macros, SAS/Enterprise Miner, Teradata, MS Access, PL/SQL, MS Excel, HP-Unix, AWS ES2, S3, Snowflake
American Express, Phoenix, AZ Aug 14 – Dec15
SAS Programmer
Responsible for creating new SAS code utilizing existing code and maintaining data in SAS. The project involved working with analysts to provide analysis of credit card customer base on demographical basis and forecasting risk. The permissions to grant a credit card to an individual were based on the analysis done.
Responsibilities:
●Used SAS v8.2 on Sun UNIX environment and Programming in SAS BASE and Macros.
●Extracting required data from IBM mainframe to local SAS Environment by SAS FTP and sync sort.
●Works closely with relevant stakeholders and subject matter to understand all aspects of the workflow processes managed for model validation, model performance monitoring, and model issue management.
●Developed application for measuring financial performance of newly acquired accounts: Forecast Vs Actual and developed SAS programs for generating reports on Key financials, Income Statements, and Balance Sheet.
●Prepare and analyze model risk reporting and data for senior management and regulators.
●Used SAS procedures and prepared daily, monthly, yearly reports.
●Part of the model risk governance process and involved in the model identification process and model inventory management.
●Create periodic reporting of model risk metrics or initiatives.
●Worked with SAS/ODS, HTML produce dynamic web interfaces from SAS programs
●Used SAS import and Access to import external files. Worked with SAS/Graph to produce graphical reports.
●Responsible for maintenance and enhancements work in reporting system using SAS/Base, SAS/Macro, SAS/Graph, SAS/AF
●Support Model risk management tool implementation efforts, including participation in User Acceptance Testing
●Wrote programs in SAS generate reports, creating RTF, HTML listings, tables and reports using SAS ODS for ad-hoc report generation
●Used SAS, SAS Macros, Oracle, Procedures like SQL, FREQ, UNIVARIATE, SUMMARY, MEANS, TRANSPOSE, ARRAYS, TABULATE and SORT to extract data, prepare integrated data base and its analysis
Environment:
BASE SAS, SAS/MACRO, SAS/STAT, SAS/GRAPH, SAS/ CONNECT, SAS/ACCESS, MS-Excel, SAS/ODS, Oracle 8i, PL/SQL, UNIX, and Windows NT, IBM Mainframes.
EDUCATION:
●Master of Science in Computer Information Systems (MSCIS)
California University of Management and Sciences - March 2018
●Bachelor Of Computer Application (BCA)
Tilak Maharashtra Vidyapeeth University - May 2010