Post Job Free

Resume

Sign in

Business Analyst Quality Assurance

Location:
Medford, MA, 02155
Salary:
$72/hr
Posted:
December 26, 2023

Contact this candidate

Resume:

PROFESSIONAL SUMMARY:

Over * years of professional experience in modeling, analysis, design, development and implementation of Statistical/Data models, application, writing macros, generating reports and graphs as well as troubleshooting for major bank & financial companies and other domain. Experience includes a clear understanding and working with all phases of SDLC. Analytical, methodical and resourceful approach to problem solving, identifying and documenting root causes while taking corrective actions to meet short- and long-term business needs. I possess the ability to understand business propositions and adapt quickly to add value to the team.

SUMMARY:

●Extensive experience in SAS (9.1.3, 9.1, 8.2 versions) Programming and data analysis using BASE SAS software in Windows and UNIX environments.

●Experienced in complete Software Development Life Cycle (SDLC) across multiple projects starting from requirements gathering, System Requirement Specification (SRS) contribution, understanding High-Level Design (HLD), Service Level Agreement (SLA), develops and review code, change management, issue tracking and quality assurance as Business Analyst.

●Utilizing SAS Procedures, Macros, and other SAS applications for data extraction, data cleansing, data loading and reporting. Involved in the data analysis, DB creation, applying database constraints and making ready to load onto the database.

●Excellent knowledge of various SAS report generating procedures like PROC PRINT, PROC REPORT, PROC FREQ, PROC MEANS, PROC TABULATE, PROC TRANSPOSE, PROC SQL.

●Experience writing advanced SQL programs for joining multiple tables, sorting data, creating SQL views, creating indexes and metadata analysis. Experience in performing Ad Hoc Queries for various applications and reports on a daily, weekly and monthly basis using complex SQL Queries.

●Experience in writing complex SQL Queries (PROC SQL, SQL, and PL/SQL) for the analysis of large health care tables like Claims, Membership, medical provider, sales and support data.

●Implementing strong security measures in AWS environments, following AWS best practices.

●Managed and deployed Kubernetes clusters on cloud platforms such as AWS or Azure.

●Architect data lakes and data storage solutions using Azure Data Lake Storage and Azure Blob Storage for efficient data management and storage.

●Evaluate and recommend Azure services and features to enhance data solutions, while managing costs effectively.

●Collaborated with data architects and business analysts to understand data requirements and design Azure data solutions

●Designed data warehousing solutions using AWS Redshift, resulting in a 45% performance improvement for business intelligence and reporting tools.

●Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients’ needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements

●Created output files/reports in different file formats like HTML, PDF, CSV, XLS and RTF using SAS ODS.

●Extensively used SAS/MACRO for creating macro variables and macro programs to modify existing SAS programs for ease of modification while maintaining consistency of results.

●Strong communication and presentation skills. Detail oriented and able to multitask effectively.

●Hands-on experience in SAS programming for extracting data from CSV, EXCEL and Microsoft Access files, Flat files, Excel spreadsheets and external RDBMS (ORACLE) tables using LIBNAME and SQL PASSTHRU facility.

●Hands-on experience in SAS programming for extracting data from Flat files, Excel spreadsheets, Sybase and SQL Server tables.

●Hands on experience in Teradata designing, development, testing, Implementation, maintenance and support of applications for Banking, Financial domains in various environments like Mainframes, Windows.

●Extracting multi-million record files from Teradata database (RDBMS), data mining and analysis, creating marketing incentive reports.

●Designed and implemented Kubernetes-based solutions for clients.

●Participate in design and code reviews with developers using Python

●Hands on shell scripting experience. Unix/Linux experience.

●Reviewing client requests and applying the necessary technical updates in existing Python processes

●Hands on experience with SAS/STAT procedures like PROC FREQ, PROC MEANS, PROC SUMMARY, PROC UNIVARIATE, PROC CORR, PROC REG, PROC GLM, PROC ANOVA

●Used spreadsheets and external RDBMS (ORACLE) tables using LIBNAME and SQL PASSTHRU facility

●Extensive experience of data merging, data subsetting with the use of PROC SQL, MERGE and SET statements

●Creating interactive dashboard or story telling using data visualization tools like PowerBI and Tableau

●Experience with traditional relational SQL Databases, specifically Azure SQL.

●Deep understanding of data and how it relates to architecture (Data ingestion, Realtime, Batch, SQL, NoSQL, Analytics, etc.)

●Create contact database and event-triggered actions using SAS Customer Intelligence

●Effective team player with strong communication & interpersonal skills.

TECHNICAL SKILLS:

Operating Systems

Windows 98/2000/NT/XP, MS-DOS, UNIX.LINUX

Languages

C/ C++, SAS, PL/SQL, Python, VB, HTML, CSS, JavaScript

Database

MS-Access, Sybase IQ, SQL Server 2000/7.0/6.5, Oracle, Teradata, MS-Excel

Tools

WINSQL, Putty, Reflection FTP Client, MS Project, MS Visio, MS Word, MS PowerPoint, MS Excel, MS Management Studio, Sybase Power Designer, Business Objects, PowerBI, Tableau, Azure, AWS

SAS Skills

SAS/BASE–MACRO/SQL, SAS/STAT, SAS/ACCESS, SAS/CONNECT, SAS/MACROS, SAS/ODS, SAS/SHARE, SAS/STAT, SAS/ETL, SAS/AML, SAS/VUYA SAS/GRAPH, SAS ENTERPRISE GUIDE, SAS Customer Intelligence

Fidelity Investment, Smithfield, RI Apr 22 – Present

SAS Consultant/Software Engineer

Fidelity Investments is an American multinational financial services corporation based in Boston, Massachusetts. Fidelity Investments operates a brokerage firm, manages a large family of mutual funds, provides fund distribution and investment advice, retirement services, index funds, wealth management, securities execution and clearance, asset custody, and life insurance. As a part of a 401(K)-department team in Fidelity, I collect all the 401(K) related information of the clients from prior 401(k) holders and convert them into Fidelity format. I have worked for many clients for E.g., SONOVA, Charles River, etc.

Responsibilities:

●Work closely with business analysts, project manager to understand requirements and create Functional Design Documents, Detailed Design Documents, Data Sourcing Documents and Unit Test Plans

●Involved in developing, testing, and writing programs and reports in SAS according to specifications as required.

●Be an integral member of the project team. Setup project team meetings with clients and other team members in each phase test, refresh and live.

●Integration and processing from multiple data sources using tools such as Azure Data Factory.

●Participate in development of cloud data warehouses, data as a service, business intelligence solutions.

●Use Azure big data tools to design batch and streaming feature pipelines and pipelines to populate a data lake

●Work with other Azure stack modules like Azure Data Lakes and SQL DW.

●Created data models and data warehouses using Azure SQL Data Warehouse and Azure Analysis Services.

●Optimize and fine-tune performance of Azure SQL databases, leveraging tools like Query Performance Insight and Azure Monitor.

●Experience working in an Agile/Scrum project environment with project team members in multiple locations, including offshore, if required.

●Extensively used various statistical procedures such as Proc Freq, Proc Means, Proc Reg, and Proc Univariate for analysis.

●Codes SAS utility macros, writes and implements test plans to support SAS macro development.

●Read datasets with a lot of missing values using MISSOVER and TRUNCOVER.

●Used LIBNAME and SQL PASSTHRU facility to read data from other sources into SAS EG.

●Generated high quality reports in the form of listing, HTML, RTF and PDF formats using SAS ODS.

●Take full responsibility on end-to-end deliverables (analysis, design, develop, test, and deploy).

●Modifying existing SAS programs to change from one Teradata source to another.

●Analyze and convert complex retirement plan requirements and develop on Actuarial pension benefit validation platform that meets the scheduled release date provided using Excel, Visual Basic, VBA macros and Sybase.

●Complete detailed unit/acceptance testing and certification to ensure development is delivered on time and in conjunction with defined release management practices.

●Documented SAS code, processes, and methodologies to ensure reproducibility and knowledge sharing within the team, facilitating a culture of transparency and collaboration.

●Code customized ad-hoc for quick test to verify client specification.

●Audit existing Power BI efforts with a focus on improvement.

●Work closely with system analysts, project managers and clients.

●Ensure designs meet performance, security, usability, and reliability and scalability requirements.

Environment:

SAS/BASE, SAS/MACRO, SAS/SQL, SAS/ODS, UNIX, Mainframe, IMS DBMS, SAS ETL, Studio 9.1 SAS/Advance, JIRA, MS Access, PowerBI, Azure SQL Data Factory, Azure Services.

Global Atlantic Financial Group, Boston, MA Feb 19 – Mar 22

SAS Consultant/Programmer

Global Atlantic Financial Group is a leader in the U.S. life insurance and annuity industry, serving the needs of individuals and institutions. Global Atlantic is a majority-owned subsidiary of KKR, a leading global investment firm that offers alternative asset management across multiple strategies and capital markets solutions. In This Project I was part of the SAS team where we collected row data in different file formats and business requirements and automated the SAS process from start to end and generated Month and Quarterly reports for different clients for E.g., MassMutual etc.

Responsibilities:

●Involved in developing, testing, and writing programs and reports in SAS according to specifications as required.

●Extensively performed Data Cleansing during the ETL’s Extraction and Loading Phase by analyzing the raw data and writing SAS EG and creating complex reusable Macros.

●Used SAS/ACCESS to read datasets from, Databases, Flat files, CSV also read and write other PC File formats.

●Extraction, transformation and loading from large Oracle tables were performed.

●Conducted data cleansing, transformation, and integration, ensuring data quality and integrity while preparing data for analysis.

●Manipulated existing Oracle database by accessing the data using SQL pass through facility.

●Worked on pre-existing macros for data validation by checking data distribution and comparison to standard data.

●Generated highly customized reports using SAS macro facility, Proc Report, Proc Tabulate, and PROC SQL.

●Analyzed various tables using Data manipulation techniques like merging, appending and sorting.

●Performed data cleansing by analyzing and eliminating duplicate and inaccurate data using PROC FREQ, PROC COMP, PROC UNIVARIATE and macros in SAS.

●Created informative and interactive data visualizations using SAS Viya, effectively conveying complex data to business stakeholders.

●Validated, documented and tested component programs in an efficient manner for inclusion in integrated reports.

●Successfully converted existing SAS code into SAS Visual Analytics reports and dashboards, enhancing data presentation and accessibility for stakeholders.

●Transferred and Migrated Data from one platform to another to be used for further analysis.

●Build simple to complex pipelines & dataflows

●Used Dynamic Data Exchange (DDE) feature of SAS for importing and exporting data between SAS and SQL. Conducted meetings with business users and data architects to define metadata for tables to perform ETL process.

●Supported and developed a management reporting system which used a web front end to generate user queries.

●Participate in design and code reviews with developers using Python

●Reviewing client requests and applying the necessary technical updates in existing Python processes

●Create SAS Information Maps provide to a simplified layer between nontechnical business users and the complexities of databases and query languages using SAS Customer Intelligence

●Take efforts include solving business issues technically via system development based on business needs, implementing and validating the changes, documentation and supporting the ongoing processing.

●Developed and maintained CI/CD pipelines for Kubernetes-based applications.

●Deployed applications on Kubernetes clusters using Helm charts.

Environment:

SAS EG, SAS VIYA, SAS/Base, SAS/Access, SAS/Connect, SAS/Stat, SAS/Graph, SAS/SQL, SAS/ODS, SAS/Macros, Oracle 9i, DB2, PL/SQL, MS Excel and MS Access.

Chubb Personal Insurance, Whitehouse Station, NJ Apr 17 – Feb 19

SAS Programmer Analyst

The Chubb Corporation is a leading global insurance organization that receives high ratings and provides property, casualty and specialty insurance to individuals and businesses around the world. The project included writing SAS code for maintaining the Revenue Neutral TRG (Tibus Rating Group) premium for policies for specific states. My job involves file handling and transformation of data through the ETL (Extraction, Transformation and Loading) Process which is received from IBM’s Information Management System (IMS) DBMS.

Responsibilities:

●Worked with various data like ASCII and Binary formats and extensively used different kinds of informants to read these data files.

●Handled files having many missing values and special characters and generated datasets based on the conditional requirements of the special character and missing values present in the file raw data files.

●Also cleaned the data using SAS code (SAS Macros) and VB Script and developed SAS Code on UNIX according to the specifications.

●Imported raw data files extracted from IBM’s Information Management System (IMS) Mainframe Database into SAS using the Import Techniques and modified existing datasets using through various conditional statements.

●Developed and maintained data pipelines, ETL processes, and data integration solutions using Azure services and open-source technologies.

●Collaborate with clients to define data strategy, assess existing data infrastructure, and develop migration plans to Azure.

●Conducted performance optimization and troubleshooting of data pipelines and SQL databases.

●Participated in data migration projects, transitioning on-premises data to Azure cloud services.

●Evaluate and recommend Azure services and features to enhance data solutions, while managing costs effectively.

●Extensively used SQL statement functionalities for analysis of large datasets (million records) and for data manipulation like Indexing and Appending various datasets.

●Interacted and coordinated with key departments to analyze areas and discuss the primary model requirements for the project.

●Responsible for analyzing and collecting specific policies for Testing the Calculation of Revenue Neutral TRG premium and the deviation results generated through the SAS datasets.

●Created new State datasets for Territories, Policies, States, Vehicles, Drivers etc. and other key domains. Analyzed the various Tiers of specific states and the Tier Factors that reflect the final Deviation produced through the calculation in the TRG Premium.

●Loaded various Old and New Rates data file retrieved from the Actuarial department in specific format in SAS through the import mechanism.

●Extensive use of detecting, describing and verifying defects, bug tracking and modifying the existing datasets for the improvement of the existing system.

●Documented methodology, data reports and model results and communicated with the Project Team / Manager to share knowledge.

Environment:

SAS/BASE, SAS/MACRO, SAS/SQL, SAS/ODS, UNIX, Mainframe, IMS DBMS, SAS ETL, Studio 9.1, Azure

JP MORGAN CHASE, COLUMBUS, OH May 16- Mar 17

SAS Data Analyst

The project analyzed short-term and mid-term credit needs of its consumer in order to analyze purchasing power of its consumer and subsequently embrace business activities such as Risk Analysis and Management, Database marketing, Credit policy, management reporting and analysis and information support. Member of the Auto Finance team, I was involved in the creation and conversion of SAS datasets from tables and flat files, analyzing the dataset and generating reports by employing complex data manipulation. The analysis produced helped the company in minimizing the manual process of analyzing risk and making appropriate Risk decisions.

Responsibilities:

●Involved in MIGRATION of auto loans from one system to another and combine the auto loans as per business requirements. Project called VLS to ALS conversion.

●Involved in extracting data from flat files, Excel spreadsheets and external RDBMS tables using LIBNAME and SQL PASSTHRU facility.

●Modified existing reports as needed due to migration of loans and updated logic to maintain consistency during migration.

●Developed summary reports to provide daily summary of newly issued auto loans as well as summarize the data from delinquent loans.

●Also developed logic summarizes the number of new or used vehicles sold on demographic bases vs number of vehicles pulled back due to delayed payment.

●Designed and implemented highly scalable and fault-tolerant cloud solutions using AWS services, with a focus on AWS Step Functions and Lambda, for automating critical business processes, resulting in a 30% reduction in operational costs.

●Leveraged AWS SNS and SQS to establish event-driven communication between microservices, enabling real-time processing of data and reducing system latency by 40%.

●Modified macros for report generation using SAS Macros as per the requirements. Generated highly customized reports using SAS/MACRO facility.

●Generated customized reports based on statistical analysis done using SAS EG /BASE SAS in the form of ODS, HTML and PDF to help business to make decisions.

●Involved in process optimization on processes taking a long time to run.

●Debugged, created, maintained and documented ad-hoc reports on demand on a daily basis.

●Monitor Daily, Weekly, Monthly & Quarterly jobs.

●Extensively worked with Excel and CSV files merge them by VLOOKUP.

Environment: SAS/EG, SAS/MACRO, SAS/ODS, SAS/ACCESS and SAS/CONNECT, File Zilla, Putty, Developer X Change, Subversion, SQL Server Management studio, AWS Lambda, SNS.

Fulcrum Analytics, Inc., Fairfield, CT Jan 16 – Apr 16

SAS Analyst

Project was involved with the update and maintenance of the Data warehouse and generation of various reports. The project also involved managing the data-warehouse, cleaning the customer database, extracting data, creating SAS datasets and writing reports for analysis.

Responsibilities:

●Used SAS Web Tools for extracting the data from Oracle database and created SAS data sets.

●Supported the direct marketing efforts for Consumer and Small Business products, which includes Mortgage, Home Equity, Deposits, Student Lending, and Insurance Services.

●Worked closely with data modeling team and management like Marketing managers, Campaign managers, financial managers etc., to analyze customer related data, generate custom reports, tables, listings, and graphs.

●Prepared Test cases and Test Data Create data manipulation and definition scripts using Teradata Bteq utility Involved in the analysis and design of the system Involved in Testing of prototype.

●Prepared documents in detail which will be shared across organization Create export scripts using Teradata Fast Export Utility.

●Created Teradata objects like Databases, Users, Profiles, Roles, Tables, and Views & Macros.

●Implementing strong security measures in AWS environments, following AWS best practices.

●Continuous Integration/Continuous Deployment (CI/CD) pipelines with AWS CodePipeline, AWS CodeBuild, and AWS CodeDeploy.

●Good knowledge in Amazon AWS concepts like EMR and EC2 web services which provides fast and efficient processing of Big Data.

●Managed data storage and transfer with AWS S3, optimizing data pipelines for high-throughput and secure data storage, achieving a 50% improvement in data access times.

●Part of the Direct Response and Analytical Marketing Team.

●Develop conversion jobs using Base SAS/SAS Data Integration Studio.

●Perform peer review and quality control of SAS Dataset.

●Analyze the content, structure and format of source data.

●Developed, modified, and generated Monthly Business Review (MBR) reports and Quarterly Business Review (QBR) reports summarizing business activity.

●Develop programs for extensive statistical analysis of data using SAS procedures.

●Used to create and execute SAS code extracting direct marketing lists from CDW.

●Creating HTML, PDF & RTF reports using SAS ODS.

●Experienced with planning, architecture, and design of Teradata data warehousing and SQL optimization

●Created scripts of BTEQ, FastLoad, MultiLoad written queries to move the data from source to destination.

●Assessing the validity of statistical and non-statistical models used for credit risk management

●Generating the monthly, yearly and historical graphs using SAS/Graph.

●Participated in the planning and developing of sales and marketing campaigns utilizing the Campaign Management System.

●Data mining to identify significant relationships among customer behaviors and descriptive characteristics.

●Managed and implemented various projects for customers using the Teradata relational database system.

●Wrote validation programs and tested them against the client data for data integrity.

●Regularly interacted with the financial analysts for the presentation of reports.

●Worked extensively in sorting, merging, concatenating, and modifying SAS data sets.

●Extensively used various Base SAS procedures like Proc SQL, Proc Report, Proc Format, Proc Tabulate, Proc Print, Proc sort etc. for reporting purposes in

Environment:

Base SAS 9.1.3, SAS/Access, SAS/Connect, SAS/Stat, SAS/Graph, SAS/SQL, SAS/ODS, SAS/Macros, SAS/Enterprise Miner, Teradata, MS Access, PL/SQL, MS Excel, HP-Unix, AWS ES2, S3.

American Express, Phoenix, AZ Aug 14 – Dec15

SAS Programmer

Responsible for creating new SAS code utilizing existing code and maintaining data in SAS. The project involved working with analysts to provide analysis of credit card customer base on demographic basis and forecasting risk. The permissions to grant a credit card to an individual were based on the analysis done.

Responsibilities:

●Used SAS v8.2 on Sun UNIX environment and Programming in SAS BASE and Macros.

●Extracting required data from IBM mainframe to local SAS Environment by SAS FTP and sync sort.

●Developed application for measuring financial performance of newly acquired accounts: Forecast Vs Actual and developed SAS programs for generating reports on Key financials, Income Statements, and Balance Sheet

●Used SAS procedures and prepared daily, monthly, yearly reports.

●Worked with SAS/ODS, HTML produce dynamic web interfaces from SAS programs

●Used SAS import and Access to import external files. Worked with SAS/Graph to produce graphical reports.

●Responsible for maintenance and enhancements work in reporting system using SAS/Base, SAS/Macro, SAS/Graph, SAS/AF

●Wrote programs in SAS generate reports, creating RTF, HTML listings, tables and reports using SAS ODS for ad-hoc report generation

●Used SAS, SAS Macros, Oracle, Procedures like SQL, FREQ, UNIVARIATE, SUMMARY, MEANS, TRANSPOSE, ARRAYS, TABULATE and SORT to extract data, prepare integrated data base and its analysis

Environment:

BASE SAS, SAS/MACRO, SAS/STAT, SAS/GRAPH, SAS/ CONNECT, SAS/ACCESS, MS-Excel, SAS/ODS, Oracle 8i, PL/SQL, UNIX, and Windows NT, IBM Mainframes.

EDUCATION:

●Bachelor of Computer Application, Mumbai, India.



Contact this candidate