Vasanth Samula
+1-360-***-**** *************@*****.*** Linkedin Seattle WA
PROFESSIONAL SUMMARY:
•Over 5 years of experience as a Data Analyst, specializing in using PostgreSQL and Apache Hadoop to manage and analyze big data.
•Proficient in developing SQL queries and scripts to extract, transform, and analyze large datasets across multiple platforms.
•Expert in deploying Microsoft Excel for complex data analysis, financial modeling, and creating dynamic reports and dashboards.
•Skilled in utilizing QlikView for interactive data visualization, enhancing business intelligence and decision making processes.
•Experienced in using Apache Spark to process large data sets in real-time, improving operational efficiency and data throughput.
•Managed version control and source code management effectively using GIT, ensuring robust project tracking and team collaboration.
•Leveraged Informatic for data integration and ETL processes, optimizing data warehousing and business intelligence capabilities.
•Developed and deployed Power BI solutions for dashboarding and visualization, providing clear insights into business metrics and trends.
•Skilled in comprehensive documentation practices, ensuring that all data processes and analyses are well-recorded and easy to understand.
•Produced detailed and actionable reports using advanced data analytics, significantly improving business strategies and operations.
•Utilized Python, including libraries like NumPy and pandas, to perform data manipulation, statistical analysis, and machine learning tasks.
•Experienced in Jupyter Notebook for developing and presenting data science projects, facilitating reproducible research and collaborative development.
•Developed predictive models using TensorFlow, enhancing business forecasting and decision-making capabilities in dynamic markets.
•Applied PyTorch in deep learning projects, optimizing algorithms for pattern recognition and predictive analytics.
•Utilized MATLAB for numerical and algorithmic computing, solving complex mathematical problems relevant to data analysis.
•Leveraged Microsoft Azure's cloud capabilities for scalable data storage, computing, and analytics solutions.
•Implemented Agile and Scrum methodologies in project management, ensuring timely delivery of data analytics projects with high precision.
•Experienced with Azure ML for creating and training machine learning models, improving accuracy in predictive analytics.
•Utilized Azure Databricks for big data analytics, enabling streamlined data processing and exploration at scale.
•Expert in creating business use cases and reporting, translating complex data into actionable insights for strategic planning.
•Skilled in applying advanced analytics and machine learning to solve real-world business problems in the insurance and financial sectors.
•Dedicated to continuous learning and applying the latest technologies in data science to remain at the forefront of industry developments.
TECHNICAL SKILLS:
•Data Analysis and Processing: SQL (5 + years), Advanced MS Excel, Python (NumPy, pandas), PostgreSQL, Apache Hadoop, Apache Spark, MATLAB, R
•Machine Learning: TensorFlow, PyTorch, AWS Sage maker, Azure ML
•Visualization and Reporting: Microsoft Excel, Power BI, Tableau, QlikView
•Cloud Technologies: AWS (S3, Redshift, Cloud), Microsoft Azure, Azure Databricks
•Programming and Tools: Alteryx, Power BI, Tableau, QlikView, Amazon Sage maker, Azure ML, Confluence, SharePoint
•Project Management: Agile, Scrum
•Documentation Tools: SharePoint, Confluence
PROFESSIONAL EXPERIENCE:
Client: Capital One, McLean, VA Apr 2024
Role: Data Analyst
Roles & Responsibilities:
•Developed complex SQL queries to analyze customer financial data, enhancing the decision-making process using AWS Cloud technologies.
•Designed and managed interactive Tableau dashboards that presented complex financial data to management, improving strategic financial planning.
•Built and trained machine learning models using Amazon Sage maker, accurately predicting customer behaviors to tailor financial services.
•Implemented data automation workflows with Alteryx, significantly enhancing the efficiency and accuracy of financial reporting processes.
•Configured AWS Redshift for robust data warehousing solutions, facilitating faster and more reliable data analysis and retrieval.
•Conducted advanced analytics using Apache Spark ALS, optimizing financial product offerings and refining customer service strategies.
•Produced comprehensive reports using SharePoint, ensuring project outcomes aligned with strategic business objectives and compliance requirements.
•Documented analytics processes and outcomes in Confluence comprehensively, providing detailed records for audit and future reference.
•Utilized AWS S3 for effective data storage and management, supporting large-scale data analytics projects and backup solutions.
•Applied R language statistical methods to analyze complex datasets, enhancing the development of financial models and risk management strategies.
•Leveraged Excel to perform detailed financial analysis and create spreadsheets that support budgeting and financial forecasting.
•Used Confluence to document project milestones and results, ensuring clear communication and documentation standards were maintained.
•Analyzed data with SQL, developing insights that led to more effective customer segmentation and marketing strategies.
•Utilized SharePoint for managing project workflows, enhancing team collaboration and document management across departments.
•Deployed AWS Recognition to analyze customer feedback and service interactions, improving customer relationship management.
•Integrated R language analyses into financial models, providing deeper insights into investment risks and opportunities.
•Managed AWS Cloud services to ensure data security and scalability, supporting the growing needs of business analytics.
•Developed predictive models using Amazon Sage maker, which accurately forecasted customer financial behavior and product preferences.
•Created dynamic visualizations in Tableau, which helped in quick decision-making by visualizing complex financial scenarios.
•Utilized Alteryx for automating data preparation and blending processes, reducing manual data handling time and errors.
Environment: SQL, Tableau, Amazon Sage maker, Alteryx, AWS Redshift, Apache Spark ALS, SharePoint, Confluence, AWS S3, R language, Excel, AWS Recognition, AWS Cloud services.
Client: AIG, New York, NY Dec 2021 - Nov 2023 Role: Data mining specialist Roles & Responsibilities:
•Analyzed insurance risk using SQL and Python, enhancing data mining processes and risk assessment accuracy.
•Implemented TensorFlow models to predict insurance claims, reducing fraudulent activities and optimizing claim processing.
•Utilized Microsoft Azure for managing and analyzing large datasets, improving the speed and reliability of data operations.
•Developed predictive analytics models using Azure ML, enhancing underwriting accuracy and claims management efficiency.
•Created detailed documentation and reports in Excel and Power BI, providing stakeholders with actionable business insights.
•Employed Jupyter Notebook to develop and test Python scripts, facilitating efficient algorithm implementation and testing.
•Managed projects using Agile methodologies, ensuring timely delivery of analytics solutions within the insurance sector.
•Designed MATLAB algorithms for premium calculations, improving financial accuracy and risk classification in insurance products.
•Configured Azure Databricks for streamlined data processing and collaboration, enhancing team productivity and data insights.
•Used Scrum techniques to manage data analytics tasks, optimizing project workflows and team collaboration.
•Applied Python libraries like NumPy and pandas for data analysis, improving operational efficiency and data quality.
•Leveraged TensorFlow and PyTorch for deep learning tasks, enhancing predictive capabilities and data model sophistication.
•Integrated data across platforms using Azure, ensuring cohesive data management and security in cloud environments.
•Documented all processes and findings thoroughly, ensuring compliance with regulatory standards and corporate governance.
•Employed Azure ML to streamline the development of machine learning models, reducing time-to-market for new analytics features.
•Developed and maintained insurance analytics models in MATLAB, providing robust solutions for actuarial data analysis.
•Utilized Azure Service Bus to manage data flows between services, enhancing data integration and real-time analytics capabilities.
•Created Azure Databricks notebooks for collaborative data analysis, fostering a culture of data-driven decision-making.
Environment: SQL, Python, TensorFlow, Microsoft Azure, Azure ML, Excel, Power BI, Jupyter Notebook, MATLAB, Azure Databricks, NumPy, pandas, PyTorch, Azure Service Bus.
Client: FuGenX Technologies, Hyderabad, India. Nov 2019 - Nov 2021 Role: Data Analyst
Roles & Responsibilities:
•Managed large datasets using PostgreSQL, supporting robust analytics and operational decision-making.
•Developed interactive dashboards in Power BI, enabling real-time business monitoring and decision support.
•Utilized Apache Hadoop and Spark to process large-scale data, improving insights into customer behaviors and market trends.
•Implemented version control with GIT, enhancing code quality and collaboration among development teams.
•Prepared extensive reports and documentation, ensuring accuracy and thorough understanding of data-driven insights.
•Applied QlikView for developing detailed visualizations, facilitating easier interpretation and application of business intelligence.
•Engineered data pipelines using Informatic, optimizing data extraction, transformation, and loading processes.
•Supported analytics and business intelligence efforts using Microsoft Excel, providing detailed financial and operational reports.
•Utilized Apache Hadoop for distributed data storage and processing, enhancing scalability and data management capabilities.
•Developed SQL scripts for data manipulation and querying, ensuring efficient data retrieval and analysis.
•Employed Spark for real-time data processing tasks, increasing responsiveness and analytical capabilities.
•Managed documentation processes using GIT, ensuring all data processes were well-recorded and version controlled.
•Created visual reports and dashboards in Power BI, providing executives with key data points for strategic decision-making.
•Used PostgreSQL for database management, ensuring robust data integrity and performance.
•Applied Informatic tools for data integration, improving data accuracy and availability across business units.
Environment: PostgreSQL, Power BI, Apache Hadoop, Spark, GIT, QlikView, Informatic, Microsoft Excel.
Education:
Master of Science: Computer Science 12/2024
Central Michigan University