Post Job Free
Sign in

Data Analyst

Location:
Grand Prairie, TX, 75052
Posted:
April 23, 2025

Contact this candidate

Resume:

Ashrithaa K

Contact: +1-603-***-**** ******************@*****.***

PROFESSIONAL SUMMARY

Over 5 years of experience as a Data Analyst with an understanding of Data Modeling, and understanding of Data warehouse, and client/server applications.

Well-versed in all phases of SDLC like Requirement Analysis, Implementation, and Maintenance, and proficient in both methodologies like Agile and Waterfall.

Good knowledge of a Python-based environment, along with data analytics, data visualization, Tableau and Excel data extracts, Power BI, and familiar with Pandas, NumPy, and matplotlib.

Experience of Data Visualization Tools like Tableau, Power BI, and Microsoft Excel.

Experience with AWS cloud services for data storage, processing, and visualization.

Proficient in, SQL, Snowflake and Oracle.

Experience in building data pipelines for automating the extraction, transformation, and loading (ETL) of geospatial data, ensuring data integrity and performance in GIS applications.

Proficiency in using data analytics tools to analyze supply chain performance, identify trends, and make data-driven decisions.

Skilled in implementing business and IT data requirements through new data strategies and designs across all data platforms, including relational, dimensional, and NoSQL databases.

Experience with Google Sheets for data analysis, collaboration, and reporting.

Expertise extends to various data tools for reporting, visualization, analytics, and machine learning.

Experience with Databases such as MySQL, SQL Server, Microsoft SQL Server, and Oracle SQL.

Proficient in all phases of the project life cycle including, data cleaning and pre-processing, and data visualization.

Proven expertise in building dashboards, reports, and supporting innovative AI/ML POCs.

Providing documentation, training, and consulting for Data stakeholders.

Designing, communicating, and executing end-end solutions.

Perform outreach activities to new business areas engaged in modeling.

Developing and maintain business unit and enterprise-wide model risk metrics and reports.

Knowledge on GitHub includes data manipulation, statistical modeling, visualization, push/pull and Repo Management.

SKILLS:

Languages

C, SQL, Python, R, MATLAB

Methodology

Agile, Waterfall

Servers

Apache Tomcat

IDE

Visual Studio Code, Jupyter Notebook, PySpark, Eclipse

Packages

NumPy, Pandas, Seaborn

Visualization tools

QlikSense, Tableau, Power BI, Google Sheets, SAS, MS Excel

Database

MySQL, PostgreSQL, MS SQL server, NoSQL, Oracle

Operating System

Windows, Linux

Version Tools (CI/CD)

GitHub, Jenkins

Cloud

AWS (S3, EC2, Athena, Glue, DynamoDB, Redshift, Cradle)

Other Skills

Data Cleaning, Analytical, Critical Thinking, Communication Skills, Presentation Skills, Problem Solving

EXPERIENCE:

Client: Sutherland, USA Sep’22-Present

Role: Data Analyst

Responsibilities:

Interpreted data and analyzed results using statistical techniques, providing ongoing reports to stakeholders.

Acquired data from primary and secondary sources, maintaining databases and data systems.

Leveraged BigQuery for high-performance querying of large datasets, improving reporting efficiency by 30%.

Designed and optimized data warehousing solutions using AWS Redshift, enhancing data storage and retrieval efficiency for large-scale analytics.

Designed & optimized Power BI dashboards using DAX & Power Query, ensuring seamless Snowflake integration for high-performance reporting.

Conducted A/B testing and advised product managers and engineers on sample sizes, optimal metrics, and experimental design, ensuring robust numerical findings.

Designed and implemented KPI measurement strategies in partnership with business stakeholders, optimizing reporting efficiency and delivering actionable insights.

Developed KPIs and dashboards to monitor SaaS product adoption, customer churn, and conversion flows.

Developed and automated Weekly Business Review decks and performance monitoring reports using Quicksight and Excel, delivering insights on key business metrics.

Integrated data visualizations with AWS Redshift for real-time reporting.

Applied AWS core concepts, including EC2, S3 and Lambda, to build scalable and cost-effective data solutions, ensuring high availability and performance.

Conducted trend analysis and statistical modeling using Python and R to predict outcomes, identify business opportunities, and explore applications of Generative AI in enhancing operational efficiencies

Built robust ETL pipelines utilizing AWS Glue, Redshift, S3 and Athena to support large-scale data processing and modeling workflows.

Leveraged AWS Redshift for tracking supply chain metrics, improving decision-making.

Participated in Agile sprints, refining backlog and collaborating with cross-functional teams to deliver automation solutions.

Integrated Power BI with multiple data sources, including SQL databases and cloud services, to ensure seamless data flow and up-to-date reporting.

Environment: AWS, PySpark, Pandas, NumPy, Snowflake, Matplotlib, ggplot2, Seaborn, SAS, R, GitHub, Microsoft Excel, Power BI, SQL, Windows/Linux, Apache Tomcat.

Client: BigBasket, Bangalore-India Feb’21-Apr’22

Role: Data Analyst

Responsibilities:

Implemented customer segmentation models using Python and R, increasing campaign effectiveness by 15%.

Conducted in-depth data analysis and created detailed reports using AWS Athena, facilitating data-driven decision-making

Built Power BI dashboards for vendor cost tracking, aiding procurement decisions.

Used Python and SQL for data cleaning, enhancing vendor compliance reporting.

Designed A/B experiments to test the effectiveness of marketing and product changes, leveraging Stata, Python, and R for statistical analysis.

Automated ETL pipelines and data integration using Tableau and Data Lake.

Developed integrated workflows with Alteryx combining data preparation with visual reporting tools like Tableau and Power BI.

Streamlined data workflows using Snowflake, improving data processing efficiency by 20%.

Designed and optimized data workflows incorporating AWS DynamoDB and NoSQL databases to manage dynamic operational datasets.

Migrated reports from Oracle OBIEE to Power BI, ensuring smooth transition and performance optimization.

Designed and implemented data profiling techniques using Snowflake and SQL to identify patterns and anomalies in enterprise data assets.

Built and maintained data quality dashboards to monitor data integrity across enterprise assets, ensuring compliance with governance standards.

Conducted exploratory analysis on pricing impact and user onboarding trends to support strategic initiatives.

Provided ad-hoc reporting and deep-dive analysis on customer service trends, helping stakeholders drive data-driven decisions.

Participated in the development of AI-driven insights for optimizing vendor management strategies, contributing to enhanced ML model performance.

Implemented scheduled report delivery using Tableau, automating the distribution of reports to stakeholders on a regular basis.

Environment: ETL, Python, AWS, Power BI, Snowflake, Looker, NumPy, Pandas, Tableau, Matplotlib, SciPy, ggplot2, R, Tableau, GitHub, SAP, Microsoft Excel, SAS, ESG, Microsoft Office 365, SQL, MS Excel Pivot tables, Google sheets.

Client: Fabindia, New Delhi-India June’19-Feb’21

Role: Data Analyst

Responsibilities:

Developed Tableau dashboards showcasing location-based sales trends, leading to a 15% revenue increase.

Created interactive Power BI dashboards to monitor cost-saving initiatives and track spend analysis.

Utilized AWS Athena for rapid querying of large datasets, enhancing data analysis capabilities and reducing query time.

Analysed large datasets using SQL and Python, extracting actionable insights that informed strategic decision-making and cost-saving initiatives.

Actively contributed to Scrum sprints, refining user stories and collaborating with developers to optimize reporting automation.

Utilized MS Excel, including pivot tables, LOOKUP and advanced tips for efficient data entry and analysis.

Developed stored procedures and complex views in SQL Server using Joins for robust and swift data retrieval.

Conducted spend analysis with SQL and Power BI to identify cost-saving opportunities.

Designed data cleansing workflows in Excel and SQL, improving data accuracy by 25%.

Supported data lineage documentation to ensure compliance with governance policies.

Supported KPI tracking and performance measurement through tailored Tableau dashboards, integrating capacity management insights to optimize operations.

Leveraged AWS S3 for secure data storage and Tableau Prep for data cleaning and transformation.

Conducted regression analysis and hypothesis testing using Python and R, resulting in data-driven pricing strategies and a 10% profit margin improvement.

Supported data lineage documentation for SQL-based systems, streamlining metadata management and ensuring compliance during audits.

Providing improved data quality and accuracy through cleaning and efficient data entry and analysis utilizing Excel features.

Utilized EC2 instances for batch processing of procurement data, ensuring high availability and quick access to processed data.

Worked on database automation version for MySQL and Oracle DB.

Developed interactive Tableau dashboards with maps and charts to present location-based delinquency reports.

Collaborated with cross-functional teams to gather requirements and deliver tailored Power BI solutions that meet diverse business needs.

Supported the documentation of data flow processes and integrations, ensuring alignment with compliance guidelines.

Experience with SQL procedures for swift and robust data retrieval using optimized SQL procedures.

Data-driven pricing adjustments derived from hypothesis testing and regression analysis using Python and R.

Environment: MySQL, Python, Microsoft Excel, Tableau, Power BI, Google Sheets, Microsoft Office 365, AWS, SQL, NumPy, Pandas, seaborn, SciPy, Microsoft Excel Pivot.



Contact this candidate