Post Job Free
Sign in

Data Analyst Business Intelligence

Location:
Elk Grove, CA
Salary:
75000
Posted:
March 02, 2025

Contact this candidate

Resume:

RUGVEDA SAMHITHA PALADUGU

DATA ANALYST

***************@*****.*** +1-945-***-****)

PROFFESIONAL SUMMARY:

Results-driven Data Analyst with 3 years of experience in data analytics, business intelligence, and reporting, leveraging data-driven insights to enhance decision-making and optimize business processes.

Strong expertise in SQL, Power BI, and Python for data extraction, transformation, analysis, and visualization, ensuring effective data storytelling and actionable insights.

Experienced in working with cloud platforms (AWS, Azure, GCP) for storing, processing, and analyzing large-scale datasets in scalable and cost-efficient environments.

Proficient in ETL development, data pipelines, and data quality monitoring, ensuring accuracy and consistency across multiple data sources. Hands-on experience with Apache Airflow for workflow automation and Great Expectations for data validation.

Adept at translating business requirements into logical and physical data models, optimizing database performance, and ensuring efficient query execution for high-performance analytics.

Experienced in managing cross-functional teams, prioritizing tasks, and coordinating projects, ensuring timely delivery of high-quality analytics solutions.

Skilled in analyzing large datasets to uncover trends, drive strategic decision-making, and improve operational efficiency through data-driven insights.

Successfully developed dynamic tools and systems, including a Dynamic Currency Conversion

(DCC) Tool, CAP360 Analyzer, and an IoT-based Real-Time Air Pollution Monitoring System, integrating static & dynamic analysis with IoT solutions.

Strong ability to communicate insights to both technical and non-technical stakeholders, conduct Root Cause Analysis (RCA), and collaborate with clients to resolve issues efficiently.

Capable of delivering well documented applications, ensuring adherence to industry standards, data governance, and project deadlines.

Passionate about leveraging emerging technologies, automation, and AI-driven analytics to enhance business intelligence capabilities and streamline data operations. WORK EXPERIENCE:

Capgemini Hyderabad, India Feb 2021 – Dec 2022

Role: Data Analyst

Key Responsibilities:

IoT Data Processing Pipeline:

Built a data pipeline to collect, process, and analyze data from IoT devices (sensors, smart devices). Used Apache Kafka for data streaming and Apache Spark for processing large volumes of data. Stored the processed data in a cloud-based data warehouse (AWS Redshift, Google BigQuery).

Impact: Enabled efficient monitoring of IoT devices, improving response times to anomalies and enhancing operational efficiency.

Tools: Kafka, Spark, AWS (Redshift, S3), Python, SQL. Scalable Cloud Data Lake Implementation:

Designed and implemented a scalable data lake architecture on AWS to store and process large volumes of structured and unstructured data. Used AWS S3 for storage, AWS Glue for ETL, and Athena for querying. Integrated with Apache Hive for data cataloging.

Impact: Improved data accessibility and reduced query times, enabling faster insights for business teams.

Tools: AWS (S3, Glue, Athena), Apache Hive, Python, SQL. Financial Data ETL Pipeline:

Developed an ETL pipeline to process and analyze financial transaction data from multiple sources

(CSV files, APIs, databases). Used Apache Airflow for workflow orchestration and Talend for data transformation. Loaded the data into a data warehouse for reporting and analytics.

Impact: Streamlined financial data processing, improving the accuracy and timeliness of financial reports.

Tools: Apache Airflow, Talend, SQL, Python, AWS Redshift. Cloud Data Warehouse Migration:

Migrated an on-premise data warehouse to a cloud-based solution (Snowflake, AWS Redshift). Designed the new schema, optimized queries, and ensured data integrity during the migration process. Automated the migration using Python scripts and AWS Lambda.

Impact: Reduced infrastructure costs and improved query performance, enabling faster access to data for business teams.

Tools: Snowflake, AWS Redshift, Python, SQL, AWS Lambda. Server Log Analytics System:

Built a pipeline to process and analyze server log data using Apache Hadoop (HDFS, MapReduce) and Apache Spark. Stored the processed data in Elasticsearch for real-time search and visualized it in Kibana.

Impact: Enabled efficient monitoring of server performance, reducing downtime and improving system reliability.

Tools: Hadoop (HDFS, MapReduce), Spark, Elasticsearch, Kibana, Python. Synergy Data INC Texas, USA July 2024 - Present

Role: Software Analyst - II

Key Responsibilities:

E-Commerce Data Pipeline:

Created a data pipeline to collect and analyze e-commerce transaction data from multiple sources, including web logs and databases. Used Apache NiFi for data ingestion, Apache Spark for processing, and Tableau for visualization.

Impact: Provided actionable insights into customer behavior, supporting targeted marketing strategies and improving sales performance.

Tools: Apache NiFi, Spark, Tableau, SQL, Python. Data Quality Monitoring Platform:

Developed a data quality monitoring system to ensure the accuracy and reliability of data across multiple sources. Used Great Expectations for data validation and Apache Airflow for scheduling and monitoring data quality checks. Integrated with Slack for real-time alerts on data quality issues.

Impact: Improved the reliability of business reports by reducing data errors and enhancing the ability to identify and resolve issues.

Tools: Great Expectations, Apache Airflow, Slack API, Python, SQL. Automated Customer Enrollment Portal:

Built a customer enrollment portal using Python and DB2 to manage and maintain bank account details for new customers. The portal automated the onboarding process, reducing manual data entry and errors. Integrated with existing banking systems to ensure data consistency and security.

Impact: Improved the efficiency of customer onboarding, reducing manual effort and errors while enhancing customer experience.

Tools: Python, DB2, SQL, Flask.

Social Media Analytics Dashboard:

Designed and implemented an analytics dashboard to monitor and analyze social media metrics

(Engagement, reach, impressions) for a marketing team. Used Apache Kafka for data streaming, Apache Spark for processing, and Power BI for visualization. Data was ingested from multiple social media APIs (Twitter, Facebook) and stored in a cloud-based data warehouse (AWS Redshift).

Impact: Enabled the marketing team to monitor social media performance effectively, allowing for quick adjustments to campaigns and improving overall campaign effectiveness.

Tools: Kafka, Spark, Power BI, AWS (Redshift, S3), Python, SQL. Technical Expertise:

BI Tools: Power BI, Tableau, Excel Power Query

Languages: Python, C

ETL Tools: SSIS, Talend, Informatica

Hadoop Ecosystem: HDFS, Hive, Pig, MapReduce

Big Data Frameworks: Apache Spark, HBase

GIT: Branching, Merging, GitLab, GitHub

Database Systems: SQL, MySQL, PostgreSQL, Microsoft SQL Server

Data Modeling: Star Schema, Snowflake Schema, ERD

Cloud & Tools: Azure Data Studio, AWS (Fundamentals), Jupyter Notebook INTERNSHIP:

BHARAT SANCHAR NIGAM LIMITED (BSNL) Vijayawada, India JAN 2019 - JAN 2020 Role: Telecommunications Intern

Key Responsibilities:

• Assisted in the configuration, troubleshooting, and maintenance of network equipment including routers, switches, and modems.

• Conducted analysis of network performance data to identify trends and areas for improvement, contributing to enhanced service quality.

• Monitored telecommunication systems and networks to ensure optimal performance and to quickly address any issues.

• Supported various telecommunication projects, including the implementation of new technologies and the upgrade of existing systems.

• Prepared and maintained detailed documentation of network configurations, procedures, and troubleshooting steps.

• Assisted in fieldwork for setting up and testing telecommunication equipment at various sites.

• Provided technical support to customers, helping to resolve issues related to network connectivity and service disruptions.

EDUCATION:

• Bachelor’s from Prasad V Potluri Siddhartha Institute of Technology in Electronics and Communication Engineering, Vijayawada, India.

• Master’s from Fitchburg State University in Computer Science, Fitchburg, MA, USA. CERTIFICATIONS:

• Introduction to the Internet of Things and Embedded Systems Certified by Coursera, an online learning initiative of “University of California” May 2019.

• The Arduino Platform and C Programming May 2019.

• The Raspberry Pi Platform and Python Programming for the Raspberry Pi May 2019.

• Programming for the Internet of Things May 2019.

• Business Intelligence using Power BI Feb 2024.

• ChatGPT and AI Tools Feb 2024.



Contact this candidate