Name: Likitha Dakkili
Contact: 347-***-****
Email: **************@*****.***
LinkedIn: https://www.linkedin.com/in/likitha-dakkili-9318b4239/ PROFESSIONAL SUMMARY:
7+ years of experience as Data analyst, specializing in data visualization, reporting and business intelligence across finance and retail and cloud security domains.
A great deal of experience in creating interactive dashboards, metric-driven reports and executive- level insights using Tableau, Grafana, Google DataStudio, and Amazon QuickSight.
Capable of creating automating and refining ETL pipelines with AWS Glue, GCP Dataflow, DataProc and Data Prep to provide precise, timely and clean datasets for analytics and reporting.
Demonstrated capacity to provide real-time reporting solutions with spice in Cloud/Pub/Sub/ composer and QuickSight allowing for proactive monitoring of risk metrics and business process.
Strong SQL, Python (Pandas, NumPy), and Snowflake expertise for data extraction, transformation, and metric calculations, supporting dashboards and audit-ready reporting.
Proficient in monitoring, alerting, and pipeline orchestration using Cloud Monitoring, CloudWatch, Jenkins, Sh-Airflow, and SNS to ensure pipeline reliability and system health.
Skilled in implementing automated reporting workflows and dashboards for operational and ML use cases, driving efficiency and visibility and data-driven decision making.
Skilled in risk management analytics and financial services including Tech risk Operations, FS resiliency and Recovery and SCAN tool automation for sensitive data compliance.
Practical Knowledge in centralizing and integrating data from many sources, such as MYSQL, MongoDB, Aurora RDS, Dynamo DB, S3, One Lake and warehouse systems.
Collaborative professional with experience working under Agile/Scrum methodologies, partnering with cross-functional teams to translate business requirements into actionable dashboards, reports, and analytics solutions.
Experienced with Version Control systems like GITHUB, BITBUCKET. TECHNICAL SKILLS:
GCP Cloud Data Proc, Cloud Data Flow, Big Query, Cloud Storage, Google Data Studio, Cloud Functions, Cloud Monitoring, Cloud Key Management Services, Cloud Pub/Sub, Cloud Composer.
AWS IAM, EC2, EMR, AWS S3 buckets, RDS, Quick Sight, AWS Lambda, SNS, CFT, AWS Glue, Cloud Watch, Code Pipeline, Secrets Manager, Redshift. Visualization Tableau, Power BI, Grafana.
Databases MySQL, Oracle, MongoDB.
Other technologies ER/Studio, Sh-Airflow, Splunk, Jenkins, Python, Snowflake. WORK EXPERIENCE:
CLIENT: Discover Financial Service, Chicago IL Mar25–Present ROLE: Data Analyst
Project Description: Discover is a financial corporation and services, which includes credit cards, personal loans, auto loans etc. This Managed AU page team manages and analyzes data for Discover’s Authorized user (AU) page, ensuring AU information through SQL and Python-driven data validation and reconciliation. RESPONSIBILITIES:
Maintained and improved over 10+ Amazon QuickSight dashboards that tracked certification compliance across the whole enterprise and provided manager- and LOB-level performance insights with 99% data accuracy.
Collaborated with 50+ engineering leads to validate database access and permissions, ensuring 100% adherence to internal security and compliance policies.
Used Cloud Radar to identify and catalog active AWS resources, such as RDS and Dynamo DB instances, to provide thorough Application profile reviews (APR’s).\
Automated data ingestion to QuickSight by uploading weekly Git-based updates, reducing dashboard refresh latency by 40% and enabling real-time APR status monitoring.
Conducted access reviews across native applications, identifying and mitigating 30+ SoD violations and toxic permission combinations.
Collaborated with multiple teams to onboard application database to CloudSentry, improving the ETIP METRIC from 90% to 96% through enhanced secure access and compliance tracking.
Developed and maintained ETIP reporting dashboards to monitor the progress of CSDB onboarding, offering insight into team adoption and achieving 99% ETIP compliance through focused follow-ups.
To improve reporting transparency and audit preparedness across all LOBs, APR dashboards
(Remediation, Historical Trends, and Peer Reviewer insights) were developed and improved. These dashboards track compliance status, remediation deadlines, and reviewer performance. Tool Stack: AWS, IAM, S3, Aurora RDS, DynamoDB, Neptune, Quicksight, SQL, Python, Grafana, Snowflake, Tableau, Agile (Scrum).
CLIENT: UBS, Weehawken NJ Jun 23 – Feb 25
ROLE: Data Analyst
Project Description: UBS is a financial corporation in which I predominately worked on managing and analyzing data for UBS online services for ensuring accuracy, integrity of client transaction and account information through SQL and Python-based Validation. Collaborated with cross-functional teams to monitor system performance, automate reports, and implement improvements for enhanced digital client experience.
RESPONSIBILTIES:
Developed and maintained interactive dashboards in Tableau and Google Data Studio to monitor client accounts, transaction trends, and digital service adoption, enabling data-driven decision-making.
Designed and optimized ETL pipelines using Python, SQL, and GCP Dataflow to prepare structured and unstructured client and transaction data for analytics and reporting.
Centralized transactional, account, and customer data from Oracle, MySQL, and internal banking systems into a secure Cloud Data Lake, ensuring consistent and reliable reporting.
Automated data ingestion and refresh workflows with Cloud Pub/Sub and Cloud Composer, enabling near real-time updates to dashboards and reporting pipelines for online services.
Built metric-driven reports for risk and operations teams to monitor digital service usage, detect anomalies, and improve client experience.
Implemented pipeline deployment automation using Cloud Build, ensuring dashboards and reports always reflect the latest financial and operational data with minimal manual intervention.
Monitored data pipelines and reporting workflows using Cloud Monitoring, providing actionable alerts and reducing downtime for client-facing digital reporting.
Enforced data security and access controls using GCP Key Management Service (KMS) and internal banking compliance protocols to protect sensitive client information.
Collaborated with cross-functional teams under Agile Scrum, translating business requirements into actionable visualizations and reports for operational, compliance, and product stakeholders.
Delivered reporting solutions and dashboards that improved visibility into online service usage, enabling UBS teams to proactively manage client needs and optimize digital operations.
Tool Stack: Tableau, Google Data Studio, Python, SQL, GCP Dataflow, Cloud Storage Data Lake, Cloud Pub/Sub, Cloud Composer, Cloud Build, Oracle, MySQL. CLIENT: Capital One, Plano TX Sep 22 – May 23
ROLE: Data Analyst
Project Description: Capital One is a financial corporation offering personal loans, auto loans, credit cards, and deposit products. The Loan Team focuses on ensuring accurate and efficient loan operations while protecting sensitive customer data. Automated manual SCAN tool processes to reduce BAU tasks, improve data accuracy, and enhance operational efficiency. RESPONSIBILITIES:
Developed a SCAN tool to identify potential sensitive loan data (HSHD) in OneLake and AWS S3 datasets.
Integrated loan data from S3 into the SCAN tool to automate manual record maintenance previously done in Google Sheets.
Wrote Python scripts to detect sensitive loan data via pattern matching across OneLake and S3 buckets.
Used AWS Glue crawlers to store SCAN tool results in S3 and load into Snowflake for metric calculation.
Automated status updates in the SCAN UI using built-in connectors in AWS Lambda.
Built interactive Grafana dashboards to provide insights on SCAN case metrics and status.
Automated email notifications to dataset owners twice a week using SNS topics and subscribers.
Calculated SCAN tool metrics using complex SQL queries in Snowflake.
Developed QuickSight visualizations to share metrics with senior management.
Used Apache Airflow (Sh-Airflow) for scheduling, orchestrating, and monitoring data workflows and DAG deployment.
Managed sensitive credentials with Chamber of Secrets for secure storage and retrieval of database and API keys.
Implemented CI/CD pipelines using Jenkins for deployment and testing of data pipelines.
Developed and maintained Splunk dashboards and alerts to monitor system health and detect anomalies, reducing downtime.
Collaborated with dataset owners, cross-functional teams, PDS, and developers in an Agile Scrum environment.
Developed and maintained documentation of data pipelines and data architecture in Confluence and Lucid Charts.
Tool Stack: Snowflake, AWS S3 buckets, RDS, Grafana, Quicksight, Lambda, SNS, Glue jobs, Agile, SQL, Python, Onelake, Jenkins, Sh-Airflow, Splunk. CLIENT: APPLE, Austin TX Mar 20 – July 22
ROLE: Software Dev/ Data analyst
Project Description: Worked on an internal Apple application designed to monitor employee activity and detect potential security threats. As a Software Developer transitioning to Data Analyst, I ingested, processed, and visualized security log data, building dashboards and automated alerts to support real-time threat monitoring and operational decision-making. RESPONSIBILTIES:
Developed data pipelines in Glue to process, clean, and transform logs related to malware, unauthorized access, and phishing for analysis.
Integrated processed data from S3 to Amazon QuickSight, creating a unified view for stakeholders.
Built real-time dashboards using QuickSight SPICE, visualizing potential security threats detected by machine learning models.
Collaborated with security teams to validate and refine threat detection algorithms based on operational feedback.
Monitored and optimized dashboard and pipeline performance using AWS CloudWatch, ensuring reliable and timely threat detection.
Defined and deployed AWS resources and configurations using CloudFormation Templates (CFT).
Tool Stack: AWS Glue, Amazon S3, Amazon QuickSight, SPICE, CloudFormation Templates, Amazon SNS, CloudWatch, Python, SQL.
CLIENT: United Technologies, INDIA Jun 18 – Dec 19 ROLE: Software Developer
RESPONSIBILTIES:
Developed and maintained Java-based backend services for the internal employee monitoring application, ensuring secure and efficient processing of log data.
Integrated APIs to ingest security logs into backend systems and implemented data validation and transformation logic using Java.
Designed and optimized Java modules for alerting and notification workflows, enabling timely detection of potential security threats.
Collaborated with cross-functional teams to enhance application features, troubleshoot issues, and ensure seamless integration with AWS services and dashboards.