Post Job Free
Sign in

Data Analyst Senior

Location:
Hyderabad, Telangana, India
Posted:
October 15, 2025

Contact this candidate

Resume:

SaiKumar Reddy Puchakayala

Montgomery, AL ************@*****.*** LinkedIn 656-***-****

PROFESSIONAL SUMMARY

Professional Senior Data Analyst and experience in Community Associate – Intelligence Analyst with 5+ years of experience delivering end-to-end data analysis, investigative research, and integrity assessments in both public and healthcare sectors. Proven ability to assess complex data sets, detect anomalies, and deliver actionable intelligence supporting law enforcement and compliance initiatives.

Expert in conducting background investigations, compiling and analyzing research findings, preparing detailed narrative intelligence reports, and building case files to support regulatory decisions. Adept at working with proprietary tools and city databases such as LexisNexis, CLEAR, ACRIS, and PASSPort.

Highly proficient in leveraging Microsoft Office Suite, conducting online and database searches, and preparing case documentation aligned with regulatory requirements. Experienced in supporting law enforcement liaisons, regulatory compliance teams, and legal analysts through comprehensive intelligence briefs and cross-agency collaboration.

Known for superior organizational skills, attention to detail, and critical thinking in reviewing documentation, validating identities, and analyzing patterns related to corruption, fraud, and organized crime.

Possess strong communication skills and the ability to interact with multi-level stakeholders including law enforcement and regulatory authorities. Experienced in developing scalable intelligence processes and investigative protocols.

Passionate about public service integrity, information transparency, and supporting a collaborative environment that safeguards regulatory and criminal justice standards.

TECHNICAL SKILLS

Graph Databases: Neo4j (familiarity), Gremlin (concepts)

Streaming & Event Processing: Kafka, Spark Streaming, Flink (concepts), Kinesis (familiarity)

DataOps / MLOps: Azure DevOps, GitHub Actions, CI/CD Automation

Security & Compliance: Data Encryption, Secure Data Transfer (SFTP, HTTPS), Regulatory Access Controls

Containerization: Docker, Kubernetes (basic)

Office Suite Proficiency: Microsoft Word, Excel, Outlook, Teams.

Analytical Reasoning: Criminal Pattern Detection, Root Cause Analysis.

Communication: Stakeholder Liaison, Report Presentation.

Time Management: Case Tracking, Deadline-Driven Execution.

Detail Orientation: Document Review, Data Verification.

Microsoft Stack: SQL Server, SSIS, ActiveBatch (familiarity)

Data Governance: Metadata Management, Data Lineage Tracking, Data Quality Frameworks

Monitoring & Optimization: ETL Performance Tuning, Failure Recovery, Error Logging

Self-Motivation: Independent Case Handling. Data Integration & APIs: REST API Data Ingestion, JSON/XML Parsing, Requests Library

Snowflake Ecosystem: Snowpark (Data Processing), Snowpipe, Warehouse Optimization

Azure Platform: Azure Data Factory, Azure Data Lake, Azure DevOps

Pipeline Monitoring: Logging, Error Handling, Job Scheduling, Performance Optimization

Database Research: Online Search, Internal Systems Navigation.

Case Management: File Updates, Status Logs.

Documentation Standards: Report Formatting, Record Maintenance.

Narrative Writing: Clear, Concise Analytical Writing.

Programming & Scripting: Python (Pandas, NumPy, PySpark), SQL, Excel VBA

Data Engineering: Microsoft Fabric (conceptual), Databricks (Delta Lake, Workflows), Azure Data Factory, Airflow, Spark, DBT

Data Modeling: Kimball Dimensional Modeling (Star & Snowflake schemas), Power BI Semantic Models, DAX

Databases: MySQL, PostgreSQL, BigQuery, Redshift, NoSQL (MongoDB)

Data Tools: Spark, Hadoop, Hive, EMR, Kafka

Visualization: Power BI, Excel Dashboards

Compliance & Risk Analytics: Regulatory Data Review, Fraud Detection

Cloud Platforms: AWS (S3, Redshift, Lambda, Glue), Snowflake, GCP (BigQuery)

Streaming Frameworks: Spark Streaming, Flink (concepts)

Programming: Python (Pandas, PySpark), SQL, PL/SQL

Cloud Platforms: Azure (Fabric, Data Lake, DevOps), AWS (S3, Glue, Redshift)

CI/CD & DevOps: Azure DevOps (Pipelines, Branching, Release), Git, Jenkins

Visualization: Power BI, Grafana, Excel Dashboards

Governance: Data Lineage, Metadata Documentation, Quality Automation, Access Control

Big Data Tools: Spark, Hadoop, Kafka

Security & Compliance: Data Encryption, Audit Logging, Regulatory Standards

Visualization: Grafana (familiarity), Tableau (basic), Power BI

Industrial Systems: SCADA (concepts), Test Data Integration, Sensor Data Streams

CI/CD: Git, Docker, Jenkins, Azure DevOps

Cloud Infrastructure: Azure (IaaS, PaaS, SaaS), Azure AD, Azure Monitor, Log Analytics, Resource Groups, ARM Templates, AWS (EC2, S3, Lambda, CloudWatch)

Systems & Servers: Microsoft Windows Server, Active Directory, Group Policy Objects (GPO), O365 Administration, Virtualization (Azure Local, Hyper-V)

Automation & Scripting: PowerShell, Bash, Azure CLI, Python

Monitoring & Maintenance: Azure Monitor, Grafana, CloudWatch, Alerting, Performance Tuning, Patch Management

Disaster Recovery & Backup: Azure Recovery Services Vault, Snapshot Management, Version Control, System Restoration

ITIL / ITSM: Incident Management, Problem Escalation, Service Delivery, Change Control

Security & Compliance: Encryption, Role-Based Access Control (RBAC), Secure File Transfer, Data Privacy

PROFESSIONAL EXPERIENCE

Data Engineer Citi Bank, USA (May 2023 – Present)

Project: Insurance Operations Analytics

Managed Azure IaaS environments, provisioning and maintaining virtual machines, resource groups, and storage accounts for production workloads.

Supported Azure AD (Active Directory) for user provisioning, access control, and group policy management.

Monitored system health using Azure Monitor, Log Analytics, and Grafana, ensuring uptime and alert-based remediation.

Automated VM backup and restoration workflows using Azure Recovery Services Vault and PowerShell scripts.

Implemented and tested disaster recovery procedures and documented system recovery runbooks.

Participated in 24/7 on-call rotation, responding to high-priority incidents and ensuring minimal downtime for business-critical systems.

Maintained and supported O365 collaboration tools (Teams, SharePoint, Outlook) for internal departments.

Applied patch management and lifecycle maintenance for cloud resources and on-prem hybrid systems.

Supported project delivery and system enhancements by coordinating with cross-functional DevOps and security teams. Collaborated with engineering teams to simulate CAN and TCP/IP data ingestion, validating pipeline readiness for high-frequency sensor data. Contributed to DevOps practices (CI/CD, IaC) for scalable deployment of data assets.

Supported projects involving data governance, encryption, and lineage tracking. Extracted and verified regulatory information using SQL and proprietary tools.

Developed reports identifying suspicious patterns related to financial activity.

Prepared compliance-ready documentation aligned with internal governance. Integrated Snowflake for optimized data storage and analytics, implementing schema models for scalability.

Collaborated with DevOps to deploy automated pipelines through Azure Data Factory and AWS Glue.

Implemented API data ingestion scripts using Python for automated data capture and validation.

Monitored ETL pipeline performance in production, improving load times by 20%.

Coordinated with security and legal teams for cross-functional case updates.

Reviewed and analyzed client data for regulatory inconsistencies.

Led sessions addressing potential fraud indicators and mitigation.

Maintained confidential case logs and documented all investigative actions.

Communicated risk analysis findings to executive stakeholders.

Produced detailed analytical narratives for compliance officers.

Supported incident escalation workflows with analytical backup.

Collaborated with third-party investigators on shared risk profiles.

Conducted validation audits of regulatory record updates.

Streamlined case tracking through Excel and reporting tools.

Assisted in data collection efforts for law enforcement response teams.

Designed and standardized CI/CD pipelines for ETL workflows using Azure DevOps, Git, and Jenkins.

Contributed to DBT model development and administration for data transformation and automation in Snowflake.

Leveraged Python and AI-based automation for metadata management, code validation, and data profiling.

Mentored analysts and junior engineers on ETL best practices, code review, and platform standards.

Collaborated with cross-functional teams to drive Data Engineering Standards CoE initiatives and process improvements.

Key Achievements:

Reduced financial compliance incident review time by 40%.

Created analytic templates used across 4 internal units.

Increased fraud pattern detection accuracy with enhanced tools.

Developed new escalation protocol adopted by compliance team.

Technologies Used: LexisNexis, SQL, Excel, Outlook, Compliance Reporting Tools, Narrative Writing, Risk Review Templates, Fraud Detection, Regulatory Compliance, Stakeholder Briefings, Data Verification, Incident Escalation, Pattern Analysis, Team Collaboration, Documentation Protocols, Microsoft Teams, CLEAR, Case Audit Support, Compliance Analytics, Secure Data Handling.

Data Analyst NextGen Healthcare, India (Jan 2020 – Dec 2022)

Project: Payer Analytics & Fraud Monitoring

Built automated ETL pipelines for time-series and event-based healthcare device data, simulating hardware telemetry integration.

Implemented data quality and anomaly detection checks for sensor-like data streams using Python and SQL triggers.

Deployed AWS data storage and streaming architectures supporting analytics on millions of data points per day. Managed Azure Virtualization hosts (Hyper-V / Azure Local) for healthcare data analytics workloads.

Deployed and maintained PaaS and SaaS-based environments, including Azure SQL, App Services, and secure API gateways.

Developed and followed standard operating procedures (SOPs) and ITIL-based change management for production updates.

Conducted cloud system performance reviews to optimize compute utilization and reduce cost by 15%.

Configured system-level monitoring and alerts through Azure Monitor and Service Health for proactive incident response.

Created real-time visualizations in Tableau and Grafana for equipment metrics and validation dashboards.

Containerized ingestion services using Docker to ensure scalability and fault tolerance.

Conducted data validation and calibration routines to improve pipeline precision for device monitoring systems. Verified medical claim records for inconsistencies and legal compliance.

Built intelligence reports for internal regulatory reviews.Coordinated with data integrity officers on pattern identification.

Reviewed operational files for potential violations or anomalies.

Created dashboards for regulatory case assessments.

Analyzed healthcare data to identify fraud patterns.

Provided case analysis summaries to investigative leads.

Supported stakeholder communications on regulatory file statuses.

Led workshops on compliance-focused data documentation.

Maintained confidential logs of patient record reviews.

Validated internal systems for adherence to healthcare laws.

Partnered with legal and audit teams for periodic investigations.

Authored case write-ups for law enforcement handoffs.

Built Microsoft Excel templates for document analysis automation.

Processed and transformed large healthcare datasets using PySpark and SQL for scalable data warehousing.

Supported migration of analytics dashboards to AWS Redshift and Power BI for centralized reporting.

Monitored data streaming from Kafka topics for fraud monitoring pipelines (proof-of-concept).

Conducted validation testing and logging on Linux-based environments for data reliability. Cloud & Orchestration: AWS (S3, Glue, Lambda), Databricks, Airflow, Azure Data Factory

Testing & Validation: PyTest, Automated Regression Scripts, Unit & Integration Testing

Languages & Frameworks: Python, Java (familiarity), Spark, Spring Boot (concepts

Key Achievements:

Boosted document audit success rate by 30%.

Delivered compliance reporting toolkits across 2 regional offices.

Developed dashboards that identified 50+ fraud triggers.

Presented case findings to multi-agency review board.

Technologies Used: LexisNexis, Excel, Microsoft Word, SQL, Healthcare Compliance Tools, Case Management Systems, Fraud Analytics, Narrative Reports, ACRIS, PASSPort, Online Search Techniques, Data Review Templates, Internal Audits, Data Integrity Reports, Pattern Identification, Background Verification, Report Automation, Communication Protocols, Law Enforcement Support, Case File Summaries.

EDUCATION

Master’s in Computer Science – Campbellsville University, USA

Bachelor’s Degree – Guru Nanak Institutions, India



Contact this candidate