SIVANAGARAJU DATA ANALYST
+1-832-***-**** ***********@*****.*** linkedin.com/in/sivanagaraju09
SUMMARY
●5+ years of experience in data analytics, business analytics, database development, and ETL design, specializing in financial analytics, operational reporting, data modeling, dashboarding, KPI tracking, and scalable cloud-based data solutions using modern tools and platforms.
●Skilled in Python, SQL, PL/SQL, and Py Spark, with expertise in writing complex queries, procedures, functions, triggers, indexes, and views for large-scale data.
●Proficient in cloud platforms including AWS (EC2, S3, Redshift, Lambda, Glue, CloudFormation) and Azure (Data Factory, Data Lake Gen2, Microsoft Fabric), enabling end-to-end development, orchestration, automation, and optimization of scalable, high-performance data pipelines and cloud analytics solutions.
●Experienced in data warehousing, ETL/ELT processes, and database design/normalization, ensuring high-performance, scalable, and reliable analytics solutions; designing optimized data models, integrating and transforming data from diverse sources, and enabling actionable insights through structured reporting and BI tools.
●Hands-on experience with Snowflake, Databricks, SQL Server, and DBT, implementing efficient transformations and building robust data models.
●Proficient in Python libraries for advanced analytics and machine learning (NumPy, Pandas, Matplotlib, Seaborn, SciPy, Scikit-learn, TensorFlow) to support statistical modeling, predictive analytics, and data-driven decision-making.
●Strong background in data visualization and business intelligence using Tableau and Power BI, including dashboard development, Excel, advanced DAX, KPIs/KRIs tracking, performance optimization, and analytics for financial domains.
●Domain expertise in financial analytics across banking, and pharmacy supply chain, including capital and liquidity risk reporting, underwriting model evaluation, FDIC regulatory compliance, fraud detection, reconciliations, and audit support.
TECHNICAL SKILLS
Programming & Data Processing: Python, R, PySpark, SQL
ETL Tools: Azure Databricks, Microsoft Fabric Notebooks, Azure Data Factory, Snowflake,
Reporting Tools: Power BI, Tableau, PowerPoint, Advanced Excel
Database/Cloud Technology: Azure, Microsoft fabric, MySQL, SQL Server, Snowflake
AI & Other Tools: GitHub, Workiva, Jira, Confluence, Copilot Bot Automation, A/B Testing
Finance Domain Knowledge: Financial Statements, Banking Operations, Claims Analytics, Risk Management, Audit Processes, Key Risk Indicators (KRIs), Key Performance Indicators (KPIs), Regulatory Compliance (Basel, FASB, IFRS), Loan Underwriting & Evaluation, Fraud Detection, Financial Reporting, Reconciliation & Controls
Core Competencies & Soft Skills: Stakeholder Communication, Problem Solving, Critical Thinking, Data-Driven Decision Making, Operational Efficiency, Business Process Improvement.
PROFESSIONAL EXPERIENCE
AT&T Data Analyst Austin, TX August 2023 – Present
●Collaborated closely with business users, compliance, and data governance teams to gather requirements, enforce data standards, and ensure, accurate reporting, while implementing best practices for data quality, validation, and governance across enterprise-scale datasets.
●Designed, implemented, and optimized scalable ETL pipelines using Azure Data Factory, Microsoft Fabric, and Databricks with Python, Py Spark, and SQL, consolidating and transforming enterprise-scale financial data.
●Performed data preprocessing, complex transformations, and validation in Fabric Notebooks and Databricks, ensuring high-quality, consistent, and reliable analytics.
●Automated hourly, daily, and monthly data refresh schedules in Microsoft Fabric Pipelines and Notebooks, ensuring timely reporting across multiple departments.
●Developed Python scripts to extract and process data from REST APIs into Azure Data Lake via Data Factory, enhancing workflow efficiency and system integration.
●Built Copilot automation solutions, including KPI and process automation Bots, streamlining compliance, regulatory, and audit operations and saving up to one week
●Implemented ETL and R scripts within Microsoft Fabric to automate computation and tracking of 30+ KPIs, integrating results with Power BI and other BI tools for enhanced risk reporting, real-time monitoring, and actionable insights that improved regulatory compliance and operational efficiency.
●Developed and deployed machine learning models using Autoencoders and Isolation Forests to detect anomalies in journal entries, payments, and claims, enhancing fraud detection, improving data accuracy, and supporting risk mitigation strategies across financial operations.
●Leveraged Snowflake on Azure for ETL, cleaning, transformation, and large-scale data analysis, ensuring data integrity and enabling advanced analytics.
●Created interactive dashboards and reports in Power BI and Tableau, transforming complex financial datasets into actionable insights, enabling KPI monitoring, trend analysis, performance tracking, and data-driven decision-making across multiple business units.
Infosys Data Analyst Hyderabad, India April 2020 – December 2022
●Collaborated with business, compliance, audit, and risk surveillance teams to translate requirements into effective, scalable analytical and reporting solutions.
●Designed and architected scalable ETL pipelines using Azure Data Factory, Azure Databricks, Data Lake Storage, Blob Storage, and Azure SQL Database, enabling automated ingestion, transformation, and validation of multi-source data and improving efficiency, consistency, and reliability in downstream reporting.
●Extracted and integrated structured/semi-structured datasets (CSV, JSON, XML, SQL Server, Snowflake) using PySpark and SQL, applying advanced cleansing, aggregations, and transformation logic to produce analysis-ready, high-quality datasets.
●Built and optimized Snowflake tables, materialized views, stored procedures, and SQL transformations, improving query performance, resource usage, and data processing efficiency for reporting and analytics.
●Performed detailed data validation, reconciliation, and integrity checks across multi-source datasets using SQL, Python, and Excel automation, and implemented data quality rules and anomaly alerts to detect inaccuracies, inconsistencies, and irregular patterns early in the reporting process.
●Built fraud detection and anomaly detection scripts using SQL and Python (Pandas, NumPy, PySpark), reducing oversight gaps and improving operational transparency across compliance and audit functions.
●Documented fraud rules, validation criteria, field mapping, data lineage, and issue-tracking outcomes in Jira and Confluence, ensuring transparency, audit readiness, and repeatability for future reporting cycles.
●Developed automated, interactive Power BI and Tableau dashboards to track KPIs, KRIs, and operational performance, reducing manual reporting time by ~45% and improving real-time visibility for leadership and audit teams.
●Utilized advanced Excel (Power Query, VBA macros, PivotTables, XLOOKUP, INDEX/MATCH, COUNTIFS/SUMIFS) to streamline reconciliation, validation, financial analysis, and operational reporting workflows, improving accuracy, reducing manual effort, and accelerating delivery of critical business insights.
●Maintained version control using Git/GitHub to ensure traceability, repeatability, and clear documentation across reporting and analytics workflows.
●Delivered accurate dashboards and reports with 100% SLA/TAT compliance while presenting actionable insights to stakeholders, supporting timely, data-driven decisions and contributing to improved operational efficiency and performance.
EDUCATION
Master of Computer Science
Lamar University, USA