Post Job Free
Sign in

Data Analyst Senior

Location:
Visakhapatnam, Andhra Pradesh, India
Posted:
October 15, 2025

Contact this candidate

Resume:

Name: REDDY KRISHNA REDDY YEDDULA

+1-971-***-**** ***************@*****.***

Senior Data Analyst

Professional Summary:

Senior Data Analyst with over 10+ years of experience designing, developing, and optimizing enterprise-scale data solutions across healthcare, finance, and retail domains.

Expertise in data modeling, ETL, data warehousing, and BI solutions, leveraging industry best practices and modern cloud platforms for high-performance analytics.

Skilled in AWS (EC2, S3, Redshift, Lambda, Glue) and Azure (Data Factory, Synapse, Databricks), delivering scalable, secure, and cost-efficient cloud-based data solutions.

Strong proficiency in Angular 16, React 18, Python 3.11, SQL Server 2022, Snowflake 7.x, and Tableau 2023 for building interactive dashboards and data pipelines.

Experienced in Agile-Scrum environments, collaborating closely with product owners, business analysts, and developers for on-time, high-quality deliverables.

Expertise in ETL tools such as Informatica 10.x, SSIS 2022, and Azure Data Factory for large-scale data ingestion and transformation.

Skilled in creating data visualizations using Power BI, Tableau, and DAX expressions to enable data-driven decision-making.

Strong background in statistical analysis and predictive modeling using Python (NumPy, Pandas, Scikit-learn) and R.

Proficient in data migration projects, including cloud-to-cloud and on-premises-to-cloud transitions, ensuring minimal downtime.

Hands-on experience with big data tools such as Spark 3.x, Hive 4.x, and Kafka 3.x for processing and streaming large datasets.

Solid understanding of dimensional modeling, star schema, snowflake schema, and Kimball & Inmon methodologies.

Skilled in data governance, data quality management, and lineage tracking, ensuring compliance with regulatory standards.

Experienced in building Cloud Data solutions with Snowflake, Reporting Data Lake, AWS Cloud and Looker

Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica PowerCenter Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.

Worked with AWS Cloud platform and its features which includes EC2, VPC, RDS, EBS, S3, CloudWatch, Cloud Trail, CloudFormation and Auto scaling etc.

Involved in MDM Process including data modeling, ETL process, and prepared data mapping documents based on graph requirements.

Experience in loading multiple larger datasets into HDFS and processing the datasets by using the Hive and Pig.

Experience writing Hive Queries for analyzing data in Hive warehouse using Hive Query Language (HQL).

Designed and implemented effective Analytics solutions and models with Snowflake.

Prepared the reports in Excel sheets using pivot tables, V-Look ups and Macros.

Technical Skills:

Languages: SQL (2022), PL/SQL (19c), Python 3.11, R 4.3, JavaScript (ES2023), HTML5, CSS3, DAX.

Cloud: AWS (EC2, S3, Redshift, Glue, Lambda, CloudFormation), Azure (ADF, Synapse, Databricks), Snowflake 7.x.

Frameworks: Angular (16/12/10/8), React 18, Spark 3.5, Hive (4.x–2.x).

Databases: SQL Server (2012–2022), Oracle (10g–19c), Teradata 17, PostgreSQL 15.

ETL & BI: Informatica (8.x–10.x), SSIS 2022, Power BI 2023, Tableau 2023.

Big Data: Databricks 12.x, AWS EMR, Kafka 3.6.

Tools: Git, Jenkins, ERwin 12.x, JIRA.

Project Experience:

Client: Homesite Insurance, Phoenix, AZ Apr 2024 – Present

Senior Data Analyst

Responsibilities:

Led the development of enterprise data pipelines in AWS using Angular 16, AWS Glue, and Redshift, enabling real-time reporting for over 200 business metrics.

Designed and implemented Snowflake data warehouse models, optimizing query performance and reducing execution time by 40%.

Built Angular-based data visualization dashboards, integrating AWS APIs for live data retrieval and user interactivity.

Developed ETL workflows in AWS Glue for ingesting and transforming data from multiple on-premises and SaaS systems.

Managed AWS Redshift clusters including provisioning, scaling, and query tuning to support analytics workloads.

Created Python automation scripts for scheduled data extraction from APIs and S3, reducing manual processing time by 60%.

Implemented MDM solutions to standardize and deduplicate customer, product, and vendor data across systems.

Designed AWS Lambda functions for data processing triggers, ensuring near real-time updates to dashboards.

Collaborated with data governance teams to establish lineage tracking and audit compliance across AWS datasets.

Integrated Angular UI with AWS Cognito for secure role-based access to reporting tools.

Performed exploratory data analysis in Python 3.11 and published insights in Power BI for executive review.

Used Hive on EMR for processing raw event logs, enriching them for downstream analytics.

Applied AWS CloudFormation templates to automate infrastructure deployments for analytics environments.

Implemented python modules to send automated emails to the clients at regular intervals of time.

Develop Script to Create Connection to Different Data Source and migrate data to snowflake.

Experience in loading Data Files in AWS Environment and Performed analysis using SQL on AWS redshift and Snowflake.

Managed Amazon RedShift clusters such as launching the cluster by specifying the nodes and performing the data analysis queries.

Gathered requirements for MDM tool implementation and user workflow processes

Worked with Data governance, Data quality, data lineage, Data architect to design various models and processes.

Developed complex metrics required for daily business reports in Spark SQL and Snowflake SQL.

Created repositories in GitHub & developed wrapper scripts to download process that are placed in GitHub.

Environment: AWS EC2, S3, Redshift 1.0, Lambda, Glue, Angular 16, Snowflake 7.x, Python 3.11, Hive 4.x, Power BI 2023, Tableau 2023, JIRA, GitHub.

Client: Centene Healthcare, St. Louis, Missouri Dec 2021 – Mar 2024

Senior Data Analyst

responsibilities:

Led Azure Data Factory pipelines for ingesting structured and semi-structured datasets from various sources.

Built React 18-based dashboards for operational monitoring, integrating with Azure APIs.

Migrated on-premises ETL workloads to Azure Synapse and Databricks, improving scalability and reducing costs.

Developed Snowflake SQL scripts for analytics, creating aggregated datasets for executive reporting.

Implemented data quality validation in ADF pipelines to ensure accuracy and compliance.

Created REST APIs to integrate Azure-based datasets with third-party applications.

Designed Power BI dashboards using DAX expressions for detailed KPI monitoring.

Led MDM initiatives to clean, merge, and standardize customer data across multiple regions.

Developed Python scripts to transform raw IoT data into time-series analytics.

Implemented Azure Blob Storage lifecycle policies to optimize cost and storage performance.

Coordinated UAT sessions with stakeholders to validate ETL and reporting accuracy.

Designed automated CI/CD pipelines for ADF deployments using Azure DevOps.

Integrated Azure Event Hub with Databricks for real-time streaming analytics.

Created snowflake external table which hold entire data including history data, also created a view on top of external table. This view displays only current state of data by eliminating history data.

Utilized Power Query in Power BI to Pivot and Un-pivot the data model for data cleansing and data massaging.

Create automated solutions using Databricks, Spark, Python, Snowflake, HTML.

Worked extensively with Tableau Business Intelligence tool to develop various dashboards.

Wrote Python scripts to parse files and load the data in database, used Python to extract weekly information from the files, Developed Python scripts to clean the raw data.

Opened Risks or Issues that the current project is facing and worked towards resolving them.

Environment: Azure Data Factory, Synapse Analytics, Databricks 12.x, React 18, Power BI 2022, Snowflake 6.x, Python 3.10, Azure Blob Storage, Azure Event Hub, SQL Server 2019, JIRA, Git.

Client: Charter Communications, Maryland Heights, MO Jun 2020 – Nov 2021

Data Analyst

Responsibilities:

Designed and developed AWS-based ETL pipelines to process structured and semi-structured datasets, enabling faster analytics and reporting.

Built Angular 12 dashboards for data visualization, integrating APIs from AWS Redshift for live reporting.

Migrated large datasets from on-premises systems to AWS S3 and Redshift, ensuring data integrity during transfer.

Developed Python 3.8 scripts for automated data cleansing, standardization, and validation.

Created Snowflake SQL procedures for business KPIs, optimizing for query efficiency and reduced execution time.

Implemented MDM processes to unify customer and product data across multiple departments.

Managed AWS IAM policies to control secure access to analytics environments.

Conducted UAT and stakeholder reviews, ensuring requirements were met before production release.

Created metadata documentation and data dictionaries for all datasets in Redshift and Snowflake.

Collaborated with cross-functional teams to define data governance policies and lineage tracking.

Developed Angular components for reusable, interactive charts and tables.

Used Hive 3.x for analyzing log data stored in AWS EMR.

Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.

Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.

Executed the SAS jobs in batch mode through UNIX shell scripts

Created remote SAS sessions to run the jobs in parallel mode to cut off the extraction time as the datasets were generated simultaneously

Reviewed and modified SAS Programs, to create customized ad-hoc reports, processed data for publishing business reports.

Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases

Experience in creating UNIX scripts for file transfer and file manipulation.

Environment: AWS S3, Redshift 1.0, EMR, Hive 3.x, Angular 12, Snowflake 5.x, Python 3.8, Power BI 2021, Tableau 2021, SQL Server 2017, JIRA, Git.

Client: USPS, Columbus, OH Jan 2018 – May 2020

Data Analyst

Designed data integration workflows in AWS for ingesting files from external vendors into Redshift.

Developed Angular 10 dashboards for internal reporting teams, displaying metrics from AWS datasets.

Wrote Python 3.6 ETL scripts to process and clean large volumes of CSV and JSON data.

Built Informatica mappings to move data between AWS S3, Redshift, and on-premises databases.

Optimized SQL Server stored procedures for faster execution of analytical queries.

Created Snowflake staging tables to handle incremental loads from AWS S3.

Collaborated with business stakeholders to define KPI requirements and reporting formats.

Used AWS CloudFormation templates for provisioning analytics infrastructure.

Designed and documented star and snowflake schemas for reporting needs.

Built Angular services to connect dashboards to AWS APIs for near real-time updates.

Configured AWS CloudWatch alerts for monitoring ETL job performance.

Worked with data governance teams to establish compliance protocols for AWS-based analytics.

Performed exploratory data analysis using Pandas and Matplotlib to identify trends.

Designed and developed data cleansing, data validation, load processes ETL using Oracle SQL and PL/SQL and UNIX.

Created report using Oracle Discoverer, any report can further pivot, sort, stoplight, graph the data presented.

Worked effectively in analyzing Source System data and mapping it to the target system.

Creating Informatica Mappings, Mapplets, Workflows, Tasks, Scheduling and Monitoring

Experience in PL/SQL reports and forms using report builder. Manage end users in the Data Analytics view and Reporting.

Have undergone initial level Informatica Training and having hands-on.

Environment: AWS S3, Redshift 0.9, CloudFormation, CloudWatch, Angular 10, Snowflake 4.x, Python 3.6, Informatica 9.x, SQL Server 2014, JIRA.

Client: Metamor Software Solutions, Hyderabad, India July 2014 – Aug 2017

ETL Developer

Responsibilities:

Developed Informatica mappings to integrate flat files and relational databases into AWS Redshift.

Created Angular 8 admin dashboards to monitor ETL job execution statuses.

Designed PL/SQL stored procedures for staging, transformation, and loading of transactional data.

Managed AWS S3 buckets for storing raw and processed data.

Built unit and integration tests for ETL workflows to ensure data consistency.

Created Hive queries for summary-level analytics on AWS EMR clusters.

Documented ETL workflows and maintained version control in Git repositories.

Performed performance tuning of Informatica sessions and workflows.

Assisted in MDM integration to eliminate duplicates and ensure consistent reference data.

Developed Angular components to provide interactive error logs for ETL monitoring.

Conducted code reviews for ETL mappings and Python scripts.

Provided production support for AWS-based analytics pipelines.

Designed automation scripts in Python 3.5 for file transfer between on-premises and AWS S3.

Coordinated with QA teams for UAT testing and validation of ETL outputs.

Developed and tested Store procedures, Functions and packages in PL/SQL for Data ETL.

Created complex Cognos reports using calculated data items, multiple lists in a single report.

Prepared functional and technical documentation of the reports created for future references

Worked with data modelers in preparing logical and physical data models and adding/deleting necessary fields using Erwin

Environment: AWS S3, Redshift 0.8, EMR, Hive 2.x, Angular 8, Informatica 8.x, Python 3.5, PL/SQL, Git, SQL Server 2012.



Contact this candidate