Post Job Free
Sign in

Data Analyst Warehousing

Location:
Irving, TX
Posted:
October 09, 2024

Contact this candidate

Resume:

VIVEK KUMAR REDDY TADIGOTLA

MOBILE NUMBER: +1-571-***-**** EMAIL – *****.*********@*****.***

SUMMARY OF SKILLS:

v Data Analyst with 7+ years of experience and expertise in Data Analysis, Data engineering, Business Analysis, ETL/ELT, Multiple Programming Languages, Multiple BI Tools, Big Data, Automation, Datawarehouse, MDM, and multiple Database handling diverse projects in multiple domains and platforms. v Hands on experience on various BI tools like Microsoft PoweBI, Qlik Sense, Amplitude, Quick sight, Tableau and Excel for data visualizations which drove company-wide understanding of complex datasets. v Proficient in SQL, dbt, NoSQL, Google BigQuery, Azure SQL, MongoDB, Hadoop Hive, Teradata, PostgreSQL, and Redshift.

v Hands on experience on SAS and various SQL Server Services like Integration Services (SSIS), Analysis Services

(SSAS), and Reporting Services (SSRS).

v Experience in handling Data warehousing ETL projects and very good understanding in database and data warehousing concepts.

v Experienced in Amazon Webservices (AWS), Azure DevOps, AWS Athena, Apache Kafka, Pycharm, Azure cloud environment, and GCP for near real time applications. v Experience in Developing and maintaining data pipelines and designing end-to-end business intelligence solutions using Azure Data Factory and Azure Databricks.

v Extensively worked using Azure Databricks cloud by using (ADF, ADLS Gen2, SQL, Blob Storage). v Very good exposure to financial services and Technologies, Banking, Insurance, Retail, Telecom, and Health domain projects.

v Hands on experience in SaaS business environment. v Hands on experience in A/B and multivariate testing. v Experienced in gathering requirements and creating Functional Requirements Document (FRD) and Business Requirements Document (BRD).

v Very good understanding of SDLC Methodologies like Agile and Waterfall. v Known for my Excellent analytical and problem-solving skills. v Expertise In customer support and teamwork.

TECHNICAL SKILLS:

CLOUD TECHNOLOGIES Azure, AWS, GCP

DATABASES SQL, Oracle, Hadoop Hive, Amazon Redshift, Power Query, Azure SQL, DynamoDB, MongoDB, Teradata, and PL/SQL

PROGRAMMING

LANGUAGES Python, Java, R, and Pycharm

OPERATING SYSTEM Unix/Linux, Windows

TOOLS

Amazon ECS, Salesforce, AWS Athena, SAP HANA, EWFM, Python, SAP LOGON, Data structures and Algorithms, Tableau, JIRA, Quick Sight, PowerBi, MS-Excel, MS-Word, MS-PowerPoint, UFT, Putty, Informatica Power Centre, Informatica PIM360, HP ALM, VSTS and QA Complete. PLATFORMS Apache Kafka, Apache Spark, Apache Superset, Azure Databricks, Windows, Linux, Unix

ETL & VISUALIZATION Informatica, Cognos, Microsoft Power BI, Tableau, Quick sight EXPERTISE IN

DAX Formulas, ETL & Data Pipelines Design & Development, Integrating Adobe Analytics, Data Visualization, Data warehousing, Data cleaning and preparation, and Data Modeling.

KNOWLEDGE ON Dot Net, MDX, HTML, Amazon Connect, Django, C, C++, Flask EDUCATION:

Master’s in Data Analytics Engineering Jan 2021 – July 2022 George Mason University 3.78/4 Bachelor of Engineering in Electronics and

Communication Engineering

Aug 2011- Apr 2015 Sathyabama University 7.3 / 10

Board of Intermediate Education 2011 Narayana Junior College 89.7% Secondary School Education 2009 Nagarjuna English Medium High School

88.8%

PROFESSIONAL WORK EXPERIENCE:

Senior Data Analyst, Amazon August 2022 – Present

• Working as a Senior Data Analyst in Amazon since August 2022.

• We are a part of PubTech Supply Side Technology team, responsible for building services to enable targeted Ad experience and provide Publisher Ad Server interface to support all our Entertainment, and Devices partner teams across Amazon.

• We own Ad intelligence, Supply Demand matching and Publisher tools domain.

• Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies

• Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition.

• Applied advanced SQL skills to extract and analyze data from a Redshift data warehouse which involved creating and optimizing complex queries to retrieve targeted data and performing data segmentation and aggregation from scratch to derive insights on ad performance and audience engagement.

• Leveraged Amazon Redshift for data warehousing and complex queries, and Amazon Athena for interactive querying of data stored in Amazon S3. Applied these tools to analyze advertising performance, optimize data processing workflows, and generate actionable insights for campaign strategies.

• Built and published 20+ customized interactive Tableau reports and dashboards along with data refresh scheduling using Tableau Desktop and Tableau server.

• Developed and implemented data models using Python which involved applying Python libraries and frameworks to build predictive models, analyze advertising data, and derive insights to optimize campaign strategies and performance.

• Developed ETL pipelines using PostgreSQL and Python to extract, transform, and load data from various sources, ensuring data integrity and consistency.

• Assisted in the development and implementation of ETL processes.

• Participated in every phase of Software development cycle like Planning, design, and execution of the software.

• We make sure to deliver code implementations which are consistently high-quality, well-tested, and go to production also ensures that the root causes of operational issues are identified and solved.

• Created multiple COE’s (Correction of Error) which is a process for improving quality by documenting and addressing issues.

• Participates in Daily Scrum calls, Sprint planning, On-call cycle, Design reviews and Code reviews. Data Analyst, Cognizant Limited February 2019 – December 2020

• Worked as Data Analyst in Cognizant.

• Worked on analyzing the Agile Requirements and Source Systems and validate the data.

• Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS and SQL technologies.

• Worked on Delta Lake to read and write the large sets of data which is on top of Azure Data Lake Store (ADLS Zen2)

• Used Azure Databricks to run Spark applications and Azure Data Factory to perform ETL as part of pre- processing.

• Built the frameworks and modules to process large data sets using the concepts of spark SQL, data frames and load them into use case extraction layer, develop the complex queries including aggregation grouping logic and generate the final data sets which are consumed by the data scientists.

• Designed and managed data architecture, Snowflake data warehouses, implemented ETL pipelines, and optimizing performance. Experienced with Snowflake’s features such as automatic scaling, data sharing, and semi-structured data handling.

• Employed NoSQL databases to manage and analyze unstructured data for a project. This involved leveraging NoSQL technologies to efficiently handle large volumes of data, facilitating flexible data modeling, and performing in-depth analysis to support project objectives.

• Leveraged Power BI, MS Access, MS Excel, MS PowerPoint to create dynamic dashboards and reports for a ULTA retail project, providing real-time insights into sales performance and trends. Developed visualizations that identified key metrics, such as sales growth and regional performance, leading to a 20% increase in sales team efficiency by streamlining reporting processes.

• Executed data modeling for Venerable Insurance project using PyCharm as the development environment, employing Python libraries such as pandas, scikit-learn, and NumPy to construct and analyze predictive models and derive actionable insights from complex datasets.

• Created custom KPIs and data visualizations using DAX, which improved decision-making and provided actionable insights for the sales team.

• Developed complex DAX formulas to create calculated columns and measures for dynamic data analysis in Power BI.

• Designed and built ETL pipelines to automate ingestion of structured and unstructured data.

• Created SSIS packages for loading of different formats of files (Excel, Flat File) and databases (SQL server) into the SQL server Datawarehouse using SQL Server Integration Services (SSIS).

• Extracted large volumes of data from various data sources and loaded the data into target data sources by performing different kinds of transformations using SQL server integration services (SSIS).

• Developed Spark code using PySpark for faster testing and data processing.

• Defect detection as well as root cause analysis for the defects.

• End to end flow understanding and query framing for the complex logics involved.

• Created an Excel Comparator automation tool using macros for data validation.

• Worked on various tools like SQL, QA Complete, JIRA, Excel, Selenium and Macros.

• Gained good exposure to Bank, Health, and insurance domain projects. Junior • Data Skilled Analyst, in Big Infosys data. Limited May 2015 - October 2018

• Worked as Junior Data Analyst in Infosys.

• Working closely with clients and other specializations alike to extract, transform and load data from multiple data sources, exploratory data analysis, exception identification, data architecture, data modeling and feature engineering, using combination of manual processes, SQL queries, automated scripts, algorithms, and macros as is necessary.

• Assisted in the development and implementation of ETL processes.

• Experienced in navigating and analyzing interactions within business systems such as Salesforce. For example, led a project where I integrated data from Salesforce to optimize customer support workflows, resulting in a 15% improvement in response times and leveraging EWFM to enhance workforce planning accuracy.

• Applied SAS to perform advanced data analysis for LEVIS client, including data manipulation, statistical modeling, and predictive analytics. Developed comprehensive reports and visualizations that informed key business decisions and enhanced data-driven strategies.

• Utilized IBM Cognos for data analysis and reporting, creating detailed dashboards and reports to provide actionable insights and support strategic decision-making. Designed and implemented data models, performed complex queries, and visualized key metrics to enhance business performance and operational efficiency.

• Leveraged tools such as SQL for querying databases, Python for developing predictive models, and Power BI for creating interactive dashboards in a financial service project. Analyzed transaction data to assess credit risk, provided insights for strategic decision-making, and improved risk management processes.

• Conducted data cleaning and preparation tasks.

• Collaborated with data engineers to develop data pipelines to improve data quality and accessibility.

• Ensuring whether the fast loads and sources are populated as per business requirements.

• Performed Data validation across REDSHIFT, Hive, Sql and teradata databases.

• Gained knowledge on Informatica, Amazon Web services (AWS), Qlikview and many other tools.

• Posting bugs and tracking them using JIRA, HP ALM, QA Complete. Analyzing Issues/Defects.

• Gained good exposure to Retail and Telecom domain projects.

• Skilled in MDM, Big data, Hadoop.

• Automated IDTW (Infosys Internal Tool) for huge data validation using Java.



Contact this candidate