Akanksha Reddy
Senior Data Analyst
Phone: +1-312-***-****
Email: ****************@*****.***
PROFESSIONAL SUMMARY:
Over 9+ years of experience delivering comprehensive data solutions – currently serving as Senior Data Analyst. Excellent understanding of business operations and analytics tools for effective analysis of Data.
Expertise in various types of software development life cycles like water fall and agile
Experience in working on different Databases/Data warehouses like Teradata, Oracle, AWS Redshift, and Snowflake.
Proven expertise with databases like Teradata, Oracle, MYSQL, and MS SQL server, as well as reporting and data visualization skills with Tableau, sophisticated MS Excel, and Google Sheets.
Working knowledge of big data tools such as PySpark, Spark SQL, Hadoop, Spark, Hive, and Pig.
GIT Version Control System experience built analytic modules and data pipelines using Kafka.
Good experience in AWS using EC2, Volume and Snapshot management, AWS Dynamo DB, AWS S3, AWS RDS, AWS VPC, Route 53, Elastic Beanstalk and IAM services.
Developed SQL, Stored Procedures, Views, Indexes and Constraints, Query Optimizer.
Created numerous Teradata BTEQ, MLOAD, FLOAD, and PL/SQL scripts.
Handling all the domain and technical interaction with application users, analyzing client business processes, documenting business requirements.
Extensive experience working with Business Intelligence data visualization tools with specialization on
Tableau Desktop, Server, Mobile, and Public, Micro Strategy, Business Objects and SSRS.
Experience in designing/building the Universes and assigning user privileges.
Experience in working on Performance and Optimization of the SQL Queries. Used SQL scripts as a Relational Databases.
Developed Report Mock-up samples before User interaction to analyze the business requirements.
Extracting data from existing data sources, Developing and executing Business reports for Performance Monitoring
Programming experience in statistical languages, such as Python, R, and Gplot.
Expertise in all aspects of analytics/data warehousing solutions (Database issues, Data modeling, Data mapping, ETL Development).
Experience in using Power BI Power Pivot to develop data analysis prototype, and used Power View and Power Map to visualize reports.
Extensive experience working with GCP Cloud services like GCP cloud dataflow, GCP Big Query, GCP Cloud Storage, GCP PUB/SUB.
Created reports using Business Objects functionalities like Queries, Graphs, Charts, Slice and Dice, Drill Down, Functions, Cross Tab, Master Detail and Formulas, etc.
Knowledge of Cassandra and familiarity with NoSQL databases like MongoDB (Robo Mongo).
Has experience working with Python libraries including NumPy, SciPy Pandas, and Spark as well as loading and extracting data.
Extensive experience in implementing Microsoft BI/Azure BI solutions like Azure Data Factory, Power BI, Azure Databricks, Azure Analysis Services, SQL Server Integration Services, SQL Server Reporting Services and Tableau.
Worked with providers and Medicare and or Medicaid entities to validate the EDI transaction sets or internet portals.
Knowledge in developing ETL packages with SSIS to verify, extract, transform, and load data into Datawarehouse databases as well as handle SSAS cubes to store data in OLAP databases.
Experience in report writing using SQL Server Reporting Services (SSRS) and creating various types of reports like drill down, Parameterized, Cascading, Conditional, Table, Matrix, Chart and Sub Reports.
Provide support on production issues including troubleshooting, coordination with IT, and end user.
TECHNICAL SKILLS:
Programming Statistical Analysis: R, Python, SAS E-miner, SAS Programming, MATLAB, Minitab
Databases: Snowflake, SQL Server, MS-Access, Oracle, Teradata, Cassandra, Neo4j, MongoDB
DWH / BI Tools: Microsoft Power BI, Tableau, SSIS, SSRS, SSAS, Business Intelligence Development Studio
(BIDS), Visual Studio, Crystal Reports, Informatica
Data Visualization Tools: Tableau, Power BI, Python Visualizations, Excel Dashboard
Database Design Tools and Data Modelling: MS Visio, ERWIN, Teradata SQL Assistant, Toad SQL, Power
Designer ER/Studio Big Data Tools: Hadoop, Spark, Hive, Pig, AWS S3, AWS Redshift, AWS DynamoDB, AWS RDS, AWS, VPC, AWS EC2, AWS Elastic Beanstalk, AWS IAM, Databricks, Kafka
Version Control System: Git, GitHub, Anaconda Navigator, Jupyter Notebook, Azure Data Factory, Azure Databricks, Azure Analysis Services, Looker, MongoDB (Robo Mongo), Smart View, Nexus
Languages: T-SQL, Java, Scala, PL/SQL, SQL, C, C++, XML, HTTP, MATLAB, DAX, Python, R, SAS E-miner, SAS
PROFESSIONAL EXPERIENCE:
Liberty Mutual Insurance - Boston, MA Feb 2022 to Present
Senior Data Analyst
Responsibilities:
Performing T-SQL tuning and optimization of queries that take longer execution time using MS SQL Profiler and Database Engine Tuning Advisor.
Demonstrated ability to create advanced analytics solutions using Qlik’s associative Data Science, enabling dynamic data exploration and discovery for users across different business functions.
Proficient in Adobe Analytics, with hands-on experience in setting up and configuring tracking parameters to capture user interactions across digital channels.
Utilized Adobe Analytics to conduct in-depth analysis of website traffic, user engagement, and conversion funnels, providing actionable insights to improve online customer experiences.
Developed custom reports and dashboards in Adobe Analytics to visualize key performance metrics, such as page views, bounce rates, and conversion rates, tailored to meet business objectives.
Implemented segmentation strategies in Adobe Analytics to identify and target high-value customer segments, enabling personalized marketing campaigns and content recommendations.
Implemented and maintained custom objects, fields, and page layouts in Salesforce to support business processes and data requirements, ensuring system integrity and scalability.
Designed and executed data migration strategies to migrate legacy data into Salesforce, leveraging data loader tools and best practices to ensure data accuracy and completeness.
Integrated Salesforce with other systems and applications, such as marketing automation platforms and customer service tools, to streamline data flow and enhance cross-functional collaboration.
Skilled in customizing Looker interfaces and implementing advanced features such as dynamic filtering, parameterized reports, and drill-down functionality to provide users with interactive and insightful data experiences.
Developed Confluence templates and macros to streamline documentation processes and improve consistency.
Implemented Confluence spaces for project documentation, meeting notes, and technical specifications.
Used Teradata SQL Assistance to implement SQL logic later used in DataStream to perform ETL jobs or to create simple Excel reports.
Expertise in building custom extensions and visualizations in Qlik to meet specific business requirements and enhance the overall user experience of analytical applications.
Proficient in Qlik scripting language (QlikView Scripting and Qlik Sense Scripting), allowing for data manipulation, aggregation, and calculation tasks to support complex analytical needs.
Experienced in conducting training sessions and workshops to educate end-users on Looker best practices, empowering them to effectively leverage the platform for data analysis and reporting tasks.
Configure Azure Kubernetes Nodes for dev, staging and perf followed by performance.
Expertise in full life cycle Business Intelligence implementations and have a understanding of all the aspects of an implementation project using OBIEE 10.1.3.x/SIEBEL analytics 7.x.
Continuously learning and staying updated on emerging trends and technologies in visualization
design, BQ, and SQL, to incorporate innovative techniques and tools into projects and enhance skills to deliver maximum value for the organization.
Have Extensive Experience in IT data analytics projects, Hands on experience in migrating on premise ETLs to Google Cloud Platform (GCP) using cloud native tools
Demonstrated experience in conducting User Acceptance Testing (UAT) and documenting test cases in both personal commercial line of property and casualty insurance.
Reverse engineered backend database to change the T-SQL scripts, create Views, Stored Procedures, Triggers and Functions to improve performance drastically.
Developing Merge jobs in Python to extract and load data into MySQL database, also worked on
PythonETL file loading and use of regular expression
Working with import and direct query, creating custom table in the Power BI.
Merge queries and append queries, remove columns and split columns, choosing required columns in the data.
Environment: ER Studio 9.7, Tableau 9.03, AWS, Teradata 15, MDM, GIT, Unix, Python 3.5.2, MLlib, SAS, regression, logistic regression, Hadoop, NoSQL, Teradata, OLTP, random forest, OLAP, HDFS, ODS, NLTK, SVM, JSON, XML, MapReduce, Google Dialog Flow.
U.S. Bancorp - Minneapolis, Minnesota Aug 2020 to Jan 2022
Senior Data Analyst
Responsibilities:
Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
Analysed business requirements and employed Unified Modeling Language (UML) to develop high-level and low-level Use Cases, Activity Diagrams, Sequence Diagrams, Class Diagrams, Data flow Diagrams.
Analysed business requirements and segregated them into high level and low-level Use Cases, activity diagrams using Rational Rose according to UML methodology thus defining the Data Process Models.
Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
Utilized corporation developed Agile SDLC methodology. Used Scrum Work Pro and Microsoft Office software to perform required job functions.
Using Shared Containers and creating reusable components for local and shared use in the ETL process.
Worked on creating and maintaining sales reports using in MS Excel, SQL in Teradata, and MS Access, produce performance reports and implement changes for improved reporting.
Work with region and country AML Compliance leads to support start-up of compliance-led projects at regional and country levels. Including defining the subsequent phases training, data migration and the uplift strategy (updating of customer information to bring them to the new KYC standards) review of customer documentation.
Provided weekly, monthly & ad hoc web analytics reports using Adobe Site Catalyst & Google Analytics.
Worked on developing Tableau data visualization using Cross Map, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails, and Density Chart.
Used Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
Substantial report development experience utilizing SQL Server Reporting Services (SSRS), and Microsoft Excel.
Worked on Informatica Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformations.
Helped the team with the forecasting by providing accurate results by ad-hoc queries, data support including in depth analysis and reporting.
Writing PL/SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
Creating Excel templates created Pivot Tables and utilized VLOOKUPs with complex formulas.
Utilize complex Excel functions such as pivot tables and VLOOKUPs to manage large data sets and make information readable for other teams.
Involved in developing SQL-based data warehouse environments and created multiple custom database applications for data archiving, analysis, and reporting purposes.
Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.
Worked with the Enterprise Data warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process.
Environment: Teradata, UNIX Shell Scripts, SQL, Snowflake, MS Azure, Data Bricks, Power BI, Python, Python, Power BI, Azure Analysis Services, SQL Server, Azure Data Lake, Teradata, NumPy, Pandas, Matplotlib, Docker, AWS, Airflow, Spark, AWS Cloud Search, Azure Databricks.
Mayo Clinic - Rochester, Minnesota March 2018 to July 2020
Data Analyst
Responsibilities:
Good understanding of Teradata SQL Assistant and data with Data Analytics, Data Reporting, Graphs, Pivot Tables and OLAP reporting.
After using R to validate the graphs and modifying the variables to fit the model, a comprehensive report was generated.
Designed and developed artifacts for Data lineage & ILDM (Source to target mapping).
Captured and stored the execution plan using the SQL Server profiler made the required indexes.
Designed and created ETL packages using SSIS to generate Data Warehouses from various tables and file sources, including Excel and flat files. The data was transformed using a variety of SSIS techniques, including derived columns, aggregations, merge joins, count, conditional split, and more.
Employed Informatica-Power Center to do the loading, extracting, and transformation
Using Tableau and Python, we optimized the MS Excel procedures that were already in place to provide complete business intelligence solutions.
Migrated Teradata database to PostgreSQL database in AWS.
Involved in testing the SQL Scripts for report development, Tableau reports, Dashboards, Scorecards and handled the performance issues effectively.
Worked extensively with Tableau Business Intelligence tool to develop various dashboards.
Built business metrics in Spark SQL to create business value.
Stored and retrieved data from data-warehouses using Amazon Redshift.
Developed a streaming application using Spark Streaming and Kafka for real time event ingestion and processing to be used for fraud defence mechanisms.
Tableau server logons and security checks.
Preparing Dashboards using calculations, parameters in Tableau.
Producing a range of graphical reports with NumPy and matplotlib, two Python programs.
Used Python SciPy to do data exploration in order to identify trends and choose features.
Worked extensively on Hive, Impala and Pig for data munging and analysis.
Using Oracle Database, SQL, and PL/SQL, complex database objects such as Stored Procedures, Functions, Packages, and Triggers were developed.
Used Python scripts to update content in the database and manipulate files.
Used Metadata Hub for importing metadata from repository, new job categories and creating new data elements.
Environment: Teradata, UNIX Shell Scripts, MS Excel, MS Power Point, AWS, Tableau, Teradata, PostgreSQL, AWS, SQL, Spark SQL, Amazon Redshift, Spark Streaming, Kafka, R, Python, Data lineage tools, ILDM tools, Teradata SQL Assistant,NumPy, Matplotlib, Oracle Database, PL/SQL, SQL Server Profiler, SSIS, Hive.
Mastercard - Hyderabad, India May 2014 to Nov 2017
Data Analyst
Responsibilities:
Developed story telling dashboards in Tableau Desktop and published them on to Tableau Server, which allowed end users to understand the data on the fly with the usage of quick filters for on demand needed information.
Developed and created classes with Dimension, Detail & Measure objects and developed Custom hierarchies to support drill down reports.
Responsible for gathering requirements from Business Analysts and identifying the data sources required for the requests.
Wrote SAS Programs to convert Excel data into Teradata tables.
Actively reviewed over 208 unique variables and 4,700 rows of data using Excel and Python.
Used Python multiple data science packages like Pandas, NumPy, SciPy, Scikit-learn and NLTK.
Involved in code-review sessions on existing SSRS report’s performance and Dataset query tuning.
Used MDX scripting for querying OLAP cubes.
Perform detailed data analysis (i.e. determine the structure, content, and quality of the data through examination of source systems and data samples) using SQL and Python.
Created drill down and drill up in worksheets in tableau.
Developed Teradata SQL scripts using OLAP functions like rank and rank Over to improve the query performance while pulling the data from large tables.
Worked on importing/exporting large amounts of data from files to Teradata and vice versa.
Created Derived tables in the BO Designer to enhance the capabilities and performance of the universe.
Created report schedules on Tableau server.
Used Inner Join, Outer join while creating tables from multiple tables.
Designed and developed weekly, monthly reports related to the logistics and manufacturing departments using Teradata SQL.
Environment: Tableau, Teradata, BTEQ, VBA, Python, SAS, UNIX, SQL, Windows, Business Objects, Analysts, SAS, Excel, Teradata, Python, Pandas, NumPy, SciPy, Scikit-learn, NLTK, SQL, SSRS, MDX, Tabular Modeling, SSDT, Analysis Services, OLAP.