Post Job Free

Resume

Sign in

Software Engineering Data Analytics

Location:
Richardson, TX
Posted:
April 10, 2024

Contact this candidate

Resume:

VENKATA KALYANI MUNAGALA

Dallas, Texas ***** 682-***-**** ad4xbh@r.postjobfree.com www.linkedin.com/in/Kalyani-Munagala

EDUCATION

The University of Texas at Dallas(Masters in Business Analytics (Data Science) Dallas, USA. Aug 2022 - Dec 2023

Jawaharlal Nehru Technological University(Masters in Business Administration) Anantapur, India. Aug 2009 – July 2011 Sri Venkateswara University (Bachelor of Commerce(Computer Applications)) Tirupati, India. July 2006 – April 2009

PROFESSIONAL EXPERIENCE

Citi, Dallas, TX, USA Sep 2023 – Till date

Data Analyst

Responsibilities:

Conducted comprehensive data quality assessments, data profiling and executed data cleansing activities in Snowflake, resulting in a 25% increase in data accuracy and reliability for critical business reports and analysis.

conducting comprehensive data quality assessments, profiling, and data cleansing activities to ensure accuracy and reliability.

Automated data pipelines in ETL Informatica, integrating diverse data sources and streamlining data processing workflows, and enhancing overall data accessibility and availability.

Prepared the complete data mapping for all the migrated jobs using SSIS.

Involved in Azure data movement and transformation capabilities (Azure Data Factory, Data Lake Analytics, Data Bricks, and Stream Analytics).

Administered SQL database, developed and deployed ETL packages in production environment.

Designed and implemented complex data pipelines using Spark, enabling seamless ETL (Extract, Transform, Load) operations, ensuring data consistency and integrity throughout the process.

Used Tableau Desktop to connect various data sources and worked with different visualizations and created dashboards.

Created Page Level Filters, Report Level Filters, and Visual Level Filters in Tableau according to the Requirements.

Utilized Tableau desktop to create various analytical dashboards that depict critical KPIs.

Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in the development environment.

Published Tableau Reports in the required originations and Made Tableau Dashboards available to Web clients. Successfully integrated custom visuals based on business requirements in Tableau desktop.

Tech Stack: Snowflake SQL, SQL, PL/ SQL Star Schema, DWH, Azure Cloud, Informatica power Center, Tableau Desktop & Server, ETL, T-SQL, Pivot,, CSV, Python, Excel.

JP Morgan & Chase, Bangalore, India

Data Analyst

Responsibilities: Mar 2019 – Aug 2022

Gather and analyze business requirements and develop workflows in Informatica PowerCenter.

Design Unix scripts, implement and support the same in git, Bit bucket, Jules, Jenkins.

Manage production systems, solve user’s problems & automate operation tasks. Engage & collaborate with other scrum teams to resolve critical & complex issues. Review assignment of incidents & ensure maintenance of healthy backlog Wrote Python scripts to automate manual data entry tasks, minimizing the risk of human errors.

Migrated the legacy jobs & PL/SQL scripts to SQL & Informatica and automated the entire worker and seat assignment process in Autosys making it easy to troubleshoot, add transformations, and update SQLs & Queries.

Undertook EMS onboarding, trained users globally, developed their comfort with new system, overcame resistance to change.

Created Splunk dashboards, monitored multiple applications & knowledge base content to capture new learning for re-use throughout the company and user base.

Employed Tableau Desktop to create data visualizations of property and seat assignment data, aiding stakeholders in making informed decisions.

Crafted Tableau dashboards utilizing workforce data.

Created reports using data acquisition tools for monthly, quarterly, and ad-hoc reports.

Performed continuous improvement on existing report templates and developed templates for new projects.

Routinely ran report templates, emphasizing clarity and regularity in result dissemination.

Evaluated data quality, documenting issues with reliability, validity, precision, and accuracy.

Helped devise and execute strategies to correct data quality where needed, maintaining clear records of data changes.

Collaborated with the Manager/Clients/Stakeholders to provide updates on completed and pending tasks for monthly and quarterly data.

Worked with 3rd party entities and internal stakeholders for ad hoc requests.

Familiarized with regulatory reporting requirements for each project.

Ensured meticulous QA/UAT on large datasets.

Engaged in technical communications to share best practices and learn new technologies and other ecosystem applications.

Identify the customers and articulate the customer expectations, establish the continuous improvements to achieve the customer expectations.

Provided exemplary customer service to internal and external customers, demonstrating professionalism, courtesy, and efficiency in addressing inquiries, issues, and support needs.

Utilized ITSM/Service now ticketing system to receive, process, and respond to technical support requests via email, telephone, and other support channels, ensuring timely resolution and documentation of all interactions.

Collaborated with Tier 2/3 team members and customer support leadership to escalate and resolve complex technical issues, contributing to the continuous improvement of departmental operations and service delivery.

Perform a data driven analysis to quantify the current software state and identify ways to improve the software experience.

Advance monitoring & alerting, debugging & troubleshooting.

Performance monitoring and capacity management of large systems using various tools. Working in Agile mode and proficient in continuous integration and continuous delivery.

Presenting proposed solutions and project status to senior management. Participating Weekly, Daily and Ad-hoc meetings.

Demonstrated strong attention to detail in writing medium to complex stored procedures, optimizing database performance and enhancing data accessibility.

Collaborated effectively within interdisciplinary teams to design and implement data pipelines, data warehouses, data marts, and focusing on performance tuning and automation.

Extensive knowledge of data movement, ETL, and OLAP techniques to streamline data processing workflows and ensure best practices for data tracking.

Communicate effectively with stakeholders, sharing knowledge and insights to facilitate continuous improvement and iterate on engineering solutions.

Demonstrated self-motivation, a proactive approach to problem-solving, focusing on delivering results & driving innovation.

Apply Big Data expertise (Hadoop, MongoDB) to optimize data processing and storage, leveraging the right tools for the job.

Tech Stack: SQL, PL/SQL, Python, Oracle, MS SQL, Winscp, Putty, SQL Server, Filemaker, ETL Informatica PowerCenter, Databricks, DWH, Tableau(Desktop & Server), Power BI, Microsoft Excel, Snowflake, Data Formatting, Data Analysis, Data Visualization, Query Optimization, Business Requirement Documents (BRDs), Statistical Analysis, Hypothesis Testing, Data Analysis Tool, Dashboard Design, Application Configuration, ITSM, Service now.

Cisco Systems (India) Pvt Ltd, Bangalore, India

Tableau Developer

Responsibilities: Aug 2018 - Nov 2018

Developed workbooks & BI reports from multiple data sources using data blending, dashboards and visualizations that empowered decision makers using Tableau desktop.

Interacted with business managers to gather requirements, converted them to technical design, developed and delivered reports.

after testing new requirements. Monitored jobs which process Terabytes of data for daily refresh of the Tableau Data sets.

Optimized multiple data sets in Tableau to enhance performance. Quickly and effectively diagnose requests regarding data integrity and functional or interactive issues in existing dashboards.

Analysis of the specifications provided by the client and understanding the business.

Gather analytical requirements from business users and translate them into technical requirements.

Created rich Graphic visualizations with Drill Downs and Parameterized using Tableau.

Creating tableau work-sheets to meet customer requirement.

Preparing Dashboards using calculations, parameters in Tableau.

Applying the Local and Global data filters on Work sheets and Dashboards.

Publishing and Scheduling the dashboards on Tableau Servers, managing permissions as a site admin.

Worked on Tableau server administration like Managing Users/Groups, Security, Backups, Scheduling the Dashboards.

Tech Stack: SQL, Oracle, MS SQL, Winscp, Putty, Tableau(Desktop & Server), Microsoft Excel, Alteryx, Data Formatting, Data Analysis, Data Visualization, Data Analysis Tool, Dashboard Design, Application Configuration, Hive/Impala, Big data.

Hewlett-Packard/Hewlett Packard Enterprise, Bangalore, India

Senior Software Engineer

Responsibilities: Feb 2015 - Aug 2018

Developed, tested, and deployed data extraction, transformation, and loading (ETL) solutions across multiple ETL tools as well as Hadoop. Collaborated with the US stakeholders to understand business requirements, converted them to technical design, and developed, tested, and delivered data driven solutions for the HP Sales division.

Brought down data processing time by 35% by innovatively changing the process of extract-transform-load data (ETL) to

Extract-Load-Transform (ELT) while dealing with large data. Utilized Python to clean and preprocess large datasets for further analysis.

Trained and reviewed task allocated to two junior resources. Acted as a Single point of contact for the SFDC (Salesforce) team, explained requirements and fixed issues.

Migrated projects from legacy ETL systems to Big Data clusters. Created a visual dashboard to predict future sales.

Worked on all aspects of Power BI/Tableau dashboards such as Performance, testing, dashboard design & tuning.

Working closely with the DXC team and the Deloitte team.

Involved in gap analysis meetings to understand existing SQL Server data structures. Involved in the analysis of the SaaS applications (SFDC & Apttus) and in the design of staging data structures. Currently working as a Module Lead for the SFDC &Apttus subject areas and handling a team of 3 developers.

Involved in all phases of SDLC process and acts as a liaison point between IT and the rest of the business/clients.

Involved in Requirement Analysis. Directly deals with the US business owners/clients to understand the requirements. Presenting proposed solutions and project status to senior management.

Created rich Graphic visualizations with Drill Downs and Parameterized using Tableau. Creating tableau work-sheets to meet customer requirement. Responding to users regarding access privileges and questions about data.

Leveraged SQL and PL/SQL experience across OLTP, Data Warehouse, and Big Data databases, emphasizing PL/SQL proficiency to design and implement efficient database systems.

Built Dashboards using Sets, Parameters, Groups, Hierarchies and Quick table calculations. Applying the Local and Global data filters on Work sheets and Dashboards. Publishing and Scheduling the dashboards on Tableau Servers.

Following the ELT approach to leverage the power of Vertica and make the data available to the users in 1/3rd the time taken by the ETL approach.

Created tables, sequences, indexes, synonyms. Designed the new process in which the entire ELT is built using Mulesoft and vSQL which will help in reducing the overall data load time.

Automated the ELT process with the help of Cisco Tidal Enterprise Scheduler. Designed strategies to handle faulty records while loading huge volume of data.

Performance tuning of the Vertica database by re-writing vSQL queries, running analyse statistics, deleting delete vectors from Vertica. Involved in phased migration process and fixing the code for the issues identified

Involved in post-production support by providing solutions and fixes for ETL issues.

Delivered exceptional customer service to both internal and external stakeholders, consistently exhibiting professionalism, courtesy, and efficiency in resolving inquiries and support needs.

Managed technical support requests using the ITSM/HPSM ticketing system across multiple channels, ensuring prompt response, resolution, and comprehensive documentation of all interactions.

Tech Stack: SQL, PL/SQL, Oracle, Vertica, MS SQL, Winscp, Putty, SQL Server, ETL Informatica PowerCenter, Alteryx, DWH, Tableau(Desktop & Server), Power BI, Microsoft Excel, Snowflake, Data Formatting, Data Analysis, Data Visualization, Query Optimization, Business Requirement Documents (BRDs), Data Analysis Tool, Dashboard Design, Application Configuration, ITSM, HPSM.

Database Foundations for Business Analytics Project Aug’ 2022-Dec’ 2022

Project separately completed on both: Oracle SQL and MongoDB Atlas (NoSQL Technology)

•Created a client-server application to be utilized by the university librarian using design, development, and implementation of the existing Library Database. Entity relationship diagram for the schema made using crow’s foot notation in MS Visio

•By creation of DB tables in the high-level schema, normalization of existing data and data generation for new books plus borrowers’ records, complete book search and checking for availability functions in both Oracle SQL & MongoDB atlas.

ACHIEVEMENTS

Google Analytics Certified in 2023.

Udemy certified Tableau Desktop & Server, AWS Developer. Promoted as an Associate at JP Morgan & Chase in 2021.

Alteryx Foundational Micro-Credential certified by Alteryx. Promoted as the Senior Software Engineer at HP in 2017.

My innovative approach of extract-load-transform (ELT) was published in HP's Center of Excellence journal in 2016.

Elected Vice President, Women's Association, 2010 and 2011, Annual inter-college competition, 2010 and 2011.

TECHNICAL SKILLS

Programming Languages : SQL, PL/SQL, Unix, R, Python, NoSQL

Key Technical Skills : Hadoop, HDFS, Hive, Sql, Arcadia, Informatica10.2/9.6/9.5, Excel,Tableau(Desktop &

Server), Vertica, SQL Server, SQuirreL SQL Client 3.7.1, AWS, Splunk, Power BI

RDBMS : Oracle Database, SQL Server, MySQL, Teradata, SQL workbench, Snowflake

Methodologies : Agile, Waterfall,

Operating Systems(Windows, Linux)

Version Control : GitLab, SVN, Bit bucket, Jules

Schedulers : Jenkins, Cisco Tidal Enterprise Scheduler, Autosys

Eligibility: Eligible to work in the U.S. for internships & full-time employment without sponsorship



Contact this candidate