Name: Triveni Ganta
Email: *********@*****.***
Contact: 612-***-****
SUMMARY;
3+years of Experienced IT Professional with functional and industry experience with accomplished process and project responsibility. Data analysis, design, development, Quality assurance, user acceptance & performance management disciplines using Lean & Agile techniques with varying team sizes inclusive of permanent and contractor resources.
Experience in Data Modeling and Architecture, Database Administration, Data Conversion Validation, Data Warehouse Development, Report Creation, Data Conversion, Applications Testing, Software Quality Assurance, User Acceptance Testing, Training and Support.
Experienced and knowledgeable in Systems Development Life Cycle (SDLC) like Requirements gathering, Analysis, Design, Implementation and Agile (Scrum) Software
Knowledge and Experience in Banking Deposits, credit cards, home loans, and auto Applications
Strong Teradata skills that include build/maintain Teradata tables, Views, Constraints, Indexes, SQL & PL/SQL scripts, Functions, Triggers and Stored procedures.
Experience in Creating Teradata objects including Volatile Table, Derived Table, Global Temporary Table and Multiset Table as needed for retrieving data
Extensive experience on creating ad hoc reports using Teradata, SQL Server, BTEQ scripts, Unix
Created and utilized sub queries, views and macros needed as part of the job.
Used Aggregations, Set Operators like Union, Minus, and Intersect, CASE Expressions and String expressions to retrieve data in required form from multiple tables.
Good knowledge in Tableau Reporting Tool and hands on experience on creating heat maps.
Worked on loading data from flat files to Teradata tables using SAS Proc Import and Fast Load Techniques
Expertise in Data mining with querying and mining large datasets to discover transition patterns and examine financial data
Hands on Experience in Troubleshooting test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
Developed dynamic Excel macros to clean, transform, and analyze HR/payroll/sales data.
Created custom dashboards and reports using VBA to enhance decision-making.
Experience in developing data applications with Python in Linux/Windows and Teradata environment.
Experienced in conducting GAP analysis to identify the delta between the current performance with the potential performance of the existing software application
Recognized for partnering with business leaders and technical teams to plan, integrate, document and execute complex project plans on time and on budget.
TECHNICAL SKILLS:
ETL & Big Data: Informatica 9.1/8.6/7.1.2 SSIS, Data Stage 8.x, Hadoop, Hive
GUI Reporting Tools: Business Objects6.5, Brio, Hyperion, Tableau, Unica Affinium Campaign
Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin
Testing Tools: Win Runner, Load Runner, Test Director, Mercury Quality Center, Rational Clear Quest
RDBMS: Oracle 11g/10g/9i/8i/7.x, MS SQL Server, UDB DB2 9.x, Teradata V2R6/R12/R13, R14, MS Access 7.0
Programming: SQL, PL/SQL, UNIX Shell Scripting, VB Script, Python
Environment: Windows (95, 98, 2000, NT, XP), UNIX
Other Tools: TOAD, AWS, MS-Office suite (Word, Excel, Project and Outlook), BTEQ, Teradata V2R6/R12/R13 SQL Assistant
PROFESSIONAL EXPERIENCE:
Data Analyst
BCBS March 2024 - Present
Responsibilities:
Involved in analysis, design and documenting business requirements and data specifications. Supported data warehousing extraction programs, end-user reports and quality assurance
Worked on numerous ad-hoc data pulls for business analysis and monitoring by writing SQL scripts.
Created monthly and quarterly business monitoring reports by writing Teradata SQL queries include System Calendars, Inner Joins and Outer Joins to retrieve data from multiple tables.
Designing predictive and spatial analytics model in Alteryx to support business decision making.
Developed BTEQ scripts in Unix using Putty and used cron-tab to automate the batch scripts and execute scheduled jobs in Unix
Performed verification and validation for accuracy of data in the monthly/quarterly reports.
Analyzed and validated data in Hadoop Lake by querying through hive tables.
Proficient in using VLOOKUP, HLOOKUP, and INDEX/Match for dynamic data retrieval and cross reference across multiple datasets.
I am skilled in utilizing Power Query, Text Function, and conditional formatting to preprocess and structure raw data.
Created reports, charts by querying data using Hive Query Language and reported the gaps in lake data loaded.
Developed Teradata SQL scripts using RANK functions to improve the query performance while pulling the data from large tables.
Designed and optimized E to clean, blend, and analyze large datasets.
Automated data preparation, reducing processing time from X hours to Y minutes.
Experience in performing Dual Data Validation on various Businesses critical reports working with another Analyst.
Developed Alteryx workflows to integrate HR/finance/supply chain data from multiple sources.
Created dashboards in Tableau/Power BI to visualize A/B test performance and report insights to stakeholders.
Designed and executed A/B tests to optimize user experience, conversion rates, and product performance.
Supported agile best practices, contributing to process improvements and team efficiency.
Designed stunning visualizations using tableau software and publishing and presenting dashboards on web and desktop platforms.
Developed Python programs for manipulating the data reading from various Teradata and convert them as one CSV Files, update the Content in the database tables.
Technical Skills: Teradata SQL Assistant, Teradata, Teradata Loading utilities (Bteq, Fast Load, Multiload), Python, Unica Infinium Campaign, Hadoop, Hive, UNIX Shell Scripts, Tableau, MS Excel, MS Power Point, Agile, Alteryx.
Data Analyst
Walmart Mar 2023 – Feb 2024
Responsibilities:
Responsible for gathering requirements from business analysts and operational analysts, Identified the data sources required for the reports needed to the customers.
Used Python programs automated the process of combining large datasets and Data files and then converting as Teradata tables for Data Analysis.
Created Automated Python Programs to Archive the database tables which large in size and not in use into Mainframes folders.
Developed programs with manipulate arrays using libraries like NumPy and Python.
Automated reporting process using Alteryx, improving data accuracy and reducing turnaround time.
Capable of leveraging excel Marcos ( VBA) and power pivot for processing automation and handling large data set.
Strong command of nested Ifs, SUMIFS, COUNTIFS, XLOOKUP, and array formulas for complex data manipulation.
Integrated Excel with external databases (SQL, Access) via VBA to streamline data retrieval.
Writing SQL scripts for huge data pulls and ad-hoc reports for analysis. Used the Teradata advanced techniques like rank, row number etc.
Generated graphs using MS Excel Pivot tables and creating presentations using Power Point.
Generated reports using Proc Tab, Proc Report, Data Null, Proc SQL and Macros. Used OLAP functions like sum, count, csum, etc.
Analyzed test results using statistical methods (confidence intervals, p-values) to ensure data-driven decision-making by using A/B testing
Created predictive models using Alteryx for employee attrition/sales forecasting.
Created Set, Multiset, Derived, Volatile, Global Temporary tables to retrieve the data from multiple tables.
Experience in writing Korn shell scripts for automating the jobs. Automated reports by connecting Teradata from MS Excel using ODBC.
Technical Skills: Teradata, Teradata utilities (SQL Assistant, BTEQ, Fast Load, Fast Export), Hadoop, Alteryx.
Junior Data Analyst
Atos Aug 2021 - Nov 2022
Created multi-set tables and volatile tables using existing tables and collected statistics on table to improve performance.
Implement point of view security to Tableau dashboards to facilitate visibility across various levels of the Organization
Communicated with business users and analysts on business requirements. Gathered and documented technical and business Meta data about the data.
Worked on numerous ad-hoc data pulls for business analysis and monitoring by writing SQL scripts.
Collaborated with cross functional teams to ensure smooth sprint execution and timely deliveries using agile.
Developed programs with manipulate arrays using libraries like NumPy and Python.
Collaborated with cross-functional teams to ensure data integrity and accuracy in reports and analyses.
Created Set, Multiset, Derived, Volatile, Global Temporary tables to retrieve the data from multiple tables.
Experience in building automated and interactive dashboards using PivotChartd, slicers, and data validation.
Technical Skills: Teradata, Teradata utilities (SQL Assistant, BTEQ, Fast Load, Fast Export), Hadoop, Python, Unix, Agile.
Education: -
Master of Science in Information Technology in Management, GPA: 3.84/4.0 May 2024 University of Concordia St Paul, MN, USA.
Bachelor of Engineering and Technology in Information Technology Engineering, GPA 3.4/4.0 April 2022 Mallineni Lakshmaiah Women’s Engineering College, Guntur, India.