Post Job Free

Resume

Sign in

Data Analyst

Location:
Dallas, TX
Salary:
70
Posted:
February 17, 2023

Contact this candidate

Resume:

Name: Yajvanth Sai

Email: advei5@r.postjobfree.com

Phone 469-***-****

ETL Data Analysis Data Reporting Tableau

Data Analyst with Around 8 years of experience in Data warehousing, Data engineering, Feature engineering, big data, ETL/ELT, and Business Intelligence. As a data analyst, specialize in, relational databases, tools like Tableau, PowerBI.

PROFESSIONAL PROFILE

Around 8 years of Industry Experienceas a Data Analyst with solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications.

Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Informatica Power Center Experience in testing and writing SQL statements - Stored Procedures, Functions, Triggers and packages Proficient in AWS, SQL Server.

Experience in Data Analysis and emphasis over Business processes and tools using Excel, Teradata SQL Assistant, Query Analyzer, and SQL Developer.

Involved all phases of SDLC, Analysis, Development, Testing, Production, Support & User Training.

Interfacing and implementation of applications using Snowflake, TERADATA 15.0, SQL SERVER Extensive experience in SQL Server Stored Procedures, Functions, Database Triggers, Views, Indexes and advanced database concepts.

Used the AWS Sage Maker to quickly build, train and deploy the machine learning models.

Proficient in writing loader scripts like BTEQ, Multi Load, Fast Load, TPump and Fast Export.

Experience with Data Remediation, Data Profiling, Data cleansing and Data Validation.

Understanding and thorough knowledge in Data warehouse/ Datamart /Database modeling concepts.

Well Versed with all aspects of Data Engineering, Data Modeling, Analysis, Design, Development and Visualization of various Business applications in related Database platforms.

Worked on cleaning, exploring, and manipulating source data and transform it to the target system using Python and tools such as Pandas, NumPy, Matplotlib.

Created detailed reports on Diesel Engineers for Reliability Analysis using Python Programming.

Extract, cleanse, and combine data from multiple sources and systems using R and Python Programming.

Experience in using MS Office (Word, Excel, Access & PowerPoint), MS Visio & Office 365

Extensively worked on Tableau10.1 for creating Tableau Dashboard.

Involved in Troubleshooting, Performance tuning of reports and resolve issues within Tableau server and Tableau Desktop Report Expertise in Installation, Configuration and administration of Tableau server in a multi-server and multitier environment.

Excellent Excel skills in Functions, Charts, Pivot tables, Data Validation and importing and export data with other Databases and Applications.

Experience in Data Analysis and emphasis over Business processes and tools using Advanced Excel, Teradata SQL Assistant, Query Analyzer, and SQL Developer.

Experienced in Software Development Lifecycle (SDLC) using SCRUM, Agile methodologies.

Technical Skills:

GUI & Reporting Tools

Business Objects,Brio, Hyperion, Tableau, Unica Affinium Campaign

Web Technologies / Other components

J2EE, XML, Log4j, HTML, CSS, JavaScript,

Server Side Scripting

UNIX Shell, Power Shell Scripting.

Databases

Oracle, Microsoft SQL Server, MySQL, Teradata

Programming Languages

Java, Python ( Flask), R

Data Modeling

Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin

Reporting Tools:

Tableau10.1, Tableau 8.1

Other Tools:

TOAD, MS-Office suite (Word, Excel, Project and Outlook), BTEQ, Teradata V2R6/R12/R13 SQL Assistant, PowerBI

BI Tools

Tableau Desktop 10.x/9.x/ 8.3/7, Tableau Server 8.2, OBIEE 11g/10.1.3.4, PowerBI2.73

Education:

Bachelors in Information Technology and Engineering from OSMANIA UNIVERSITY

Masters in Management Information Systems from Northern Illinois University, Dekalb, Illinois

CAREER SUMMARY

Client: Symplr Healthcare, Kansas (Remote) Mar 2021 – Present

Role: Data Analyst

Responsibilities:

●As a Data Analyst, worked closely with Business Analysts to gather requirements and design a reliable and scalable data pipelines.

●Power BI dashboard maintenance, SQL, Tableau, Python, Data Handling.

●Used Microsoft Excel (e.g., charts, filters, vlookup, pivot tables).

●Created and published multiple dashboards and reports using Tableau server.

●Worked on both batch processing and streaming data Sources. Used Spark streaming and Kafka for the streaming data processing.

●Created multiple custom SQL queries in Teradata SQL Workbench in AWS, cloudera platform to prepare the right data sets for Tableau dashboards. Queries involved retrieving data from multiple tables using various join conditions that enabled to utilize efficiently optimized data extracts for Tableau workbooks.

●Performed data analysis and data profiling using complex SQL in AWS.

●Preparing Dashboards using calculations, parameters in Tableau.

●Implementation of full lifecycle in Data warehouses and Business Data marts with Star Schemas, Snowflake Schemas.

●Built and maintained large scale datasets in Snowflake databases for model training and on line inference.

●Working knowledge on table design and data management using HDFS, Hive, Impala, Sqoop, MySQL, and Kafka.

●Developed python scripts for data cleaning, analysis and automating day to day activities.

●Developed automated reports using Tableau, Python and MySQL to reduce the manual intervention saving 20 hours a month.

●Experienced working with source formats, which includes - CSV, JSON, AVRO, JSON, Parquet, etc.

●Worked as Tableau Server Administrator and Tableau desktop developer. Responsible for Preparing the Report Design specifications based on the User's requirement.

●Generated reports using Proc Tab, Proc Report, Data Null, Proc SQL and Macros. Used OLAP functions like sum, count, csum, etc.

●Communicated with business users and analysts on business requirements. Gathered and documented technical and business Meta data about the data.

●Created numerous processes and flow charts to meet the business needs and interacted with business users to understand their data needs.

●Created Set, Multiset, Derived, Volatile, Global Temporary tables to retrieve the data from multiple tables.

●Experience in writing korn shell scripts for automating the jobs. Automated reports by connecting Teradata from MS Excel using ODBC.

●Documented scripts, specifications, other processes and preparation of Technical Design Documents.

●Designed stunning visualizations using tableau software and publishing and presenting dashboards on web and desktop platforms.

Environment: SQL Assistant, Teradata, Teradata Loading utilities (Bteq, FastLoad, MultiLoad), Python, Unica Affinium Campaign, Hive, UNIX Shell Scripts, Tableau, Snowflake, Python, MS Excel, MS Power Point, Hadoop, HIPAA, EDI

Client: HSBC, Chicago, IL Sep 2018 – Jan 2021

Role: Data Analyst

Responsibilities:

●Facilitated business meetings to work on the functional/non-functional requirements from BAs/Product Managers.

●Conducted analysis that linked customer sentiment with user behavior and product usage data

●Developed calculations to measure NPS, Financial Confidence, and product usage

●Designed and automated interactive KPI dashboards with sentiment scores in Tableau

●Created interactive visualizations and dashboards using Tableau that enabled business users and executives to explore product usage and customer trends.

●Created custom SQL queries on various databases such as Teradata, MySQL, DB2 for data analysis and data validation.

●Performed data profiling and analysis on various datasets like customer profile data, branch hierarchy, contact history and email campaign data.

●Developed Python jobs to extract and load data into MySQL database and use apache Sqoop to load data into HDFS.

●Experience in Tableau calculations & applying complex calculations to large, complex data sets.

●Responsible with Splunk Searching and Reporting modules, Knowledge Objects, Administration, Add-On's, Dashboards, Clustering and Forwarder Management.

●Developed Splunk queries and dashboards targeted at understanding application performance & capacity analysis, survey data, marketing emails data from 3rd tools Mail Chimp using Excel and Tableau.

●Utilized Digital analytics data in extracting business insights and visualized the trends from the customer events tracked.

●Analyzed and developed reports using customer transactional data to create a multi-dimensional customer segmentation based on frequency of usage and product usage.

●Extensively worked on providing end to end Web Analytics services from reporting, analysis, business strategy (optimization) and implementation.

●Designed and executed A/B testing and Experience optimization campaigns on Adobe Target.

●Developed and maintained reports for both ad-hoc and ongoing business operating needs.

●Prepared dashboards with drill down functions such as date filters, parameters, actions using Tableau to reflect the data behavior over time.

●Involved in ETL processes to handle data migration from multiple business units and sources including Oracle, Postgres, MSSQL, Access

●Developed Informatica jobs to populate the claims data to data warehouse - star schema.

●Administered user groups, and scheduled instances for reports on large volumes of data in Tableau server.

●Configured data extraction and scheduled incremental refreshes for data sources on Tableau server to improve performance of reports.

●Assisted in developing working documents to identify and get access permissions to database resources.

●Participated in grooming backlog, daily scrums, retrospectives and sprint reviews with product owner and technology partners to meet release timelines and adding business value.

Environment: Teradata SQL Assistant, Teradata, Teradata Loading utilities,Python, HDFS, Hadoop, Hive, UNIX Shell Scripts, Tableau, Snowflake, Python, MS Excel, MS Power Point, Teradata, Teradata utilities (SQL Assistant, BTEQ, Fast Load, Fast Export), Hadoop.

Client: McLane Global, Houston, TX Jan2017 – Aug 2018

Role: Data Analyst

Responsibilities:

●Successfully interpreted data to draw conclusions for managerial action and strategy.

●Built Couple Custom SQL reports to monitor the Loss Data and had to alert the business if the Threshold value is exceeded or any unforeseen exceptions occur

●Had to come up withMockups& design for Tableau for reporting requirements including sharing the join & filter conditions.

●Provide regular status update on the progress of remediation to the project leads.

●Created reports using SQL queries using Teradata SQL assistant.

●Ensure data quality, data organization, metadata, and data profiling and provide technical support on Data warehouse teams.

●Created Tableau reports and published them in Tableau Server for leadership teams

●Worked on Tableau 10.1 BI Answers to build Interactive Dashboards with drill-down capabilities for data migration status.

●Developed automated data pipelines from various external data sources (web pages, API etc) to internal data warehouse (SQL server), then export to reporting tools by Python.

●Performed system testing by coordinating with the business and development teams.

●Worked in data quality, data organization, metadata, and data profiling

Environment:Teradata, Windows XP, UNIX, .Net, SQL, MS-Office, Tableau, Office 365

Company:Ranbaxy Laboratories, India May 2014 – Nov 2015

Role: Data Analyst

Responsibilities:

●Worked on Asset Management, Wealth Management, Investment Banking, Derivative products. such as Options and Futures, Credit Card, Insurance, Bonds, Sarbanes Oxley Act (SOX), GAAP, GAAS, IFRS, CCAR and Salesforce CRM.

●Documenting Business requirements document (BRD), Functional requirements document (FRD), Use Case Specifications, Functional Specifications (FSD), and User Stories.

●Elicited and detailed functional system requirements and non-functional system requirements from cross-functional technical teams to identify needs of new analytical environment

●Wrote complex SQL queries involving joins, sub queries and creating temporary/volatile tables, using Teradata SQL Assistant and SQL Developer.

●Worked with Chair side QA in identifying test scenarios for data consumption, orders consumption, order schedules and performed data mapping, created Data Mapping in excel.

●Developed set and multiset tables, views and volatile tables for the Team’s project data.

●Created Data mapping documents to capture the source to target data flow and any transformation and business logic involved.

●Built Pivot tables and charts in Excel for daily, weekly and monthly reports.

●Create tools or queries to obtain data from central data repository for reporting, quantitative and qualitative data analysis and present final analysis to customers.

●Prepared Scripts in Python and Shell for Automation of administration tasks.

●Worked on Tableau 9.1 to build Interactive transaction migration Dashboards with drill-down capabilities for reporting and monitoring, Developed SQL scripts for accessing data from the Teradata/SQL server tables, Worked in data quality, data organization, metadata, and data profiling

●Performed Data Analysis to provide insight over data defects and future state issues that might arise over time during new data migration and

●Assisted Financial Analyst in creating Actuarial reports and perform data verification.

●Responsible for testing the data flow, navigation flow, system testing and functionality testing

●Created test cases, analyzed the results, tracked the defects by generating reports and map the test cases to requirements to create trace ability metrics

Environment: Teradata, Teradata SQL Assistant, SQL Developer, Excel, SQL Server UNIX, Windows.Net, Python, SQL, MS-Office, Tableau9.1,Power BI



Contact this candidate