Sindhu Priyanka Immanneni
**************@*****.***
Professional Summary:
12+ years of Experience in Design, Analysis, Development in the Business Intelligence and Data warehousing, including roles as a data analyst
8+ years of experience in working with Tableau Desktop, Tableau Server and Tableau Reader with various versions of Tableau 2023.x/2020.x /10.x/9.x/8.x/7.x.
2+ years of experience in working with Google Looker.
2+ years of experience in working with Power BI.
Strong experience in creating the reporting solution using Tableau and Certified Tableau desktop specialist
Outstanding knowledge& working experience of slice & dice Looker.
In depth functional domain knowledge of Manufacturing and Travel domains
Expertise in SQL, C, C++, LookML and Python Scripting.
Expertise in creating LookML models, looks, Interactive Dashboards and performance tuning the looker dashboards.
Implementation of data storage configurations, including tablespaces, datafiles, and filegroups.
Proven track record of leveraging data-driven insights to inform business decision-making and enhance operational efficiency.
Thorough understanding of Waterfall & Agile Software Development Life Cycle (SDLC) model, including requirements analysis, system analysis, design, development, testing, documentation, user stories, training, implementation and post-implementation process.
Proficiency in data manipulation, statistical analysis, and visualization techniques to extract actionable intelligence from complex datasets.
Monitoring and auditing of DDL operations for security and compliance purposes.
Proficient in Tableau data visualization tool to analyze and obtain insights into large datasets, create visually compelling and actionable interactive reports and dashboards.
Provide guidance and insight on data visualization and dashboard design best practices. Develop visually
informative dashboards, customized reports based on business requirements and UI specifications.
Proficient in understanding business processes / requirements, technical data analysis (TDA) and translating them into technical documents i.e., Mapping document, High and Low-Level Design documents.
Experience in building reports using major RDBMS like Oracle12c Exadata, DB2, SQL Server and Snowflake DWH
Expertise in creating and scheduling reports using Tableau server client python scripting
Skilled in implementing and enforcing data governance policies to ensure data integrity, security, and regulatory compliance.
Conducted thorough data analysis to identify trends, patterns, and insights that drove strategic business decisions.
Mastered the ability to design and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameterized using Tableau.
Experience in generating reports using tools SQL Server Reporting Services (SSRS),Micro Strategy, Tableau and Crystal Reports.
Collaborative approach to working with cross-functional teams to achieve organizational objectives and drive strategic initiatives.
Expertise in SharePoint workflows and administration of the Share Point sites
Tableau development skillset includes visualization development, use of parameters, and report optimization. Identified key business metrics measurements and developed methods to represent data in support of the measurements.
Integration with other database management tasks such as data manipulation and query processing.
Have experience in Dealer Lending, SOX, Risk and Treasury business process.
Creation of filters, reports, and dashboards. Created advanced chart types, visualizations and complex calculations to manipulate the data.
Having greater ability to upgrade to new technologies.
Excellent Communication and Interpersonal skills with proven exceptional Customer Interaction capabilities.
Strong Analytical and Problem-Solving Skills with proven ability to provide Solutions to Complex requirements & use cases.
Willingness to learn, Creative & Innovative and have proven flexibility & quick adaptability to different Challenging environments.
Experience in coding SQL/PL SQL using Procedures, Triggers, and Packages.
Experience working with multiple teams from different Geographies like North America, Europe, South East Asia, India.
Technical Skills:
Category
Skills
Operating Systems
Windows 10, Windows Server 2012, IBM AIX, Red Hat Linux
Languages
SQL, C, C++, Python Scripts, LookML
Tools
Tableau Desktop, Tableau Server, Looker 6.X to 7.X, Power BI, OBIEE
RDBMS
Oracle 12c Exadata, IBM DB2, MS-Access, SQL Server, SnowFlake
Education:
Bachelor of Engineering from Jawaharlal Nehru Technological University India, 2008.
Certification:
Tableau Desktop Specialist
Achievements
Received Mark of Excellence Award from Cyient
Received Associate of the Month Award from Cyient
Received Special Recognition Award from CMC
Received GDC Recognition Award from CMC
Professional Experience:
Infosys, Toyota Financial Services, USA, Plano, TX Mar 2023 – Present
Sr. Developer
Project Role: Tableau2023.4, Webi
Platform/Database: Oracle12c Exadata, Snowflake
Project Description:
Toyota Financial Services has chosen Infosys as its partner for creating the reports for migration of the commercial loans from the legacy system to wholesale data sintelligence system. We have created the analytics and provided advanced features like Self-service reporting in Tableau for the business users. This project is operated in Agile methodology as it is huge transformation program.
Responsibilities:
Gathering requirements from the business users and built customized interactive reports and dashboards and publish to server.
Extensive practical knowledge in importing data for use in reporting software (Tableau) graphs, and flowcharts.
Created the Lineage and KPIs for the reporting attributes.
Developing the Looker environment and supporting the business decision process.
Created action filters, parameters, sets, groups for preparing dashboards and worksheets using Tableau.
Created dashboards by extracting data from different sources
Involved in creating interactive dashboard and applied actions (filter, highlight and URL) to dashboard.
Involved in creating calculated fields and hierarchies.
Manage and optimize Snowflake data warehouses, including provisioning and resizing compute resources, monitoring performance, and optimizing storage utilization.
Developed Parameter based reports, Drill-down reports, Charts and Looker reports using Looker and enabled ad-hoc reporting.
Strong experience with GIT workflow for version controlling.
Execution of data definition commands to define, modify, or delete database structures.
Involved in testing the SQL, Tableau Dashboard & Snowflake DB Scripts.
Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop.
Developed and maintained complex SQL queries to extract, manipulate, and analyze large datasets from various sources, including databases, spreadsheets, and APIs.
Performed complex performance tuning to optimize query performance and optimize the overall database performance
Worked with backend engineers to optimize existing API calls to create efficiencies by deprecating unneeded API calls
Utilized advance features of Tableau software like to link data from different connections together on one dashboard and to filter data in multiple views at once.
Utilized advanced Excel functions, such as pivot tables, VLOOKUP, and macros, to transform raw data into actionable insights and visualizations.
Tested dashboards to ensure data was matching as per the business requirements and if there were any changes in underlying data.
Hands-on development assisting users in creating and modifying worksheets and data visualization dashboards.
Infosys, About Client: Toyota Financial Services, Plano, TX Nov 2021 – Mar-2023
Sr. Developer
Project Role: Snowflake, Tableau2020.6, Webi
Platform/Database: Oracle12c Exadata, Snowflake
Project Description:
Toyota Financial Services has chosen Infosys as its partner for creating the reports for TFS Data Modernization. As part of migrating the data could platform and developing the legacy reports in OBIEE to Tableau and Looker. We have created the analytics and provided advanced features like Self-service reporting in Tableau for the business users. This project is operated in Agile methodology as it is huge transformation program
Responsibilities:
Gathering requirements from the business users and built customized interactive reports and dashboards and publish to server.
Extensive practical knowledge in importing data for use in reporting software (Tableau and Looker) graphs, and flowcharts.
Created the Lineage and KPIs for the reporting attributes.
Developing the Looker environment and supporting the business decision process.
Created action filters, parameters, sets, groups for preparing dashboards and worksheets using Tableau.
Created dashboards by extracting data from different sources
Design and implement efficient data models in Snowflake, including schema design, table partitioning, and clustering keys, to support analytical queries and optimize query performance.
Involved in creating interactive dashboard and applied actions (filter, highlight and URL) to dashboard.
Involved in creating calculated fields and hierarchies.
Handling of data dictionary information, including metadata about database objects and their relationships.
Developed Parameter based reports, Drill-down reports, Charts and Looker reports using Looker and enabled ad-hoc reporting.
Created interactive dashboards and reports using data visualization tools like Tableau, Power BI, or QlikView to communicate findings and key performance indicators effectively.
Strong experience with GIT workflow for version controlling.
Involved in testing the SQL, Tableau Dashboard & Snowflake DB Scripts.
Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop.
Collaborated with cross-functional teams to define project requirements, develop analytical solutions, and present findings to stakeholders.
Granting and revoking permissions on database objects to control access and security.
Utilized advance features of Tableau software like to link data from different connections together on one dashboard and to filter data in multiple views at once.
Tested dashboards to ensure data was matching as per the business requirements and if there were any changes in underlying data.
Hands-on development assisting users in creating and modifying worksheets and data visualization dashboards.
Created report schedules, data connections, projects, groups in Looker.
Infosys, Toyota Financial Services, Plano, TX May 2021 – Jul 2021
Sr. Developer
Project Role: Snowflake, Tableau, Looker 6.X to7.X, Webi
Platform/Database: Oracle12c Exadata, Snowflake
Project Description:
Toyota Financial Services has chosen Infosys as its partner for creating the reports for Private Label growth project. In PLG project we have created the reports in the modernization platform (google Looker) reports for Day 1, Day 30, Day 60 and Day 100.
Responsibilities:
Gathering requirements from the business users and built customized interactive reports and dashboards and publish to server.
Extensive practical knowledge in importing data for use in reporting software (Webi and Looker) graphs, and flowcharts.
Created the Lineage and KPIs for the reporting attributes.
Developing the Looker environment and supporting the business decision process.
Experience with BI tools like Tableau Desktop, Tableau Serverand Micro Strategy front-end reporting and automation scheduling tool like Tidal and One automation
Write and optimize SQL queries for performance in Snowflake, leveraging features like query hints, materialized views, and query profiling to improve query execution times.
Created action filters, parameters, sets, groups for preparing dashboards and worksheets using Tableau.
Created dashboards by extracting data from different sources
Integrate data from various sources into Snowflake using ETL (Extract, Transform, Load) processes or data ingestion tools like Snowpipe, ensuring data quality and consistency.
Involved in creating interactive dashboard and applied actions (filter, highlight and URL) to dashboard.
Involved in creating calculated fields and hierarchies.
Conducted A/B testing and statistical analysis to evaluate the effectiveness of marketing campaigns, product features, and business strategies.
Modification of database schema by adding, modifying, or dropping database objects.
Developed Parameter based reports, Drill-down reports, Charts and Looker reports using Looker and enabled ad-hoc reporting.
Strong experience with GIT workflow for version controlling.
Involved in testing the SQL, LookML in Looker & Snowflake DB Scripts.
Tested dashboards to ensure data was matching as per the business requirements and if there were any changes in underlying data.
Hands-on development assisting users in creating and modifying worksheets and data visualization dashboards.
Created report schedules, data connections, projects, groups in Looker.
Rockline Solutions, Bombardier, USA, Chicago, IL Feb 2018 – Nov 2020
Sr.Developer
Technology/software: Informatica 9.x, Oracle 12c, Tableau 9.x
Project Description:
Bombardier has chosen Rockline Solutions as its partner for automating the Daily Cost Management based reporting solution. The generation of these reports is done manually using Excel Macros and importing data manually into the excel workbooks from various source systems. We have automated this by creating reporting data mart using star schema model in oracle and integrating the data from different source into this using Informatica ETL’s on nightly basis, we have created models in Tableau and developed the reports required using Tableau.
Responsibilities:
I was responsible for supporting the Daily Cost Management solution, which is used by 100 of users every day, thereby handling critical live production issues.
Worked as Tableau developer in the team of 4 members
Developed Trend reports to provide progress update on changes in Daily Costs by departments
Gathering requirements from the business users and built customized interactive reports and dashboards using Tableau 9.0 for quick reviews to be presented to Business and IT users.
Implement and manage security policies and access controls in Snowflake, including role-based access control (RBAC), encryption, and auditing, to protect sensitive data and ensure compliance with regulatory requirements.
Developed POCs by building reports and dashboards using Tableau.
Created and maintained reports to display the status and performance of deployed model and algorithm with Tableau.
Applied data mining techniques to discover meaningful patterns and relationships in large datasets, contributing to business growth and competitive advantage.
Performed conditional formatting for all the dashboards using Tableau.
Developed Customer scorecard dashboards for Tableau Mobile application. Created interactive dashboards for Tableau Mobile.
Designed and developed various analytical reports from multiple data sources by blending data on a single worksheet in Tableau Desktop.
Utilized advance features of Tableau software like to link data from different connections together on one dashboard and to filter data in multiple views at once.
Created action filters, parameters, sets, groups for preparing dashboards and worksheets using Tableau.
Define best practices for creating Tableau dashboards by matching requirements to the charts to be chosen, color patterns as per user's needs, standardizing dashboard's size, look and feel etc.
Documentation of all the dashboards describing the purpose of the dashboard, all the elements used in the dashboard and any additional notes related to the dashboard.
Created Tableau Dashboards with interactive views, trends and drill downs along with user level security.
Created customized calculations, Conditions and Filter (Local, Global) for various analytical reports and dashboards.
Management of constraints to ensure data integrity, such as primary keys, foreign keys, and unique constraints.
New KPI generation for Scorecards.
Involved in creating interactive dashboard and applied actions (filter, highlight and URL) to dashboard.
Involved in creating calculated fields and hierarchies.
Administered user, user groups, and scheduled instances for reports in Tableau.
Developed Parameter based reports, Drill-down reports, Charts and Tabular reports using Tableau and enabled ad-hoc reporting.
Tested dashboards to ensure data was matching as per the business requirements and if there were any changes in underlying data.
Hands-on development assisting users in creating and modifying worksheets and data visualization dashboards.
Documented all the dashboards related information, overview of each dashboard, calculations performed on each dashboard
TUI Travel PLC (Formerly known as TUI AG) USA, (offshore) Hyderabad, India May 2015 – Nov 2017
Developer
Technology/software: Informatica 9.x, Tableau 9.X, Obiee
Platform: Client/Server, UNIX, Windows, Oracle 9i
Description:
TUI Germany has changed its booking system from IRIS to @Comres whose data is loaded to BACE DWH. The Datahub is an area which obtains reservation information from several various booking systems which are owned by TUI-Germany and also from those systems which are not owned by TUI-Germany. TUI-Germany got related with new reservations systems/Service providers like: Bahn, Cruises and Houseboat. There was no unified system to capture all the data from Datahub and report it to TUI Germany sales and other downstream systems. Currently there are four different business models, on which basis data is loaded from reservation systems into the data warehouse. Tour operator (Veranstalter), Agent model (Mittlermodell) which deals with Houseboat and Hamburg Tourism, Tour operator with direct link to service provider (Veranstalter mit Direktanbindung an Leistunganbieter) ex: Bahn, Service contract (Dienstleistungsvertrag) ex:TUI Cruises
Roles and Responsibilities
Created worksheets in tableau for production support and showcase business using story points.
Implemented data blending to blend related data from different data sources.
Developed aggregated measures in worksheets and merged multiple worksheets to create workbooks using Tableau for data analysis purposes.
Developed ad-hoc reports for users using Tableau and OBIEE based on business need and user accessibility.
Created hierarchies and calculated complex measures in tableau worksheets.
Conducted performance tuning of tableau and OBIEE reports.
Worked with various tableau views such as Cross tabs, scatter plots, Heat maps, Geographic maps, Pie charts, Bar Charts etc.
Collaborated with business stakeholders to define key performance indicators (KPIs) and performance metrics, translating business requirements into data-driven solutions.
Implement backup and recovery strategies for Snowflake databases and data warehouses to ensure data availability and disaster recovery readiness.
Specification of the structure and properties of data elements within a database.
Establish and enforce data governance policies and standards within Snowflake, including data classification, data lineage tracking, and metadata management, to ensure data quality, integrity, and compliance.
Design Dashboards using Filters, Parameters and Context Filters in Tableau and OBIEE.
Created different KPI using calculated key figures and parameters.
Experience in building Groups, Hierarchies, sets to create detail level summary reports and dashboards using KPI’s.
Involved in creating a dual-axis bar chart with multiple measures.
Worked in MUDE environment with different teams working on the repository at the same time.
Involved in designing and developing of new subject areas in Repository (RPD) catering to the OLAP reporting needs of BI Project.
Designed and executed experiments to test hypotheses and validate assumptions, leveraging statistical methods and hypothesis testing.
Created/Modified/Customized OBIEE Metadata Repository based on the requirement by defining Physical Data Layer, and developed Business Model and Presentation Layer using OBIEE Administration Tool.
Created the required aliases, views in Physical layer. Implemented Level Based Metrics in the BMM layer. Implemented the Time Series Wizard for comparing measures in different time Periods. Created Compound facts in the BMM as per the client requirement.
Created Dimension Hierarchies in the BMM layer to drill down and drill across on various dimensions as per client requirement.
Monitor and analyze Snowflake performance metrics, such as query execution times and resource utilization, and proactively identify and address performance bottlenecks to optimize system performance
Created complex financial reports using sub queries, unions, advance filter based on result of another results, guiding the reports by implementing the conditional sections behind the scenes. Built proof of concept (POC) reports to show user the alternate options of any functionality, which is not readily available in OBIEE.
Involved in OBIEE 11g to 12c upgrade.
Conducted customer segmentation analysis to identify target customer segments and personalize marketing strategies and campaigns.
Definition and creation of database objects such as tables, views, indexes, and sequences.
Involved in Root Cause Analysis of issues and day-to-day production support.
Involved in OBIEE RPD and Web Catalog objects migration from DEV to both TEST and PROD environments.
Designed custom style sheets using HTML and CSS coding in several BI dashboards.
Debugged and resolved data matching issues ranging from the base tables to BI dashboard reports.
Unit testing of the reports by writing my own SQL queries and executing on database and checking the results against the BI reports.
Worked on the ETL loads using the PLSQL procedures.
Created Database Objects like tables, previliges, procedures and packages using PL/SQL Developer to load data into warehouse tables.
Built primary, unique and foreign key relations across tables to constrain related data together to avoid redundancy and ensure data normalization.
Created indexes on the tables for faster retrieval of the data to enhance database performance.
Implemented and executed dynamic SQL queries to improve code flexibility.
Worked on SQL Loader to load data into different environments.
Involved in debugging packages and fixing PL/SQL code to resolve issues.
Worked on Beyond Compare tool to compare and debug PL/SQL queries.
Cyient, Bombardier, Hyderabad, India Jun 2013 – Apr 2015
Sr.Developer
Project Description:
VAST is a simulated tool used to test the Vehicle software and production cradle. VAST simulates wayside components such as RATP(Region Automatic Train Protection), RATO(Region Automatic Train Operation) and Central Control, alarms. VAST also simulates train borne components such as Propulsion & brake system, Doors, all digital IO train lines(like pushbuttons, manual controls), Norming Points reader and Propulsion signals. VAST also simulates Hot-Standby environment for testing Hot-Standby VATC. VAST also simulates project specific requirements such as RF Modem interface, CANBUS interface, Pulse Width Modulation(PWM) and etc. VAST provides configuration file which can be modified according to the project requirement. VAST reads project map files for the project which is used for route generations, norming point data, speed limits and vehicle information. VAST simulates all interface control document(ICD) data for the configured project. VAST simulates the propulsion and brake module as tachometer pulses in train automatic mode based on propulsion signal input from VATC.
Responsibilities
Involved in requirement study of the VATC System.
Involved in Designing and development of the project.
Conducted thorough data analysis for strategic decision-making.
Manage and optimize Snowflake data warehouses.
Collaborated with cross-functional teams for project success
Proficient in SQL queries for data extraction and manipulation.
Conducted A/B testing for marketing effectiveness
Involved in Planning and estimating the efforts.
Integrate data from various sources into Snowflake.
Created test cases to perform testing on the Integration of modules
Responsible for creating the Lifecycle documentations.
Created and executed test cases for Unit test, performance test.
Ansaldo STS, Pittsburgh, USA, Hyderabad, India Jun 2010 – May 2013
Developer
Description:
The UD0902 consists of a Microlok CPU PCB that has been modified to accept a daughter card. The daughter card in this case is the same as that used in the AF90x development. This daughter card is considered to be a Co-processor that performs logic (vitally) on the I/O of the Microlok. The UD0902 development is to address the needs of the Positive Train Control (PTC) safety case in both dark territory and areas where there may be insufficient coverage for the physical attributes associated with freight commerce. The design will use the AF90x development as a base. The UD0902 must achieve a Safety Integrity Level (SIL) of 4 (very high). The system will use a two out of two voting scheme for all SIL-4 data and data determination. There will be a separate processor for general (SIL-0) communication, event handling, and display interfaces. Some information will be SIL-4 messages that are transferred from the vital CPUs to the non-vital Communication CPU. The vitality of the message will be embedded in the protocol of the message by the vital CPUs
Roles and Responsibilities
Software/Hardware Integration Testing and System Validation.
Application builder Testing and Wayside Test Tool testing
Unit testing and Regression Testing on On target and Vector CAST.
Development of Test case and Test report documents.
Development of Test Applications.
Static Analysis of code using Logiscope tool and fixing the violations.
Raising PR’s (Problem reports) and tracking them to closure.
Implemented predictive modeling for forecasting.
Ensured data quality and accuracy through cleansing techniques.
Responsible to review before delivering the results.
Responsible to update the Metrics sheet and Defect Data
Participating in weekly status call with Client.
Estimated for all the activities
Ansaldo STS, Pittsburgh, USA (offshore) Hyderabad, India Feb 2009 – May 2010
Developer
Description:
STM (Specific Transmission Module) Montreal car borne System is an Automatic Train Control (ATC) System in 1:1 hot standby configuration. Each ATC consists of the vital Automatic Train Protection subsystem (ATP) and the non-vital Automatic Train Operation (PA) subsystem. The ATP primarily maintains safe vehicle speed and eliminates over speed conditions by activating emergency brakes; PA is primarily for Automatic Train Operation. Other subsystems are SESAM-CT which monitors data relevant to the train control system and records that data for a window of time around set events and alarms. The SESAM-CT communicates with a wayside data collector through a radio link. The scope included verification of ATP, PA, and SESAM-CT and Wayside Data Collector software.
Roles and Responsibilities
Software/Hardware Integration Testing and System Validation.
Application builder Testing and Wayside Test Tool testing
Unit testing and Regression Testing on On target and Vector CAST.
Development of Test case and Test report documents.
Development of Test Applications.
resented insights to stakeholders through clear visualizations.
Automated ETL processes for improved efficiency.
Static Analysis of code using Logiscope tool and fixing the violations.
Raising PR’s (Problem reports) and tracking them to closure.
Responsible to review before delivering the results.
Responsible to update the Metrics sheet and Defect Data
Participating in weekly status call with Client.
Estimated for all the activities
Trw Automotive, Farmington Hills, Hyderabad, India Oct 2008 – Jan 2009
Developer
Description:
The Airbag Electronic Control Unit system controls the Airbag Deployment in the Automobiles. The system periodically diagnoses the inputs from the on-board and remote accelerometer sensors and determines the crash event based on the received data. Once a valid crash is determined, corresponding bags will be deployed depending on the type of the crash (front, side & rear). The ECU always checks the inputs received form the sensors, to determine whether valid data is received from sensors.
Roles and Responsibilities
Involved in testing the timing analysis
Conducted exploratory data analysis for pattern discovery.
Developed predictive models for business optimization.
Ensured data security and regulatory compliance.
Execution of the Test Cases.
Preparation of Software Verification Reports