Post Job Free
Sign in

Data Analyst Quality

Location:
Austin, TX
Posted:
July 27, 2023

Contact this candidate

Resume:

Vijay Thammanna

Email:-adykap@r.postjobfree.com Mobile:- +1-479-***-****

PROFESSIONAL SUMMARY

Over 16+ years of experience in Data Modeling, Data Engineering, Migration consultant, Data profil- ing and Data Analyst roles.

Expertise in Banking, Financing, Retail and Payroll domains.

Experience in creating visual representation on Logical and physical data modeling process.

Experience in cloud platforms like AWS Redshift\RDS, Postgres SQL, Big data, Spark,Microsoft Azure, Google cloud platform,Github,

ETL and Analytical tools like Altreyx, Informatica, SAP Information steward, SAP BODS, Tableau, and Power BI

Data Integration,Rest/SOAP UI,Functional, Regression, System Testing, Load Testing, UAT and GUI testing.

SQL,PL/SQL,T-SQL,NoSQL

Metadata and Star schema/Snowflake schema. Analyzed Source Systems, Staging area, Fact and Dimen- sion tables in Target D/W.

Agile Methodologies, CI/CD pipeline to build, develop and automate process.

Python scripting for automating job process and executions.

Data modeling Tool Erwin and ETL tools like SAP BODS,Informatica,Altryex,IBM Data Stage

Experience in using DevOps tools used for Release Management

Experience using query tools like Oracle, Teradata, DB2 and MS SQL Server to validate data and trouble- shoot data quality issues.

Engaged in SCD Type 2 implementation Data, Integration testing with SAP Systems, SAP BW and Non SAP Systems as a source to BODS/INFORMATICA, by implementing Data Ware House with Star Schema, Snow flake Schema

HP ALM,Jira,Kanban for Bug Tracking and Defect Reporting.

Reports testing in SAP BO Reports, Crystal Reports, Tableau, Power BI.

Strong working experience UNIX environments

WORK EXPERIENCE

Title: Lead Data Engineer Mar 2022– Till Date

Client : Charles Schwab - Finance

Platform: Big data, Teradata, Google Cloud platform, Informatica, Tableau,Control-M,Unix Responsibilities:

Designed Logical Data mapping and relational document from Teradata to GCP tables

Performed Verification, Validation, and Transformations on the Big data, GCP on target database.

Solved various performance and complex logic related issues with effective and efficient solution in the ETL process (performance tuning) and identified various bottlenecks and critical paths of data transformation in sessions for different Interface.

Extracted, transformed, and loaded data from various heterogeneous data sources and destinations

Engaged with the Business analysts in requirements gathering analysis, testing and project coordination.

Prepared standard design documents, Technical Specification TD, ETL Specifications and Migration documents.

Designed and reviewed the Interface/Conversion ETL code before moving to QA to ensure design standards are met.

Migrated and validated Tableau reports

Automated data validation process using python scripting

Participate in Agile sprint planning session to prioritize requirements and estimate efforts.

Defect reporting on ALM,Jira and track towards closure. Provide training and assist team members on resolving technical issues

Overseeing the inbound and outbound interfaces development process by closely working with functional, devel- oper and tester.

Managing PMO with the creation of Project time and resources Management.

Maintained daily Tech Tracker/Issue logs for the updates and issues related to objects and progress.

Designed and implement scheduler to run the interfaces at different time using Shell Scripting in UNIX. Title: Lead Data Engineer Nov2019- Mar2022

Client : PNC Bank -Finance

Platform: AWS RDS\Redshift, AWS Glue, Postgres SQL,Informatica, Oracle, Erwin,Big data, Teradata, Python, SAP Information Stewards

Responsibilities:

Engaged in migrating large on premise oracle database to AWS oracle cloud using AWS Database Migration Service

Implement and manage continuous delivery systems and methodologies on AWS.

Configured schemas and tables in AWS Redshift. Stored and retrieved data from Redshift database

Used AWS Glue for the data transformation, validate and data cleansing

Designed Logical and physical data modeling document using Erwin data modeling tool

Engaged in migrating oracle to Postgres using Ora2g tool

Managed source to target mapping between structured and unstructured tables

Developed data pipelines, data flows using AWS redshift.

Developed design document which shows detailed understanding and relation on key attributes, Table hierarchies,attributes and column metrics.

Prepared standard design documents FD, Technical Specification TD, ETL Specifications and Migration doc- uments.

Created Test Cases using the SDLC procedures and reviewed them with the Test Manager.

Designed and reviewed the Interface/Conversion ETL code before move to QA to ensure design standards are met.

Created RESTful application programming interface (APIs) for streamlining, monitoring, and reporting Designed and implement scheduler to run the interfaces at different time using Shell Scripting in UNIX.

Solved various performance and complex logic related issues with effective and efficient solution in the ETL process (performance tuning) and identified various bottlenecks and critical paths of data transformation in sessions for different Interface.

Used different transformations like Connected & Unconnected Lookup, Router, Aggregator, Source Qualifier, Joiner, Expression, Update Strategy and Sequence generator to data conversion before loading to Target.

Performed Verification, Validation, and Transformations on the Big data, Oracle & Teradata on target database. Also responsible in design and generating BI reports for customers

Automated data validation process using python scripting

Participate in Agile sprint planning session to prioritize requirements and estimate efforts.

Analyze requirements, design test scenarios and draft test cases for each scenario .

Gathering requirement, reviewing functional mapping documents .

Conducting/Attending Regular/Adhoc meeting with project teams to update daily status or escalate issues and concerns.

Managing PMO with the creation of Project time and resources Management.

Involved with the Business analysts in requirements gathering analysis, testing and project coordination. Title: Data Architect. Jun2019– Oct2019

Client: Bahrain Electric and Water Authority -Payroll Platform: SAP Success Factors, SAP BODS,DB2

Responsibilities:

Designed payroll data visual representation source to target table mapping from on-prem to Successfactor Cloud environment

Visualization showing key attributes, Table hierarchies,attributes and column metrics.

loading data from all the source systems to CSV files and files are pulled by interface to Success Factors,

Validating transformation loads for all the modules and creating Data quality reports for functional and business team to fix the issues

Architected data migration and validation strategy

Created deduplication strategy

Planned/Managed/Budgeted a team of 5 data conversion analysts through the ETL process from blueprint to real- ization (onshore and offshore) using SAP Data Services

Managed implementation of SAP Information Steward for data profiling and data quality

Represented Data Migration in Project leadership meetings

Responsible for teaching and guiding internal business analysts on the conversion process

Guided business analysts on functional specification writing, data mapping, data validation and data cleansing

Data requirements gathering for 3 Business Units from 11 different legacy systems

Engaged with business to establish core requirements for a Data Governance Organization Title: Lead Data Engineer Feb2016– May2019

Client: Walmart, US.-Retail

Platform: DB2, TRIRIGA, Power BI, SAP BODS, Alteryx,SAP information Steward Responsibilities:

Designed data mapping and functional design documents base on the requirements given by customer.

Performed data transformations and ETL process from source to target environments.

Performed data profiling using information stewards

Participate in Agile sprint planning session to priorities requirements and estimate efforts.

Analyze requirements, design test scenarios and draft test cases for each scenario

Responsible in designing process and loading data from source to target DB2 tables and testing those transfor- mation logics by comparing source and target data.

Performed Verification, Validation, and Transformations using Alteryx before loading into target database. Also responsible in design and generating BI reports for customers

Prepared sample data sets for data testing and report analysis

Performed Tririga Functional and data validate and bird reports.

Defect reporting on HP ALM and track towards closure

Provide training and assist team members on resolving technical issues

Writing stand-alone data management tools, which are run as scheduled jobs.

Maintenance of code and design documents.

Title: Seniors Data Analyst Mar2010– Feb2016

Client: MARSH, US- Finance

Platform: Teradata,SAP BOXI R3.1,Power BI

Responsibilities:

Engaged in understanding Business Process and coordinated with Business Analysts to understand specific user requirements.

Tested the extracted data from sources like Excel file, flat files, etc.

Performed in extensive DATA validation using SQL quieres and back-end testing.

Designed and developed detailed Test strategy, Test plan, Test cases and Test procedures for Functional and Re- gression Testing

Worked in an Agile methodology and attended Scrum meeting.

Gather and analyze requirements.

Developed complex universes and reports as per requirements.

Automated BO reports

Title: Data Quality Analyst Jan2009– Mar2010

Client: Philips Healthcare SPS, GLOBAL

Platform: Data monitoring (web based ETL) tool, SAP, SQL Server 2000, BO. Responsibilities:

Analyzed the Business requirements by conducting extensive interactions with users and reporting teams and as- sisted in developing the low level design documents.

Web monitor tool (which is interface between SAP and oracle) is used to monitor the data loads and to recover the failed loads.

Data loads are extracted from BO XI (Infoview web tool) and analyzed.

Worked with business owners to identify key data elements that will be monitored on an on-going basis to ensure a high level of data quality within the Warehouse.

Worked closely with the functional departments and IT Analysts to attain an in-depth knowledge of business pro- cesses and procedures

Analyzed the data mapping and developed scenarios to verify the data migration from different data sources.

Designed and created Data Quality baseline flow diagrams, which includes error handling and test plan flow data

Performed requirements gathering, process mapping, Business process reengineering, Release management, test, and end-user support.

Title: Data Analyst Jun2006– Jan2008

Client: Hewlett Packard, Bangalore

Platform: Business Objects (Webi), SAP ODS, SQL Server, Informatica. Responsibilities:

Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

Designed and customized data models for Data warehouse supporting data from multiple sources on real time

Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

Extracted the data from the flat files and other databases into staging area and populated onto Data WAREHOUSE. EDUCATION:

Bachelor’s Degree in Computer science, VTU-2006



Contact this candidate