Post Job Free

Resume

Sign in

Data Analyst Quality

Location:
San Antonio, TX
Posted:
November 10, 2023

Contact this candidate

Resume:

Priyadharsini Mani Kirthigha

Data Analyst

San Antonio, TX 210-***-**** ad00jc@r.postjobfree.com

SUMMARY

●8+ years of experience in Information Technology, Data Analysis and Data Validation. Involved in Project requirements, Data management, and Project reporting activities during this period. Highly motivated individual with strong data analytical skills and data operations. Also involved in Process improvement, Data Quality assurance.

●Requirement gathering, analysis and implementation of Report inventory metadata for all Tier 1, Tier 2, and Tier 3 Regulatory Reports of USAA BANK.

●Establishing and Documenting Data Quality methodology for repetitive processes, determining Data Quality issues, maintaining quality data, and defining Data Quality Audit procedures in SDM project.

●Report Certification Framework and Data Control.

●Have an extensive knowledge in Excel using VB Macro, creating PIVOT tables.

●Perform data analysis on all results, generate and prepare presentations and reports for clients using Tableau / Business Objects for Business users and Senior Leadership team.

●Experienced in Extraction Transformation and Loading (ETL) processes using DataStage, Informatica, UNIX Scripting, Oracle PL/SQL Programming, DB2, SQL Server, Flat file, XML source.

●Strong knowledge in problem solving skills and have excellent coding/debugging skills in ETL.

●Responsible to periodically scan various databases such as Netezza, Oracle and DB2 and identify non-compliant PCI data.

●Resourceful team player and capable of delivering tasks within stringent timelines and an effective servant leader.

EXPERIENCE

Client: USAA (Bank) August 2018 – June 2023

Company: HCL America

Role: Data Analyst

Project# 1: Semi Structured Sensitive Data Management (Bank-HADOOP)

Project Overview: The main objective of this project is to ensure the Bank remains compliant with remediation SLAs. The

findings are coming from the enterprise team with the scanning of DataGuise tool. These findings are from within semi

structured environments including HADOOP, Documentum and Snowflake owned by the bank. Asset owners are responsible

for providing remediation information to the Bank SDM team.

●Used Excel for data cleansing processes. The scanned resulted csv files need to be parsed and created as a workbook which is in readable format for the end user who is a POC of the Asset.

●The raw file is a csv file containing Enterprise Scan results and created workbooks with rules, data validations that were later used for remediation process by Data Owners/POCs and SDM Team.

●Validated, if unprotected sensitive data from the scan was true by SQL queries in Coginiti Pro.

●Collaborated with Agile Teams, Product Owners, Data Owners, Data Architects, and other Data Science professionals to communicate needs and solutions for remediation of sensitive data.

●Maintained numerous of trackers to keep all the data in organized.

●We submitted the set of excel file which contains the table need to be tokenized. Validated the tokenized information on the COGINITI against the Agile team result.

●Developed a semi-automated solution to monitor, assess, and support sensitive data compliance and remediation efforts for Semi-Structured data.

●Developed Excel solutions to help improve dynamic reporting and data cleansing processes.

●Trained offshore team members on operating new tools, the current business processes, and on writing efficient queries for data analysis.

Project# 2: Structured Remediation

Project Objective: These findings are from within structured environments, including Databases (Production and Non

Production) owned by the Bank. Stewards are responsible for providing the information to the Bank SDM team. Enterprise

SDM team conducts scans of structured environments around the BANK area to take the remediation option if there is any

sensitive data triggered against on the flagged files and Database. The enterprise results will be uploaded in ARCHER.

●Created the visualized data and reports based on the tracker in SharePoint.

●Updated periodically the tracker to get automatically updated.

●Communicate with the DB owners, stewards to get the remediation process.

Project# 3: Information Governance Catalog (IGC)

●Create a high-level design on data extraction from Netezza and creating rules in IBM IA which are data elements.

●Responsible for capturing of the business requirements, data usage, report fields and map to underlying source data element.

●Document Business, critical metadata and make it available in enterprise approved repository.

●Perform Data Profiling in Information Analyzer (IA) to evaluate quality of data.

●Review the Enterprise Repository and document Business and critical metadata that is available for reporting.

●Evaluate Quality of data using Data Profiling in Information Analyzer (IA) and DQE (Data Quality Engine).

●Participate in Data quality plan review meetings with Subject Matter Experts and Information stewards to finalize Data Quality rules.

●Implement and test Data Quality rules for all six dimensions and ensure monitoring is in place for critical data.

●Requirement gathering, analysis and implementation of Report Certification Framework Process for all the Active USAA BANK Regulatory Tier 1, Tier 2, and Tier 3 Reports.

●Establish a Data Quality methodology documenting a repeatable set of processes for determining, investigating, and resolving Data Quality issues, establishing an on-going process for maintaining quality data, and defining Data Quality audit procedures.

●Collaborate directly with the business data owners to establish the quality business rules that will provide the foundation of the organization's Data Quality improvement plan.

●Work with the business teams to validate business rules as they impact quality. Articulate the need for and benefits of Data Quality to the organization

●Perform Data Certification to ensure measurable and meaningful results are produced.

●Identify and profile the critical data elements to analyze their business impact and establish baseline.

●Define Governance Structure including roles and responsibilities and ownership model.

●Identify and measure Data Quality Checkpoints where Data Quality Rules will be integrated

●Generate aggregated risk data to meet a broad range of on-demand, ad hoc-risk management reporting requests, including requests during stress/crisis situations, requests due to changing internal needs, and requests to meet supervisory queries.

●Work with the business teams to validate business rules as they impact quality. Articulate the need for and benefits of Data Quality to the organization.

●Data Lineage Ingestion in IGC, Automate the date lineage ingestion into IGC through DataStage.

●Report Certification Framework and Data Certification:

1.Requirement gathering, analysis and implementation of Report inventory metadata for all Tier 1, Tier 2, and Tier 3 Regulatory Reports of USAA BANK.

2.Primary Focus on the Regulatory Tier 1 BANK Reports.

3.Ensure that the development of report using approved data sources.

4.All applicable controls for reporting environment are implemented and automated.

5.Document the solution design approach for item that are specific for report or report sections.

●Interact with Business/IT team to understand the analytical data preparation method and build the different business models based on the Business requirement and implement automation.

Project# 4: Sensitive Data Management (SDM) – PCI-DSS (Shared Drive & Employee Home Drives)

●Responsible for creating a file set and Responsible for sending mail notifications.

●Responsible for source data cleansing, refinement, and validation.

●Performing data mining and predicting the future outcomes of the data.

●Responsible for various analysis on data according to the requirements of the project.

●Responsible to extract, transform and load data into database which is a source for Tableau reporting dashboard.

●Responsible for extracting the required information from data sources using SQL, Unix, Java, Python scripting.

●Responsible for Gap analysis validation and publishing the discrepancies.

●Responsible to periodically scan various databases such as Netezza, Oracle and DB2 and identify non-compliant PCI data.

●Created a java file to trigger the java program to send the email notification to each employee with the attachment having the flagged file list.

●Responsible for Identify, analyze, and Interpret trends or patterns in complex datafiles.

●Created macros to extracting the data and to the owners of the flagged files.

●Responsible for redacting/masking the PCI content from the flagged files.

●Participate in performance, data volume analysis and design.

●Participate in performance improvement activities. Identify and apply potential improvements related to the environment for the application.

●Provide input to training and/or documentation materials regarding latest technical and functional design changes.

●Responsible for developing various batch jobs which will help the team in day-to-day remediation process.

Client: Lloyds Bank, Scotland Jun’ 2009 to Dec’ 2012

Company: HCL Technologies (Offshore)

Role: Data Analyst & ETL Developer (Datastage)

Project Name: Securitization

●As a core team member of the team, involved during the analysis, planning, design, development, and implementation stages of projects using IBM Web Sphere software.

●Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with required Tables in the Dev Environment.

●Responsible for configuring the DataStage development and SIT environments

●Responsible for getting the existing “.dsx” Package from production and import the “.dsx” into Development and SIT environments

●Worked extensively on Data Stage Manager, Designer, Administrator, and Director

●Created DataStage jobs using different stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Modify, Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, Row Generator, Etc

●Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.

●Involved in Validation of the data with respect to requirements

SKILLS

Methodologies:

Jira, Asana, Salesforce Front Door

Tools and Applications

IBM Information Analyzer (IA), IBM Information Governance Catalogue (IGC), IBM Infosphere, DQ Engine, JIRA, Rally

Documentation Tools

Confluence, MS Word, MS Excel, MS PowerPoint, MS Visio, MS Project, MS SharePoint. MS Visio

Visualization Tools:

Tableau, Business Objects, Microsoft Excel

ETL Tools

Datastage, Informatica, SSRS

Database:

MS Access 2000, MySQL, Oracle 9i, 10g, SQL Server

Other Skills:

Data Cleaning, Data Wrangling, Critical Thinking, Presentation Skills, Problem-Solving

MS-Suite:

MS Excel, MS Word, MS Outlook, MS Office

Operating System:

Windows Vista, Windows 7, Windows 10, Unix

EDUCATION

Master of Computer Applications (MCA) from Kongu Engineering College, Tamil Nadu, India.

Bachelor of Computer Applications (BCA) from Kongu Arts & Science College, Tamil Nadu, India.

CERTIFICATION

Alation Data Analyst Certification June 2023

Alation Data Steward Certification June 2023



Contact this candidate