Post Job Free
Sign in

Data Warehousing Business Intelligence

Location:
Stone Ridge, VA
Posted:
April 16, 2025

Contact this candidate

Resume:

Sarika Harikrishnan ******************@*****.***

+1-571-***-****

PROFESSIONAL SUMMARY:

13+ years of IT experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing and Data Integration Solutions

3+ years of experience in Healthcare and 11+ years of experience in Finance domain.

11+ years of experience in SQL related databases and Data warehousing, Data modelling and architecture.

10+ years of experience in ETL Informatica PowerCenter, Data Quality and MDM.

4+ years of experience in Python

4+ years of experience in UNIX

2+ years of experience in Data Analytics, Business Intelligence and Reporting with Cognos, Power BI, Tableau and Crystal Reports.

1+ years of experience in Informatica IDQ and Data transformations

Knowledge in Full Life Cycle development of Data Warehousing projects.

Involved in projects on Agile model development.

Proficiency in SQL and advanced PL/SQL with various relational databases like Oracle, SQL Server, Teradata, MYSQL, PostgreSQL.

Experience with data mapping, dimensional modelling using star and snowflake schema models and Entity relationship models.

Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica PowerCenter, IDQ and Data transformations.

Extensively worked with Informatica and database performance tuning activities.

Created UNIX shell scripts to run the Informatica workflows and to control ETL flow.

Worked on Python for data sourcing and data calculations.

Proficient in data analysis and ensure the business logics and requirement gathering.

Proficiency in creating reports for analytics for users and stakeholders.

Extensively worked on Estimation, Requirement gathering, Forecasting & Revenue Management, Change/Release and Deployment Management, Release planning, Compliance tracking and Status Reporting

Independently perform root-cause analysis and solution development and ability to meet deadlines and manage multiple tasks.

Strong leadership qualities, flexibility in work schedules, possess effective communication skills, Team player and possess analytical and problem-solving skills.

EXPERIENCE SUMMARY

Employer

Start Date

End Date

Cognizant Technology Solutions

Dec 2011

Apr 2017

Bank of America

Apr 2017

Mar 2025

TECHNICAL SKILLS

ETL: Informatica PowerCenter, Informatica Data Quality, Informatica Data Transformations, Azure

Languages: Python, PL/SQL, HTML, XML, JSON, JAVA Basics

Database: Oracle, Teradata, SQL Server, Netezza, MYSQL, PostgreSQL

Data Modelling: Dimensional, Physical and Logical Data Modelling, ER Studio, Erwin Data Modeler, Toad Data Modeler, Lucid Chart.

OLAP/BI Tools: Cognos, Power BI, Tableau, Crystal Reports.

Tools: Autosys, SQL Developer, Teradata SQL Assistant And Visio

Scheduler: Tivoli, Jenkins

Versioning: SVN, GIT, XLR

ACHIEVEMENTS

Twice awarded Tech Guru Certification in Cognizant Technology Solutions for technical expertise.

Awarded STAR Performer in Cognizant Technology Solutions for successful project implementation.

Runner up in Innovation contest(Hackathon) organized in Account.

Completed Big Data Enablement program.

Awarded as Women Achiever for unexceptional working and career progress.

Completed training on banking in global markets.

Won Prizes in Keyboard (piano) competition.

Won President and Governor Awards in Scouts and Guides activity.

Professional Profile

Employer: Bank of America

Overall Duration: Apr 2017 to Mar 2025

Project 2

Enterprise Capital Management

Duration

June 2020 to March 2025

Role

Feature Lead – Data Modeler and Business Analyst

Tools and Technologies

Informatica PowerCenter, Python, Oracle, ER Studio, Unix, Power BI, Tableau, JIRA, BPNM, Agile, Six Sigma

Project details

Building a data platform for US Regulatory and Economic Capital calculation and reporting. It calculates the bank’s Risk Weighted Assets (RWA), Capital and Budgeting for multiple products across various line of business work streams such as Wholesale, Retail, Trading, Securitization, Assets, Commercial Finance including treasury, equipment lending, leasing and debt/equity underwriting. Also forecasts the regulatory capital for use in the bank’s CCAR submission and risk management.

Responsibilities

Understanding the concepts of economic and regulatory capital, equipment finance lifecycle, finance and budget derivation logics and analyzing the data model for fitting multiple sources into a single data model.

Work with business users to get the details on the capital, finance and budget calculation factors and providing the data design solutions and data lineage.

Build data model and establishing the data models with required naming standards to cover various lines of business.

Supporting ETL to source data from the multiple data systems which are required to be considered for RWA and capital calculations like Wholesale, Retail, Trading, Securitization, equipment leasing, lending, or asset financing industry, standard rate fixing systems to fit into the data model.

Engaging with stakeholders to analyze the data and complex requirements to prioritize the requirements, identify gaps, and align the project activities with business goals.

Document requirements clearly and concisely in formats like BRDs or user stories, ensuring stakeholder validation and traceability.

Conduct end-to-end data mapping between source and target systems to support system integration, ensuring accurate data flow aligning to business requirements and rules.

Work with technical teams and stakeholders to define data transformation logic, identify gaps, and validate mappings, by developing and maintaining detailed data mapping documents to define field-level transformations between source and target systems.

Collaborate with cross-functional teams to identify data requirements, resolve mapping gaps, and ensure alignment with business rules and data governance standards to ensure accuracy, completeness, and consistency across the system.

Experienced in process modeling using tools like flowcharts, BPMN and use case modeling to optimize workflows and improve efficiency, with Applied Lean and Six Sigma for continuous improvement.

Involve in Data quality activities to check including profiling, merging, duplication and standardization.

Guide team and review ETL flows built to handle the changing data from various sources with SCD and CDC concepts.

Use of SQL/ PLSQL scripts to analyze and load data into the database with scope of handling large datasets focusing on query optimization and tuning for better performance.

Work on Reporting and Business Intelligence on Power Bi and Tableau to design the dashboard and reports as per the user requirements and aligning it with the data model.

Analyze and align the data and user requirements to the standards of BASEL, MACRS, IFRS 16, Fair Lending Laws, Taxations, Operating and Capital Lease.

Engage in scrum discussions and take responsibility on requirements analyzing, designing and developing on all Data model, ETL and DB strategies.

Providing status reporting of team activities against the program plan or schedule to business users.

Project 1

Enterprise Finance Data Mart

Duration

Apr 2017 to Jun 2020

Role

Lead Developer

Environment

Informatica, Unix, Windows, Oracle, Crystal Reports, Financial Regulations, Lucid Chart, ER Studio, JIRA

Project details

Building a data platform for multiple products across various line of business work streams such as Wholesale, Retail, Trading, Securitization, Assets, Commercial Finance including treasury, equipment lending, leasing and debt/equity underwriting, with the scope of having a centralized data mart for the business use cases.

Responsibilities

Understanding of the financial data required for building the data mart for Regulatory and Economic Capital calculation and reporting with business users.

Coordinate with multiple workstream stakeholders and business users to get the data required for building the financial data store.

Involving in documentation of requirements in BRDs, FRDs, user stories, and review with business users and internal teams.

Building the flow charts and data mapping sheets to load the data into the data mart from the source, and help team in aligning to requirements.

Involve in data profiling activities to understand the source data and translate to business requirements and to ensure the data alignment.

Support designing the ETL code and DB components as per the regulations to source the data from upstream with analyses of the source data with SCD and CDC concepts.

Engage in SQL/ PLSQL scripts to analyze and load data into the database based on the data mapping with scope of handling large datasets focusing on query optimization and tuning for better performance.

Work on Reporting and Business Intelligence on Power Bi and Tableau to design the dashboard and reports as per the user requirements and aligning it with the data model.

Providing the financial calculation rules and logics on the source data for generating reports and reconciliation of data to generate the monthly and quarterly capital calculations.

Analyzing and forecasting of the financial values for the next 3 years based on the data loaded into the DataMart for the business users review.

Employer: Cognizant Technology Solutions

Overall Duration: Dec 2011 to Apr 2017

Project 2

Data Services Platform

Duration

Dec 2014 to Apr 2017

Role

Lead ETL Developer

Environment

Informatica, Unix, Teradata, Oracle

Project details

Data Services Platform aims to provide 360-degree view of financial data including customers, assets, trading, wholesale and retail products, financial lending, leasing, and structured assets to support initiating an Innovation program which is used for deriving analytics value, with focus of capability to effectively harness available financial information to maximum at a central data service platform securely.

Responsibilities

Understand and analyze the requirement documents and creating a design to handle multi-client environment and multiple data sourcing.

Work with business users to clarify requirements and translate the requirement into technical specifications.

Responsible for analyzing, designing and developing ETL strategies and processes, writing ETL specifications for developer, ETL and Informatica development, administration and mentoring.

Working with Informatica IDQ for profiling and data quality reports to review with stakeholders.

Involved in building the SQL and PL/SQL components and SQL tunings and optimization and designing the Informatica ETL mappings by using various basic transformations.

Building audit balancing control (ABC)for batch process & record balancing across the layer and restart ability.

Perform Unit and Integration testing. Reporting of bugs and bugs fixing and Facilitate QA testing and creating the high-level QA test document.

Established work plans for the development and testing effort as per business requirements.

Providing status reporting of team activities against the program plan or schedule and providing guidance to the team on project delivery.

Analyze the type and the impacts/cost and benefits of the enhancement and provide estimates based on the analysis and impacts of an enhancement.

Assist and support cross-functional teams.

Project 1

Integrated Data Store (IDS)

Duration

Dec 2011 to Nov 2014

Role

ETL Developer

Environment

Informatica, Unix, Windows, Oracle

Project details

Multiple healthcare source systems data consisting of provider & Member information are loaded into an Integrated Data Store (IDS) which will be used to function as a data warehouse for the downstream applications and reporting platform for the healthcare organization. This project involves end-end design & development to extract, transform by applying the business rules and loading the data into IDS with the standard layered architecture along with Master Data Management.

Responsibilities

Responsible for ETL development, testing and review of code along with supporting documents like Unit test case and technical handover documents.

Develop ETL mappings/sessions using Informatica PowerCenter to source data from various formats including databases, flat files, XML, JSON.

Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.

Work with Business Analyst and business users to clarify requirements and translate the requirement into technical specifications.

Creating the stored procedures and packages for remediation scripts in database for data loading.

Analyses on the performance of the jobs and tuning those jobs.

Involved in production deployment activities and support for system testing and QA testing.

Provide support on production environment to ensure no break in system.

Academic Details:

DEGREE/ STANDARD

BOARD/UNIVERSITY

INSTITUTION STUDIED

YEAR OF PASSING

PERCENTAGE

B.Tech

Information Technology

ANNA UNIVERSITY

St. Peter’s Engineering College, India

2011

85

XII

STATE BOARD - HSC

Bentinck higher secondary school for Girls, India

2007

88.08

X

STATE BOARD - SSLC

Bentinck higher secondary school for Girls, India

2005

89.6



Contact this candidate