Post Job Free
Sign in

Business Intelligence Data Warehouse

Location:
Spring City, PA
Posted:
April 05, 2025

Contact this candidate

Resume:

SreeVishnu Vajjala

E-mail : *******.******@*****.***

Phone : +1-201-***-****

Professional Summary:

●Have totally 18+ years of overall experience and possess deep understanding of technology with focus on delivering business solutions, application development and design of data virtualization, data warehousing, Business Intelligence projects.

●Erudite in Snowflake and data virtualization using Denodo Virtual Dataport 7.X, 8.X

●Currently working as Snowflake - Denodo consultant on DQLC project in Santander Bank, involved in E2E life cycle of Data quality rules development, enhancements, starting from prototyping, Analysis, development, testing, delivery and support.

●Experience in installing and configuring Denodo Solution Manager and VQL servers on enterprise scale.

●Has worked as a Data Modeller, Data Architect, developer, designer of the DWH – BI- System Integration projects. Has solid understanding of ETL design principles and good practical knowledge of performing ETL design & Development processes using Informatica PowerCenter.

●Has strong understanding of OLAP/OLTP systems, Dimensional modeling, Star/Snowflake schema and has worked on different layers of data warehouse like staging, operational Datastores, enterprise data warehouse and data marts.

●Perform heavy Oracle data loads in a shorter time frame using Informatica BULK loading, Oracle exchange partitions, session partitions and Pushdown optimization techniques.

●Worked on Exchange partition and Pushdown optimization techniques.

●Have sound knowledge in SQL, PL SQL development environment, good knowledge writing scheduled jobs, procedures, functions, materialized views and packages. Worked on parallel processing and virtual private database concepts.

●Used Informatica PowerCenter 10.2, 9.6, 9.5.1 for bug fixes from various OLTP databases and other applications to the data mart and downstream applications.

●Monitor and tune ETL processes for performance improvements, identify, research, and resolve data warehouse load issues.

●Worked on schedulers like Control-M, Autosys JIL and Oracle schedulers.

●Has domain knowledge of Investment Banking, Wealth Management and Telecom.

●Prepared business requirement documents, technical specifications to develop Informatica ETL mappings to load data into various tables confirming the business rules.

●Have a good understanding of the software development life cycle working on agile methodology. Have excellent communication, presentation and Interpersonal skills.

●Good at performance tuning and troubleshooting.

Education:

Degree

University/College

Year of Passing

M.C.A

Osmania University / Loyoala Academy

2006

B.C.A

Osmania University / Princeton Degree College

2003

Technical Skills

Languages

SQL, PLSQL, Shell scripting, Perl, Python

Databases

Snowflake, Oracle 10g, Oracle 11g, 12C, 19c, exadata. MS SQL Server

SOAP Request

XML.

Schedulers

Autosys, Oracle Schedulers, Control-M

Tools & Utilities

TOAD, SQL Developer, SQL Navigator, Oracle Warehouse Builder and Informatica Powercenter 9.5.1, 9.6, 10.2 HF2 & 10.5.5

Denodo Virtual Dataport V7.0, 8.0

Domain Knowledge

Telecom, Insurance, Pharma and Banking.

Platform

DOS, WINDOWS XP, Unix, Linux

Source Control

Visual Source Safe, Borland StarTeam, Subversion, Github Bitbucket

Modelling Tools

MS Visio, Toad Data modeler and

ERStudio Data Architect 11.0

Development Methodologies

Agile methodology

Awards & Certifications:

●Above and beyond contributions for skinny statements efforts in PSI-31 Jan 2020.

●Above and beyond contributions in PSI-30 Oct 2019.

●Got FIMC (Feather In My Cap) award for my efforts in Brazil client migration in DQI Project from UBS.

●Got Expert star award for my contribution in C3BL project from Cisco.

●Got Cookies award twice in a row for my contribution to team from BT.

●Snowflake snowpro core certified.

●Secured 98% in Oracle Certified Associate.

●Secured 83% in Sun Certified Java Programmer for platform 1.4.

Work Experince:

Project NAME:

Data Quality – DQLC

Client

Santander Bank N. A.

Location

Boston, MA, USA

Role

Sr. Data Virtualization Consultant

Duration

(02/2024) – Till Date

Team Size:

20

Environment

Technologies

Denodo Virtual DataPort 8.0

SQL Server

Snowflake

Oracle 12c

Project Description

Santander Bank N.A. is American bank wholly owned subsidary of Spainish Santander Group. Santander Bank is into commercial and retail banking sectors.

Data Quality project involves in quantitatively, qualitatively, and objectively understand the state of data quality and report quality statistics to data owners. Various data quality rules such as Completeness, Conformance, Consistency, Uniqueness and Validity are defined by data stewards and Prototyped by data quality analysts. Rules are build and executed on frequent intervals and reported to mangement. These reports are further used to cleanse the data.

As Denodo Consultant:

Analyse the prototyped rule queries to simply and turn to VQLs.

Build base views, derived view and interface views, split the logic of rule to improve reuse and effeciency.

Involve in Unit testing and QA cycles with data stewards.

Code review and mentoring Junior developers.

Performance tune views and optimize VQL to perform better.

Deployment support for rules developed.

Production and warranty support of released code.

Project NAME:

Anti Money Laundering – AML

Client

CNO Financial Group

Location

Carmel, IN, USA

Role

Sr. Data Virtualization Architect

Duration

(02/2022) – (02/2024)

Team Size:

10

Environment

Technologies

Denodo Virtual DataPort 8.0

Bash Scripting

SQL Server

Azure Databricks

Oracle 12c

Autosys

GIT

Project Description

CNO Financial Group Inc. is a holding company. CNO’s insurance subsidiaries – Principally Bankers Life and Casualty Company, Washington National Insurance Company and Colonial Penn Life Insurance Company - serve pre-retiree and retired Americans by helping them protect against financial adversity and provide for a more secure retirement. CNO provides life and health insurance, annuities, financial services, and workforce benefits solutions through a family of brands.

AML Compliance project is in response to an internal audit finding. It was identified that CNO needs to enhance both its prospective and retrospective transaction monitoring systems. AML is designed to ensure compliance with various federal regulations including Bank Secrecy Act, USA Patriot Act, FinCen and OFAC. These regulations are designed to control the risk of money laundering and financial crimes.

As Denodo Architect:

●Design Denodo platform setup with installation and configuration of VDP on Dev, test, QA and production environments.

●Create virtual data models for data consumption by CNO internal users.

●Create and set-up data sources SQL Server, Oracle, Data bricks, file-based inputs.

●Design and mentor for development of virtual layer.

●Create a data security layer with users and roles by leveraging Role Based Access Control.

●Configure Data Catalog for data discovery, lineage and exporting results.

●Monitor the trends and tune memory allocations.

●Laying out process for version control system of check-out, check-ins and Merge process.

Project NAME:

AD – NA Automation

Client

Becton Dickinson

Location

Franklin Lakes, NJ, USA

Role

Data Virtualization consultant

Duration

(09/2021) – (02/2022)

Team Size:

3

Environment

Technologies

Denodo Virtual DataPort 7.0

HIVE

Project Description

Becton Dickinson is one of the largest global medical technology companies in the world and is advancing the world of health by improving medical discovery, diagnostics and the delivery of care.

As a Denodo SME, responsible for development of data virtualization layer. Pull data from heterogenous systems and form a canonical view. Developed base, derived and reporting views.

Project NAME:

Systems Integration - Concord

Client

SEI Investments

Location

Oaks, PA, USA

Role

Data Virtualization consultant/Integration specialist

Duration

(12/2019) – (09/2021)

Team Size:

5

Environment

Technologies

Languages: PLSQL, Shell scripting

Database: Oracle 11g, 19c

Tools: SQL Developer,Toad, Informatica 10.2

Denodo Virtual DataPort 7.0,8.0

Scheduler: Control-M

Project Description

SEI Investments is a leading global provider of asset management, investment processing, and investment operations solutions for institutional and personal wealth management. It provides verity of solutions under SEI wealth platform (SWP) like Private Banks, investment advisors, investment managers, institutional investors.

Systems Integration team is focused on implementation and support of loaders, web services, extracts from SWP. Team is also responsible for development, implementation and support of custom and standard interface files for SWP firms or supporting third parties. Concord works with Product Manager for developing/enhancing interfaces.

Responsibilities:

●Creating High Level Technical Design Document having high level description and picture of various components and processes involved in achieving the desired functionality.

●Creating and providing Sample File Layout to Clients based on requirements so that client can start their development in parallel for consuming the outbound extracts.

●Preparation of low level design based on finalized High Level Design Documents. This involves creating the Actual Structure PLSQL Programs to be developed along with all the objects to be used. Peer reviews of low level designs.

●Designed and developed high-quality integration solutions by using Denodo virtualization tool by reading from Oracle database, flat files.

●Creating custom Base views, developing derived views, integrating multiple views and build final form views which can be used to generate pipe delimited extracts via Denodo scheduler.

●Developed utilities using shell scripting which does semantic checks on generated files.

●Scheduling new control-M jobs for extracts and triggering them adhoc basis for supporting testing.

●Performance tuning on developed derived views by using cache views and wrote cache jobs. Analyzing and identifying the reusable components to performance tune the extraction process. Fine tune and optimize the performance of complex views.

●Creating technical mapping document for Informatica. This includes the mapping logic from source to target along with all the validations and data flow details.

●Built restful web services from Denodo and exposed it to third party web applications.

●Creating and updating Database Package/Procedure/Function using PL/SQL.

●Creating database Object like Table, View, Synonyms, Sequences etc.

●Reviewing PLSQL code to make sure standards are followed and queries are performance tuned.

●Performing various kinds of testing like Unit Testing and Component Testing etc.

Project NAME:

Statements

Client

SEI Investments

Location

Malvern, PA, USA

Role

Data Arctitect/Data Modeler, Technical Lead

Duration

(07/2018) – (11/2019)

Team Size:

15

Environment

Technologies

Languages: PLSQL

Database: Oracle 11g.

Tools: SQL Developer, Informatica 9.6 & 10.2

ERStudio Data Architect 11.0

Project Description

SEI Investments is a financial services company and which is a global provider of investment processing, investment management, and investment operations solutions. SEI provides products and services to corporations, financial institutions, financial advisors, and ultra-high-wealth individuals. The Statement is a formal record of the financial activities of a business, person, or other entity. Relevant financial information was in a structured manner and in a form easy to understand. Firms have a great deal of

flexibility regarding the kinds of documents they produce, what is in these documents, and how the documents are configured. For example, the Platform supports Statements as well as Transaction Advices or Contract Notes, Fee Advices, and Client Income & Disposal Reports. Each of these client communications will vary in content, format, audience, frequency, volume, and size.

Responsibilities:

●As Data Arctitectect create a blue print of new structures/enhancing existing structures after discussing requirements with tech leads. Make sure the standards are met with new structures and enhancements.

●Liase with enterprise data architects and run through the blue print of design and seek approvals.

●As Data Modeler convert logical models to conceptual models and work with admin to deploy the structures. Store the models in central repository of project.

●As Tech Lead, gather the requirements, access the risks associated and convert business requirements to technical requirments.

●Coding, Pair programming, code reviews. Unit test case reviews. Identifying bottlenecks for performance issues and rewriting underlying SQLs.

●Developed new new packages for skinny statement generation.

●Involved in defect discussions and fixes.

●Enhaced ETLs which process street side retireone data.

●Involved in technical documentation and reviews.

Project NAME:

GSP (Group Shareholding Programme)

Client

Union Bank of Switzerland (Finance IT)

Location

Weehawken, NJ, USA

Role

Technical Lead

Duration

(12/2016) – (07/2018)

Team Size:

18

Environment

Technologies

Languages: PLSQL

Database: Oracle 11g.

Tools: TOAD, Informatica 9.5.1

Autosys scheduler, SAP BO XI R2

UNIX.

Project Description

GSP is Group Shareholding Program which is part of Compliance and Operation Risk Control division of UBS. GSP is alerting system to facilitate compliance officers of UBS, Compliance officers are obliged to report UBS positions to regional regulators on certain intervals/ on demand basis. GSP is designed to alert compliance when we are at or are approaching shareholding thresholds in the different jurisdictions. Compliance will gather further evidence (essentially validate the positions held in GSP,perform exception processing) and – if required – prepare a filing for the regulator. GSP is essentially an Oracle data warehouse. Approximately 80 position data feeds received daily - uploaded into database using Informatica (invokedfrom Autosys scheduler). Users enter business rules for how positions are to be aggregated through a Java front end. Batch “gateclose” process run for each region to generate “pre-canned” data for

the alerts and reports using combination of position data, user defined rule data, and instrument static data. Alerts are highlighted on Java front end – so called “Dashboards”. Users can drill into dashboards and pull out further details on B.O. Infoview reports.Users have series of Java front end data maintenance tools (repair queue, data mapping etc).

Responsibilities:

●Defining and coordinating the technical tasks for the development of the software and/or interfaces.

●Interacted with our customers and gathered requirements based on changing needs. Incorporated identified factors into Informatica mappings to build Data Warehouse.

●Expert in Informatica tool like designer, workflow manger, workflow monitor, repository manger etc.

●Using Informatica Power Center to build the new feeds.

●Have experience in adherence to coding standards & best practices using Informatica Power Center.

●Work with Clients. Serving as the interface between the developers and the project manager

●Act as a mentor for the developers in the team. Has strong technical skills and developed the architecturally significant components of the software system and/or interfaces.

●Work with L2 support team in case any production issues, find root cause analysis and applying fix..

●Lead development team to create Informatica mappings for new feeds according to BRDs.

●Ensure all the developers understand the big picture because their work and implications

●Know the status of the developer’s work and detect slippage.

●Communicate technical issues/decisions effectively to PM and Sponsor using business jargons.

Project NAME:

FDR (Financial Detail Reporting)

Client

Union Bank of Switzerland (Finance IT)

Location

London, UK

Role

Lead Developer.

Duration

(03/2015) – (11/2016)

Team Size:

3

Environment

Technologies

Languages: PLSQL

Database: Oracle 10g.

Tools: TOAD, Informatica

Autosys scheduler

UNIX.

Project Description

FDR is Financial Detail Reporting project. FDR is one of the profit and loss reporting platform for UBS. FDR is a new target state infrastructure that will allow Finance, across both the Equity and FICC functions, to apply a holistic approach to P&L and Balance Sheet reporting. FDR makes up part of strategic reporting and incorporates components of RMCR (Risk Management Control Renewal) and Finance Transformation..

Contribution:

●Oversaw team of on-site and off-shore developers to create ETL in Informatica mappings.

●Responsibilities include customer interaction to gather requirements, analyze the current system and proposed system, code for new feeds, unit testing, supporting business users, liaising with L2 team for deployment of code.

●Created complex mappings in PowerCenter Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, SQL, Union, Lookup, Joiner, XML Source Qualifier, transformations.

●Debugging the workflow/Session log files and assigning the ticket to the support based on the error.

●Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

●Preparing low level design documents.

•Designed and coded all ETL routines using Informatica

•Preparing unit test case documents.

•Updating in confluence to maintain knowledge repository.

•Developed ETL mappings, transformations using Informatica Power Center 9.5.1

•Helping team mates about the architecture of the system.

•Generating adhoc reports for supporting business users. Participate in peer reviews.

•Involved in thedata cleanup process, doing bulk uploads using Informatica and unit testing for the same.

•Engaged maintain and code feeds, make sure smooth functioning of system free from data issues and data feeds into system by means of migrations.

•Support L2 Team in successful implementation of codes into production.

•Generate reports from database systems to cater adhoc requests from Business analysts for their analysis.

Project NAME:

DQI (DataQualityIntiative)

Client

Union Bank of Switzerland (Data IT)

Location

Singapore.

Role

Independent Contributor

Duration

(03/2012) – (02/2015)

Team Size:

10

Environment

Technologies

Languages: PLSQL

Database: Oracle 10g.

Tools: TOAD, Groovy Script, Informatica

UNIX.

Project Description

Data Quality Intiative is project that ensures cleaniess and high quality of client reference data is maintained across the DATA-IT platform. It constitues of COT(Customer Onboarding Tool), Entity master, Masterfiles and EROS applications. COT is a workflow based tool to onboard client and acts as source of data whereas entity master is a repository of client reference data and acts as downstream system to COT. Masterfiles is downsteam systems to entity master and acts as source system for accounts related data whereas EROS is the system for settlement instructions.

Contribution

As a Independent Contributor,

●Involved in the data cleanup process, doing bulk uploads using Informatica and PLSQL and unit testing for the same.

●Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

●Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.

●Developed a shell wrapper script, on one step execution of command generation of report before migration, migration or bulk uploads and report after migration would executed; thus reducing the burden of L2 team drastically.

●Extensively optimized all the Informatica sources, targets, mappings and sessions by finding the bottlenecks in different areas, and debugging some existing mappings using the Debugger to test and fix the mappings.

●Engaged to maintain data cleanups, make sure smooth functioning of system free from data issues and data feeds into system by means of migrations.

●Perform Code reviews for all developments using Informatica and also facilitate and ensure optimal ETL performance for all Informatica Mappings & Workflows.

●Acted as SPOC for all COT and entity master migration activities.

●Supported L2 Team in successful implementation of migration activities and bulk uploads.

●Used to generate reports from database systems to cater adhoc requests from Business analysts for thier analysis.

Project NAME:

Socrates

Client

Union Bank of Switzerland (Finance IT)

Location

Singapore

Role

Team Member

Duration

(05/2011) – (02/2012)

Team Size:

6

Environment

Technologies

Languages: PLSQL

Database: Oracle 10g.

Tools: TOAD, Informatica

UNIX.

Project Description

Socrates is a fixed income multi-product P&L reporting application for traders. It is used to collate, report and analyse P&L accross Finance Control Division of UBS. It captures various components of P&L(Profit and Loss) from different upstream systems by means of daily data feed processes stores in repository, and calculates certain other P&L components and provides total P&L which BusinessUnit Controllers (BUCs) perform their daily checking and post adjustments if required. It acts as up stream system for CONSOL and Arisk and many other systems.

Contribution

As a team member,

●Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.

●Involved in the coding for the enhancement, minor changes and unit testing for the same.

●Acted as SPOC for all database issues.

●Prepared Technical design document.

●Supported L2 Team for production issues.

Project NAME:

SDP – C3BL

Client

Cisco

Location

Bangalore, India

Role

Team Member

Duration

(09/2010) – (04/2011)

Team Size:

Project :4

Environment

Technologies

Languages: PLSQL

Database: Oracle 10g.

Tools: TOAD, OracleWarehouseBuilder

UNIX.

Project Description

Service delivery performance is part of C3BL system (Cisco Customer careCentre Business Layer), C3 acts as up stream system for C3BL and BO is the means of accessing C3BL data. From C3 data flows into the BL Staging layer, this process is termed as Extract process. $U tool which is a job scheduling tool is used in the ETL process. Further from BL staging schema, data flows into the BL Reporting schema. This process is termed as Transform/Load process. Finally from the BL reporting layer, data is taken for BO reports generation. The whole process uses ETL mechanism.

Contribution

As a team member,

●Involved in the coding for the enhancement, minor changes and preparing unit testing for the same.

●Responsible for designing and maintaining ETL solution, using OWB & Oracle PL/SQL, including Application Data and dimension modeling and also project management for various projects in Information Management.

●Prepared Technical design document.

●Supported QA Team.

●Launching and Monitoring the loads

Project NAME:

SSO DM

Client

Cisco

Location

Bangalore, India

Role

Team Member

Duration

(06/2010) – 08/2010

Team Size:

Project :2

Environment

Technologies

Languages: PLSQL

Database: Oracle 10g.

Tools: TOAD.

UNIX.

Project Description

SSO DM is a repository of data which will pull the data from source system known as Cisco Custmer care Centre C3 and converts into a structure which is suitable for reporting. This extracted data is used many down stream systems like SCM BI, EBI, CIBER BI etc.. SSO DM uses ETL mechanism to extract data from source.

Contribution

As a team member,

●Involved in the coding for the ETL enhancements using PLSQL, minor changes and preparing unit testing for the same.

●Prepared Technical design document.

●Supported QA Team.

●Knowledge transfer to Support team.

Project NAME:

eDCA (Electronic Data Capture Application)

Client

BT (British Telecom)

Location

Kolkata, Pune & Bangalore India.

Role

Team Member

Duration

(03/2007) – (05/2010)

Team Size:

Project :80

Module :15

Environment

Technologies

Languages: PLSQL

Database: Oracle 9i.

Tools: TOAD, SQL Navigtor.

WINDOWS XP.

Project Description

eDCA is a web based tool to place an order for BT Network service customers and validates the order according to business rules. This order involves for services like Access, VPN and infrastructure needs for Customer Premise Equipments like Routers and Modems.

eDCA comprises of following teams:

1.FE

2.Databuild

3.Quarantine.

Quarantine is a back end process that interacts with Classic over XML interface. This acts as a bridge between Online DCA front end and Classic application. It comprises of database tables and database objects like stored procedures, functions, views, etc. It is the communication interface between DD DCA / Decomp Tool and Classic and is used by Decomp Tool, DD DCA and Classic. Quarantine provides the feature to convert the order detail format received from Decomp Tool or DD DCA into a format, which is understood, by Classic, and vice-versa.

Contribution

As a team member,

●Involved in the coding of the enhancement, minor changes & preparing unit testing for the same.

●Played a signficnt role in launching BT’s new product BT OneVoice

●Worked on all prespositions of the E2E stack.

●Provided the support for production, SIT and D2D.

●Catering Adhoc requests as and when the user requested.

●Co ordinating with team and assist.

●Inducting new members into team.

●Prepared KT Document for the project.



Contact this candidate