Post Job Free

Resume

Sign in

Data Informatica

Location:
Lovettsville, VA
Posted:
June 30, 2020

Contact this candidate

Resume:

SRINIVAS.G Mailto : add8kc@r.postjobfree.com

Mobile : 609-***-****

Professional Profile

Having total 10+ years of experience in Data-ware housing Technologies to desing data Integration and business Intelligence Applications for Analysis, Design, Development, and Implementation of business application systems.

Experience in MDM Hub configurations - Data modeling, Data Mappings, Data validation, Match and Merge rules, Hierarchy Manager, customizing/configuring Informatica Data Director.

Experience in ETL, ELT, data integration, data quality and data governance skills using Talend Data Integration, Talend Data Quality, Talend Big Data, SQL and PL/SQL, T-SQL.

Defined, Designed and Developed an End-to-End reporting solution using MongoDB, Java and Talend ETL.

Familiar in using Talend features such as context variables, triggers, Connectors for Database and flat files like tMySqlInput, tMySqlConnection, tOracle, tFileCopy, tFileInputDelimited, tFileExists, Components for processing like tMap,tflowtoIterate, tAggregate, tSort, tJoin,tUnique,tDie.

Experience in creating multi schema targets with Flat files, creating split output files, defining Metadata structure for Databases and XML’s in Talend Open Studio.

Analyzing the regulatory restrictions and the need for identification of golden record for Master Data.

Experience in Informatica Data Quality (IDQ), Power Center, Data Cleansing, Data profiling, Data quality measurement and Data validation processing.

Worked in both SDLC water fall and Agile project implementations .

Worked on Oracle Database supporting OLTP and Data Warehouses.

Proficient in understanding business processes / requirements and translating them into technical requirements.

Experience in writing Test Plans, Developing and Maintaining Test scripts.

Experience in Performance tuning of ETL process. Reduced the execution time for huge volumes of data for a company merger projects.

Worked on project implementation techniques like Star and Snow Flake Schemas data models.

Good team player, strong interpersonal and communication skills combined with self-motivation initiative and positive attitude.

Areas of Expertise

Data Modeling

Requiremnts analysis

Knowledge transfer

Preparing BRD, TD documentation.

Test Case Design and Execution

Develop Test Scripts

Supporting applications both AD and AM Models.

Release Management

Admin Activites/Scrum Master

Industries

Insurance Domain

Telecom Communications

Pharmaceuticals

HR Payroll

Healthcare

Banking, Finance

Oil and Gas Resources

Media and Enterainment

Experience and Skills

AML[Anti Money Laundering ] ACTIMIZE-SAM/CDD/WLF tool set

Informatica Power Centre 9.5.1/10

Informatica MDM Multi Domain Edition 9.5

Informatica Data Director(IDD) 9.5

Informatica Data Quality(IDQ) 9.5

Talend 4.2.4/5.2.2

Talend Open Studio (TOS) for Data Integration 5.5.2.

SQL server 2005 & 2008

Oracle 9i,10g and 11g

SQL, PL/SQL

Netezza

Teradata

Tableau

Qlikview

Knowledge on BO XI R2

Congo’s 8

Micro strategy

Windows XP,NT,ME,2000,98 & 95

UNIX/Linux

Educational Qualifications

Masters in Electronics from Osmania University[2006].

Bachelor of Computer Application Science from Osmania University[2002].

Professional Experience

Automatic Data Processing [ADP].

Industry : PayRoll/Product Development

Location : Parsippany /NJ

Position : Senior Application Developer

Duration : Feburary -2017 to Till Date

Environment : Informatica 10, Oracle 10, Oracle 11g/NICE ACTIMIZE Suit for AML.

Project Scope

AML, Anti Money Laundering: to a set of laws, regulations, and procedures intended to prevent criminals from disguising illegally obtained funds as legitimate income.

SAM, Suspious Activity Monitoring: Buliding transactional suspicious monitoring for ALINE CARD.

Integrating the solution with MB file from Meta Bank and VERF from VISA systems to build the daily account transactional monitoring for any suspicious activity and create alerts to Compliance team to monitory requrlary and report to FINCEN.

KYC/CDD, Know Your Customer/Customer Due Diligence: CDD information comprises the facts about a customer that should enable an organisation to assess the extent to which the customer exposes it to a range of risks. These risks include money laundering for any other activites. Builiding CDD for all clients on boarding

Process for ADP. The point of entry at SFDC for Potencial prospects and exisinting Clients.

To monitor their activity and create alerts and risk rate the score for prospects and clients, status based on thirdparty check list provided by BVD[Bureau van Dijk] on daily basis with API integration.

WLF, Watch List Filter, Building Real-Time and Batch Sanctions Screening & Watchlist Filtering in Seconds.

Feeding the SFDC data on daily basis on Real-Time and Batch mode into the ACTIMIZE system and integrating with DowJones List, Creating the alerts based on the score provided by the DJ list and Monthly basis and sending the reports back to SFDC and Compliance team.

Building the Compliance for Disputes/Convinence Check/Compaints and automating the process for workflow management for easy of operation.

Responsibilities:

Working as the Lead Sr Consultant Devloper.

Integrated diffrenent source systems for Compliance and Suspicous Activity Monitoring like

{Aline Pay Card/Disputes/Convinence check/SFDC Client On boarding/Oracle-NA/CSVP/EVO/CSVP-ODS/Work Market}

Responsible for understanding the Functional specs and Involved in preparing the High Level Design Documents.

Involved in the Preparing the Low Level Design documents and ETL Mapping documents (Detail Design Documents).

AD Scope includes estimating, work planning.

Suporting the project thourgh out life cycle for overall project development and delivery end to end.

Developing Informatica ETL mappings to load data from flat files to Stage and Stage to Star schema. Also mapping to integrate attributes from different source systems to combined data into 1 and push the date to ACTIMIZE system.

Development of ETL strategy to combine data from existing different source systems and integrate with ACTIMIZE data model.

Creating UNIX shell script to automate source file load from share point and push it to SFT profile for Informatica ETL load.

Used Control-M as a scheduling tool for scheduling the jobs.

Providing the support for SIT,UAT testing phase, resolving the defects on time and getting business singh off.

Suporting the Project during and Pre-Production and after go Live Production support for any Issues.

Bank of Tokyo Mitsubishi

Industry : Financial Banking

Location : Jersy City/NJ

Position : Informatica Lead Consultant

Duration : May -2016 to January -2017

Environment : Informatica 9.5, Oracle 10/11g, Oracle 11g

Project Scope

OFSAA Profitabilty, As part of this project migrating the existing the 4.5 OFSAA system to new 6.1 OFSAA system. To build new enhanced Data model to load data from merged union bank data west cosat data along with Bank of Tokyo East data.

And to use new data from 6.1 data model for OFSAA reporting including OBIEE cubes and other down stream systems.

Responsibilities:

Lead a team of five offshore team members and assisting them for overall delivery.

Coordinating with offshore development team and on shore business team for development and requirements gathering.

And working with onshore solution engineering team for business solutions.

Involved in all phases of project life cycle for overall project delivery end to end.

Helping the team in resolving technical/non-technical issues.

Providing the support for SIT,UAT testing phase, resolving the defects on time and getting business singh off.

Prepare for Production go-live.

Disney Magical Express Resorts and Parks

Industry : Disney Magical Express Resorts and Parks

Location : Orlando/FL

Position : Informatica Lead Consultant

Duration : July -2015 to May -2016

Environment : Informatica 9.5, Oracle 10/11g, SQL Server 2008, Oracle 11g

Project Scope

The Walt Disney Company, commonly known as Disney, is an American diversified multinational

Mass media and entertainment conglomerate. The current DME project is done to migrate it existing online reservation application which is present on SQL server and .NET application to new environment Oracle and Java based front end application for online reservation booking for Disney resorts and Theme parks.

Responsibilities:

Lead a team of six offshore team members and assisting them for overall delivery.

Responsible for understanding the Functional specs and Involved in preparing the High Level Design Documents.

Involved in the Preparing the Low Level Design documents and ETL Mapping documents (Detail Design Documents).

AD Scope includes estimating, work planning and assignments.

Used QlikView to develop, design and support reports, dashboards and scorecards for a variety of business users.

Designing QlikView Document/User Setting, Layouts to make consistent and professional optimized look to clients.

Used Qlikview Management Console (QMC) and managed their schedule dependency to ensure the data gets refreshed in a timely and accurate fashion.

Maintained Row, Column and Sheet level Securities for all Major Dashboards using Section Access.

Used QlikView Functions (Date and Time, Keep, Join, Mapping, String & input Fields etc), Loops, Conditional Statements and Variables.

Used Containers, Fast changes and created some fake containers with Inline data and used different types of charts & tables to meet user requirements.

Implemented security and deployed QlikView Applications.

Developed QlikView Dashboards using Chart Box (Drill Down, Drill up & Cyclic Grouping), List, Input Field, Table Box and Calendar.

Created Publisher Tasks in QMC for the QlikView Application to Reload and Distribute the QlikView Application on to the Access Point.

Scheduled the Publisher Tasks Daily/Weekly/Monthly on various Dashboards based on the Database updates.

Extensive Exposure to QlikView Server providing User access and Creating Tasks.

Implemented Section action based on user requirement to restrict data by User profiles.

Created Dashboard of reports using QlikView components like List Box, Slider, calendar & Macros.

Resolved issues resulting from User Acceptance Testing (UAT)

Build reusable macros to control QlikView objects for ease of use by the users.

Involved in design and development of QlikView Platform integrated with Expandable Databases.

State Street Bank

Industry : Banking

Location : Boston/MA

Position : Informatica MDM Consultant

Duration : Oct -2014 to July -2015

Environment : Informatica Multi Domain MDM 9.5/9.1, Informatica Data Director, Informatica Power Center 9.1, Oracle 10/11g, Windows Server 2008, Address Doctor, Toad 10.1, SQL*Loader.

Project Scope

CMS has embarked on a real-time operational MDM hub integrating multiple source systems data for mastering the Counter party Management System from various internal systems and external systems for

rating their counter parties and better understand the lending Risk Management System.

Responsibilities:

Configured and Installed Informatica MDM Hub, MRM Server, MRM Cleanse, MRM Resource Kit and Address Doctor5 in Conversion Island, Development, QA, Pre-Prod and Production in Windows environment.

Led effort for gathering business requirements for the data-warehouse as well as business-intelligence reports to be used by the management.

Evaluated and reported the progress of data quality improvement to the management and also provided data quality training and consulting services to Business and IT members in the organization

Good development experience in Informatica (9.5) Power Center- ETL tool and Informatica Data Quality (IDQ) tool.

Involved in implementing the Land Process of loading the customer/product data set into Informatica MDM Hub that was coming from various source systems.

Took development decisions by working with business users and enterprise architects to understand enterprise vision and direction.

Hands on executing MDM hub Batch Jobs (Stage, Load, Tokenize and Match Processes) and Batch Groups.

Designed SQL Queries to pull out the data which has to be Matched & Merged coming from all the source systems according to match & merge rules setup (to make sure consolidation process configured).

Configured Match and Merge Rules in improving the data quality by match analysis, analysis of the MDM system performance and tuning the performance by changing the existing cleanse function, match rules etc.

Involved in developing and configuring the Hierarchies using Hierarchy Manager to identify hierarchical relationships.

Designed, configured and tested Informatica Data Director (IDD) interfaces.

Provided weekly status updates on progress, status and fatal errors in IDD.

Led ETL team towards designing and development of extraction mappings for inbound source to MDM and then outbound MDM to consuming systems.

Worked on IDD UI configuring subject areas for Customer Domain that assisted to search and edit master records that enabled client with their sales process transformation

Enabled Message Queue Triggers for Hub State Indicator Changes.

Assurant Insurance

Industry : Insurance

Location : Santa Ana, CA

Position : ETL Informatica Tech Lead

Duration : August -2013 to September 2014

Environment : Talend Integration Services (5.0), Netezza DB, MySQL DB, Unix, Linux, Wiki, SVN,Talend Administrative Console (TAC), Java, Infosphere Datastage (9.1) Designer/Director, Erwin Data Modeler R7, Informatica Power Center 8.6.1, Oracle 10g, PL/SQL, Toad, Teradata 13, Teradata SQL Assistant.

Project Scope

Assurant is a leading insurance provider for Renters/Apartment Insurance, Flood Insurance and Manufactured lender placed Insurance, with diverse set of specialty and niche-market insurance products of regional agencies, broker networks, and insurance carriers to help you find the most coverage options best suits the customer needs.

Responsibilities

Worked with Talend components such as tfileInputDelimited, tFileInputPositional, tFileOutputDelimited, tfileInputExcel, tFileInputXML, tFolder, tFileExists, tMySqlConnection, tMySqlInput, tSalesForceConnection, tSalesForceInput, tOracle.

Installed & Configured MDM Hub on Dev, Test and Prod Server, cleanse, resource kit, and Address Doctor5 in Dev, QA.

Gathered requirements from business users

Creating the Base objects, Staging tables and landing tables foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.

Created Mappings to get the data loaded into the Staging tables during the Stage Process.

Defined Trust and validation rules for the base tables.

Coordinated with Business team and making them understand Match & Merge and incorporated their requirements.

Created Match rule sets in for the base objects by defining the Match Path components, Match columns and rules.

Created IDD application and Subject Areas, Subject Area Groups, Deploy and test IDD application, cleanse functions, utilizing timeline, and export and import data.

Configured Address Doctor and fixed the AD issues by doing the changes in the config file.

Worked on BDD config file to get the changes reflected on IDD.

Developed and created queries which can be used for over and under Matching.

Analyzed the data by running the queries and provided the stats after Initial data and incremental Loads.

SAM - discussed use of Roles, creation of users and assignment of user to Role.

Defined Roles and privileges for each environment according to requirements.

Defined the security such that schema will be secured with access only granted for specific downstream integration uses, using users created for those specific integrations

Scheduled MDM Stage Jobs, Load jobs using Utilities workbench.

American Express

Industry : Banking

Location : Phoneix, AZ

Position : ETL Informatica Tech Lead

Duration : February-2013 to August-2013

Environment : Oracle, Clarity. Informatica 9.5, Talend Integration Services (5.0), Talend Administrative Console (TAC) O/S: Windows, Unix.

Project Scope

Work Force Analytics: This particular module is developed for reporting purpose of various reports across all the divisions for American express for showing the expenditure incurred on the resources at American express permanent employees and contractors across all the segments and locations.

Spend Analytics: This particular module is developed for reporting purpose of various reports for the business to better understand the spend amount across all the segments and divisions of American Express. This reports would give the better understanding of these amount spend over a period.

Responsibilities

Requirement gathering from the business and leading the project.

Designing the whole solution to integrate Employee attributes from HR Analytics tables with timesheet data feed from Clarity tool. This is to create comprehensive data set that can be analyzed through Tableau.

Designing the solution to integrate existing Spend data ware house data to GBS data ware house, this is to create comprehensive data set that can be analyzed through Tableau.

Developing Informatica ETL mappings to load data from flat files to Stage and Stage to Star schema. Also mapping to integrate attributes from HR Data warehouse and loading the comprehensive combined data into 3 different summary tables. (Similarly for the Spend Data ware house)

Development of ETL strategy to combine data from existing HR data warehouse and timesheet data from clarity tool using Informatica joiner, expression, lookup, router, normalize sequence number generator transformations.

Creating UNIX shell script to automate source file load from share point and push it to SFT profile for Informatica ETL load.

Creating currency conversion table and loading currency conversion data from rss feed. Also loading factual data after converting it into US dollars using the currency conversion table.

DAC configurations for full load and incremental load. Creating DAC tasks and execution plan.

Creation of Design and Technical documents for development purpose.

Successfully deployed Talend Admin Centre WAR file in Weblogic 10.3 server and gave training to client about Talend Admin Centre functionality.

Extract, cleanse, Load test and maintain identified conformed dimensional data using a complex mix of Talend ETL components, Java and SQL

Schedule the Talend jobs with Talend Admin Console, setting up best practices and migration strategy.

Did a POC on the ELT component and increased the performance of one JOB.

Tool setup, Client and server model was set up by me.

Worked on TIS(EE), TOS (4.1) and TIS Mpx.

Coordinating with offshore development team and on shore business team for development and requirements gathering.

Understanding of the business which will help in resolving the production defects.

Resolving the production defects in OBIEE, ETL Informatica.

For current project studying the OBIEE reports to convert Tableau reports and design the Data model to suit for Tableau reporting.

Creating required views for reporting in Tableau reports.

Devon Energy [Accenture/India]

Industry : Resources

Location : Banglore, INDIA

Position : ETL Informatica Tech Lead

Duration : December-2011 to February-2013

Environment : Oracle, SAP BW, SAP ECC 6.0, SQL Server 2008. Informatica 9.5 O/S: Windows, Unix.

Project Scope

Devon Energy Corporation is among the largest U.S.-based independent natural gas and oil producers. Based in Oklahoma City, company's operations are focused on North American and Canada exploration and production.

The current existing BIDW is developed in PPDM model 3.7 to support BI analytics team to make day to day decisions based on BI CUBES developed in SSRS and SSAS.

Responsibilities

Leading a team of four offshore team members and assisting them for overall delivery.

Extraction of data form sap r/3, SAP ECC and BW source systems.

Working on SAP General Ledger tables to pull the data.

Help team on SAP BW data base migration form oralce to HANA DB.

Analysis of existing data model to support ongoing changes to data model and BI analytics team.

Actively involving in the 24*7 support and doing the enhancements wherever required.

Analyzing and fixing the code in production instantly after doing quick testing in production-like environment whenever there is a job failure due to the code issue.

AD Scope includes estimating, work planning and assignments.

AM includes manage Informatica support team and production support SLA's.

Identifying the existing performance issues and tuning the Informatica workflows for optimal performance.

Implementing the changes to existing data PPDM 3.7 to 3.8 model and supporting the ongoing changes.

Analysis of performance bottle necks and improve the performance at Informatica and data base.

Preparing the documentation for easy maintenance of support project.

Automated the jobs by considering their internal dependencies and based on upstreams using Tidal as scheduling tool.

Closely working with Business users during the month-end, quarter end and year end support activates and provides timely updates for financial closure which are critical to the Business for reporting purpose.

Implementation JAWS analytics tool for various work load balances.

Creating the OBIEE reports based on adhoc requirements from Business.

Providing necessary support for OBIEE queries whenever required.

Working over the incidents/tickets that are created by different business users by prioritizing them based on issue criticality.

Actively participating in the DR (Disaster Events) that take place over the weekends in every quarter.

Update Business whenever there is any delay in the batches (for any reason) and plan accordingly so that there should not be much impact to Business.

Working along with the team as a part of the regular activities in day-to-day work along with the other responsibilities.

Assign the team members accordingly to complete the work based on their competent.

Created partitions to handle huge data at database level.

Track the status of work from time to time and provide timely updates to team lead.

Provides spontaneous response to the queries posted by business users and supporting teams.

Handling escalations in the best way whenever there are any.

Scheduling the calls with Business users to understand the issues clearly and solve the issue accordingly.

Prepared Documents like Technical Design Doc, Production Support Doc, Deployment Doc

Lead and encourage the team to participate in multiple events that are conducted by Organization.

Level3Communications [Accenture/India]

Industry : Communications

Location : Hyderabad, India

Position : Informatica Tech Lead

Duration : March-2011 to Oct-2011

Environment : Oracle Exadata, SQL Server 2008. Informatica 8 and Informatica 9 O/S: Windows, Unix.

Project Scope

Level 3 Communications Is an American multinational telecommunications and Internet service Provider Company headquartered in Broomfield, Colorado.

It operates a Tier 1 network. The company provides core transport, IP, voice, video, and content delivery for most of the medium to large Internet carriers in North America, Latin America, Europe, and selected cities in Asia.

Responsibilities

Participated in discussion with senior management team for the better understanding of business requirements Analysis for developing mapping documents for already developed jobs from Ab-initio to Informatica.

Leading the team of three for project delivery.

Involved in Analysis phase of business requirement and design of the Informatica mappings using low level design documents.

Designed the project in such a way that there are 2 phases, SDE and SIL, as per the BI standards at Informatica (9.1) level.

STAGE - Source dependent Extraction, where the data gets extracted from different source systems to staging tables. This is the staging area.

ODS- Source Independent loading, where the transformed data will be loaded into the final target tables.

Helping the team in resolving technical/non-technical issues.

Involved in migration of Informatica migration from Informatica 8 to 9 version.

Involved DB migration from Oracle to Oracle Exa-Data and implementing NRT (Near Real Time) updating to ODS from front end UI to ODS.

Participated in the DAC implementation.

Informatica migration setup from oracle 10g to Oracle ExaData.

The EP’s were scheduled at DAC level as per client specifications after creating the tasks, assembling ‘subject areas’ and building the ‘Execution plans’.

Prepare the road-map for the completion of tasks and plan to share the work among the team as per each individual capability.

Involved in Performance tuning by determining bottlenecks in sources, mappings and sessions.

Created effective Test Cases and did Unit and Integration Testing to ensure the successful execution of data loading process.

Troubleshooting the loading failure cases, including database problems

Analyzed Session Log in case session fails to resolve errors in mapping or session configurations.

Used TOAD to run SQL queries and validate the data in warehouse and mart.

Involved in designing the ETL testing strategies for functional, integration and system testing for Data warehouse implementation.

Expertize with Informatica (IDQ) toolkit, analysis, data and cleansing.

Used IDQ for Data matching, and data conversion.

Used IDQ to perform data profiling and create mappings.

Involved in migration of the code from IDQ to power center.

Experience in creating mapplets in IDQ/ Informatica Developer for Data standardization.

Created Materialized views at database level and refresh them at ETL level for ease access of data for OBIEE.

Time Warner Brother [Accenture/India]

Industry : Communications

Location : Banglore, India

Position : Senior Software Engineer

Duration : June-2010 to February-2011

Environment : Database: Netezza, Mainfarme and DB2 Workday Tools: Informatica9.1.1and VM-Ware O/S: Windows, Unix.

Project Scope

Time Warner Inc: A global leader in media and entertainment with businesses in television networks, film and TV entertainment and publishing, uses its industry-leading operating scale and brands to create, package and deliver high-quality content worldwide through multiple distribution outlets.

This project is done to integrate the data among all the divisions Time Warner and populate the data in to a single EDW related to HR Payroll system Workday, and use the same data to various reporting purpose.

Responsibilities

Involved in Preparing BRD (Business Requirements Documents) and Functional specs and High Level Design Documents

Involved in Preparing Technical Specification documentation documents and ETL Mapping documents (Detail Design Documents).

Worked with Sources such as Relational, Flat files, Xml’s and Web services as sources and targets.

Worked as part of migration of data from People soft to Oracle and Main frame DB2.

Created Power Exchange Registration for CDC on DB2 mainframe tables, created DataMap to extract data from file on mainframe.

Created FTP scripts to transfer fixed with flat files from Oracle to Main frame system or downstream systems which require printing checks for their employees.

Created FTP jobs to transfer files on to Tibco system to other third party systems like Open text etc..

Used Web services or web consuer transformation to pull and push data into thirdparty downstream systems opentext, Main frame systesms.

Used Informatica to extract data from Workday to staging and then loading the data to EDW data warehouse and data mart, EDW built on Netezza database and Workday HR-Payroll system.

Developed Mapping by using the Transformations like Source Qualifier, Filter, Joiner, Expression, Aggregator, Sequence generator, Lookup, Union, Update strategy and xml parser transformation.

Developed Mapplets and Re-useable Transformations.

To cleanse data at the point of entry into the Warehouse and (ODS) to ensure that the warehouse provides consistent and accurate data for the business decision making.

Used Agility workbench Netezza for faster application design and development.

Performed target load plan to load data into the various targets.

Responsible for Developing Transformation rules from source system to Data Warehouse.

Involved in dealing with the slowly changing dimensions.

Performance Tuning of the Mapping and Interacted with client team and Business users to perform analysis of business requirements and Interpreted transformation rules for all target data objects.

Responsible for preparing Unit Test Plan, Unit Test Scripts for all phases of testing unit testing, Integration testing, system testing and responsible for bug fixing after user acceptance test.

NoikaSiemensNetworks [Accenture/India]

Industry : Communications

Location : Chennai, India

Position : Senior Software Engineer

Duration : August-2008 to April-2010

Environment : Database: Oracle10g, Oracle Exa-Data, SQL Developer Tools: Informatica8.1.1and 8.6.1O/S: Windows Server 2003, Unix and Linux.

Project Scope

NSN Informatica: These systems are concerned with taking care of retail and sales analysis and distribute analysis of the products. The company is searching for business enhancement and competitive edge using information technology to make better business decision. NSN H5 has branches across Finland, Germany, Singapore and Sweden. It has Odd 17 Clients and approximately 200 end users; they all use the one common system from NSN warehouse. This will help them intelligent information reports regarding their existing business situation to analyze specifically.

ISC Opera Reporting:

This project is developed for



Contact this candidate