Post Job Free
Sign in

Data & Digitalization Engineer / DBA Consultant

Location:
Chennai, Tamil Nadu, India
Salary:
Optinal
Posted:
March 07, 2026

Contact this candidate

Resume:

S Sreeram alias of S SRIRAM VASSUTHEV E-Mail: **********@*****.***

mobile/WhatsApp: 637-***-****

Data Base Design Analyst Consultant DATA ENGINEERING DESIGN & ANALYST CONSULTANT Data Support Engineer Data & Digitalization

Activity Summary

ACADEMIC DETAILS

2023 M.Sc Counseling & Spiritual Health, Annamalai University.

2019 LL.B (Bachelor of Law), Dr.ambedkar global law institute,

2010 B.E (Computer Science & Engineering) from Anna University (Sams College) 70%

2006 12th(Science, Maths, computer), Matriculation Board (ICF School), 55%

2004 10th(Science + Maths), Matriculation Board (ICF School), 50%

CERTIFICATE

2025 Linux for Cloud Technicians Essentials

2024 Advanced Diploma in Cloud Computing for Amazon Web Service

2023 Diploma in Amazon Web Services

2022 CS50x Introduction to Computer Science - harvard edu

2020 MRD (Medical Record Department) Certificate

2019 Oracle MySQL 5.6 Certified Professional.

2018 PG diploma program in Data science,

Oracle Database 12C Certified Professional.

2012 Oracle Database 11g Certified Professional.

2011 Oracle Database 10g Certified Professional.

Oracle 10g performance tuning.

2010 Red Hat Certified Engineer

PROFILE SUMMARY

An articulate as computer science engineer with 10.4 years of experience in IT Industry.

knowledge research and learning study of Diversity & Variant of subject in 5 years is value added to experience enhance carrier Life path.

Artificial Intelligence (AI) Of Core Development on Infrastructure Level and Develop Layer System Service of Data to Apps of IndustrialSpec.

On variant sector industry of BFS, Financial Market, PHARMA, Hotel & Tourism, Core IT Infrastructure.

Infrastructure Organization Support IT – Department Modern Public & Private Cloud, Application, Database

Infrastructure Organization Support IT – Department Traditional Application, Database

Professional Experience in Database Development/Database Analysts Dev and SIT.

Professional Experience in Database Maintains Activity on Dev,SIT,UAT and PROD.

Professional Experience in Database Oracle,MySQL,MSSQL,SYBASE,POSTGRESQL.

Professional Experience in Cloud Database AWS Dynamo DB, AWS Aurora, Google BIG Query,

Professional Experience in SQL language, SQL Scripting Language With Tool of Integrate Service Report Testing.

Professional Experience in DBMS/RDBMS Programing Language of Procedure,Function,Packages and Library utilities.

Professional Experience in Script Language Shell & Bash

Experience in Report Development on JasperSoft,Ireport,SAP Business Object (BI),Tablue,PowerBI.

Experience in System Developmet of Python,R, Scripting and Selenium Based Testing,Html and JS to validate data.

Experience in Data Integration tool and Talent,SSIS and Oracle Esuit Retail Store ETL Tool.

Experience in Cloud Computing Aws,Google,Azure Cloud Environment.

Experience in Automata/Automation Python RPI.

Experience in SPARQL and RDF Query Language Triple Store Database.

Experience in Document Structure like XML, JSON and XML language.

Experience in Capacity Planning of Database System with Storage, Memory and computation.

Experience on Data Modeling of Logical and Physical with Relation Model ER Diagram.

Experience in Data Service Level Logical,Physical,Conceptual module understand of Business and Analysis.

Experience in Service Orientation of Container & Kubernetes of Cloud and On-Premise Machine

Research & Learning in Support Service of Scaleless with Methodology in Virtualization Environment Vmware,Hyper-v

Research & Learning in Support Mirco Service of Scaleless with Methodology in Docker, podman using Kubernetes.

Research & Learning in Support Mirco Service of Database Scaleless with Methodology in Docker, podman using Kubernetes.

Research & Learning in Support Infrastructure Scaling [Horizontal vs Vertical Scaling] with Methodology Capacity Calibrating & Testing.

Organizational Experience

Career Break [Jan2023 – Till Now] IT Industry

Aspire System [July2022 – Oct 2022] IT Industry Role : Module Lead

BlazeClan Technology [Feb 2022 – July 2022] IT Industry Role : Database Designer

Infrasoft Technology [Jun 2019 – Sep 2021] IT Industry Role : Consultant Support

Astra Zeneca Pvt Ltd [Nov 2017 – Jun 2019] Pharmaceutical Industry IT Department Team Size :3 Role : Senior Consultant

TATA CONSULTANCY SERVICE [Jun 2016 – Nov 2017] Computer Department Role: System Engineer

Treselle System Pvt.Ltd [ July 2011 – Dec 2011 ]~ [Jan 2014 – Jun 2016] Software Service Role: DataEngineer/Database Analysts

Oganization Summary

Aspire System [July2022 – Oct 2022] IT Industry Role : Module support

Team : IFF Team Size : 5

Description: Creative Application

Understanding Data Variant.

Re Modify Index Structure For SQL Query.

Debug Data and Mining Data take time evaluation.

Profiling Data and SQL Query Fetching Methodology according Optimizer.

Explain Plan Understand for All SQL Query.

Characteristic understanding Data Behavior and Mining and Fetching Data from Database.

Review Indexation on Batch Processing.

Monitoring Batch Processing and find Improvement in Behavior Level, Improving while in Data Growing.

Public Cloud : AWS (Amazon Web Service)

Operating system: Amazon Linux (Centos RHEL Fedora)

Database: MySQL 5.7

Tool Data : MySQL Work Bench

Monitoring Tool Database System.

Performance Monitoring.

Index validating for SQL Query.

Tuning SQL Query Using Index.

BlazeClan Technology [Feb 2022 – July 2022] IT Industry Role : Database Designer Remote Work

Team : WZRDRY Team Size : 15

Description: CRYPTO CURRENCY based project on digital platform and option trading platform.

Design Schema and Role.

Define Schema and Data Modelling, Entity-Relationship as per Sprint Requirement.

User Creation for Application For Dev and UAT.

Data Dumping File from Dev to UAT as per release.

Forcing a query to use an index

Determining disk usage

Finding out what queries users are currently running

Getting the execution plan for a statement

Logging slow statements

Collecting statistics

Monitoring database load

Finding blocking sessions

Table access statistics

Finding unused indexes

Checking active sessions

Public Cloud : AWS (Amazon Web Service)

Operating system: Amazon Linux (Centos RHEL Fedora)

Database: PostgreSQL 13.4

Script / Language: Power Shell & Batch Program

Tool Data : PGADMIN-4, PGBOUNCE, PGDUMP, DRAWIO (DataModeling)

Backend Tool : Java – Hibernet, Microservice

FrontEnd: Rect JS

Identities Method: Amazon Cognito

Involving in Day to Day Scrum Call.

Design Table and Structure according to Sprint Release.

Infrasoft Technology [Jun 2019 – Sep 2021] IT Industry Role : Consultant Support

Client : BNP Paribas

Team : Profit and Loss (PNL) Team Size : 7

Description: Profit and Loss team is an Day to Day regulate calculus on Financial platform on the Daily basis. Plato Is application for PNL. Plato is a global Profit and loss application to calculate the official Pnl for Middle Office users. Feeds dependence is key player to PNL, Calibrate the Profit and Loss, computed and stored and provide the reporting services to the Middle Office Users and Other depender.

• Involving in Day to Day Plato Monitoring Jobs.

• Missing Feeds & enquiry related to Feeds from Upstream team.

• Create Incident related to missed file and follow up with team. Feeds are Key player in PNL.Updating Job Status regularly.User Request Account unlock and Password Reset for Plato Application

• Checking the performance of Job and consume time taken for processing in system due to huge data consumption on XML feeds and resolve to relatively using method of clipping.

•Write Sybase SQL Query to validate data accordingly requested from User enquiry like Missing data and response to user with result set as like existing in system in Excel Sheet.

Daily publish data from Euro To U.S as Physical data Moving using Plato Position Display Tool.

•Publish data between server utilizing internal transmission data tools between service of Message Broker,Report Minor,Report Major,BCP interface,OS interface and report Extract.

•Checking failed job logs and identified the issues and update to team member and to resolve job and rerun it.

•Revers checking between given PV value or Cash values to respective file name for given COB date requested by Users and Cross validate between the Database and File Feed and intimate to upstream team or respective user.

Operating system: Windows Server 2008

Database: Sybase ASE 16.0 + Sybase IQ 16.0

Script / Language: Power Shell & Batch Program, Perl, DotNet Famework 4.7, Visual Basic

Tool Data : ASE ISQL, DBviewer 6.0, Plato Monitor, Plato Position Display

ETL Server : Informatica Powercenter 10

Reporting Tool : SAP Business Object (BI)

Developemnt: T-SQL Sybase

MiddleWare : IBM MQ 7.5

Infrasoft Technology [Jun 2019 – Sep 2021] IT Industry Role : Consultant Support

Client : BNP Paribas

Team : Wealth management Team Team Size : 18

Description: Wealth management Team is an investment-advisory discipline which incorporates financial planning, investment portfolio management and a number of aggregated financial services offered by a complex mix of asset managers, custodial banks, retail banks, financial planners and others. There is no equivalent of a stock exchange to consolidate the allocation of investments and promulgate fund pricing and as such it is considered a fragmented and decentralised industry.

• Migrating Application from Sybase ASE 15.7 to Oracle 12C Database as PDB

Environment SIT & UAT.

• Export and Import sharing using Lanceur tool.

• Validating manual deployment

• Managing the service related activity on the Application support.

SQL script and K-Shell Scripting deployment of Application.

• Checking Validation step of Application migration from Sybase ASE 15.7 to Oracle 12C.

• Validating the Schema Check and Evaluate Similarity checklist.

• Design the Component/Process based on Deployment of Serena Tool.

• Understanding the Manual deployment and Designs the Process & Component of Serena tool on SIT and UAT environment.

• Understanding the Machine architecture, Source Code Language with Compiler, Interpreter.

• Deploy the Compiler, Interpreter source code package into Application server Via Jenkins and Kubernate.

Operating system: AIX,RedHat

Database: Oracle 11g, Mysql 5.0,Sybase ASE15.7

Script / Language: Shell Script Programing

Tool Data : Sql Developer, Serena,Lanceur,

Developemnt: SQL, PL/SQL (Oracle). SQL,Lanceur

MiddleWare : Tomecat,WebSphere

Involving in Deployment using Automation Tool Serena.

• Handling Deployment of Application based on Requirement

• Manual Deployment on Database on Oracle 12C.

• Manual Deployment without Tool

• Database Data Migration using Oracle Utility tool to different

Astra Zeneca Pvt Ltd [Nov 2017 – Jun 2019] Pharmaceutical Industry IT Department Team Size :3 Role : Senior Consultant

Description: Enabling Units, Global Product & Portfolio Strategy, and Global Medical Affairs (EUGG) IT actively partners with its business stakeholders to deliver cutting-edge IT capabilities to deliver on AstraZeneca’s bold ambition, delivering life changing therapies to its patients. The ambition of EUGG IT is to provide world-class expertise and leadership in technological solutions, BI Analytics, and Information Delivery to our Finance, HR, Legal, Compliance, Corporate Affairs, GMA and GPPS business functions. The EUGG IT Team plays a critical role in bringing AstraZeneca’s exciting product pipeline to patients across the globe. This team engages with some of the most exciting business areas within AstraZeneca and is faced with solving for a variety of complex business and IT challenges.

Operating system: Linux Windows

Database : ORACLE 12C MySQL 5.7 Postgres RedShift Allegrograph SPARQL SQL PLSQL

Script : Unix Bash/Shell script/Window Power Shell

Cloud System :AWS – EC2 Linux & Windows

Integration Tool : Talend Data Integration Tool

Application : ODS HRA CI Portal

DBA Activity :

•Provision the user and granting the privilege.

•Create database as per requirement.

•Create table and rebuilding the tables and manage the structure in git repository.

•Find the locking session.

•Adding the index and tuning the performance query.

•Performance troubleshooting checklist.

•Access request forms.

•Database Inventory.

•Documentation.

•Procedure for the database changes and code promotion from one environment to another.

•Understand the data modeling the on system ODS,HRA and CI.

•Evolution the existing system to understand and develop and enhance activity.

Talend Data Intergration:

•Experience Design Jobs using the Talend Component.

•Experience Component understand and Implementing.

•Experience and understanding the Database Component.

•Experience Design SQL Query with Different End Source.

•Experience with Mapping and Processing.

•Experience in Context Parameter to Design Jobs.

•Experience in Component on XML,JSON and Difference Sources.

•Experience and configure the Jobs on TAC server and testing activity.

CI Portal Application Development:

• Working on the competitive Intelligence database and understanding the dataset of financial terms.

• Gathering the requirement and analysis the requirement.

• Maintaining the source and target table mapping. Understanding the Pharama related product and then relate the data with financial.

• Design the Jobs and data loading processing from different source system to destination system.

• Validating the data on the pharama data using SPARQL as per Business analysts.

• Understanding the Application Fields and related data table fields.

• Reverse engineering the database data modeling. Write sparql query as per tag defined in the system and analysis the data.

• Understanding the PL/SQL coding of the CI portal data refresh.

TATA CONSULTANCY SERVICE [Jun 2016 – Nov 2017] Computer Department Role: System Engineer

Client: Ontario Teachers' Pension Plan (OTPP), Canada [Jan 2017 – Nov 2017] Banking & Financial Service Team Size : 12

Description: OTPP is an independent organization responsible for administering defined-benefit pensions for school teachers of the Canadian province of Ontario. Ontario Teachers' also invests the plan's pension fund, and is one of the world’s largest institutional investors. The plan is a multi-employer pension plan, jointly sponsored by the Government of Ontario and the Ontario Teachers' Federation.

Operating System: Red Hat Linux Windows Server IBM Z/OS HP-Unix

Database: ORACLE 11g Database (5 node RAC) MySQL 5.7 MSSQL Postgres Redis IBM Universal DB2 COBOL

Notes: instead JCL Used Automic appworx (AutoSys Tool)

Script: Unix Bash/Shell script/Window Power Shell

Tool: Oracle 12C OEM MySQL Enterprise Monitor (MEM)

Integration Tool:Oracle 11G Stream(Heterogeneous)-ODBC-MSSQL DELPHIX (Automation Database Refresh) Oracle Advance queue SSIS.

Application:Enterprise Database (EDB) Enterprise Data Warehouse(EDW) Liferay Application

MySQL Activities:

•Installing & Configuring MySQL 5.7 Server & MYSQL Router with Innodb Cluster.

•Master & Slave Setup replication.

•Replication Master to Master Setup.

•Checking the enrollment issues of the user Creation and Ensure security of table, view and data.

•Monitoring performance of the SQL Statements and interpreter Query and tuning query.

•Drop User from Enterprise System.

•Configure External Authentication setup to MySQL.

•Installing & Configuring MySQL Enterprise Monitoring (MEM).

•Installing & Configuring Query Analyzer on MySQL Enterprise Monitoring (MEM).

•Configuring alter and notification on MySQL Enterprise Monitoring (MEM).

•Deploy the changes in the SQL script on Production/UAT/SIT/TEST/DEV.

•User provision on MySQL for Production/UAT/SIT/TEST/DEV.

•Granting the privilege as per the business requirement.

•Automating the Service start Stop during OS patch.

ORACLE Activities:

•User provision Oracle EDB EDW Production/UAT/SIT/TEST/DEV.

•Database replication in Distribution Database system (ORACLE 11g STREAM Setup).

•Checking the enrollment issues of the user Creation and Ensure security of table, view and data

•Monitoring performance of the SQL Statements and interpreter Query and tuning query

•Monitoring and optimizing the performance of the database

•Drop User from Enterprise System.

•Configure Oracle 11g/12c for Enterprise database.

•Granting the privilege as per the business requirement.

•Deploy the changes in the SQL script on Production/UAT/SIT/TEST/DEV.

•Experience in Oracle 11g RAC environment

Client: Masters Home Improvement - Woolworths, Australia [Jun 2016 – Dec 2016] Retail Team Size : 3

Description: It is the trading name of an Australian home improvement chain operated by Retailer Woolworths Limited. Masters was created as a way for Woolworth s Limited to enter the hardware retail Space. Total 62 stores running for Masters around Australia support for Hardware and Home timber material to Customer.

•Responsible for writing and optimizing in-application SQL statements

•Monitoring performance of the SQL Statements and interpreter Query and tuning query

•Accountable for using SQL – Cleansing the Data, Profiling the data and finding the cardinality of the data.

•Efficiently understanding degraded performance of SQL statement and modified SQL Statement for performance tuning and testing in application.

•Checking the enrollment issues of the user Creation and Ensure security of table, view and data

•Monitoring and optimizing the performance of the database

•Analysis the Data and removing the duplication of record on daily basis.

•Verify code deployment and implementing the changes Services on Development & Production.

•Writing & Maintaining Procedure Oriented Program using Procedure, Functions & Triggers using Control Statement.

Writing & Maintaining Shell and Perl scripts for automating the system

•Working on Retail Phase Jobs from Phase 1 to Phase 6

Operating system: HP-UNIX

Database: ORACLE 11G SQL PLSQL

Script: Unix Bash/Shell script

Tool Data :SqlDeveloper

Integration Tool: Automic appworx

Application: Oracle Retail 13.1,OBIEE 11g,IBM MQ

Treselle System Pvt.Ltd [ July 2011 – Dec 2011 ]~ [Jan 2014 – Jun 2016] Software Service Role: DataEngineer/Database Analysts

Client: DISCERN, US [Jan 2014 – Jun 2016] BFS (Banking & Financial Service)

Description: DISCERN Investment Analytics, Inc. is an equity research firm. The firm seeks to cover specialty pharmaceuticals, financial services, energy, and real estate investment trust sectors. It offers investment research blended with contextual foresight to identify emerging markets, key trends and opportunity areas over the long-term horizon.The firm powerful information analytics tools and makes use of proprietary data streams to conduct its research

Cloud System: Amazon RDS Instance

Database:Oracle, MySql, Mongo DB and Redshift

Integration Tool:Talend Data Integration (ETL)

Script / Language:(Linux/Unix) Bash/Shell script/Perl script/COBOL/Pig/Java/COBOL/EBCIDIC/COPYBOOK/Mainframe/ IBM Z/OS

Tool Data:DbVisualizer

Automata/Automation:

Write Shell/Perl script to create daemon (process) for automating data loading processing on level wise data loading from different datasets on different source system service to destiny.

Write Shell/Perl script to read file from the ftp, s3, http and other service server and Loading data into database.

Analysis the data and understand the conceptual, logical & physical data models of dataset source.

Gather data from different end source using command curl,wget and xmlstarlet.

Understand the URL and UML of the structure.

Analysis the text using command line in linux.

Database Management System:

Analysis the requirement as per the give datasets from with different data source of structure, unstructured and semi structure data with defined and undefined schemas.

Reviewing a datasets accounting and processing methods and ensuring that the applications and data assets run sufficient and effectively.

Determine the keys and identified undefined the schema structure to define the structure and processing/Computing the data.

Organize the integration of data and migration activities to enhance operating efficiency.

Design the tables, index, view, Primary key, Reference key and utilize other logical component of the DBMS/RDBMS to implement the design of database for business requirement.

Write SQL statement on validating data on loaded datasets.

Develop the Queries as per the business requirement.

Analysis the query performance, checking the data and tuning query to improve retrieval process of the data from DBMS.

Writing & Maintaining Procedure Oriented Program using Procedure, Functions & Triggers using Control Statement as per the business logic.

Identify the performance issues and converting the charter based computation to Numeric based computer to improve the performance of query to speed the process.

Understanding the pig script processing/Computation and design/write Store Procedure with same functionality archive in RDBMS.

Comparing the case study of scripting computing and database computing.

Public Cloud Computing:

Create and Maintain Amazon RDS instance (Oracle & MySql) using Amazon web service, Amazon API or CLI.

Create and Maintain Amazon EC2 instance using Amazon web service, Amazon API or CLI.

Working with Amazon (Simple Queue Service) and manage the Workflow process using shell script.

Create and define Amazon VPC network using Amazon web service, Amazon API or CLI.

Create access key for EC2 Instance and accessing system.

Type Datasource handled: Json,Xml,CSV,tab and Flat file.

Type Datasource System handled : Oracle,MySQL,Sybase,postgres,MSSQL

Type Datasource System handled (No-SQL): Mongodb

Data Conversion & computation:

Cobol Ebcdic data file and Copybook understanding and converting Ebcdic to Ascii/CSV/Tab Delimiter data.

Converting the Cobol ebcdic file conversation to Ascii, and Load data into Oracle Database.

Analysis the text data using linux shell script with Text Processing tools.

Data Integration:

Worked on Talend Data Integrations Processing using Data transformation Between Oracle,MySql and MSSQL.

Develop the Jobs in Talend Data Integration Tool.

Loading Data from different data source end like Ftpserver,webservice,excel file, source database to Destination end like Database and service end point.

Integrating the difference Source end to Destination end using Talend Tool.

Writing the Necessary Stored Routine, PLSQL, SQL and (Linux/Unix) Shell Scripting.

Writing the Talend Jobs for Data processing from Oracle Target DB to Mysql Database.

Configure system:

Configure the Control DB and Target DB using Xpressfeed Application Server on Linux engine.

Configure the Control DB using Oracle Advance Queue management in Xpressfeed Application Server.

Configure the Traget DB using Xpressfeed Application Server.

Data Validate :

Write Python automating validating data processing on level wise dataloading from different datasets on different source system service.

Compare the data value and testing data value using Shell/Perl/Python language.

Computation Processes:

Determine and interpreting the computation process, to identify the performance of computation between character & numeric data computing.

Compare the computation between numeric and character. Analysis the computation & checking performance between symbolic systems processing of data loading.

Analysis and Understanding the pattern of core computation processing of the system in the data processing end.

Capacity Planning :

Understanding the business life time and design and planning the capacity size of Storage, memory and computation (CPU).

Understanding the Business requirement and planning the capacity of the database size and computation process (in Hertz) based on the Data.

Client : American Tours International, US [ Jan 2012 – Nov 2012 ] Travel and Tourism Database Analysts

Description: AmericanTours International takes the concept of destination management to a whole new level. As the only American-owned inbound tour operator in North America we are not just handling your travel arrangements, we are welcoming you to our home. So for first time visitors or returning guests, let ATI open North America to you..

•Modify column design using Erwin.

•Reverse Engineering Physical and Logical of the data model using Erwin.

•Review and model the data as per the input given by analysts with spec using Erwin tool.

Report Design:

Design the Variable, Parameter, Field, and Expression on The Report.

Design the Datasets and Sub datasets

Working with main and sub reports using SQL Query with parent and child query.

Creating the queries and define the parameter as input to report.

Configure and design the report according to Content width and height with representation of data arrangement.

Working with tables and charts for design the report Charts like Pie Chart,XY – Line chart, Bar Chart and Linear Chart.

Publish report to BI Server and testing the report.

Configure Input parameter on report.

Deploy the report to production server.

Upgrading Database:

Upgrade the database from Oracle 8i to 11G using exp and imp traditional method.

Update the metadata system of 8i to 11G system.

Operating system: Windows 2008 & Linux (Red Hat)

Database: Oracle, Sybase, T-sql programing

Script / Language: Shell /Perl Script Programing

Tool Data: DbVisualizer Erwin (CA technology)

Reporting Tool:Ireport

Service Server:JasperReports Server (BI Server - Apache Tomcat)

Query Design :

•Migrate the SQR report to JasperReports.

oDeveloped SQR report s is analyzed and understood.

oDesign the SQL Query as per SQR Report.

oDevelop the Jasper Reports as per SQR report.

o Cross Check between SQR Reports and Jasper Report, Validating and Verifying data.

•Modified the database structure, as necessary, from information given by application developers.

•Developed Quires and checked the performance & Modified Existing queries in Reports.

•Wrote & Maintained Procedure Oriented Program using Procedure, Functions & Triggers using Control Statement.

•Managed and maintained the Sybase database on domain of travel and tourism package management

Client : San Diego Visitors Bureau, US [ July 2011 – Dec 2011 ] Travel and Tourism Database Development

Description: The San Diego Tourism Authority is a private, non-profit corporation composed of approximately 1,100 member organizations, businesses, local governments, and individuals seeking a better community through the visitor industry. SDTA members include tourism-related entities in such categories as lodging, dining, arts & attractions, shopping, and transportation, among others, as well as other companies indirectly involved in the visitor industry.

Operating system:Windows, Linux

Database: Oracle 11g, Mysql 5.0, PostgreSQL 8.0

Script / Language: Shell Script Programing

Tool Data : Navigator

Developemnt: SQL, PL/SQL (Oracle). SQL,PL/ pgSQL(PostgreSQL)

•Database replication in Distribution Database system (ORACLE 11g STREAM Setup).

•Enrolling users and maintaining system security.

•Controlling and monitoring user access to the database.

•Data conversion, Data migration and Data transformation from PostgreSQL Database to ORACLE Database.

Personal Detail:

Name : Sriram Vassuthev.S

Father’s Name : SUBRAMANI.K

Date of Birth : 24-09-1988.

Sex : Male.

Marital Status : Single.

Nationality : Indian.

Language Known : English, Tamil

Place: Chennai,India

Date: Sriram Vassuthev.S



Contact this candidate