Post Job Free

Resume

Sign in

Data Developer

Location:
Phoenix, AZ
Posted:
April 04, 2020

Contact this candidate

Resume:

Ushadevi Chandrasekar Mob:480-***-****

****, **** ***** **** ****** Road, E-mail:adcncd@r.postjobfree.com /

Chandler, Zip-85248, AZ United States. adcncd@r.postjobfree.com

Work Authorization: Authorized to work for any US employer

PROFESSIONAL SNAPSHOTS

11+years of experience in Oracle SQL, PL-SQL Development, Data Modeling in Power Designer and Web services testing via Soap UI in Investment Banking Domain, BFSI and Oil-Gas Domain. I’ve worked as an Associate at Deutsche Bank Operations International (DBOI) Bangalore, India. Prior to which, I held the post of Module Lead at Wipro Technologies.

Extensively worked on skills like Oracle, SQL, PL-SQL, Data Modeling and Web Services.

Performed SQL tuning and indexes, partitioning and XML file processing.

Developed database objects including tables, Indexes, views, sequences, packages, synonyms, triggers, procedures, partitioning, functions and collections.

Partitioning and Purging on Existing Tables, BLOB cleanup and reclaim space.

Shell Scripting, Control-M Scheduler etc.

Handling Technical team.

PROFESSIONAL SUMMARY

Working knowledge in Agile Methodology, Software Development Life Cycle, System study, Design, Documentation, Oracle database Architecture designing, Development and Implementation.

Areas of expertise include Strong Oracle Sql, Pl/sql Performance tuning and good in UNIX Shell Scripting.

Worked for Oil and Gas Domain (Consumer Goods).

Worked for Banking (Securities And Capital Markets - Investment).

Done Performance Tuning at DBA level and SQL as per requirement.

Build entire database setup for new application (Oracle 12c) and created Development, Test, QA environment for application.

Hands on knowledge on the data modeling and data Architect.

Develop script for future issues (like Sequence generation for migration project).

Develop sql script for verifying data validations, NULL values and Master data.

Develop script for checking the integrity of database schema as per requirement Doc.

Automation of data purging as per Rules configured in system.

Deployment of backend code in non-production environment and reducing deployment time by stats refinement, tuning etc.

Data loading and report generation as on when required.

Remediation scripts to clean up, scheduled job to clean BLOB data as per retention period defined and downstream XML files generation.

Scheduling the file processing as per max Row limit definition etc.

New field integration which includes Design approval via Power Designer,Web service changes, field value storage & retrieval, Replication logic, and getting the static data approval via Autobhan ..etc.

Bulk file logic implementation to process the bulk data.

Data validations on input/output data.

ROLES & RESPONSIBILTY

Worked as database Architect, DBA, Senior Developer.

Handled Technical Team.

Understanding requirement and Effort estimation for CRs and Issues.

Impact Analysis of new changes or new functionality amendment.

Code review as per coding standards.

Responsible for creating Test Data for application.

Performance tuning via Explain Plan, STATS, Trace File.

JSON file generation using an index.

Scheduling job to process the bulk files and reclaim LOB space.

Integration of new fields and Bulk file integration.

SPECIALITY

Oracle PLSQL Programming, Design, Development, Modeling and Web service.

TECHNICAL SKILLS

Programming Language: Oracle SQL and PL-SQL

Database:Oracle9i,Oracle10g,Oracle11g,Oracle 12c

Database(Oracle)Utilities: Export and Import.

Operating Systems: Windows2000/XP,Linuxcentos5.

Tool Used: Toad, Plsql developer, SQL developer

Domain Knowledge: Banking Operation, Oil and Gas and Investment banking.

ORGANISATIONAL EXPERIENCE

QUALIFICATION

PROJECT DETAILS

DBOI GLOBAL SERVICES PRIVATE LIMITED

DBCAR-CRDS Application (Dec-17 to Nov-19):

CRDS database is the golden source of client reference data. It should provide the consistent, integral and united view of the reference (master) data across the Deutsche Bank’s internal applications and systems, and to keep the reference data in the internal system up to date and intact by providing regular updates on changes. It is to create the single view of the clients across the DB group.

CRDS being the golden source means:

CRDS is the one source for party data for all DB systems.

Business relationship data and Account data stored in CRDS along with Party data. They are joined in a flexible Data Model. That allows uniform representing of different data patterns from multiple DB Systems.

Only verified part of the Party data is stored in CRDS via it’s input interfaces. Several verification methods are used (e.g 4 eye checks, Validation Rules execution).

External customer systems use CRDS data via output interfaces, which grants data consistency within DB systems.

Technologies: Oracle 12c – Oracle SQL, PL/SQL and Power Designer

Roles and Responsibilities:

Data Modeling and PDM changes using Power designer.

Development using PLSQL Developer and testing as per agile methodology.

Web service changes using Java and testing via SoapUI.

Bulk file processing implementation/remediation, if any..

Bug fixes if any in SIT/UAT phase.

Non-production deployment activity

WIPRO TECHNOLOGIES

I) CBEC-GST (Sep-2016- Dec-2017)

India is aiming for a major reform of its indirect tax system, aimed at reducing the complexity of the prevailing indirect axes and removing the cascading effect of the federal taxes by introducing Goods and Services Tax (GST), which will subsume all major prevailing indirect taxes including VAT, Central Excise, Service Tax and Central Sales Tax (CST). GST will be a dual tax levy by both Central and State Administrations on the same base / transaction for supply of every goods and services (except for the ones that will be outside the purview of GST) and allow Input Tax Credit(ITC) across the supply chain.

Technologies: Oracle 12c – Oracle SQL & PL/SQL.

Roles and Responsibilities:

Data Modeling

Working on Logical and relational Modeling using SQL Developer Tool.

Generating DDL scripts from Relational Model and deploy it in Database.

Data Migration from different sources.

DBA and PL/SQL developer tasks.

Bug fixes.

Deployment Tasks

II) GMAS (Jan 15- Sep 16)

GMAS is a business process and IT solution to facilitate effective, consistent assessment and assurance of the risks associated with shipping and Maritime business activities, and assists Shell businesses to meet the requirements of the Control framework-Transport Manual, Maritime Safety (TM-MS). This allows a business Requestor to check asset suitability before use. Checks are risk based and will include the vessel, contract type, type of activity, port, berth and supply bases as appropriate. Suitability for use has been automated as much as possible and the system provides for submission of requests to assessors (maritime professionals) for an assurance decision as needed. Importantly the system records and documents decision for use. SOR: As data in the system is growing year after year, there is a need to implementing disposal / purge plan at record level based on the country rules and regulations. For the purpose we have partitioned (range & reference) the GMAS tables(80+) in different partitions so that the data resides in them and the partition can be dropped when required.

Technologies: Oracle 11g – Oracle SQL & PL/SQL.

Roles and Responsibilities:

Work on Sustain/ enhancement Items which is tracked through SM9tool.

Supporting Production release.

Make sure that the DDL changes are reflected in Data Model (SA) .

For any new Sustain item, implement the System of Records (SOR) logic.

Performance Tuning includes Compression, De-normalization and Modifying existing joins… etc.

Identify repeated issues, perform operations based root cause analysis (RCA) & complete documentation of problems and solutions.

Preparing document for all enhancement/Sustain items and uploading in SharePoint once UAT sign-off is done.

Give KT to support team/new joiners before production deployment.

Emergency releases in case, if any.

Give KT on GMAS database objects and SOR end to end logic to new joiners.

Guide my database team members to achieve their goals and objectives and supporting the team to come out with the desired results.

Conducting weekly offshore meeting/activities on round robin basis.

Updating the process document in case, if any change related to DB.

III) CORNING (Apr 14- Sep 14)

.

Installation of different components of Cyber Ark product test, Pre-Prod and Production environment, analysis and trouble shooting of the issues. Implementation and configuring the solution in clients test, development and production environment. Active directory integration with Cyber Ark. Password management of privileged accounts (Change, Verify, Reconcile). Creation of all necessary project documents required for implementation or any change request like installation guide, functional requirement, test scenarios, SOP etc. Involved in product Monitoring, Training, Reporting and Support etc.

Analysis of multiple privileged account access within organization and on boarding of corresponding privileged accounts in Cyber Ark.

Technologies: Privilege Password Management - PPM – Cyber Ark.

Roles and Responsibilities:

Implementation of Cyber Ark PIM 8.6 version solution for leading Manufacturing organization of US, for managing privileged accounts intheir environment for different servers like Windows,UNIX and also integrated with different tools like WInSCP, VSphere, VNC.

IV) Credit Initiation - Singapore:(Dec 12 –Sep 13)

The process of collecting credit card related information from source system, which process raw data from different regions (Poland, Hungary, Australia, Singapore, AE etc..) and validate the status of card issued, tracking spouse & dependent card details sends it to CI front end team to validate functional logic and report generation. Migration of COBOL to PL/SQL for upload and download report generation based on regions.

Technologies: Oracle 11g - PL/SQL, Shell Scripting, SQL Loader, COBOL.

Roles and Responsibilities:

Integration of new flat files using SQL Loader and output report generation as per the business requirements.

Migration of COBOL to PL/SQL for upload and download report generation based on regions.

Handling of Adhoc requests (Report history generation), and helping production team on resolving critical fixes.

Performance tuning when job takes more than the expected.

Interact with customers to understand the business requirement and giving update on all open items that we work and prioritize the tasks etc.

Lead and help team members in case if they need any help and make sure that all changes are in track and deliver them on/before expected time lines.

Tools used: UMR & Quality Center

HCL TECHNOLOGIES

I) RMIT Project - CTB:(Jan 12 –Nov-12)

The process of collecting holdings information from Global Data Hub (GDH) application, which process raw data from different regions (Asia pacific, United States, Europe) and sends it to Risk metrics for calculation of risk by portfolio managers for decision making and report generation .

It calculates average rating from different rating systems (Moody, Fitch and S&P ratings) etc.

It involves four stages:

1)Data preparation 2) Data Push 3) Feed generation 4) Receive reports from risk metrics and feed into RDM database.

Stage1: It extends required global holdings data from GDH into Risk metrics group Frankfurt (RMGF) schema at GDH side.

Stage2: Using informatica workflow concept, it pushes data from GDH schema and pushed to risk data mart database for risk calculation.

Stage3: Once calculation is done, it generates feed file in XML format (since these data is viewed in web-based application) and sends it into RM, BARRA & RC Banken etc. systems.

Stage4: Downstream generates 3-4 thousands of reports based on metadata file and sends back into RDM database for history maintenance. This report data is used by users to make decision based on previous transactions/market movement.

Technologies: Oracle 11g - PL/SQL, Shell Scripting, Control-M and Informatica.

Roles and Responsibilities:

Taking ownership of all major development like new interface building for Liquidity/new instrument feed creation and extract feed which meets the requirement.

Interact with Europe customers to understand the business requirement and giving update on all open items that we work and prioritize the tasks etc.

Handle requirement such as Price formula change / source table changes i.e. execution of change request and test case preparation and raise a ticket for release management.

Extraction of Stock, Cash and FX contract for Booked and Disposed holdings from source systems by applying some filter logic change (very rare) and addition of new instrument type business logic in View and feed file companion & getting user sign-off for the same.

Migration of source system, portfolios handling and currency conversion with the help of FX rates and weight calculation at share class.

Handling of some Adhoc requests (Seed cap weekly & Monthly load), EC holdings, Index History load etc.

Lead and help team members in case if they need any help and make sure that all changes are in track and deliver them on/before expected time lines.

Performance tuning when job takes more than the expected time around a week.

II) IDMS application (AMAS - Asset management application systems) :(Sep 08–Dec 11)

The Index Data Maintenance System (IDMS) is used to collect index data from many suppliers. The data is loaded into a common structure and validated against the GDH. The Data Management team have a web based application to view and clean the data if any discrepancies are found. Once the validation is successful, information can be distributed to several downstream business systems.

IDMS have one central index database as the data would be identical in any region or country. IDMS loads most data provided by a vendor. Vendor files are broken down and inserted into a common structure. This allows generic validation and extraction processes to run against the core data. All stages of a vendor data load are written back to a central log table.

Output: IDMS provides the index & constituents data to the target systems in the form of flat files.

Technologies: Oracle 11g (9i) - PL/SQL, Shell Scripting, Control-M.

Roles and Responsibilities:

Integrated new downstream/upstream system to send/receive new Benchmark data in single/multi-index format.

Integration of new series, indices, and files under existing series in the oracle application database called IDMS (Index Data Maintenance Systems).

Coding, Testing and rework in non-production environment and release it into production through Remedy tool.

Writing/Modifying shell scripts for performance tuning, splitting/Merging the vendor files and also written few new shell scripts to design a file according to new downstream Specification etc.

Created few Tables and set of jobs under the table in Control-M to extract and deliver files to downstream within the specified time.

Developed a piece of code to generate the daily incident reports and send it to the mangers and all the team members to notify about the current status of all the open issues in Remedy.

I was part of database migration testing (from Oracle9i to Oracle 11g), UNIX to Solaris server, Autosys to Control-M jobs migration for IDMS application from May 2010 to Sep 2010 duration.

Performance tuning when migrated from Oracle 9i to oracle 11g around six months and as and when required.

Quarterly once we have to do Partition/Purging activities for set of IDMS core tables.

Worked on below Value-adds in Index Data management systems to stabilize the application:

Written a new procedure to sync the production sequence value with that of the non-production if there is any mismatch due to manual entry.

Written a generic procedure to MAP assets that comes Via MAP files automatically (to reduce manual effort involved in asset mapping and release management).

Written new procedure to setup any single index existing downstream for new series integration.

Bloomberg Change: Implemented three logics to automate index addition/Removal in request file based on master table, removal of hard coding series_cd in put file shell scripting and to fetch vendor file availability time from oracle table instead from request file.

III) CEMS - TPRAM applications (AMAS Project): (Jan 08- Aug08)

Credit Exposure Monitoring System (CEMS) is used to monitor and report credit exposure by brokers, issuers, settlement date.

The application receives data from external feeds, GDH and also some other internal feeds.

CEMS produces daily reports, monthly and ad-hoc reports. Actuate report can be viewed and run through Actuate portal.

Technical Overview: Database: Oracle, Reporting Tool: Actuate Report Designer Professional, Scheduler: Control-M and VB script.

Credit Exposure Monitoring System (TPRAM) is used to monitor and report credit exposure by brokers, issuers, settlement date.

The application receives data from external feeds, GDH and also some other internal feeds.

TPRAM produces daily reports, monthly and ad-hoc reports. Actuate report can be viewed and run through Actuate portal.

Technical Overview: Database: Oracle, Reporting Tool: Actuate Report Designer Professional.

Technologies: PL/SQLOracle 10g, Shell Scripting, Control-M and Actuate.

Roles and Responsibilities:

Gathering requirements from users for new report development and preparing Technical/Functional Specification document to get initial sign off from requester.

Designing the Actuate report front end design using sample screen using BRD document and linking output parameters to Database tables and procedures.

Writing new/modify procedures, functions and packages to fetch data from GDH system and dump all the required data into INTRPT schema temporary tables to increase the report retrieval speed.

Resolving daily production incident report issues and sends it to the mangers and all the team members to notify about the current status of all the open issues in CEMS and TPRAM.

Testing developed reports in UAT portal to check whether all the functionalities, rework if it is required and release management after UAT sign-off.

IV) HCL Training Detail: (Oct 07- Dec 07)

I have been trained in ORACLE (SQL, PL/SQL and Forms and Reports) from 09-Oct-2007 to 09-Dec-2007in HCL Technologies.

Developed “Customer Support” project at the end of HCL training.

Roles and Responsibilities:

Database back-end design.

Table creations, Coding in PL/SQL to record and process customer complaints, and update the status of the recorded incidents.

V) Work exp & Java Training Detail: (Aug 04- Aug 05)

From August 2004 to May 2005 (10 months), I worked as a Lecturer in AdhiParasakthi Engineering College and from Jun-Aug 2005, I joined in NIIT to take JAVA course and completed the Java examination successfully.

Personal Skills and Strength:

A professional fancy to break through in new technologies with self motivation

A focus to work on latest market trends

Excellent interpersonal l& communications kills

Excellent analytic ability, strong sense of urgency & logical thinking

Professional Trainings & Ceritificaions:

Trainings attended on SQL and PLSQL programming.

Completed JAVA and J2EE training and exams in NIIT, Nungambhakkam, Chennai.

Completed Gaurdium InfoSphere Training and Certification successfully, Bangalore.

Completed CyberArk and Certification successfully, Bangalore.

Agile/BDD Training in DBOI.

SCJP Certified.

OCA Certified.

Paper Publications

“Integrated System for Image Retrieval”, presented at PeriyarManiammaicollege of engineering, Tanjore,Tamilnadu, April 2007.

“Content Based Image Retrieval”, presented at AmirthaCollege of engineering, Coimbatore, Tamilnadu, May 2007.

PERSONAL DETAIL

Gender : Female

Nationality : Indian

Passport Details : No: Z3229994 valid till 10/05/2025

Visa : L2 with EAD

Declaration:

I hereby declare that the information provided above is true to the best of my knowledge and belief.

[Ushadevi Chandrasekar]

Company

Position

Period

Location

DBOI Global Services Private Limited

Associate

From Dec-2017 to Nov-2019

Bangalore

Wipro Technologies

Module Lead

From Dec-2012 to Dec-2017

Bangalore

HCL Technologies

Sr.Software Engineer

From Oct-2007 to Nov-2012

Bangalore

Qualification/Certification

Year-Percentage

University/Institute

Location

B.Tech (Information Technology)

2004 Passed out with 84.33%

AdhiParasakthi Engineering College, Madras University.

TamilNadu, India

M.E (CSE)

2007 Passed out with 85%

SSN College of Engineering, Anna University. University Rank Holder and College gold Medal.

TamilNadu, India



Contact this candidate