SENIOR INFORMATICA ENGINEER
Summary:
* Around Thirteen years of experience in the IT industry with Eleven years
of progressive hands-on experience in Data warehousing and ETL processes
using Informatica.
* Exposure to Informatica B2B Data Exchange that allows to support the
expanding diversity of customers and partners and their data with
capabilities that surpass the usual B2B solutions
* Worked with the B2B Operation console to create the partners, configure
the Partner management, Event Monitors and the Events
* Exposure to Informatica B2B Data Transformation that supports
transformation of structured, unstructured, and semi-structured data
types while complying with the multiple standards which govern the data
formats.
* Used the Data Transformation Studio 4.0 to create scripts that convert
complex data formats, into standard ones and then integrate to
Informatica's Power Center platform, using the UDO transformation.
* Defined JMS queues managed by ActiveMQ which is pipe-line for the data
between DX and PowerCenter.
* Defined the new queues in QueueAdmin.properties in DX, which are then
updated in the JNDI bindings file.
* Designed and developed efficient Error handling methods and implemented
throughout the mappings in various projects.
* Analytical and Technical aptitude with ability to work in a fast paced,
challenging environment and keen to learn new technologies.
* Proficient in understanding business processes/requirements and
translating them into technical requirement specifications.
* Excellent interpersonal and communication skills, technically competent
and result-oriented with problem solving skills and ability to work
effectively as a team member as well as independently.
* Developed effective working relationships with client team to understand
support requirements, develop tactical and strategic plans to implement
technology solutions and effectively manage client expectations.
* Very strong mix of Academic and Professional experience in the related
area demonstrated through the implementation of successful projects and
diverse roles and can cope well in a high intensity work environment.
Technical Skills:
Databases Teradata, Oracle 11g/10g/9i/8i, MS Access 2000, MS SQL
Server 7.0, DB2
ETL Tools Informatica B2B Data Exchange, B2B Data Transformation,
PowerCenter 8.x.x/7.1.3/6.2
Data Modeling Star-Schema Modeling, Snowflakes Modeling
Languages PL/SQL, HTML, DHTML, XML, Java
Tools and Servers Toad 9.6.1.1, Data Transformation Studio 4.0, MS Access
Reports, Microsoft Visual InterDev 6.0/1.0, MS FrontPage
98, SQL*Plus, SQL*Loader, FTP
Operating Systems Windows 95/ 98/NT/2000, XP, Linux, UNIX
Cisco Systems Inc, CA August'
13 - Till Date
EDW: ELT to ETL Conversion Track
This is an Architectural change to enable utilization of ETL capability of
Informatica and hence reducing the load on Teradata servers & saving CPU
time. Existing ETLs are modified to use ETL approach instead of using PDO &
utilize Informatica server to do the transformations.
Responsibilities:
. Analysis and identifying the list of all the tables across the EDW
based on the complexity and data volume to shortlist the eligible ETLs
(using the complexity Calculator document) for conversion from ELT to
ETL logic.
. Current code analysis where the ELT logic (Teradata queries) and
pushdown optimization is used in the informatica mappings and
workflows.
. Created the design document, including the high level design flow
diagrams.
. Implemented the new ETL logic for converting the existing ELT logic;
made use of the dynamic, persistant properties of lookup transformation
wherever applicable for building the informatica caches.
. Implemented the new feature of Native MERGE INTO in Informatica 9.5.1
for the SCD type-1 logic for the better performance.
. Parameterized all the session level Relational, Loader connections and
the filenames, filepaths. Defined the entries for these in the job
control table, these are used by the script that invokes infa WF.
. Enabled the concurrent execution (only with the unique instance name
property at WF level) for the reporting WFs to load the reporting
tables concurrently into 2 different databases (PROD1 and PROD2).
. The script to invoke the infa WFs are defined in the $U (scheduling
tool) jobs.
. Extend the support to the QA testing, creating the deployment documents
and ensuring the successful code deployment in the Prod environment.
. Extending the QA and PROD support for the daily cycles to fix issues
(if any).
. Identifying any bugs in the existing logic while doing the analysis,
and coordinating with the SME & Operation teams, and accordingly raise
the CRs in the defect tracking tool.
. Tracking all the issues encountered in the HP quality center, for each
phase/release, and identifying the lessons learnt to improve the
quality of code in the next phases/releases.
Environment: Informatica Power Center v9.5.1, Oracle 11 g, Teradata SQL
Assistant 14.0.1, SQL, PL-SQL, Microsoft Visio, Dollar U, Toad, Unix, Perl,
HP Quality center, PVCS, Kintana deployment tool.
Kaiser Permanente, Pleasanton, CA
August' 11 - July' 13
Claims Data Warehouse
To provide a consistent industry standard view of claims data to all
regions for reporting, analytical and non-real-time operational functions.
Responsibilities:
. Requirements analysis and converting the specifications to the
Technical design documentation.
. Created the Data flow diagrams for the high level design; Source data
analysis and identifying the lineage of the data elements (as a part of
the detailed design) to load the staging and reporting tables.
. Worked on the Change Requests for the scheduled release time frames.
. Involve in the design decision reviews for the implementation of the
ETL process for the CA region from different version of the Xcelys
(Oracle db) and GA Tapestry: Clarity (Teradata db) source systems
. Implement Regionalization logic in the same ETL code base though the
business rules are different for ROC (Regions other than CA) versus CA
regions. Created and used Persistent lookup cache on a table
(XL_CDW_RGN_IDENTIFIER) that requires to be looked up across various
sessions to load all the staging tables to determine the REGION_CODE
(NC/SC) based on the SOURCE_SYS_CODE (CA / ROC).
. Implemented the conditional logic at the workflow level, based on the
Business rules, to look up the Persistent cache for processing the
multiple cycles per day on incremental fashion, and to process the next
day cycle by dropping and recreating the PC. (The table is truncated
before staring the next day's 1st cycle).
. Responsible for handling all the financial data extracts (General
Ledger, Accounts Payable, and so on) from source to stage and stage to
reporting tables.
. Implementation changes to fix the workflows per the CRs, Eg: (a) to
convert from regionalization logic to the shared logic (Eg: certain
finance VENDOR tables) (b) To convert from Type1 to Type2; and then
historical fix to the existing data either using SQL scripts or one-
time mappings (ETL).
. Eg: While the source system has the Vendor data for CO and HI regions,
the CDW has only the CO region:
CR: Certain set of Vendor tables were identified to have shared logic
across all the regions: This requires the Historical data extraction
exclusively for the HI region from the source till the CR release date.
Solution in Informatica: Created one-time mappings to all such
applicable vendor tables: (a) Remove smart code logic per business
rules (b) used joiner transformation with Master Outer Join on Source
(Detail) and Stage (Master) tables; Insert into stage table using a
Filter transformation to populate data based on ISNULL (STG_TABLE_PK)
(c) Flag such records to identify easily.
. Identified and fixed the lookup transformations in certain mappings for
only the incremental dataset, to achieve a better performance, which
reduced the WF run time.
. Upgrade and maintain the consistency of the CDW table structures and
the ETL code for every Version & Release upgrade to the tables on the
Xcelys Source system side, and released to Prod on planned dates.
. Create Informatica deployment groups, Service Requests (Remedy tool) to
migrate the code to QA/ UAT/ PROD environments.
. Creating the parameter file based on the configurations
(Regional_Subject_Area, Line_Nbr, Param_Line) set in the
ETL_SESSION_PARAMETER table and defining all parameters at the workflow
level.
. Parameterize the mappings across the regions for loading the stage,
dim, fact tables.
. Enabled concurrent execution of the workflows to allow the parallel
execution of the same set of sessions pointing to different parameter
files to run cycles for each of the regions.
. Worked with the reusable transformations, mapplets, mappings, sessions,
worklets, workflows.
. Creating the Unit test case document with the test results and
screenshots.
. Managing the cycle loads in the QA, UAT environments on the daily basis
to ensure the data reconciles for the counts and amounts: across the
source to staging and staging to reporting levels for all the claims
and finance, shared (across all regions) tables.
. Worked on fixing the emergency RFCs by identifying the issues and
testing: before promoting to Prod.
. Worked with HP Quality center tool to identify the assigned defects,
and update the progress.
Environment: Informatica Power Center v8.6.1, Oracle 11 g, Teradata SQL
Assistant 14.0.1, SQL, PL-SQL, Microsoft Visio, Tivoli, SQL Developer,
Remedy tool (for creating SRs).
Bank of the West (BOW), San Ramon, CA
March 11 - July'11
Projects: RMS, TSYS Consumer
BOW developed an initiative to create a centralized DWH hub and the
necessary environment to support it (for regulatory compliance of Basel II
accord). A single repository of data to accommodate modeling, analytical,
reporting and management needs to satisfy BOW's Basel II regulatory
requirements.
Recovery Management system (RMS) - is to manage the Account recovery
transactions.
TSYS Consumer Credit Card is maintained by Total System Services in
Columbus, GA. A consumer credit card account is an unsecured, revolving
line of credit with a pre-authorized open-ended dollar limit granted to
creditworthy customers.
Responsibilities:
. Derived the Technical specifications document, based on the Functional
specifications.
. Worked with B2B Data Transformation to split the cobol source
files(Account Master, Account History, Account Transactions, Customer
Master, Score Master), and deployed the projects to the ServiceDB in the
server.
. Used CM_console command to split the original file into multiple files,
using the DT service name.
. Worked with Informatica Power Center to load the data into various stages
till it gets loaded into the core db.
. Loaded the data into delimited flat files for providing the TSYS data to
the Financing team
. Written the Views in Teradata database to handle the Change Data Capture
mechanism
. Used Pushdown Optimization technique in Informatica that enables faster
processing of the huge data loads
. Used MLOAD utility for the loader connections to load the cobol source
data into the Teradata database.
. Used Indirect file loading method to process the data catch ups for the
Recovery Management System
. Used parameter files widely at every layer of the data loads (SRC (B2B (
STG ( WRK ( PRE-CORE ( CORE), and avoided any hardcoding in the power
center workflows.
. Unit Testing in DEV, Prepare the migration checklist documentation,
QA/UAT Support, Production support.
Environment: B2B Data Transformation Studio 8.6.2, Informatica Power Center
v8.6.1, Teradata SQL Assistant 12,Windows 2000, ANSI SQL, UNIX Scripting,
PVCS
Cisco Systems Inc, San Jose, CA August 10
- Feb 11
Project : TL-9000 Certification (Re-joined the same project - worked
earlier)
As per the new version of the TL Hand book, the counting rules for one of
the metrics: SFQ (Software Fix Quality) are changed to the extent that the
whole design and development is to be re-built.
Developer Role:
. Derived the SDS document based on the Functional specifications.
. Developing, maintaining, and promoting the database tables and index
creation scripts.
. Developing mapping documents from QDI source to different data warehouses
based on the requirements.
. Worked as an Informatica administrator to create the development folders,
users and backing up repository etc.
. Developing Informatica workflows and perform the unit testing for the
developed mappings.
. Filling release management templates.
. Integrating all the mappings, sessions and created workflows in the
application folder and used $U scheduling tool to set the dependencies
between the workflows.
. Migrating Informatica mappings, sessions and WFs from development
environment to beta and production.
Technical and Application Production Support:
. Monitoring, working on problem tickets resolution, being on call for
production support on a rotational basis.
. Providing ongoing support for service view applications at agreed service
levels.
. Analyzing source system data, and working with source team to resolve any
issues.
. Interacting with system administrators, DBAs and other teams to
troubleshoot application performance issues.
. Analyzing long running SQL's and tuning them using Explain plans,
creating required indexes, hints etc.,
. Requirements gathering coordinate implementation of requested
enhancements for the applications.
. Develop test plan and scripts, conduct testing, and dealing with business
partners to conduct end-user acceptance testing.
. Ensuring that all production changes are processed according to release
management policies and procedures.
. Ensuring that appropriate levels of quality assurance have been met for
all new and existing applications / CRs.
. Ensuring that application changes are fully documented, supportable.
. Proactively identifying opportunities for change within the production
environment.
Environment: Informatica Power Center v8.6.1, Oracle 10g, Windows 2000,
Oracle PL/SQL, SQL Loader, UNIX Scripting, $U, Kintana package for the
production deployment, PVCS
VMware Inc, Palo Alto, CA
Jan 10 - July 10 Project: Everest BI
Salesforce.com system - used by entire Sales Organization of VMware to
manage customers, potential customers, track opportunities, contacts and
activities. This project - To extract the data from salesforce.com to the
EDW system (Keystone). The data is then loaded into the Sales BI and
Marketing data marts which are the basis for the Reporting.
Responsibilities
. Involved in the design and development of the Keystone Phase III to
extract the data from SFDC source system into the EDW database and then
to the Sales BI and the Marketing data marts.
. Developing the mapping documents and the ETL logic for loading into the
global data warehouse
. Worked with Salesforce UI, Apex explorer and Apex data loader.
. Creating Informatica mappings to populate the data into dimensions and
facts using various transformations.
. Handled the historical data loads into the Type2 dimensions facts, with
the Type1 dimensions and facts as the sources.
. Handled the logical deletes in the Type1 and the Type2 dimension and fact
tables.
. Implemented the re-startablility in the Type 2 ETL logic by including the
PROCESS_FLAG attribute in the type 1 Dim/FACT table that gets updated to
Y whenever the record gets processed; and by choosing the Source-based
commit at the session level. This attribute is reset to null in the Type
1 mapping - whenever a source row passes through the UPDATE flow.
. Avoid hard coding at the session/workflow level, parameterized and
defined the variables at the database level in a table for the minimal/no
changes while promoting the code across different environments like
QA/UAT/Production. Then with INFA mapping these values are used to
generate the parameter file.
. Worked on the condensation of the historical data in the Siebel system
(that is not handled by the Conversion process into the salesforce.com
system) - identifying the proposed set of the critical columns of Siebel
system, and then migrated to the EDW database directly.
. Documented Informatica ETL design & Mappings and Prepared the Unit Test
cases documents.
. Created and monitored workflow tasks and sessions using Informatica power
center.
. Used the workflow tasks such as sessions, email, command, decision, event
wait.
. Identified the tracks where the performance was poor and worked to tune
such mappings and workflows.
. Used Informatica scheduler, to set the dependencies between the jobs.
. Involved in creating Unix Shell Scripts for tasks like ftp, moving and
archiving files.
. Created Unix Shell scripts to call Informatica workflows using the PMCMD
command.
. Maintained an exclusive folder called Workflow Schedule in the
Informatica repository for scheduling the dependencies of the workflows,
and invoked each of the workflows from the command tasks.
Environment: Informatica Power Center v8.6.1,Salesforce.com UI, Apex
Explorer, Apex Data loader, Oracle 10g, Erwin, PVCS, Windows 2000, Oracle
PL/SQL, SQL Server 2000, UNIX Scripting
Aspen Marketing Services, Sandiego, CA
April 09 - Dec 09
Aspen Marketing Services is the fifth largest marketing services firm in
the United States and this is the nation's largest privately-held marketing
services agency. Aspen Portal is available for thousands of dealerships at
www.aspenmsportal.com.
Responsibilities
. Participated in the design and prepared the Technical documents based on
the Business Requirements Specifications.
. Worked on several applications in this automotive domain such as E-
Strike, Welcome Point, Smart touch 2.0., Enterprise reporting for ST2.0
customers.
. Handling the data feeds for General Motors such as Integralink, Maritz
for Nissan and Infiniti of North America to load the staging tables
related to the vehicle inventory, parts inventory, vehicle sales, service
repair order, Customer layout, Sales transactions, customer & email
suppressions, Hand raisers.
. Worked as the B2B DX Operator to define and configure the Partners and
create the Profiles
. Responsible for the DX Administrator tasks such as - Creating the
Application & Template, Defining JMS & JNDI connections, Creating Queues,
Defining End-Points - composite (scdl) file
. Used Data Transformation Studio 4.0 to simplify the complex data
transformations by visualizing the data and converting them into standard
ones.
. Worked with the new transformations of the Informatica 8.x such as
Unstructured data transformation, Java transformation, SQL transformation
. As a Flow Designer, developed the ETL mappings & Workflows using
Informatica Power Center
. Worked as the System Administrator to configure and administer Data
Exchange (DX) and PowerCenter
. As the Power center Administrator, Participated to Create the users,
grant privileges
. Responsible to create ordinary and reusable informatica folders and
manage privileges to the users.
. Creating users, groups in informatica Administration console for
informatica domain and Repository.
. Defined the JMS queues managed by ActiveMQ which acts as the means for
the data that passes between DX and PowerCenter
. Responsible to bounce and restart the DX server, the Active MQ, the Power
Center - as and when required.
. Configured the xeruntime.composite file to set the services (input
landing zone) for the flow templates.
. Extensive use of Command tasks, Decision tasks, email tasks in the
workflow design.
. Responsible in Migrating the Developed Components from Dev Environment to
QA and Prod Environment
. Created the unit test plan document and Involved in unit testing and load
testing
. Created ETL Mapping Specification templates and Unit testing templates.
. Used the mapping variables and parameters files at the mapping level,
workflow variables at the workflow level. Also per the requirements when
applicable - worked with assigning the mapping variable to the workflow
variable or the other way round.
. Implemented the HTML tags in the Unix Shell scripting while sending a
mail from the script that is in turn invoked from the informatica
workflow
. Filling templates for the migration of ETL objects migration, Unix Script
migration, database object migration and Scheduling request forms
. Involved in identifying and creating reusable objects, efficiency of the
ETL code, the business logic and standardization. Developed mapplets to
use across various modules which generate Batch ID and implement the Data
validation process.
. Developed UNIX Shell Scripts to Create Indexes, Truncate tables using Pre
and Post session command tasks
. Created and maintained the shell scripts and Parameter files in UNIX for
the proper execution of Informatica workflows in different environments.
. Created and maintained the Informatica Best Practices document as per the
Informatica Velocity rules.
. Being in the Lead role, taken the responsibility to review the work done
by the team, and make sure of the best performance of the mappings and
the workflows and the best practices are followed.
Environment: Informatica B2B Data Exchange and Data Transformation (B2B DX-
DT), Informatica Power Center v8.6.1, Oracle 9i, System Architect,
Perforce, Unix scripting, Windows 2000, Oracle PL/SQL, SQL Server 2005
Cisco Systems Inc, Sanjose, CA Jan 07 -
March 09
TL-9000 Project [DAVA - Deep Application Vulnerability Assessment
Certified,
External Audit (Sept'2008) finding categories-: Strengths - Positive
comments & Best Practice]
("Track Internal Assessments, Results & Actions" (TIARA) tool is used to
manage, report, and track the results of the audit events.)
TL 9000: globally recognized quality standard, designed to improve
telecommunications products: hardware, software and services. It improves
the Supplier (to Cisco) - Manufacturer (Cisco itself) - Customer
relationships and co-dependent processes, and benchmark how it works and
provides a continuous improvement feedback loop. TL 9000 comprises
Requirements and Measurements which quantify the product quality and
assurance which in turn projects the company growth. Worked on the Product
Maintenance and NPO & Download to excel, TL9000 MRS System & the metrics
and the Customer definitions & the metrics
Responsibilities:
. Involved in the Anaysis, Design of the MRS system which includes the
various metrics like SHIPMENTS(SHP), INSTALLBASE (IB), ONTIME DELIVERY
(OTI), PROBLEM REPORTS (NPR)
. Involved in Design and Development for the Project CUSTOMER in the
metrics: OTI
. Involved in the Production support work and responsible for running the
monthly loads for all the MRS and Customer metrics (Shipments, Install
base, Returns, On-time delivery, Number of problem reports, Software
problem reports, Software fix quality, Cumulative software fix, Fixed
response time, Outages) and all these metrics for the customers to
generate monthly, quarterly, release specific and release-specific
quarterly metrics.
. Implemented Slowly Changing Dimension methodology for accessing the full
history
. Performance tuning for high volume data load mappings using SQL overrides
in Lookups and Source filter usage in Source Qualifiers as well.
. Involved in the Requirement gathering and verifying thoroughly for any
miscalculations and loop holes from the business functionality point of
view and also recommending the client for the better approach.
. Extensively worked on $U scheduling tool to set up the dependency of the
Informatica workflows and the Unix Shell Scripts which are used to send
the e-mails after processing the ETLs.
. Involved in the Unit Testing, supported the QA and UAT Testing
. Responsible for deploying the code to the Production, and then supported
the post production issues.
. Worked on designing the standard Error handling in the projects for
effectively reporting the IT or Biz errors on the abnormalities,
respectively wit auto generated e-mails.
. Worked extensively on different types of transformations like Lookup
(connected and unconnected), Router, Expression, Filter, Aggregator,
Update Strategy, Source Qualifiers, and Joiner.
. Extensively worked on worklet, mapplets, tasks such as link, email,
command; Pre SQL and Post SQL statements at mapping level, pre session
and post session commands at workflow level, while developing the
mappings and the workflows.
. Implemented the HTML tags in the Unix Shell scripting while sending a
mail from the script that is in turn invoked from the Dollar-U's tailer
task.
. Developed the technical documents for Solution design, Code migration,
Naming conventions and the Mapping documents from source to target, for
the various subject areas that I worked on.
Environment: Informatica Power Center v8.1.1, Oracle 9i, System Architect,
PVCS, Dollar-U, Windows 2000, Oracle PL/SQL, SQL Loader, UNIX Scripting
Symantec Corporation, Sunnyvale, CA
Sept 05 - Dec 06
OASIS Interfaces, Symantec Corporation
Symantec Corporation: undertaken a major initiative, code-named OASIS
(Oracle Applications for Security Integrated with Storage), involves
integrating the separate Symantec and VERITAS ERP (Oracle 11.5.9) instances
into a single ERP instance. This program consists of 11 sub-tracks and
delivers state of the art real-time IT operational systems for the merged
company.
SABA: Learning Management System
Learning Management System is the project for integration of the two
interfaces: Pivotal Sales Order System to Saba Education system and Saba to
the Virtual Academy Portal. Pivotal is the ordering system for licenses of
software, CDs, Education offerings, etc. The ISR (Inside Sales
Representative) enters the order in Pivotal to pass the new Education
orders to the SLMS to automate customer training registration.
Responsibilities:
. Involved in Analysis, Design, Development and Testing of two of the
OASIS sub tracks - Product and pricing, licensing w.r.t Saba 3.4 and
Saba 5.3 systems and Pivotal to Saba and Saba to VA Portal interfaces.
. Understanding the Requirement Specifications, reviewing the Functional
Specifications document and developing the Design document.
. Developed the source definitions, target definitions to extract data
from flat files and relational sources to Data warehouse.
. Used the ETL tool, Informatica to extract the ERP source data and load
into the Saba system
. Developed the interfaces using PL/SQL stored procedures, implementing
cursors.
. Included error-handling and exception handling by logging in the error
tables and sending an alert message via e-mail to the concerned
distributors.
. Created Unit test cases and Responsible for moving the code to QA
environment, and extending the support for the QA and UAT testing for
fixing the bugs.
. Written documentation to describe program development, logic, coding,
testing, changes and corrections, optimized the mappings by changing
the logic and reduced running time.
. Involved in bug fixing of the current production issues, and delivery
in a short time frame.
. Extensively used Debugger to identify data transfer problems in
mappings.
Environment: Informatica Power Center 7.1.3, Perforce, Oracle 9i, Perforce,
TOAD, Sun Solaris (UNIX), UNIX Shell Scripting Windows 2000
Tata Consultancy Services, Hyderabad, India
May 04 - Aug 05
Client: Tata Tele-services Limited (http://www.tataindicom.com/)
Assistant Systems Engineer
TATA Teleservices Limited (TTL) promoted by TATA Group is a licensed
private operator for providing basic telephony services in the state of AP,
Delhi, Karnataka, Tamil Nadu and Maharashtra. Worked on the TTL OSS
Provisioning project.
Responsibilities:
. Involved 24X7 Support, maintenance and enhancements (based on the
change requests raised by the client). The complete cycle of change
request started with feasibility check, impact analysis, development,
testing and implementation.
. Created new database objects like tables, views, sequences, functions,
synonyms, indexes, and roles.
. Worked on CRs (Eg: Millennium Edition (ME), Paluke Bangaram scheme,
SMS-CRM integration and so on.)
. Used various informatica transformations like lookup, filter, router,
sorter, joiner.
. Worked with PL/SQL Packages, Procedures Triggers
. Used Oracle UTL_FILE package for the data extraction.
. Developed SQL*loader scripts to load the data from flat files to the
Oracle Tables
. Involved in integration testing with other interface modules.
. Performed SQL Tuning, code reviews and test case reviews.
. Writing shell scripts to automate the process of monitoring the
engines and adapters of TIBCO EAI.
Environment: Informatica Power Center 7.1.3, Oracle 8i(8.1.7), SQL, PL/SQL
(Packages, Functions, Procedures, Triggers), TOAD, UNIX (Solaris 8.0),
Tibco IM, Exceed, Windows NT, SQL*Plus, SQL * Loader
Techno Design Group, Hyderabad, India.
Nov'01 - April '04
Java and PL/SQL Developer
Responsibilities:
. Interacted with the business users and collected the requirements.
. Programming in Java (Servlets, JSP), Oracle.
. Developing unix shell scripts to run batch jobs for loading database
tables from flat files using SQL*Loader.
. Created SQL*Loader control files for moving the data from flat files
to staging area tables.
. Involved in database development by creating PL/SQL functions,
procedures, triggers, packages, cursors, Error handling.
. Involved in Unit testing and integrated system testing.
Environment:
Java, Oracle 8i, SQL, PL/SQL, TOAD, UNIX (Solaris 8.0), MS Windows 2000,
VSS
Faculty at TRENDZ TECHNOLOGIES
May' 01 - Oct' 01
Project Trainee Anurag Infotech 09/00 - 03/01
Project work: Hyderabad Urban Development Agency - Java(Servlets) and
Oracle- Title: Allotment of Plots. (www.hudahyd.com)
Project Trainee Planet Online 09/00 - 03/01
Web site for purchase of T-shirts - Java & oracle (project for U.S)
(www.mygarb.com)
Education: Master of Science (Computers)
Seminars, Awards and Paper presentations
. Topper in St. Pious X P.G College (M.Sc Computer Science)
. Paper Presentation on "OOPS" at Osmania University.
. 1st Prize in Essay Writing Competition held by VIVEK '99
. Proficiency Prize in Queen Mary's Junior College.