|*** ****** **, ****** *******, Edison,NJ. ***** **********@*******.***,******@*****.***
SUMMARY: Around 14 plus years of hands on IT industry experience in the most competitive and thriving environments in USA and India.
• Extensive knowledge in the development of Stored Procedures, Functions & Oracle Packages, Triggers using Oracle PL/SQL, Oracle 10G, 9i/8i/8.0/7.3, SQL SERVER 2000, T-SQL. TOAD 8.1.6 as Front-end tool.
• Worked with external tables,views in oracle.
• Performance tuning views,sql,pl/sql code.
• Working with pl/sql code using transaction control statements, set transaction…savepoint,commit and rollback.
• Extensive knowledge in developing batch programs using SQL*LOADER, DTS and Unix SHELL Scripts.
• DTS (Creating DTS, Active-x Scripts, Scheduling and Monitoring).
• Hands on experience in UNIX shell scripting for Job Scheduling, SQL Script execution.
• Developing, executing and trouble shooting Autosys jobs.
• Extensive experience writing Unix Shell Scripts for data processing, data loading and scheduling jobs. Worked on HPUnix, AIX UNIX, Solaris and SCO Unixware.
• Strong in Data warehousing concepts using Normalized Multi-dimensional Star Schema and Snowflakes methodologies.
• Experience in architecture, database modeling, analysis, design, development, and data conversion.
• Extensive experience in implementation of Data Cleanup procedures, transformations, Scripts, Triggers, Stored Procedures and execution of test plans for loading the data successfully into the targets.
• Technically proficient in Identifying and Resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
• Comprehensive working knowledge of Client/Server Architecture.
• Proven ability to effectively manage time and resources and prioritize tasks in order to consistently complete projects according to specifications and on time.
• Responsible for interacting with business partners to identify information needs and business requirements.
• Creative problem solving capabilities, analytical approach and exceptional technical writing skills.
• Trained on Cognos EPSeries 7.0 & Business Object 5.1
• Strong working experience in the Requirements Analysis, Data Analysis, Design, Implementation, Testing of Data warehouses, ODS and Business Data marts
SKILLS: Data Warehousing: Metadata, Datamart, Data cleansing, Erwin, OLAP, OLTP, SQL*Plus,
Data Modeling: ERWin 4.x/3.5.2/3.x, Dimensional Data Modeling, Star Schema Modeling, Snow-Flake
Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling.
Data Bases: Oracle 10G, 9i/8i/8.0/7.3, DB2/AIX64
8.2.0,MS SQL Server 6.5/7.0, 2000, MS Access 7.0 / 97 / 2000,Sybase, TOAD 7.5,TOAD 8.1.6.
Testing Tools: Mercury Test Directory 8.0. Job Scheduling: Autosys, for scheduling Informatica Workflows,
Version Control: PVCS. Languages: SQL, PL/SQL, Transact SQL, UNIX Shell Scripting, Familiar with Perl,
Burno’s Birmingham, AL Mar ’08 – May ‘ 08
This is a two month project, working on Unix Shell Scripting,Autosys Job scheduling of ETL process.
• Working on conversion/decoupling project.
• Analysis/Gap Analysis and designing new interfaces in unix shell scripts.
• Writing scripts for encrypting/decrypting data using GPG/PGP encryption software. Configuring files for encryption using gpg encryption software. Configuring files for multiple keys.
• Configuring SFTP,SCP,SSH for authentication less SFTP.
• Creating public and private keys and sharing the keys across servers and configuring the authorized keys file for authentication less ftp process which helps in writing automated batch scripts.
• Transferring files using sftp, ftp and automating the whole process of encryption, file transfer and auditing of file moves and The process of file moves(push and pull).
• Writing new shell scripts and modifying existing shell scripts.
• Automating the shell scripts using Autosys Job Scheudler.
• Autosys 4.0 is used to develop jobs, jobs are scheduled individually and in Boxes.
• Responsible for writing file watcher jobs, box jobs, send event jobs.
• Scheduling jobs Using calendars , time parameters and job dependencies
• Moving code from UAT, DEV to PROD and Unit testing the scripts and jobs.
Environment: Unix Shell Scripting ,SSH,SCP,SFTP, Aix Unix V5.3,Autosys 4.0.
EXPERIENCE: PFIZER INC., MORRIS PLAINS, NJ (Nov ’06 to Sep 07 )
Senior Oracle Developer – J&J Integration (Oracle 10g, PL/SQL)
Integration project - Data will be integrated from J&J and Pfizer. Work involves developing ETL process using
Oracle, Pl/SQL and Informatica ETL tool.
• Working on Open Orders, Item dimension. Unix Shell Scripting, Oracle PL/SQL will be used for ETL Process. Maestro is job scheduling tool.
• Performance tuning views, sql queries and stored procedures, functions.
• Developed stored procedures for transforming and loading of data.
• Creating views, external tables, database links, public/private synonyms, configuring .ora files for connectivity.
• Using new features for data warehousing (merge statement) .
• Documenting GAP analysis, for Pfizer and JnJ source and target tables.
• Documenting table creation scripts and deploying in dev/qa/prod environments.
• Analyzing tables and indexes for performance tuning.
• Working on Adhoc request to create ETL process to load data for Zantac Vendor to give information related to Closed (Sales) order and Open Orders. Data resides on Oracle database in ODS and target is flat files.
• Designed the ETL process to get data from denormalized customer, invoiced order, open order and lot tables.
• UNIX shell scripting to run the Informatica workflows and store header and footer information in flat files. Writing UNIX functions to run different steps of jobs to load and validate data.
• Writing wrapper script in unix shell to run the whole process, rename/moving flat files for archiving nad emailing the flat files to clients.
• Validating the data with end users.
• Documenting the ETL process, mapping document, file formats.
• Pulling data from archive tables for history reports.
• Creating Adhoc SQL queries, modifying existing pl/sql packages.
• Debugging production jobs, fixing oracle pl/sql procedures, enhancing existing pl/sql code.
• Supported four different applications,cpfr,awms,drc and ods applications.
• Researching new analytic functions in Oracle (rank,dense rank,partition over)
Environment: Informatica 7.3, DB2 Source, Oracle 10G Target, Unix Shell Scripting, PL/SQL
Programming for transformation, TOAD 8.1.6 and Maestro for job scheduling.
THE HARTFORD, HARTFORD, CT (Mar ‘06 to Oct ‘06)
Senior Oracle Developer (Oracle PL/SQL).
Development and Enhancement of Personal Lines Enterprise Datawarehouse.
Avaya Phone Load Datawarehouse:
• Interact with Customers/Users/ Business Analyst to get the requirements and Prepare functional and technical documents.
• Updating ERWIN models with changes, designing database tables.
• Developing PL/SQL code (packages, stored procedures and functions) for translates.
• Scheduling PL/SQL packages using Autosys Jobs.
• Shell Scripts for load balancing, table alter scripts.
• Data Analysis using Toad 7.5 for Oracle Target and Source Tables.
• Writing adhoc update scripts ( anonymous pl/sql block)
• Assisting Integration Testing, Unit Testing, documenting test logs/issues.
• Assisting Migration of oracle pl/sql code from QA To Prod environment.
• Post Production support & performance tuning of PL/SQL code.
• Defined naming standards and implemented.
• Implemented balancing packages in all stages (Stage, Load, DMstage, Mart).
• Implemented Autosys scripts to automate the Processes.
• Interacted with DBA team regarding the database Process.
• Prepared Workflow process to run Sessions based on Load dependency.
• Performed Bulk and Incremental Load.
• Prepared test plans/test scripts as part of Post/Pre Production testing.
• Involved in Post Production support & Moving entire system from UAT environment to PRODUCTION environment.
• Identify/Analyze all data elements that are required for the Scorecard project
• Define data model and understand source data with some identified possible 9 tables and 300+ elements.
• Define business rules to derive some fields (duration of call, hold call etc.)
• Setup incremental process to retrieve data from twelve CMS servers
• Verification of data (unit testing, user acceptance testing etc.)
• Build linking process to link Claim CMS data to adjuster detail in WCR and APG DM.
• Load historical data from Claim CMS if available
Environment: Oracle 10G Target, Unix Shell Scripting, PL/SQL
Programming for transformation, Informatica 7.2,TOAD 8.1.6 and Autosys for job scheduling.
CIRCUIT CITY, RICHMOND, VA (May ‘05 - Mar ’06)
Enterprise Datawarehouse- Dimensions and Inventory Track (Development & Support)
• Documentation of Technical specification (Maps, Transformations, Mapplets, Sessions, Workflows, Autosys Jobs, Parameter files, Source and Target Tables.) and the ETL process to load data from various sources to different target tables.
• Documentation of Business Rules/Transformation involved when moving data from source to Targets.
• Documentation of Implementaion Plan to migrate Informatica Maps, Session, Workflows, Source, shortcuts, Shell Script, Autosys jobs from Development to Test to Production Environment.
• Involved in logical and physical design of tables.
• Design & Development of various Maps, Unit Testing of Maps. Developing Mapplets, Reusable Transformation.
• Designing Mapping flow with Performance Tuning of Maps. Creating Sessions and Workflows. Using Debugger to Debug Maps. Documenting Unit Test Cases and Unit Testing Maps.
• Creating Sessions and Workflows and Configuring Parameter Files. System Testing Informatica Maps/Sessions/Workflow using Autosys jobs.
• UNIX Shell Scripting and Autosys for Job Scheduling
• UNIX Shell Scripting and Autosys for Job Scheduling, using various commands, insert_job, update_job, delete_job. Using various parameters to define jobs like start_time:, day_of_weeks and trouble shooting autosys jobs in production, dev and test environment.
• Developing autosys jobs with dependency using the condition parameter and defining various dependency between the autosys jobs.
• Writing shell script to set the environment for running autosys jobs and SQL shell scripts.
• Defining different job streams to run workflows in parallel for different dimensions tracks (Product, Location, Code tables).
• Monitoring Jobs in development, test and production using jr, autorep commands and checking for status of jobs for a week using various options. Force starting jobs, setting or changing of job status using sendevent commands while testing. Jilling all the jobs and organizing in a structured way.
Environment: Informatica 7.1.1, repository on Oracle 8.1.7. Source databases are AS400, Flat Files, DB2. Target
Database is Database server - DB2/AIX64 8.2.0, AutoSys3.0, 4.5
BERLITZ, PRINCETON, NEW JERSEY (Feb ‘05 - May ’05)
SQLSERVER/ETL Consultant (Team Lead and Offshore Coordinator)
Data Migration and ETL
This is internal project for Berlitz worldwide. Data had to be migrated from SQLServer 7.0 to SQLServer 2000,
sing DTS packages.
• Business Analysis, Design ETL Process, interview Business users, get requirements. Document all the DTS packages implemented to move the data from source to target tables.
Environment: Microsoft NT Server 4.0, MS-SQL*Server 2000/7.0, DTS. Sybase, MS Excel, MS Word, MS
AMERICAN MEDICAL DEPOT, MIAMI, FL (Aug ‘03 – Mar ’04)
Medical products Datawarehouse
• Imported various Sources, Targets, and developed Transformations using Informatica Power Center Designer.
• Developed various Mappings with the collection of all Sources, Targets, and Transformations.
• Created Mapplets with the help of Mapplet Designer and used those Mapplets in the Mappings.
• Used Type2 version to update slowly changing dimension table to keep full history.
• Involved in performance tuning by optimizing sources, targets and sessions.
• Loaded target data warehouse in Oracle 9i
• Worked with Oracle External Tables new feature of Oracle 9i, which allows Oracle to query the data that is stored outside the database in flat files using drivers. They are useful in the ETL process of Data warehouse and can be queried in parallel.
• Used Oracle's Multitable inserts a new feature of Oracle 9i to load data into new application from source systems, which reduces table, scans and PL/SQL Code necessary for performing multiple conditional inserts. Its main use is for the ETL process in data warehouses where it can be parallelized and/or convert non-relational data into a relational format
• Extensively used, External Loader Utilities.
• Created Workflows in workflow designer and executed tasks such as sessions, commands using Informatica Workflow manager.
• Monitored transformation processes using Informatica PowerCenter 6.1 Workflow monitor.
• Worked extensively on almost all the transformations such as Aggregator, Joiner, Cached and Uncached lookups, Connected and Unconnected lookups, Filter, Router, Expression, Normalizer, Sequence generator, Joiner etc.
• Implemented Complex logics using expression transformations. Involved in Data Certification by using complex SQL queries on warehouse. Identified the required dimensions and, measures for the reports.
• Prepared prototypes of all the reports and an approval for the requirements and the Prototypes were obtained from business owners.
Environment: Informatica 6.1 (Source Analyzer, Warehouse designer, Mapping Designer, Transformation
Developer), Oracle 9i, Unix, Flat Files, Access, SQLServer 7.0, Access, ERwin, Windows NT.SQL,PL/SQL,SQL*Loader, Korn Shell Scripting.
KONSYL PHARMACEUTICALS INC, NJ (Feb ‘02 - Jul ’03)
• Extraction of data from varying origins including MS Excel spreadsheets, text files. MS Access database files, and SQL Server databases
• Transformation of data including restructuring, reformatting, deriving new values, substitution (mapping to new values), and validation using Transact-SQL and SQL Server DTS
Environment: SQL Server 7.0/2000, DTS, BCP, ERWin, TSQL, Windows NT.
SCHEDULEEARTH.COM, MIAMI, FL (Feb ‘01 - Jan’02)
Database / Web Developer
Web Based Datawarehousing Project
• As a lead developer, gathered client requirements, scope, design and developed the site.
• The site was developed using ASP (VB Script), HTML. The backend database was MS-SQL*Server 7.0.
ALLMARINEPARTS.COM, MIAMI, FL (Jul ‘00 – Jan ’01)
Database / Web Developer
• The exchange was being designed and built to be a virtual hub or marketplace for cargo and cruise ship owners and part buyers to invite quotes from the ship part manufacturers.
• Developed using ASP (VB Script), HTML. The backend database was MS-SQL*Server 7.0.
DUEDILIGENCEDONE.COM, MIAMI, FL (Jan ’00 – Jun ’00)
Database / Web Developer
• This project required coding in ASP (VB Script), HTML and interaction with the backend database using ADO. The connection was made using ODBC and the backend database was Microsoft Access. One of the major responsibilities included online credit card authorization using Cyber Cash.
ECOMSERVER INC, LAWRENCEVILLE, NJ (Apr. 98 – Jan. 00)
(The Call Center group of WebSci Technologies formed this company.)
• System Installation: This being a newly formed startup, in the initial phases was involved in the LAN setup involving hardware and software installation. Role involved setting up the network consisting of desktops running a mixture of MS-Windows 95 and NT machines. The network also has Microsoft Windows NT servers running as Primary, Secondary and backup Domain Controllers.
• VoiceStream Wireless: The project involved the design, setup, administration and maintenance of the database environment for remotely loading data into the ORACLE RDBMS. The data resided on an Oracle 220.127.116.11.0 RDBMS running over the UnixWare Operating System. The data was uploaded on a daily basis onto the Oracle 7.x database residing on a Windows NT 4.0 system by running batch jobs. Remote database connectivity was accomplished by using SQL*NET V2 client software. This involved configuring the LISTENER.ORA and TNSNAMES.ORA files, starting the listener process and configuring the respective files required for remote database connection. This being a single-team member project was involved end to end into the design and development of the system. As part of ORACLE system administration and maintenance responsibilities was involved in installing the Oracle 7.3.x on Windows NT, creating users, creating roles, creating tablespaces , granting the roles to users, configuring the SQL*NET V2 files required for the remote database connectivity. The connection to remote Database was accomplish by using ODBC and SQL*NET V2. I was also responsible for generating reports using Crystal Reports 7.0 running on a Windows NT4.0 OS. This also involved creation of database links, views, elimination of duplicate records, deleting records after certain time frame, backing up data and automating the full process. For this we made use of the SQL*LOADER and the IMP/EXP functionality.
• Oracle 8.0.x installation (Lucent Technologies): Involved in automating the process of installing Oracle 18.104.22.168.0 onto the Intuity Conversant. This involves creation of a package that can automate the process. Intuity Conversant (8.0), is the Voice Response Unit developed by Lucent and is anticipated to be released in the beginning of next year. This is an on-going project. Role involved researching the files that are required to be executed to install the oracle software, create the database and execute the files required to created the data dictionary tables.
Environment: UnixWare 2.0, Microsoft Windows 95 and NT, Oracle 7.x, Oracle 22.214.171.124.0,and Oracle 8.i,
SQL*NET v1/2, net 8.0, SQL*LOADER, Crystal Reports 7.0
WEBSCI TECHNOLOGIES, MONMOUTH JUNCTION, NJ (May 97 – Apr. 98)
WebSci Technologies is an Independent Service Vendor for Lucent. During this period worked on multiple client
• Design, development, administration and maintenance of applications running over the ORACLE database.
• Designed and developed dB tables and the dB structure
• Regular dB administration and maintenance
• Developed stored procedures using PL/SQL
• Remote dB linking, connection and configuration using SQL*NET V2.0
• Report generation using SQL*PLUS
• Execution and generation of data loading scripts using SQL*LOADER
• Automating process of loading data using unix wrapper shell scripts.
• Extensive client interaction with executives from corporations like: Chase Mellon Share Holder Services, Prudential, Samsung, Lucent, ADP and AT&T.
• Worked as a in-house consultant for AT&T and Chase Mellon Share Holder Services.
Environment: ORACLE 7.0 and above, SQL*PLUS, SQL*LOADER, SQL*NET V2.0, UnixWare 2.1, Intuity
CENTURY ENKA LIMITED, PUNE, INDIA (May ‘96- Mar ’97)
• Involved in the system design and implementation.
Environment: ORACLE 7.0, SQL*FORMS, SQL*MENU and Report Writer, HP-UX 9.0, HP G-50 RISC
SMART INFORMATION SYSTEMS, GOA, INDIA (Aug ‘95 - Mar ’96)
This was a sales and support position that required a thorough knowledge of the Oracle product suites
WINSOFT CONSULTANTS, BOMBAY, INDIA (Oct ‘94 – Jun ’95)
• This was a sales and support position that required a thorough knowledge of the Oracle, MS Visual Basic and Unix Systems.
EDUCATION: Bachelors Degree in Computer Engineering, Goa Engineering College, Ponda, Goa. Goa University, India.
First Class with Honors. 1994
Similar ResumesData Warehouse (ETL) Developer
- informatica developer, etl developer, teradata,...Quality Analyst
- embarcadero, overview, mechanism, load runner,...Project Engineer Management Developer Testing
- rdms, test lead, etl, infosys, aspnet, developer,...Manager Business Objects Project Management Data
- bilt, datastage, informatica, webi, teradata, bo,...ETL Analyst
- datastage, ssis, amir, ssrs, bo, vodafone, etl,...Business analyst
San jose, CA
- oracle business, ab initio, obiee, initio,...Informatica, Oracle, Senior software engineer, senior systems analyst,
- informatica, teradata, ucs, etl, 6i, weblogic,...Informatica ETL Developer
- datawarehousing, datawarehouse, teradata,...Sr Developer & Analyst
Uninc Clark County, WA
- nhibernate, orm, wsdl, slot, winforms,...Senior Datastage consultant
Great Neck, NY
- datastage, datawarehousing, mks, bw, basel, ssrs,...