Post Job Free
Sign in

Sql Data

Location:
Herndon, VA, 20170
Posted:
October 05, 2010

Contact this candidate

Resume:

Summary:

. * years of professional Software/IT experience in Analysis, Design,

Development, Testing and Implementation of various Data Warehousing

Projects.

. Over 4 years of experience in AbInitio-ETL tool (GDE, Co>operating

System).

. Design and develop applications using Ab Initio, UNIX scripting,

PL/SQL and Autosys.

. Excellent understanding of the System Development Life Cycle. Involved

in analysis, design, development, testing, implementation, and

maintenance of various applications.

. Experience in designing, developing and testing large scaled

applications as per functional requirements using AbInitio,UNIX,

Unix Shell scripting and PL/SQL.

. Extensively used ETL methodologies for supporting data extraction,

transformations and loading processing using AbInitio.

. Good exposure to SQL, PL/SQL stored procedures, Triggers and Packages.

. Expertise with most of the Ab-Initio components like Database,

Datasets, Partition, De-partition, Sort and Transform components.

. Good Experience working with various Heterogeneous Source Systems such

as Oracle, DB2, Teradata, MS SQL Server, Flat files and Legacy

Systems.

. Strong knowledge in Dimensional modeling like Star Schema and

Snowflake Schema.

. Designed and created Database objects: Indexes, Views, Procedures,

Packages and Triggers.

. Developed Unix Korn shell wrapper scripts to accept parameters and

scheduled the processes using Autosys.

. Designed scalable architectures addressing parallelism, data

integration, ETL, data repositories and analytics, making use of Ab-

Initio suite.

. Experience with AbInitio Co>Operating System, application tuning and

debugging strategies Proficient with various ABINITIO Parallelism and

Multi File System technique

. Expert in coding SQL, PL/SQL stored procedures, Triggers and Packages.

. Provided the 24x7 support for production and testing daily, monthly

and weekly data refresh and worked with fixing complex/critical

production problem.

. Knowledge of Version Control tool Clear Case and defect tracking tools

like Clarify and Clear Quest.

. Experience in DBMS Utilities such as SQL, PL/SQL, TOAD, SQL Server,

SQL*Loader, Teradata SQL Assistant.

. Able to interact effectively with other members of the Business,

Quality Assurance, Users and other teams involved with the System

Development Life cycle.

Technical Skills:

ETL Tools AbInitio GDE1.15, CO>OP 2.15, Informatica Powercenter,

Powermart, Erwin 4.1/3.7, Microsoft Visio 2000

Databases Oracle 10g/9i/8i, Teradata V2R4/R5, DB2, MS-Access,

SQL Server 2005.

Platforms Sun Solaris 7.0, HP-UX UNIX, Windows NT/XP, MS-DOS,

Mainframe TSOB

Languages C, C++, SQL, PL / SQL, XML, UNIX Shell Scripting

GUI Visual Basic 5.0/6.0, Visual C ++ 6.0

Scheduling AutoSys, Control M, TWS and Maestro

Tools

Scripting UNIX Shell (Ksh/Csh)

Other Tools TOAD, SQL Tools 1.5, IBM Rational ClearCase,

Teradata SQL Assistant, Clear Quest, I-Cart, Putty,

Erwin 4.0

Professional Experience:

Fannie Mae, Herndon, VA

Feb2009-Present

AbInitio Developer

Fannie Mae a government sponsored enterprise provides liquidity, stability,

and affordability to the U.S. housing and mortgage markets. ADW is a

certified source of data for most consumers in the Mortgage Securities and

Loans (MSL) Get Current work streams. Data stored in the ADW is extracted

from various sources and is transformed per business requirements. Once the

data has been loaded into ADW, it is certified and approved by a Get

Current business team and publish to downstream users during the production

rollout.

Responsibilities:

. Develop/Modify subject area graphs based on business requirements

using various AbInitio Components like Filter by Expression, Partition

by Expression, reformat, join, gather, merge rollup, normalize,

denormalize, scan, replicate etc.

. Modified the AbInitio graphs to utilize data parallelism and thereby

improve the overall performance to fine-tune the execution times.

. Understanding of the System Development Life Cycle (SDLC). Involved in

analysis, design, development, testing, implementation, and

maintenance of various applications.

. Created AbInitio graphs to extract data from different external

systems, validate, transform and load data to Oracle.

. Develop and execute PL/SQL queries to ensure the application is

getting valid data from the database using TOAD tool.

. Implemented Lookups, lookup local, In-Memory Joins to speed up various

AbInitio Graphs.

. Involved in unit testing and assisted in system testing, integrated

testing, and user acceptance testing.

. Worked with business to translate the business requirements into High

level design, Detailed Level Design and Functional code.

. Developed Complex AbInitio XFRs to derive new fields and solve various

business requirements.

. Extensively used AbInitio built in string, math, and date functions.

. Used Run SQL component to execute procedures, packages, SQL

statements.

. Design and development for process scheduling using Autosys.

. Extensively used Enterprise Meta Environment (EME) and ClearCase for

version control.

. Used ClearQuest for defect tracking and also to generate tickets.

. Used I-Cart to generate baselines for the migration of code to QA

testing and UAT testing teams.

. Develop Korn Shell scripts to run job streams.

. Involved in writing Stored Procedures, Packages, functions and

triggers in PL/SQL.

. Provide 24*7 extended supports during the production rollout.

. Interact with All teams like Business Analyst, Development Team,

Testing team, Configuration Management Team and Project Management.

Environment: AbInitio (GDE 1.15, Co>operating system 2.15), Oracle 10g,

SQL, PL/SQL, Shell Scripts, TOAD, SQL Tools 1.5, Autosys, IBM Rational

ClearCase, Clear Quest, I-Cart.

VISA, Foster City, CA

July2007-Jan2009

AbInitio Developer:

The purpose of this project was to make necessary changes to the Loyalty

systems to support the migration of Visa Extras from the current program

administrator, which was Carlson Marketing Worldwide (CMW) to the future

program administrator, Epsilon.

Responsibilities:

. Involved in creating the Physical models from business requirements.

. Used AbInitio GDE to generate complex graphs for transformation and

Loading of data into Staging and Target Data base area.

. Involved in designing Load graphs using AbInitio and tuned the

performance of the queries to make the load process run faster.

. Involved in phases of Software Development Life Cycle (SDLC) such as

database design, system study, structural design etc.

. Used DB2 as the database to get the data from the database and then

had to create staging tables and then had to move the data from the

staging table to the original table.

. Clear understanding of Business Intelligence and Data Warehousing

Concepts with emphasis on ETL and System Development Life Cycle

(SDLC).

. Extensively used Partition components, De-partition components,

Dataset components, Transform components, Sort components etc.

. Implemented Lookups, lookup local, In-Memory Joins to speed up various

AbInitio Graphs.

. Worked on IBM AIX OS/390 for job scheduling using MVS (Multiple

Virtual Storage) to schedule Ab Initio jobs.

. Worked extensively on DB2/UDB and writing sql queries to get the data

from the database.

. Wrote complex queries using the commit and rollback commands in

DB2/UDB.

. Worked on improving the performance of Ab Initio graphs by using

Various Ab Initio performance techniques like using lookup's (instead

of joins), In-Memory Joins and rollups to speed up various Ab Initio

Graphs.

. Involved in unit testing and assisted in system testing, integrated

testing, and user acceptance testing.

. Followed the best design principles, efficiency guidelines and naming

standards in designing the graphs.

. Used the AbInitio Web Interface to Navigate the EME to view graphs,

files and datasets and examine the dependencies among objects.

. Experience in using FCS tool to build AbInitio code.

. Designed and build AbInitio generic graphs for unloading data from

source systems. Validated the unloading process with the row count of

the landed file with that of the source table row count.

. Involved in writing Indexes, Views, Stored Procedures, Packages and

Triggers.

. Incorporated Exception Handling in the graphs to keep track of

anomalies in the incoming data and raise an alarm in such situation.

. Developed shell scripts for Archiving, Data Loading procedures and

Validation.

. Experience using Clear Quest, Clear Case, Quality Centre.

Environment: Ab Initio GDE (1.13), Co>Op Sys (2.13), EME, DB2/UDB 7,

Oracle, Toad, COGNOS 7/8, UNIX, IBM 390, Teradata, Mainframe DB2, JCL, SQL,

Micro Strategy.

Allstate, Chicago, IL

Oct2006-June2007

AbInitio Developer

The project (AF-ADW) main aim was to process the feeds from all the source

systems by transforming the feeds according to the business rules and load

into ADW data warehouse (Oracle) and consolidation of four ETL cycles to

One ETL cycle.

Responsibilities:

. Involved in the Functional design of the project.

. Worked on loading the ADW by extracting the feeds, parsing and then

transforming the data according to the specific business rules.

. Build Mandatory feed check script, which checks the arrival of feeds

and sends e-mail to the team about the missing feeds.

. Experienced in using EME (Check-in, Check-out, and Version Control).

. Worked on production support in monitoring the jobs and took

corrective action based on the job status.

. Wrote stored procedures, functions, triggers etc to enhance the

performance by using PL/SQL.

. Worked extensively in UNIX environment by accessing the files,

changing permissions to them and building scripts.

. Created Oracle views to improve the performance of the reports.

. Developed Complex AbInitio XFRs to derive new fields and solve various

business requirements.

. Developed shell scripts to automate file manipulation and data loading

procedure.

. Worked on database connections, SQL joins, Loops, Materialized Views,

Indexes, aggregate conditions, parsing of objects and Written PL/SQL

procedures and functions for processing business logic in the

database.

. Developed wrapper scripts for loading fact tables and load status

tables.

. Strong experience loading data into target systems like main frame

DB2, Oracle.

. Worked with Partition components like Partition-by-Key, Partition-by-

Expression, Partition-by-Round robin to Partition the data from Serial

File, using Multi file system.

. Worked with Departition Components like Concatenate, Gather,

Interleave and Merge in order to departition and repartition data from

Multi files accordingly.

. Performed transformations of source data with Transform Components

like Join, Dedup Sorted, Denormalize, Normalize, Reformat, Filter-by-

Expression, Rollup etc.

. Created Summary tables using Rollup, Scan & Aggregate.

. Tuning of AbInitio Graphs for better performance.

. Used Autosys scheduler for job scheduling.

. Written many user defined functions, which are used within Allstate.

Environment: AbInitio (Co.Op 2.13, GDE 1.13), Solaris UNIX, Oracle,

Autosys, SQL, PL/SQL, Business Objects, Erwin, Shell Scripting, EME, SQL *

Loader and Windows NT.

SCL Technologies, India

Jan2005 -Sep2006

Database Developer

. Created new database objects like Procedures, Functions, Packages,

Triggers, Indexes and Views using T-SQL in Development and Production

environment for SQL Server.

. Developed Database Triggers to enforce Data integrity and additional

Referential Integrity.

. Developed SQL Queries to fetch complex data from different tables in

remote databases using joins, database links and formatted the results

into reports and kept logs.

. Involved in performance tuning and monitoring of T-SQL.

. Used SQL Profiler and Query Analyzer to optimize DTS package queries

and stored procedures.

. Wrote T-SQL procedures to generate DML scripts that modified database

objects dynamically based on user inputs

. Created Stored Procedures to transform the Data and worked extensively

in T-SQL for various needs of the transformations while loading the

data

. Involved in performance tuning using indexing (Cluster Index, Non

Cluster index) tables.



Contact this candidate