Post Job Free
Sign in

Sales Pvt Ltd

Location:
banfalore, KA, 560071, India
Posted:
August 23, 2012

Contact this candidate

Resume:

K. SUBAHAN SHARIF

E-mail: *************@*****.***

Phone: +918*********

Objective:

Dynamic professional with 30 months of rich experience in Software Development. Seeking a challenging role in Business Intelligence / Data Warehouse Implementation that would best utilize my IT experience in Data warehousing and Product Design/Development/ Maintenance/enhancement projects involving my roles in project management and various phases of Project Life cycle.

Core Competencies:

Software Development:

ETL Design for Data Warehousing Projects.

Mainframe Application Development.

Development and Testing.

UAT.

Deployment and Implementation.

Technical Skills:

BI Tools, Languages and Software:

Informatica Powercenter 9.0.1.

Unix Shell Scripting(KSH)

Teradata SQL and BTEQ scripts(Tool Used: Teradata SQL assistant)

Teradata Utilities ( Mload,Fastload,Fastexport,Arcmain,Bteqmain)

COBOL, JCL, Easytrieve, REXX

RDBMS: Teradata13.0, Oracle 10g

Operating Systems:

Windows 2000/2003, XP.

Unix AIX.

MVS, Z/OS

Business Areas: Telecom

Qualifications:

Degree and Year: Bachelor of Engineering, 2009

Institute: Anjalai Ammal Mahalingam Engg College Affiliated to Anna University, Chennai

Major and Specialization: Electronics and Communication

Experience Summary:

Development and maintenance of SPDR(Sales Processing for Delivering Revenue) for AT&T client in IBM

Knowledge in Full Life Cycle development of Data Warehousing

Have extensively worked in developing ETL program for supporting Data Extraction, transformation and loading using Informatica Power Center

Understand the business rules completely based on High Level document specifications and implements

the data transformation methodologies

Expertise on BI tools like Informatica (Powercenter 8.5 & 9.0.1), Oracle, Unix Shell Scripting, Teradata BTEQ scripts & Teradata Utilities.

Experience with Informatica Advanced Techniques – Dynamic Caching, Memory Management, Parallel Processing to increase Performance throughput

Experience in creating Facts, dimensions (SCD Type 1,2 and 3) and staging mapping .

Worked in performance tuning of ETL after identifying bottlenecks in mappings, source database, target database and session performance.

Worked in creating shell scripts (Korn sh) for doing file transfers through SFTP, SSH,SCP and for back end handling of flat files and having experience in scheduling of ETL jobs using Crontab

Worked in creating Teradata BTEQ scripts and used FastLoad, MultiLoad for better performance of sessions involving Teradata targets.

Experience in working in an onsite-offshore structure and effectively coordinated tasks between onsite and offshore teams.

Experience in the full-cycle software development including requirements gathering, prototyping, and proof of concept, design, documentation, implementation, testing, maintenance and production support.

Career Profile:

S.No Organization From Date To date Role

1.

IBM India Pvt Ltd

Feb-2010

Till Date

Systems Engineer

Projects:

1. IBM India Pvt Ltd-GBS (Aug-2010 to Till date)

Project: SPDR (Sales Processing for Delivering Revenue)

Customer: AT&T

Period: Aug-2010 to till date

Description:

SPDR (Sales Process for Delivering Revenue) pulls billed Classic SBC and Bellsouth customer & revenue information from sources and feeds into the RUSS/SAART applications for Sales compensation.

SPDR collects customer data, revenue data, service order data from various flat file and Teradata sources and create daily and monthly flat files for RUSS application

The data is processed by SPDR in different sections like extraction, staging, Transform &Loading. In Extract section data is extracted from the source based on region &Non-region and separate XT files for Region & Non-Region are created. In staging section all the region files for a biller from different sources are combined and a Biller-Region file is created. All the Non-region files for a biller is combined and a Biller-No region file is created. In Transform &Loading section all the stage files are transformed into a RUSS format file .

Stats details regarding each biller revenue will be stored in Oracle tables

Hardware/Operating System: AIX

Languages/Tool: Informatica Powercenter 9.0.1, Oracle 10g SQL, Teradata BTEQ, UNIX Shell Scripting, Teradata SQL Assistant.

Roles & Responsibilities:

Involved in design and development of SPDR mappings, sessions and workflows which includes 3 stages such as Extraction level, Stagging Level and Transform & Loading level

Involved in performance tuning of ETL after identifying bottlenecks in mappings, source database, target database and session performance.

Involved in creation of the Teradata BTEQ scripts for complex DML statements, Oracle SQL queries and used FastLoad, MultiLoad for better performance of sessions involving Teradata targets

Written Shell scripts (KSH) for file handling, scheduling jobs and transferring files across network through SFTP, SSH and SCP.

Developed ETL programs using Informatica to implement the business requirements.

Communicated with business customers to discuss the issues and requirements.

Created shell scripts to fine tune the ETL flow of the Informatica workflows.

Used Informatica file watch events to pole the FTP sites for the external mainframe files.

Production Support has been done to resolve the ongoing issues and troubleshoot the problems.

Performance tuning was done at the functional level and map level. Used relational SQL wherever possible to minimize the data transfer over the network.

Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.

Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.

Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.

Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.

Effectively worked on Onsite and Offshore work model.

Pre and post session assignment variables were used to pass the variable values from one session to other.

Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.

Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.

Performed unit testing at various levels of the ETL and actively involved in team code reviews.

Identified problems in existing production data and developed one time scripts to correct them.

Fixed the invalid mappings and troubleshoot the technical problems of the database.

2. IBM India Pvt Ltd-GBS (Feb-2010 to Aug-2010)

Project: RUSS (Revenue and Usage Sourcing system)

Customer: AT&T

Period: Feb 2010- Aug 2010

Description:

• RUSS is the Revenue & Usage Sourcing System and its business function is for Sales and Marketing Needs Analysis Utilizing a RUSS Tracking Database for bill cycle validation and a RUSS month-end reconciliation process, the system is responsible for collecting all billed revenue and minutes of use data for ABS (AT&T Business Services).

• It directly interfaces and is dependent on information from a various number of ABS billing systems to include the Data Warehouse (DW) system, which receives information from other ABS billing systems and the Tailored Journals system.

• Files are received almost daily from among 34 billing systems that have one or more number of billing cycles in a month.

Hardware/Operating System: Mainframe,MVS,Z/OS

Languages/Tool: COBOL, JCL, Easytrieve, REXX, Teradata SQL Assistant.

Roles & Responsibilities:

As a Team Member responsible for Analysis, design, development, testing and Installation of Projects for new implementations and upgrades for our IBM US customer AT&T, a Leading Telecom Company.

Deliver new and complex high quality solutions to clients in response to varying business requirements.

Translate customer requirements into formal requirements and design documents, establish specific solutions, programming and testing that culminate in client acceptance of the results.

Communicate with SME and interfacing Application owners to extensively perform impact analysis.

Created and modified existing BTEQs to access data from the Teradata system.

Created MLoad, Mload,Fastexport and Fastload Scripts to update existing subject areas.

Used Mainframes ZOS to execute BTeq, Mload,Fastexport and Fastload Scripts.

Developed Mainframe datasets to automate pre-session and post-session tasks and BTEQ scripts.

Created views, macros and Teradata scripts for creating new tables and modifying existing table definitions to incorporate the functional requirements.

Creating scripts for producing daily and monthly reports for the client.

Trainings Undergone:

Informatica Power Center 9.0 Level 2 Developer (Apr 2012)- Hands on Training from Informatica.

Data warehousing and BI concepts (Sep-10).

Teradata Basics(Apr-10)- Hands on Training in IBM

Mainframe Application Development (Nov 2009-Jan 2010)

Awards/Achievements:

1. Bravo award on January 2011.

2. Service Excellence award on March 2011.

Personal Details:

Date of Birth: 6th April 1987

Nationality: Indian

Present Address: No D3, 3rd Floor, plot no 27/3, 2nd main, Chikka Adugodi extn, Tavarekere,

Opt to Venkateshwara College, Bangalore-560029

Phone no: +918*********

Passport Details:

Name as in PassPort: K.Subahan Sharif

Passport Number: H8171777

Place of Issue: Thiruchirappalli

Date of Issue: 30-03-2010

Date of Expiry: 29-03-2020



Contact this candidate