Post Job Free
Sign in

Data Manager

Location:
Latham, NY, 12110
Posted:
October 31, 2017

Contact this candidate

Resume:

Mohammad Irfan Ahamad Dandiya

813-***-****

********@*****.***

QUALIFICATION SUMMARY

8+ years of IT experience in analysis, design, development and implementation of software applications in Data Warehousing/Business Intelligence.

Experience in all the phases of Data warehouse life cycle and project development life cycle involving in planning, analysis, design, coding, testing, deployment and support.

Well versed with Data warehousing concepts using Kimball’s principles or methodology- Dimensional modeling, slowly changing dimensions (type1, type2, and type3), surrogate keys, star schema and snowflake schema in implementing DSS (Decision Support Systems.)

Extensively worked on many technologies like Informatica Power Centre 9.1/9.6(Designer, Repository Manager, and Workflow Manager), Oracle 12c/11g/10g/9i/8i/8.0/7.x, MY SQL Server, Business Intelligence (implementing xir4), Business Objects (creating reports by linking data from multiple sources and BO enterprise products designer. WebI, DeskI, Dashboard, scheduling).

Experience in Designing, developing data conversion/data integration modules to support large scale system migrations and implementations.

Expertise in Data Warehouse development working with Extraction/Transformation/Loading using Informatica Power Center with Oracle, SQL Server, Teradata, DB2 and Heterogeneous Sources

Experience in Data masking using SQL scripts

Extensive experience in developing mappings for Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouse/Data Marts.

Experience in creating Reusable Transformations (Joiner, Lookup, Sorter, Aggregator, Expression, Update Strategy, Router, Filter, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.

Extensively worked on developing and debugging Informatica Mappings, Mapplets, Sessions and Workflows.

Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

Very efficient in tuning complex SQL statements.

Involved in Data conversion and migration from DB2 to oracle system.

Extensively used SQL and PL/SQL in creation of Triggers, Functions, Indexes, Views, Cursors and Stored Procedures.

Excellent Experience working on Business Objects 4.0/4.1/4.2, Release XIR3, Information Designer tool, Universe design tool, Universe Design (Unv and Unx),WebI, DeskI, Dashboards, Crystal reports creation and scheduling, Publication to created reports by linking data from multiple sources

Experienced in UNIX Shell scripting as part of file manipulation, Scheduling and text processing.

TECHNICAL SKILLS

ETL Tools : Informatica PowerCenter 9.6/9.1/8.x/7.x/6.x

Reporting tools : Business Objects XI 4.0/XI 3.1, InfoBurst, Crystal Reports, and OBIEE

Databases : Oracle 7i/8i/9i/10g/11g/12c, MS SQL Server 2008, DB2 and Teradata12

Modeling Tools : Star-Schema Modeling, Snowflakes Modeling, Dimension Tables, FACT, and Erwin

Scheduling Tools : UC4, Control M

Languages : C, C++, Java, Visual Basic, SQL, PL/SQL, UNIX Shell Scripting

Operating System : Windows 9x/NT/2000/XP, UNIX

Office Applications : MS-Office 97/2000/XP/2007

Other Tools : SQL *Loader, TOAD 7.6, SQL Plus, Putty

PROFESSIONAL EXPERIENCE

Data Warehouse Lead Consultant, Department of Transportation, WI Feb 14 – till date

Responsibilities

Involved on Data Warehouse & BI projects, gathering & documenting of Functional & Technical requirements, designing solutions, developing ETL using

Involved in interacting with business finance users, understanding the business process, collecting the information and preparing business requirements document and functional requirements document.

Extensively used Informatica Power Center for developing mapping and scheduling jobs

Strong hands on experience in Oracle, DB2, and Teradata.

Extensively used SQL, PL/SQL and Teradata in creation of Triggers, Functions, Indexes, Views, Cursors and Stored Procedures.

Used Teradata as Source database and Target database

Strong experience using Teradata utilities (SQL, B-TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, Query man), Teradata parallel support and Unix Shell scripting.

Strong experience in Creating Database Objects such as Tables, Views, Functions, Stored Procedures, Indexes, Triggers, Cursors in Teradata.

Closely worked with data modelers to build Dimensional Modeling using Star Schema and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-Studio.

Worked on Database architecture of OLTP and OLAP applications.

Worked on various Informatica concepts like partitioning, Performance tuning, identifying bottlenecks, deployment groups, Informatica scheduler and more.

Involved in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.

Designed ETL loading process and data flow.

Developed ETL scripts to extract and cleanse data from SQL databases.

Generated the required Meta data at the source side, in the Informatica Mappings

Involved in Data conversion of financial system and loaded into data warehouse

Prepared the impact analysis document and high-level design for the requirements.

Involve on performance tuning by collecting statistics and observing explain plan.

Involve in complete life cycle implementation of Business Objects.

Developed universes and reports using Business Objects

Used InfoView to create Web Intelligence, Desk Intelligence Reports over the Universes

Created Combination Queries Create WebI reports over Multiple Queries.

Gathered requirements for financial and managements reports from financial and managerial users by conducting workshops and providing questioner to users and did POC demo on BO WebI, Xcelcius/Dashboarding and live office to make sure collected requirements are accurate.

Involved in complete SDLC right from Use case document discussions, writing functional and technical specifications for report development.

Good experience in interacting with finance users and understanding financial systems like FDW

Translated the business rules into technical specifications to design the universe.

Analyzed the star schema to design the universe using Business Objects Designer Module.

Developed table joins, alerts and queries, creating prompts, parsing of the Universes, applying contexts and alias table to remove cyclic dependencies. Resolving the loops checked the cardinalities and checked the integrity of the Universes.

Created different types of reports such as Master/Detail, Cross Tab and Charts.

Created the reports in Web Intelligence using the universe as the main data providers and writing the complex queries including sub-queries, unions, intersect and minus.

Formatted the WebI Reports as per the Users requirements using all the facilities in Business Objects, which allows the users to analyze the data. Developed reports for all modules and some reports are extremely complicated.

In Web Intelligence(WebI), developed wide variety of advanced and complex reports using functionalities like, Cross Tabs, Forms, Tables, Sections, Breaks, Sorts, Ranking, Prompts, Drill-Up, Drill-Down, Drill-Across, Drill-By, Charts, Formatting Cells and Blocks, Conditional formatting using Alerts, Formulas and Variables, Functions, Report Filters, Combined Queries, Sub Queries, Merged Dimensions.

Created Complex crystal reports and enhanced their performance

Environment: Informatica 9.1/9.6,Business Objects 3.4.1,4.2,InfoBurst,Crystal Reports, DVO, OBIEE, Teradata12, Oracle 11g/12c, UNIX, Windows XP, UC4,Control M, Oracle APPS, SQL Server 2012, DB2

Senior ETL Developer American Girl Inc., WI (Cognizant - Implementation Partner) July 13 – Feb 14

Responsibilities

Extensively worked on understanding the business requirements, Data modeling and ETL structure.

Designed ETL loading process and data flow.

Developed ETL scripts to extract and cleanse data from SQL databases.

Generated the required Meta data at the source side, in the Informatica Mappings

Prepared the impact analysis document and high-level design for the requirements.

Involved on performance tuning by collecting statistics and observing explain plan.

Carried out performance tuning both on Informatica side and database side.

Worked on slowly changing Dimensions and Operational Schemas.

Involved in Analysis, Requirements Gathering and documentation of Functional & Technical specifications.

Worked on data analysis to find the data duplication and existed data pattern

Developed High Level Technical Design specifications and Low Level specifications based on business requirements.

Worked on preparing unit test plans and functional validation.

Designed, developed and tested data conversion/data integration modules to support large scale system migrations and implementations.

Worked closely with ETL Analysts and project subject matter experts to create technical designs, developed ETL modules using ETL software (Informatica) and stored procedures, thoroughly test the software, integrated the code in to workflows or scheduling solutions, and execute the code modules as necessary for final conversion

Analyzed the source data coming from different sources (Oracle, XML, Flat files, Excels) and worked on developing ETL mappings.

Developing mappings to load data from multiple data sources to target.

Developed complex mappings using Lookups both connected and unconnected, Rank, Sorter, Joiner, Aggregator, Filter, Router, SQL transformations to transform the data as per the requirements.

Extensively used SQL-Overrides and filter conditions in source qualifier thereby improved the performance of the mapping involved in performance tuning on Informatica in all levels.

Configured ODBC connectivity to various source/target databases.

Have defined production release requirements and sustainment architecture.

Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the Workflow Manager.

Used Workflow Monitor to monitor the jobs, review error logs that were generated for each session, and resolved them.

Have involved in administration tasks including upgrading Informatica, importing/exporting mappings

Optimized the performance of the mappings by various tests on sources, targets and transformations.

Identified the Bottlenecks, removed them and implemented performance tuning logic on targets, sources, mapping, sessions to provide maximum efficiency and performance.

Used Pushdown Optimization exclusively to improve the performance of the ETL processes.

Tuned performance of Informatica session for large data files by using partitioning, increasing block size, data cache size, sequence buffer length and target based commit interval.

Used SQL Assistant to querying Teradata tables.

Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another.

Involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.

Worked closely with the Managed Services team to provide high level design document, monitor progress, provide guidance, review and sign off on all documentation and testing.

Environment: Informatica 9.1, Cognos, Teradata12, Oracle 10g, UNIX, Windows XP, PVCS, UC4, Oracle APPS

Senior ETL Developer, American Girl Inc., WI (Cognizant - Implementation Partner) Jan 13 – June 13

Responsibilities

Developed functional and technical specification documents, mappings and exception handler for ETL processes

Tuned performance of Informatica session for large data files by using partitioning, increasing block size, data cache size, sequence buffer length and target based commit interval.

Optimized the performance of the mappings by various tests on sources, targets and transformations.

Developed robust Informatica mappings and fine-tuned them to process large volume of input records with estimated throughput.

Created mapping design and mapping document

Developing mappings to load data from multiple data sources to target.

Have prepared the impact analysis document and high-level design for the requirements.

Have involved on performance tuning by collecting statistics and observing explain plan.

Carried out performance tuning both on Informatica side and database side.

Designed ETL loading process and data flow.

Used Workflow Monitor to monitor the jobs, review error logs that were generated for each session, and resolved them.

Optimized the performance on database by disabling and enabling the indexes.

Understanding the Functional Team requirements & specifications.

Developed complex mappings using Lookups both connected and unconnected, Rank, Sorter, Joiner, Aggregator, Filter, Router, SQL transformations to transform the data as per the requirements.

Documenting Test cases and reviewing

Have worked on slowly changing Dimensions and Operational Schemas.

Extraction, Transformation and Loading (ETL) of data by using Informatica Power Center.

Extracted source data from flat files, Oracle and loaded to an Oracle.

Created mappings using the transformations such as the Source qualifier, Aggregator, Expression, Router, Filter, Sequence Generator, and Update Strategy.

Created and Monitored Informatica sessions.

Checked and tuned the performance of Informatica Mappings.

Involved in Creating Sessions, Workflows Using Workflow Manager

Involved in testing the mappings (UNIT testing).

Worked closely with the Managed Services team to provide high level design document, monitor progress, provide guidance, review and sign off on all documentation and testing.

Developed High Level Technical Design specifications and Low Level specifications based on business requirements.

Environment: Informatica 9.x, Oracle 10g, UNIX, PVCS, Windows XP, UC4

Senior Lead Developer, Mercury Insurance, CA Feb 11 – Dec 12

Responsibilities

Analyzing business requirements and preparing technical documentation and application process and writing test cases.

Working on Designing and Creating tables for new setup database, writing sub routines as per the business requirement.

Analyzing and evaluating data sources fields to match the target fields for moving the data from denormalized legacy system to normalized New Setup database.

Developed mappings to load the date form different sources like oracle, flat files

Gather and/or clarify requirements directly from the users or PM and developing new informatica process for automating various new ETL activities.

Writing UNIX scripts to ftp files, cleaning the data, other required functionalities and place them in the proper source/target directories for processing.

Enhancement of existing complex ETL processes to support additional design, development and deployment of new and enhanced ETL processes to production and support new load strategy.

Created variables in Expression transformation to compare current record with the previous record and combine the data for these records based on the same claim numbers

Responsible for creating shared and reusable objects in Informatica shared folder and update the objects with the new requirements and changes.

Responsible for providing consolidated daily and weekly status reports to the management and team.

Responsible for coordinating the Offshore and Onsite team and resolve all the issues faced by the team.

Used FileZilla and WINSCP tool to create and view the parameter files and other source files in UNIX in DEV, Testing and Production Environments.

Extensively used pmcmd commands on command prompt and executed Unix Shell scripts to automate workflows and to populate parameter files

Worked on Change Data Capture data sources.

Prepared ETL low level and high level design documents.

Debugging invalid mappings using break points, testing of stored procedures and functions, testing of Informatica sessions, batches and the target Data.

Wrote SQL, PL/SQL codes, stored procedures and packages for dropping and re- creating indexes, to generate oracle sequences, procedures that cleans up the base automatically in case the mapping already ran, the procedure that will logically expire existing records and update specific Code, packages to incorporate business rules to sometimes eliminates transformation dependency

Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables

Performed tuning and optimization of complex SQL queries using Teradata Explain.

Responsible for Collect Statics on FACT tables.

Used Teradata Data Mover for overall data management capabilities for copying indexes, global temporary tables.

Created proper Primary Index taking into consideration of both planned access of dat and even distribution of data across all the available AMPS.

Improving the SQL query performance, and updating the technical documents.

Involved in complete life cycle implementation of Business Objects.

Extensively worked in creating Xcelcius dashboard and design.

Extensively worked on InfoView to create Web Intelligence, Desk Intelligence Reports over the Universes Created. Worked on Combination Queries Created Webi reports over Multiple Queries.

retrieving data using Business Objects Universes, personal data files and free hand SQL methods

Creating and Formatting the Reports as per the Users Requirements using the features in Business Objects like Master/Detail, Charts, Cross tabs, Slice and Dice, Drill mode, Formulae to analyze the data. Creating and distributing Report Templates to Users. and Ranking were used for Multidimensional Formatting

Used @Functions like @Prompt (for user defined queries), @Where (for creating conditional filters), and @Select to create duplicate objects in the process of building the Universes.

Environment: Informatica PowerCenter v 8.6, SAP Business Objects, Teradata12,Oracle 9i/10g, CSV Files, Excel files, SQL, PL/SQL, Unix Shell scripting, Windows XP and, SQ Developer, Toad, Microsoft Visio.

Data warehouse Developer, GE Capital Treasury CT Jul 10 – Dec 10

Responsibilities

Analyzed, designed, created and manipulated Oracle tables.

Resolved moderate problems associated with the designed programs and provided technical guidance on complex programming.

Experience with Data modeling and MDM methodologies.

Developing mappings to load data from multiple data sources to target.

Created Data Marts for new modules.

understood the business needs and designs programs and systems that match the complex business requirements and records all the specifications that are involved in the development and coding process

Assisted the project manager by compiling information from the current systems, analyzing the program requirements and ensuring that it meets the specified time requirements

Researched manual and automated systems, developed recommendations and prepared documentation. The recommendations identified unnecessary steps.

Worked with data warehouse staff to incorporate best practices from Informatica.

Achieved Data Masking in database using SQL scripts for security and integrity of data.

Worked with analysts, using work sessions, to translating business requirements into technical user specifications, including data, process and interface specifications.

Developed new methods and procedures.

Developed implementation plans and schedules for Informatica enhancements,

Developed new enhancement training sessions

Experience with Cobol transformation, for sourcing data from CobolJCL sources

Assisted in the development and training related to best practices for the Data Warehouse group.

Researched and developed methods related to information processing metrics.

Involved with moving data from flat files, including Excel files, to an Oracle Environment.

Developed documentation, for both business and technical designs.

Worked closely with the Managed Services team to provide high level design document, monitor progress, provide guidance, review and sign off on all documentation and testing.

Have worked on performance tuning by collecting statistics and observing explain plan.

Prepared unit test plans and functional validation.

Have defined production release requirements and sustainment architecture

Environment: Informatica 8.x, Oracle 10g, UNIX, PVCS, Windows XP, VISIO

Academic Experience June’ 09 to May’ 10

Projects Executed:-

Student Evaluation & Feedback System in B-Tech 4nd Year

Description - Software providing the list of students/faculties according their performance.

Language – C++ / MS Access, Perl

Environment - Windows

EDUCATION

Bachelor of Technology, G.Pulla Reddy Engg Colg, India



Contact this candidate