Post Job Free
Sign in

Sr Developer & Analyst

Location:
Indianapolis, IN, 46032
Salary:
75000
Posted:
December 09, 2011

Contact this candidate

Resume:

SONIA FRANCIS

Summary:

*+ years of experience in IT industry with in depth understanding of the life cycle of a software development project - Requirement gathering & Analysis, System Design, Software Development. System Testing, Bug fixing. Documentation and implementation.

3+ years experience in Informatica which includes the development of ETL’s for supporting data transformation using Informatica PowerCenter version 8.x

Design and development of Business Intelligence Solutions using Informatica PowerCenter and Business Intelligence Development Studio using SSIS( SQL Server Integration Services).

Extensive knowledge on the PowerCenter components as PowerCenter Designer, PowerCenter Repository Manager, Workflow Manager and Workflow Monitor.

Thorough knowledge in creating ETL’s to perform the data load from different data sources and good understanding in installation of Informatica.

Experienced in developing the mappings and transformations using PowerCenter Designer and executing the mappings through Work Flow Manager from multiple sources to multiple targets.

Excellent knowledge and experience in creating source to target mapping, edit rules and validation, transformations, and business rules.

Good knowledge in Data migration from SQL Server to DB2 using SSIS with numerous challenges like data integrity issues, data inconsistencies and high data volumes.

Extensive knowledge in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes and the scheduler tool Autosys.

Involved in implementation activities and Support activities post implementation.

Experienced at Creating effective Test data and development thorough Unit test cases to ensure successful execution.

Strong analytical, problem-solving, communication, learning and team skills.

TECHNICAL SKILLS:

ETL Tools MS SSIS, Informatica Power Center V8.1.1/9.0.1

Databases MS SQL Server, IBM DB2 v 8.0, Sybase

Methodologies ETL, Software Development Life Cycle (SDLC)

Business Tools Visio 2000

Environments Windows 97/98/XP, Unix

Packages MS Office

Scheduling Tools Autosys

TRAININGS:

Undergone Infosys General Training where I have acquired or enhanced my skills in concepts of programming and structured analysis.

Exposed to an overview of Infosys organizational structure, administration and general development methods.

Undergone extensive hands on training on varied languages/Operating Systems including a number of training projects.

Undergone On-the-Job training where I have been placed in a project team which is working on stream I was trained on.

Learned to develop and the process software utilizing Infosys methodologies and techniques of developing, testing and debugging software and systems.

Professional Experience:

Client – NorthWestern Mutual, Milwaukee May 2008 – Till Date

Role: Informatica Developer

Environment: Informatica Power Center 8.1.1/9.0.1, Sybase, UNIX, Windows 2007.

This project involves timely resolution of production issues, development and implementation of enhancements and change request in the EBIS (Enterprise Business Intelligence services).EBIS is designed in an idea of integrating applications/sources in one common area and transforming them in to different data marts for various business reporting needs. The data present in EBIS database is used for various report generation and so data accuracy is for high importance. EBIS team will be responsible for the maintenance of various databases and will be supporting the systems when affected by data volume, job failure, system failures etc.

Responsibilities:

Correlate the business to technical aspect and come up with high level design to create the mappings.

Parsing high-level design spec to simple ETL coding and mapping standards.Worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapplet Designer and Transformation Designer.

Creation of Informatica Workflows, sessions and Run that Sessions to load into Target and Debugging, Unit Testing of Mappings.

Integrating the Work Flows to the Autosys jobs to schedule the run of the mappings.

Perform Informatica code migrations to different Informatica repositories.

Good understanding about the data ware house concepts.

Designed and developed Informatica Mappings to load data from Source systems to ODS.

Extensively used Power Center to design multiple mappings with embedded business logic.

Creation of mappings with the transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer per the business need.

Created mappings using Unconnected Lookup, Sorter, Aggregator, and Router transformations for populating target table in efficient manner.

Knowledge sharing with the end users, clients and documented the design, development process, the process flow and the schedule of each mappings/jobs.

Created Unix shell scripts for triggering/automating the execution of the informatica mappings.

Design/develop the Unix scripts and deployment in production using APLUS tool.

Create the schema for the new mappings; define the data types, constraints, indexes in the database.

Designed and developed UNIX scripts for creating, dropping tables which are used for scheduling the jobs.

Maintained metadata, naming standards and warehouse standards for future application development

Used the scheduler tool Autosys for triggering the Unix scripts which inturn executes the Informatica sessions.

Created various UNIX Shell Scripts for scheduling loading process.

Created database objects like stored procedures and views and stored functions.

Involved in writing test plan, Unit, System integration, user testing and Implementation of the System.

Perform thorough end to end system testing of the functionality.

Perform User Acceptance testing and ensure that all issues are resolved and an official sign off is obtained from the users regarding the same.

Monitor the scheduled batch jobs for the execution of the workflows for any issues/failure or unusual behaviour.

Root Cause Analysis for any failures and come up with solutions to prevent such failures in future. Design and implement solutions to prevent such failures in future.

Conduct a thorough code review and ensure that the outcome is in line with the objective and all the processes and standards are followed.

Review the detail level design and ensure that the IT standards are followed.

Implementation of Code Fixes and Changes in Production.

Prepare a detailed implementation plan and checklist.

Ensure that all steps in implementation checklist have been verified.

Perform checkouts of the components after the production implementation.

Facilitate and control the implementation of approved changes efficiently and with acceptable risks to the existing and new IT services and supporting vital business functions

Create change management records for components implemented in production. Follow all the processes and standards for change control as specified by the clients Change management policy.

Client – NorthWestern Mutual, Milwaukee August 2007 – April 2008

Role: Data Retention -Developer

PeopleSoft General Ledger holds for the financial accounting of Northwestern Mutuals. The core tables of PeopleSoft GL has got millions data which was growing dramatically day. The historic data which was kept in the databases were causing performance issues and was no longer accessed. Data Retention was an Re-engineering project in which I have developed a new automated process for purging the historic data from the General Ledger system. I was also later on involved in General Ledger production support of the components developed.

Responsibilities:

Identify the people soft general ledger system tables for the data retention activities.

Analysis and understanding of the core business logic behind the tables identified for retention process.

Create a detailed design document on the Retention/purge process for the GL tables.

Analysis and understanding the business requirements and the Designing of the data base for the migration of the data from SQL database.

Develop the Sybase stored procedures with the Business logic, Unix scripts for executing the procedures.

Create test cases to test the Retention/Purge process for the tables identified.

Execute the test cases and validate the results to make sure that the intended records were purged and all the business logic was applied when the data was purged.

Perform the testing in multiple test databases ensure the smooth flow of the process.

Implementation activities of the components created and validate the results after the first run.

Production support of the components developed for General Leger Purge.

Hardware: IBM PC,Software: Sybase Database, Unix scripts, Sybase stored procedure.

Client – PMI Investigation, California August 2006 – July 2007

Role: SSIS Developer

Investigation Data Migration is a complete re structuring of the current SQL databases. The investigations MS Access application is being revamped to new Web based application and along with this, a more reliable and stable database was designed in DB2.Investigation data migration is an end-to-end project which mainly focus on the migration of existing data to DB2 database and thus provides a smooth running of the new web based investigations application. Investigation –Data migration project involves solution modelling, designing, build, testing of build code and implementation of solution. Project is employing iterative model to implement the solution, so the project is implemented through multiple release.

Responsibilities:

Identify the tables for migration required for providing the data for the new web based application.

Analysis and understanding the business requirements and the Designing of the data base for the migration of the data from SQL database.

Understand the existing SQL database structure and get the bad data corrected.

Create the SSIS data mappings to perform the data migration from SQL data base to the new data base designed in DB2 which is a onetime built migration process.

Create test cases to ensure the Data integrity, data correctness and the record count of the data migrated.

Assist the QA team to perform the testing of the migrated data.

Performance tuning of the SSIS mappings to improve the through put.

Production Support which includes the correction of the data that has been migrated according to the application needs.

Hardware: IBM PC, Software: SSIS, DB2, SQL Database

EDUCATION:

BE Electronics & Communication Engineering



Contact this candidate