Sign in

Data Sql Server

Jersey City, New Jersey, United States
January 09, 2018

Contact this candidate




** ***** **, ****** ****, NJ 07310


7+ years of software development experience which includes Requirement Analysis, Design, and Development, Testing, Infrastructure migration and Release Coordination for Data warehousing applications.

Hands on experience in all aspects of Software Development Life Cycle (SDLC).

Strong knowledge of SQL, data analysis and data validation using SQL.

Strong experience in Extraction, Transformation, Loading data from various sources into Data Warehouses using Informatica, UNIX and Oracle along with OLTP, OLAP, Data Mining concepts.

Experience in Oracle 10g, 11g, SQL/PLSQL, Informatica, MS SQL Server, Unix Shell scripting and PERL scripting.

Extensive experience in developing Stored Procedures, Functions, Views and Triggers, SQL queries and Oracle PL/SQL.

Experience in Dimensional Data modeling, Star Schema, Snow-Flake Schema, FACT & Dimensions tables, Physical & logical data modeling and De-normalization.

Experience in data conversion of various data sources like SQL Server, Oracle, Fixed Width and Delimited Flat Files.

Hands on experience with mappings from varied transformations such as Lookups (connected and unconnected), Router, Aggregator, Sorter, Filter, Update Strategy, Normalizer, Sequence Generator, Joiner.

Integration with SAP ECC and APO modules using SAP Transports and powerexchanges.

Expertise in design and implementation of Mapplets and slowly changing dimensions - (SCDs Type I & II).

Experience in working with mapping variable/parameters and created parameter files for imparting flexible runs of workflows based on changing variable values.

Used oracle explain plan feature to identify the query execution cost, tuned the queries based on the oracle optimize plan and created Materialized views for dimension tables for good query performance

Experience in Scheduling, Failure Reporting, Audit Trails, Data Cleansing, backup and purge strategies and Performance Testing Benchmarks.

Experience in Unix Shell Scripting and PERL scripting, SVN and scheduling jobs through AutoSys.

Extensive knowledge with Dimensional Data modeling, Star Schema/snowflakes schema, Fact and Dimension tables.

Experience in installing, upgrading and configuring Informatica Power Center on multiple environments. Upgraded Informatica Power Center Repository from 8.x to 9.0.1, Autosys and Oracle 11g

Expertise in implementation of SFTP, FTP services to retrieve Flat Files from the external sources

Involved in designing mappings with multiple files, using indirect file loading process.

Managing offshore team on daily basis about the development activities and overview the delivery.

Excellent interpersonal and communication skills, technically competent and result-oriented with problem solving skills and ability to work effectively as a team member as well as independently.

Developed effective working relationships with client team to understand support requirements, develop tactical and strategic plans to implement technology solutions, and effectively manage client expectations.

Assist with the development and execution of detailed system and user acceptance test plans, test cases and test scripts for manual and automated testing.


Language : C, PERL, SQL, PL/SQL

Operating System : RHEL, AIX, Solaris

Scripting : UNIX Shell scripting

DBMS/RDBMS : Oracle 10g, 11g, MS SQL Server,

Tools : Informatica Powercenter, Toad, SQL Developer, Autosys, Putty, HP ALM, Remedy




ENVIRONMENT: Informatica 9.x, 10.x., Oracle 11g, Unix Shell script, PERL, Autosys

MARRS (Argus Safety): Argus safety is a complete pharmacovigilance software system designed to solve the pharmaceutical industry’s toughest regulatory challenges. It provides the most comprehensive global Adverse Events (AE) case data management and regulatory reporting in the pharmaceutical industry. Argus Safety Japan brings the single global database for the Japanese market to include localization and regulation support.

STAR: Star stands for surveillance and Trend Analysis Reporting system. It is also called as Emprica signal. A safety data mart is maintained on the database server to provide data to an offline signal management preparation “signal prep” process is run periodically to prepare data for Emprica signal’s Signal Management module. This process uses Emprica Signal to generate statistics on the safety data, and aggregates the statistics to detect potential signal (automated observations concerning the co-occurrence of drugs and adverse events) which may warrant further investigation by medical professionals.

MARRS to STAR ETL: Argus Safety (MARRS) is the main repository for Merck Safety data. STAR contains a data mart which sourced from the MARRS data. The data mart is periodically refreshed through an ETL process which transfers data from MARRS to STAR and timestamps it. This transformation will be on different stages.

Environments: ETL, Informatica Power Center 8.x 9.x, Oracle 11g, PL/SQL, UNIX shell scripting PERL, Autosys and HP-ALM


Involved in requirements gathering, analysis, design, implements and deploy applications using Informatica-power center, Oracle and UNIX Shell Scripting.

Prepared Estimation based on the complexity and object details and Schedule, Plan Resources and Task Tracking using the utilization tracker.

Responsible for developing, support and maintenance for the ETL processes using Informatica PowerCenter 9.x, Informatica Client tools – Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.

Experience in integration of heterogeneous data sources like Oracle, SQL Server and Flat Files (Fixed & delimited) into Staging Area using Informatica mappings, workflows and session objects.

Implemented mapplets, Transformations that was responsible for validating and fine-tuning the ETL logic coded into mappings.

Extensively worked on different types of transformations like expression, filter, aggregator, lookup, stored procedure, sequence generator, and joiner etc.

Extensively worked on performance tuning at all levels of the Data warehouse. Implemented performance tuning by using lookup caches, gathering stats, dropped indexes and re-created them after loading data to targets and increased the commit interval.

Created database objects like Tables, Views, Procedures, Triggers and Packages for implementing functionalities

Integration with SAP ECC and APO modules using SAP Transports and powerexchanges.

Extracting data from various sources involving flat files and relational tables.

Involved in support and maintenance of Oracle Import, Export and SQL*Loader jobs.

Created Shell scripts to schedule the jobs through Unix and Informatica scheduler.

Working with Testing Strategy for Unit Testing and System Integration Testing

Responsible for development of new Adhoc change requests and execution of Data Change Requests.

Helped in creating of data-mapping best practices document including visual processes and trained team members on data mapping process and tools

Performed Data Analysis and Data validation by writing basic queries against the database.

Created test cases and involved in Unit testing, System Testing, User Acceptance Testing to verify accuracy followed by regression testing and completeness of ETL process.

Involved in monitoring the workflows and optimizing the load times using session partition, concurrent workflow concepts.

Created SOPs and Run books for support team and schedule meeting to explain the approach and troubleshooting techniques.

Worked on other tools like WinSCP for transferring the files from Windows to UNIX and vice-versa, Putty for connecting to UNIX and run the scripts.



IGSR: IGSR application is a comprehensive international, multi entity, multi-currency and multi-time zone stock record. It offers a single source of real-time position and trade information.

IOWA: IOWA is a back-office application in trade system which acts as a main processing engine and used globally to provide direct user access and trade instruction for trade settlements. There are two main purpose of IOWA are settlement of trade and sharing of instruction/messages.

The IOWA System connects with many upstream (front office applications) and downstream application (back office applications)

Environments: ETL, Informatica Power Center 8.x, Oracle 11g, PL/SQL, UNIX shell scripting PERL, Autosys


●Perform Requirements gathering, analysis, design, implement, test and deploy applications using Informatica-power center, Oracle and UNIX Shell Scripting.

●Responsible for developing, support and maintenance for the ETL processes using Informatica

●Created database objects like Tables, Views, Procedures, Triggers and Packages for implementing functionalities.

●Worked on SQL*Loader to load data from flat files obtained from various facilities every day.

●Created new mappings and modify the existing mappings in Informatica based on the requirement.

●Worked on production deployment and change plan documents

●Worked on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, sequence generator and joiner.

●Created and modified several UNIX shell scripts based on the needs of the project and client requirements

●Created a customized script to monitors a file from upstream application and process

●Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.

●Experience in crontab schedule, awk scripts and getopts command

●Experience in Autosys tool on Scheduling and monitoring

●Collaborated with the QA team to ensure adequate testing of software both before and after completion, maintained quality procedures, and ensured appropriate documentation is in place.



Cloverleaf® Integration Services - Quovadx/Cloverleaf is an interface engine primarily used in healthcare industries for interaction between different applications. The Cloverleaf® Integration Services runtime and delivery environment includes extensive user support and diagnostic tools.

EDCTM (Electronic Data Capture Trial Management) - EDC Trial Management application designed to manage the information about protocols, centers, people, assets, and contact information of the trial centers, shipment of the equipment to the study center and tracking of equipment. EDCTM also used for creating and managing the OC RDC and IMPALA application accounts.

Environments: ETL, Informatica Power Center 8.x, Oracle 11g, PL/SQL, UNIX shell scripting PERL, Autosys


●Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.

●Extensively used Informatica power center for extraction, transformation and loading process.

●Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.

●Used various transformations like Lookup, Filter, Expression, Sequence Generator and Router to develop robust mappings in the Informatica Designer.

●Created database objects like Tables, Views, Procedures, Triggers and Packages for implementing functionalities.

●Monitoring Autosys and cron batch jobs, Application support and Maintenance

●Worked on optimization of batch cycle job duration.

●Extracted the data from Oracle, CSV and Flat files

●Created a PERL scripting to generates Phase 1 Management data (Clinical data)

●Preparing customized utility for applications using UNIX Shell scripting and PERL script

●Constantly interacted with on-site team and designers to discuss the requirements.

●Worked on Unit testing and System Integration Testing

●Set up definitions and processes for test phases including Product test, Integration test, System test and User Acceptance Test (UAT)


Bachelor of Computer Application (2005-2008), Alagappa University, Chennai

Diploma in Information Technology (2004-2007), NTTF, Chennai

Contact this candidate