Sign in

Data Sql Server

United States
August 11, 2017

Contact this candidate



ETL Developer

Email Id:

Phone No:508-***-****


6+years of ETL Developer experience in Design, Development, Implementations, Support & Maintenance of Data Warehouses and DataMart’s using Informatica.

Extensive experience in data warehousing Technologies using Informatica Power Center 6.x/7.x/8.1.1 and intense expertise using Oracle 7.x/8.x/9i/10g and SQL Server.

Extensive experience in full life cycle project design and development.

Knowledge of Data Warehousing concepts & Dimensional modeling like Star and Snowflake Schemas and Worked on technologies like Oracle, Oracle Data Integrator.

Hands-on working experience in Dimensional Data Modeling, Data Cleansing, Standardization and Migration, and Data Staging of operational sources using ETL processes for data warehouses.

Excellent working experience in Data Warehouse applications, directly responsible for the Extraction, Transformation & Loading of data from multiple sources into Data Warehouse.

Experience in implementing complex business rules by creating re-usable transformations (Expression, Aggregate, Lookup, Router, Rank, Update Strategy) using Power center.

Experience in development of ELT and ETL Mapping in Oracle Data Integrator(ODI).

Knowledge in Electronic Health Records (EHR) systems and Healthcare Information Exchanges.

Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems.

Good understanding in database and data warehousing concepts (OLTP & OLAP).

Extracted data from Oracle and SQL Server then used Teradata for data warehousing.

Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions and at database.

Extensive experience with data modeling techniques, logical and physical database design.

Experience in Extracting, Transforming and Loading (ETL) data from Excel, Flat file, Oracle to MS SQL Server.

Experience with all phases of software development life cycle (SDLC) like waterfall and agile methodologies and batch process experience.

Used SSIS to create ETL packages to validate, extract, transform and load data to data warehouse and data marts.

Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional Model (Kimball and Inmon), Star and Snowflake schema design addressing slowly Changing Dimensions (SCDs)

Extraction, Transformation and Load (ETL) processes using SAS DI studio for building a data warehouse, OLAP cubes for reporting and other various data integration projects to build data marts from enterprise data sources and fully integrate the SAS Business Intelligence suite.

Database and Log Backups and Scheduling Backups by creating and deploying SSIS packages.

Modified and maintained SQL Server stored procedures, views, ad-hoc queries, and SSIS.

Experienced in Import and export of Informatica code by XML files & experience in SAS ETL development.

Knowledge in SAS Information Map Studio, SAS Access, SAS Enterprise guide.

Extensively worked on database objects like tables, indexes, views.

Proficient in using Informatica Workflow Manager to create and schedule workflows.

Well acquainted with Performance Tuning of sources, targets, mappings and sessions to overcome the bottlenecks in transformation and Loading.

Highly proficient in SQL, PL/SQL, Stored Procedures, Triggers.

Expertise in UNIX Shell Scripts.

Knowledge of archiving and purging the databases.

Gained working knowledge in reporting tools like Siebel Analytics, Business objects and COGNOS.

Excellent Analytical, Communication and Team Working Skills.


Data Warehousing

Informatica Power Center 8.6.1/8.1.1/7.1.1/7.0/6.2, Informatica Power Exchange, Informatica Data Quality Suite.


Oracle 11g/10g/9i, DB2, SQL server 2008R2/2012, MS Access and Teradata.


Transact- SQL, PL/SQL, HTML, Unix shell scripting, C, JAVA.

Dimensional Data Modeling

Dimensional Data Modeling, Star Schema Modeling, Snowflake Schema Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ER Diagrams.

SQL Server Tools

SQL server Management Studio, SQL server Query Analyzer, SQL server mail service, SQL server profiler, Putty.

Web Technologies

MS FrontPage, MS Outlook Express, FTP, TCP/IP, LAN, PHP.

Other Tools

Microsoft Office, MS Visio, Visual Basic 6, Perl

Operating Systems

Linux, Unix, MS-DOS, Sun Solaris.


Client: BBVA COMPASS, Chicago, IL April 2016 to Till Date

Role: Sr. ETL Developer


Design, Develop and Documentation of the ETL (Extract, Transformation & Load) strategy to populate the Data Warehouse from the various source systems.

Involved in design and development of ETL processes to support the Contract Sphere Application.

Created different source definitions to extract data from Flat Files and Relational Tables for Power Center.

Experience using ODI (Oracle Data Integrator) ELT tool to create Data Ware Housing OLAP model.

Used ODI commands like OdiFileMove, odiFileAppend, odiFilecopy etc.

Resolved ODI related issues faced by the team members.

Experience in ODI designer, Topology, Operator and ODI Tools.

Worked with Informatica Data Explorer (IDE), and Informatica Data Quality (IDQ) using different transformations like Parser, labeler, Address Doctor.

Experience in Informatica IDQ application for managing, cleansing data quality rules and profiling.

Assist infrastructure personnel to define and document the technical architecture that includes ODI, DRM.

Worked on Slowly Changing Dimensions (SCD) Type 1, 2 and 3 to keep track of historical data.

Setting up Sequential and Concurrent Batches and sessions to schedule the loads at required frequency using Power Center Workflow manager.

Integrated ODI mappings and translations with Oracle Data Relationship Management (DRM) web services to support meta-data management approach to maintaining account and entity hierarchies.

Involved in design and development of various interfaces using Informatica to load the data from SAP to Contract sphere and contract sphere to SAP based on the business requirements.

Experience in Slowly Changing Dimension(SCD) for tracking changes.

Experience on creating jobs using various transformations from the vendor sources into the target data marts using SAS DI Studio.

Involved in writing and executing the test cases for system integration testing.

Documented the test results and get business approvals for the same.

Responsible for code promotion activities.

Developed reusable Maplets and Transformations

Extensively used Dynamic Lookups for incremental loads.

Involved in error checking and testing of ETL procedures and programs using Informatica session log.

Provided best Technical and Process Oriented Solutions for the Requirements.

Involved in creating the knowledge transfer documents for the client.

Involved in scheduling the Informatica jobs in control M and communicating the same to Revitas team to schedule respective Revitas jobs.

Involved in the data verification activities through front end Revitas application such as verifying the settlements are properly settled.

Environment: Informatica Power Center, Oracle 11g, SAS DI Studio 3.3, Erwin 4.0, Micro strategy, SQL, PL/SQL, Revitas Contract sphere, Linux.

Client: T Rowe Price, Baltimore, MD Oct 2015 to Mar 2016

Role: Sr. ETL Developer


Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.

Interpreted logical and physical data models for Business users to determine common data definitions.

Hands on experience in developing Qlikview Dashboards using Chart Box (Drill Down, Drill up & Cyclic Grouping), List, Input Field, Table Box and Calendar etc. Carry out creation of SDLC documentation for existing applications.

Develop essential ETL Framework supporting rollout of ModelN – Global Pricing.

Perform comprehensive documentation, including ETL architecture, design, logical and physical data models, data auditing, error handling, and reconciliation reports.

Used Model N ERP tool to create a Datamart with Informatica.

Installation, Configuration and Deployment of Informatica with Informatica GRID option.

Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Lookup, Filter, Joiner, Rank, Router, Update Strategy, java and XML. Developed various Ad-hoc mappings for various business needs.

Designed, tested, and deployed plans using Informatica Data Quality Suite 8.5 (IDQ).

Developed and tested all the backend programs, Error Handling Strategies and update processes.

Worked on normalizer transformation for normalizing the XML source data.

Extensively used XML transformation to generate target XML files.

Perform the Load balance tests by tuning the SAS DI studio jobs.

Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

Developed Scripts to automate the Data Load processes to target Data warehouse.

Worked extensively with Oracle Data Integrator[ODI] and Oracle Warehouse Builder[OWB]

Responsible to tune ETL procedures and schemas to optimize load and query Performance.

Involved with the Architecture group to develop ETL metadata strategies and Informatica objects reuse policies. Developed reusable Informatica Mapplets and Transformations.

Developed ETL technical specs, Visio for ETL process flow and ETL load plan, ETL execution plan, Test cases, Test scripts etc.

Experience in implementing complex business rules by creating re-usable transformations (Expression, Aggregate, Lookup, Router, Rank, Update Strategy) using Power center.

Involved in the analysis, design, and development and testing phases of Software Development Lifecycle (SDLC).

Involved in production support activities with Installation and Configuration of Informatica Power Center 8.6.

Analyzed, Designed and Implemented the ETL architecture and generated OLAP reports.

Used Informatica Workflow Monitor to monitor and control jobs.

Developed stored procedures using PL/SQL and driving scripts using Unix Shell Scripts.

Worked on BTEQ and multi load scripts to load Teradata tables.

Worked extensively with Teradata utilities (MLOAD, TPUMP and FAST LOAD) to load data.

Wrote stored procedures, functions, and database triggers. Created database triggers on tables to generate surrogate keys.

Assisted in documenting business requirements for data warehousing and integration needs.

Environment: Informatica Power Center, Informatica Data Quality Suite 8.5 (IDQ), Teradata V2R6, Oracle 11g, DB2, TOAD, Erwin 3.5.2, PL/SQL, SAS DI Studio 3.3, WINCVS, Windows XP, UNIX, Sun Solaris

Client: Citi Bank, Newark, NJ Oct 2014 - Aug 2015

Role: Informatica ETL Developer


Generated the data as per the Clients specifications using mappings and transformations.

Configured Power Exchange to capture the changes to be processed on the source AS400 (DB2) & Oracle.

Configured Real Time Power Exchange mappings to replicate source data into Stage in Teradata, Oracle and SQL Server.

Interacted with Informatica Global Support about the issues that were raised as part of our daily CDC jobs.

Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.

Involved in Debugging and Performance Tuning at various levels from sources to the mappings and to the Targets, session level and at workflow levels.

Was involved in POC study and implementation to migrate the CDC over from Teradata to SQL server.

Involved in the building of data marts from enterprise data sources spread across platforms of the SAS DI studio.

Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.

Experience in Software Development Life Cycle (SDLC).

Built ETL Code to maintain history for service orders that have been posted and to capture the changes from service order origination to its final status i.e. billing and product level changes at WTN level.

Involved in running and scheduling UNIX Shell scripts, Informatica jobs.

Meet the SLA at all times and improved the Performance of loading data from Key Data sources.

Maintained facts and Dimensions using Star Schema and Snowflake Schema.

Prepare documentation on all aspects of ETL processes, definitions and mappings.

Coordinated with all the technical resources and delivery of the quality solutions in the timely manner with overall project objectives.

Environment: Informatica Power Center 9.5/9.1, Informatica Power Exchange 9.5/9.1, Teradata v14/v13, Flat files, Oracle 11, CDC, DB2, AS400, SQL Server 2008/2012.

Client: Vision Tech, Hyderabad, India. Jun 2013– May 2014

Role: ETL Developer


Worked on standardization like updating the existing workflows with parameters and new rules and method of the existing workflows.

Studied business requirements and ETL requirements documents.

Developed mappings in Informatica Power center designer.

Loaded flat files, VSAM files in to the Informatica sources.

Design and develop store procedures, configuration files, tables, views, and functions

Handled large table loads and reporting from weblog files using sql adhoc queries

Performed migration and data transformation from existing database to the new implemented database structure using Informatica.

Creation of Transformations like Sequence generator, Lookup, Joiner and Source qualifier transformations in -Informatica Designer

Used many transformations like expression, lookup, normalize, update strategy, joiner etc.

Developed workflows and sessions in Informatica Workflow manager.

Extensively used session level parameters to represent various values.

Loaded parameter values in various CTRL (control) tables, which dictate what a specific workflow should do.

Experience in Software Development Life Cycle (SDLC) [this includes multiple platforms like system study, analysis, design, development, unit integration, acceptance testing and implementation], Agile and Waterfall Software test methodologies.

Wrote SQL control scripts to create parameters and enter various values in the CTRL tables, which monitor the workflow.

Used SQL developer to write and execute SQL queries.

Used Informatica Workflow Monitor to monitor the running workflows, and check session log files to monitor sessions.

Used Informatica Repository manager to migrate workflows, sessions, and mappings from Test environment to QA.

Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.

Perform the Load balance tests by tuning the SAS DI studio jobs.

Used Harvest/CA CSM Workbench tool to migrate sources, targets, mappings, reusable transformations, sessions, workflows from QA environment to Production.

Used pushdown optimization (to source) in almost all the sessions.

Used Parameters in Pre-SQL, Post-SQL, Source Filter, Target-schema, Source-schema, source table name, target table name, SQL override etc.

Interacted with business analysts for clarification of the code and attended meeting to resolve many issues.

Wrote emails about the success and failure of the tasks that were assigned to me.

Used FileZilla to look up files and navigate through the Unix environment.

Environment: Informatica Power Center, Oracle, and MS Office.

Client: Vysya Bank, Hyderabad, India. Jun 2012 - May 2013

Role: SSRS Developer


Reviewed, evaluated, designed, implemented and maintained company databases.

Monitored and tuned database resources and activities for SQL Server databases.

Performed and automated SQL Server version upgrades and patch installs.

Designed, developed and maintained relational databases.

Developed, implemented and maintained enterprise business information systems.

Develop SSRS reports and configure SSRS subscriptions per specifications provided by internal and external clients.

Work on support tickets, maintenance tasks and assigned projects at discretion of the Reporting Coordinator.

Maintain existing Crystal Reports against legacy system and rewrite these reports in SSRS against new data warehouse.

Worked as a SQL Developer responsible for creating SQL stored procedures, functions and triggers.

Provided support and enhancement of national rental property listing database applications

Performed production system software installations

Created and scheduled SQL Agent jobs and maintenance plans

Gathered software requirements from clients and end users

Derived and documented process application business rules

Modified and maintained SQL Server stored procedures, views, ad-hoc queries, and SSIS packages used in the search engine optimization (SEO) process.

Developed and Optimized Stored Procedures and Functions using T-SQL

Participated in several projects and interacted with peers.

Gathered business requirements and converted it into SQL stored procedures for database specific projects

Environment: SQL Server 2000/2005/2008, Integration Services (SSIS), Reporting Services (SSRS), Analysis Services (SSAS), DTS, C#, .Net, T-SQL, Windows 95/NT/2000/2003, XML, MS Excel, MS Access, MS visual Studio

Education: Bachelor of Technology, GITAM University.

Contact this candidate