Chandana
Email Id: ********.********@*****.***
Mobile: 615-***-****
Deer park, NY
Career Objective:
Aiming for an innovative, challenging, growth oriented and responsible career in the field of Business Intelligence.
Professional Experience and Summary
Over 8 years of experience in Information Technology field with strong emphasis in Business Intelligence, ETL Solution and Data Warehousing.
Worked in on-site/offshore model around 6 years
Worked as an Associate Lead around 4 years
Extensive experience in Healthcare, Insurance, banking and financial services.
Excellent experience in Informatica PowerCenter 9.x/8.x, Teradata, Oracle, ESP and UNIX scripting
Have exposure working with Informatica Data quality
Actively participated during upgradation from Informatica powercenter 9.6 to Informatica powercenter 10.1
Profound knowledge in complete life cycle development of Data Warehouse: Data Analysis, Design, Development, Implementation and Testing using Data Extraction, Transformation and Loading (ETL)
Drawing on Experience in all aspects of analytics/data warehousing solutions (Database issues, Data modelling, Data mapping, ETL Development, metadata management, data migration and reporting solutions)
Design & Development of various ETL components (Informatica mappings, UNIX shell scripts, SQL scripts etc.) Scheduling techniques and innovative add-on components
Expertise in design and implementation of SCD - slowly changing dimensions types (1, 2 and 3)
Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
Expertise in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages
Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation.
Knowledge on implementing hierarchies, relationships types, packages and profiles for hierarchy management in MDM Hub implementation
Worked on creation of Tables, Views, Constraints and normalization techniques.
Effectively made use of Table Functions, Indexes, Analytical functions, Query Re-Write.
Have exposure on database objects like Stored Procedures, cursors, Functions and Triggers using SQL and PL/SQL
Experience with Performance Tuning for Oracle RDBMS using Explain Plan and HINTS.
Improved the performance of the application by rewriting the SQL queries.
Coordination with Business Analyst and Subject Matter Experts to understand the scope of work and business
Experience in coordinating with cross functional teams and project management.
Analysis of source data and raising the concern if any with appropriate Subject Matter Expert via Business Analyst
Have exposure on preparing mapping documents for development of Powercenter mapping design
Have exposure on documentation on Detailed Technical Design, Migration process, Test cases
Requirements gathering for Change Requests and various other modules and sub modules which are internal to the project
Worked with different scheduling tools like ESP and Control-M
Attending Scrum calls and other Client meetings for status reporting of the team, getting necessary clarifications, providing and discussing various solution methodologies to solve a business case
Performing Quality assurance reviews like internal quality analysis and external quality analysis, before any deliverables
Had worked deeply on performance improvement of ETL components.
Worked on performance tuning of SQL and Unix Scripts.
Good analytical, reasoning and problem solving skills
Good Communication and presentation skills
Have exposure in Power shell scripting
Given guidance and trained new recruits regarding ETL Concepts, Power Center, UNIX, Database concepts, Scheduling tools and mentored them in various projects and phases.
Responsible for team members' work assignment and tracking
Had taken part in code reviews of Powercenter Designer, Powercenter Workflow Manager, Shell scripts, SQL Queries, Scheduling designs
Certifications:
Certified in ITIL 2011 V3 foundation of Service Management
Certified in ETL tool PowerCenter 8.x
Teradata V12 Certified Professional
Academics:
Bachelor of Technology in Computer Science and Information Technology -India
Technical Skills:
Operating System : Windows NT/2000/XP, Vista, Windows 7/10, Linux
Software Skills : Informatica Power Center 10.1/9.x/8.6, Informatica developer 9.1,
Informatica MDM hub 8.6
Tools Used : ESP Workstation, Control-M, WinSCP, PuTTY, SQL Developer,
Toad, SQL Squirrel, Ultra-edit, Beyond Compare, Text Pad, Visio,
MS Office, Winmerge, Rational Rose, Powershell
Languages : C, Java, C++, XML
Databases : Teradata, Oracle 11i/10g/9i, MySQL, DB2 UDB 9.x, MS SQL Server 2000.
Professional Experience:
Key Assignments:
CVS Healthcare, Rhode Island, USA July 10, 2017 - Current
ETL Developer
Enterprise Data Warehouse (EDW)
LTC locations have the need of a perpetual inventory system to track and optimize inventory. The project is to implement automated perpetual inventory system at all LTC pharmacy locations to reduce inventory carrying costs, inefficient labor practices, and replenishment processes, reduce returns/waste, 3rd party costs and optimize inventory days of supply by NDC and package type.
Responsibilities:
Design & Development of various ETL components
Performed ETL code reviews and Migration of ETL Objects across repositories.
Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs in production.
Analysing and Resolving defects
Analyzed existing complex data flow of multiple layers to find impact because of changes in source data
Unit test plan preparation, execution of the test plans on all developed component and preparation of unit test reports as part of quality process
Analyzed and resolved system/production failures and data quality defects.
Created and modified various UNIX Shell scripts, Control Scripts in K Shell.
Worked on unix extensively with sftp connections for files fetching and also for fetching the data from databases
Used unix commands to deal with the success and failure status of files existence and with the balance and control logic handling and especially for team mail communication about the job running status.
Environment: Informatica PowerCenter 9.6, UNIX, Oracle, Control-M, UNIX, Tortoise SVN
Athene USA, West Des Moines, USA Mar, 2014 – Apr, 2017
Senior BI/ETL Developer
Enterprise Data Warehouse (EDW)
The Enterprise data warehouse (formerly FDS – financial data store) is designed to be a database which integrates and aggregates many disparate sources of data across Athene USA into a consistent consolidated view with extensive history available. EDW harmonizes the differences between data sourced from different systems into a view that allows those who use the data to have knowledge of only the data structure of EDW not the particular structure of each independent system. The EDW joins data from many different business functions which do not directly exchange data to provide correlation of that data and the impacts of activity and performance across the enterprise. As a result, EDW speeds delivery of reporting based on very broad aggregate data and historical trends. The EDW is not meant to support transactions it is intended to support analysis of performance, management reporting for decision making and trend analysis over time. Business actions and decisions can be informed or triggered based on analysis and predictive/data correlation models developed over the top of the EDW. EDW will retain rich historical data. EDW will serve as a central repository of robust broad scoped data.
Responsibilities:
Coordinated between the offshore team and the Client
This co-ordination involves in getting the clarifications from the Client for the Offshore team
Act as an interface between client and offshore team in getting requirements/requirement clarifications that arise during code development
Design & Development of various ETL components
Implementation of the developed component across various environments like Integration, System, Production Support and Production
Implemented Teradata Utilities such as MultiLoad and Teradata Parallel Transporter connections for loading data to enhance runtime and performance
Involved in the creation of new objects (Tables/Views, Indexes) in Teradata
Performed ETL code reviews and Migration of ETL Objects across repositories.
Developed ETL's for masking the data
Monitored day to day Loads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's
Coordinated and interacted with other teams to maintain integrity in performing the tasks and execution of the loads appropriately in a timely manner.
Analysing and Resolving defects
Unit test plan preparation, execution of the test plans on all developed component and preparation of unit test reports as part of quality process
Providing the downstream impact analysis
Involved in deployment management activities and performs Final check on all deliverables to meet all the Client quality standards.
Attended SIT and UAT defects discussion meetings as part of testing support in higher environments.
Analyzed and resolved system/production failures and data quality defects.
Shared the weekly status reports to customer and also with TCS Managers.
Followed up on any functional clarification with business users needed by Offshore team
Handled the Incident/Problem and Service requests
Used debugger in identifying bugs in existing mappings by analysing data flow, evaluating transformations.
Pre and post session assignment variables were used to pass the variable values from one session to other.
Developed mapping parameters and variables to support SQL override.
Identified problems in existing production data and developed one time scripts to correct them
Creating indexes on tables to improve the performance by eliminating the full table scans and views for hiding the actual tables and to eliminate the complexity of the large queries.
Worked on creation of temporary tables, cloning the tables
Have exposure on sequences and handling duplicates
Have used Informatica developer 9.1 for email validation task
Participated in the upgradation process of Informatica Powercenter 9.6 to Informatica Powercenter 10.1
Create/administer Oracle 10.2.0.3 databases for PowerCenter repositories
Environment: Informatica PowerCenter 9.6, Informatica developer 9.6, Teradata, Oracle, ESP, Control-M, UNIX
Aviva USA, Iowa Feb, 2013 – Feb, 2014
Senior BI/ETL Developer
Customer Retention Project (CRP)
The project comprises of making use of the Clients Call Centre Information to effectively derive reports for improvement of Customer Retention. Call frequencies and relevant data was transferred to the Data Warehouse in order to generate reports for Performance management and Customer retention.
Responsibilities:
Converted business requirement into high level and low level design
Worked closely with the Business analysts and attended meetings to gather and discuss the requirements for various projects.
Prepared various documents like ETL Specification doc, Technical Design doc, Run Book etc.
Documented Mapping and Transformation details, user requirements, implementation plan and schedule
Involved in writing BTEQ scripts for creating staging tables & inserting test data as per requirement
Designed ETL components and Scheduling design and techniques
Development of various ETL components, Scheduling components
Worked on complex mappings, mapplets and workflow to meet the business needs ensured they are reusable transformation to avoid duplications.
Extensively used ETL to transfer and extract data from source files (Flat files and DB2) and load the data into the target database.
Written UNIX shell scripts to automate the load
Extensively used ESP (Enterprise Scheduler and Planning) for Scheduling and monitoring
Handled scripts for pre validating the source File structures before loading into the Staging by comparing the source file headers against the base lined header
Wrote complex SQL queries using joins, sub queries and correlated sub queries to retrieve data from the database
Extensively worked on Oracle SQL's for Data Analysis and debugging.
Performance tuning has been done to increase the through put for both mapping and session level and SQL Queries Optimization as well
Provided support and quality validation through test cases for all stages of Unit and Integration testing
Involved in migrating objects to different environments using Deployment groups and Labels
Offered production support services to resolve ongoing issues and trouble shoot them to identify the bugs
Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level and the Target Level for Slowly Changing Dimensions Type1, Type2 for Data Loads
Prepared ETL standards, Naming conventions and wrote ETL flow documentation
Extensively used the Add Currently Processed Flat File Name port to load the flat file
Environment: Informatica PowerCenter 8.6, Informatica MDM hub 8.6, Teradata, Oracle, ESP, UNIX
Client: Wells Fargo, Charlotte, NC June,2012–Jan,2013
ETL Designer
Wells Fargo Home Mortgage (WFHM) takes care of supporting Wells Fargo representatives in all branches in distributing home mortgages to customers across United States. The CORE ODS (Operational Data Store) is used to acquire, prepare and deliver data which was generated within the CORE on-line applications. The COD database (Consolidated Originations Database) of OLTP system is the primary data source for the ODS. ODS is divided into various schema's like Raw, staging and static to ensure the proper cleansing and transformation of data to various downstream systems.
The CORE ODS delivers data primarily to the LIS (Loan Information System) from which the data is delivered to the various downstream systems including operational, reporting and analytical. We analyzed and performed ETL (Extraction, Transformation and load) operations on the ODS data of various subject areas like the RESPA Fee, Escrowing, Closing, Property, Landlord, Income, Asset, Employment, and Underwriter etc of the borrower into the LIS system.
Responsibilities:
Involved in requirement Gathering and analysis of source data as data comes in from different Source systems.
Involved in eight releases to successfully lead the deployment process.
Record Count Verification DWH backend/Reporting queries against source and target as an initial check.
Validation after isolating the driving sources.
Data integrity between the various source tables and relationships.
Check for missing data, negatives and consistency. Field-by-Field data verification can be done to check the consistency of source and target data.
Development of ETL using Informatica 8.6.
Applied slowly changing dimensions like Type 1,2,3 effectively to handle the delta Loads
Prepared various mappings to load the data into different Staging and Target tables.
Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy, Designing and optimizing the Mapping.
Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Oracle for optimal performance.
Prepared the error handling document to maintain the Error handling process.
Created test cases for the mappings developed and then created integration Testing Document.
Involved in the Defect analysis call for UAT environment along with users to understand the data and to make any modifications if suggested by the user.
Developed Workflows using task developer, worklet designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.
Developed reconciliation scripts to validate the data loaded in the tables as part of unit testing.
Prepared SQL Queries to validate the data in both source and target databases.
Prepared scripts to email the records that do not satisfy the business rules (Error Records) to the uploaded business users.
Prepared the UNIX Shell Scripts to process the file uploads
Extensively worked on the ESP scheduling tool.
Worked on Teradata and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
Supported the applications in Production on rotation basis and provided solutions for failed jobs.
Actively attended the production support calls at all times and answered the business questions.
Prepared Game Plan's for the Deployment of Informatica Code.
Prepared Change Request's, Work Order's, Problem Ticket's, and Work Request are for the deployment process.
Prepared the recovery process in case of workflow failure due to Database issues or Network issues.
Environment: Informatica 8.6.1, Oracle, Teradata, UNIX, Shell Programming
Client: State Street Corporation, MA Jan, 2012 – May, 2012
ETL Developer
State Street Corporation is the world’s leading provider of financial services to institutional investors including investment servicing, investment management and investment research and trading. State Street operates globally in more than 100 geographic markets and employs 29,230 worldwide. The PRIME application is an anti-money laundering application used to screen the state street customers with the sanctioned list and send the matched records to OFAC team. We get hundreds of feeds regularly. The files were Ftp’d to the informatica server and implemented business rules using informatica and sent to another server for batch filtering
Responsibilities:
Responsible for translating business-reporting requirements into data warehouse architectural design
Involved in designing jobs for incremental loads as per given specs
Involved in the ETL development using Informatica 9.1
Performed ETL Optimization and tuned some key Informatica mappings for better performance
Used transformations like Lookup, Join, Sequential, Normalizer, Rank, Router and XML.
Worked on mapping parameters and variable, workflow parameters and variables in Informatica.
Created mapplets and worklets and used throughout the project.
Created naming and system standards for lookup transformation and target tables
Designed developed, and tested processes for validating and conditioning data
Participated in performance integration and system testing.
Environment: Informatica PowerCenter, Teradata, Oracle, ESP, UNIX
Client: Cigna Health Care, CT July 2011 – Dec 2011
Senior BI/ETL Developer
Cigna is a leader in health insurance and retirement investment schemes. The Project involved in writing ETL to pull campaign meta data, contacts data, and snapshot data from DB2 system tables and flat files into the ODS. This data will be stored at the most granular detailed level to provide flexibility for segmentation, response attribution, and aggregation to history and also to accept a standard disposition layout from outside deployment vendors.
Responsibilities:
Responsible for translating business-reporting requirements into data warehouse architectural design
Involved in designing various jobs in ESP as per given specs.
Used informatica powercenter tools to extract data from source systems such as Oracle, Flat files to target system on Oracle.
Performed ETL Optimization and tuned some key mappings for better performance.
Involved in developing and deploying new data marts along with modifying existing marts to support additional business requirements.
Used transformations like Look Up, Join, Sequential, Rank, Source qualifier, aggregator.
Created naming and system standards for lookup, transformation and target tables.
Worked on Star schema. Built fact tables using informatica powercenter.
Created ESP design and jobs to execute informatica workflows.
Designed developed, and tested processes for validating and conditioning data.
Participated in performance integration and system testing.
Environment: Informatica PowerCenter, Teradata, Oracle, DB2, ESP, UNIX
Aviva USA, India Aug, 2010 – June, 2011
Senior BI/ETL Developer
Trial Insurance to EDW :
The project comprises of making use of the information given by customers before the insurance policies are issued to the customer. Trial policies will be issued during the underwriting process and before the actual policies are issued to the customers. This entire information is collected from different source systems and stored in EDW to generate reports for customer management.
Responsibilities:
Design & Development of various ETL components (Informatica mappings)
Converted business requirement into high level and low level design
Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
Defined the Base objects, Staging tables, foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.
Configured Address Doctor which can cleanse whole world address Data and enhanced by making some modifications during installation
Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records
Configured match rule set property by enabling search by rules in MDM according to Business Rules
Performed match/merge and ran match rules to check the effectiveness of MDM process on data
Design and Development of Scheduling techniques and components
Written database scripts and UNIX shell scripts to automate the load
Worked on complex mappings, mapplets and workflow to meet the business needs ensured they are reusable transformation to avoid duplications.
Creating indexes on tables to improve the performance by eliminating the full table scans and views for hiding the actual tables and to eliminate the complexity of the large queries.
Wrote complex SQL queries using joins, sub queries and correlated sub queries to retrieve data from the database
Extensively used ETL to transfer and extract data from source files (Flat files and DB2) and load the data into the target database.
Documented Mapping and Transformation details, user requirements, implementation plan and schedule
Extensively worked on Oracle SQL's for Data Analysis and debugging.
Handled scripts for pre validating the source File structures before loading into the Staging by comparing the source file headers against the base lined header
Involved in writing BTEQ scripts for creating staging tables & inserting test data as per requirement
Performance tuning has been done to increase the through put for both mapping and session level and SQL Queries Optimization as well
Provided support and quality validation through test cases for all stages of Unit and Integration testing
Involved in migrating objects to different environments using Deployment groups and Labels
Worked closely with the Business analysts and attended meetings to gather and discuss the requirements for various projects.
Prepared various documents like ETL Specification doc, Technical Design doc, Run Book etc.
Offered production support services to resolve ongoing issues and trouble shoot them to identify the bugs
Extensively used ESP (Enterprise Scheduler and Planning) for Scheduling and monitoring
Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level and the Target Level for Slowly Changing Dimensions Type1, Type2 for Data Loads
Prepared ETL standards, Naming conventions and wrote ETL flow documentation
Extensively used the Add Currently Processed Flat File Name port to load the flat file
Environment: Informatica PowerCenter 8.6, Informatica MDM hub 8.6, Teradata, Oracle, ESP, UNIX
Yog Info Tech Pvt Ltd, India Aug, 2009 – July, 2010
Assistant Systems Engineer
Sales Reporting:
Sales Reporting mainly involved creating a Data mart in Teradata which stores the policy sales from the agents. On creation of Mart, this gave a consolidated view Reporting structure of agents, Commission information of Writing and Servicing agents, top performing agents.
Responsibilities:
Proficiency in developing SQL with various relational databases like Oracle, Teradata
Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.
Created Tasks, Workflows, Sessions to move the data at specific intervals on demand using Workflow Manager and Workflow Monitor.
Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
Used debugger in identifying bugs in existing mappings by analysing data flow, evaluating transformations.
Written documentation to describe program development, logic, coding, testing, changes and corrections.
Involved in writing test cases, assisting Testing team in testing.
Involved in supporting and maintaining Unix shell script jobs
Developed the mapping specifications and documented from source to target.
Have exposure on handling duplicates
Responsibilities included designing and developing complex informatica mappings including Type-II slowly changing dimensions.
Performance tuning is performed at the Mapping level as well as the Database level to increase the data throughput
Followed Informatica best practices, such as creating shared objects in shared for reusability and standard naming convention of ETL objects
Environment: Informatica PowerCenter, Teradata, Oracle, ESP, UNIX