Preeti Tyagi
(BI Qlikview Developer/ SR. ETL Informatica Developer)
Email acdqja@r.postjobfree.com
SUMMARY
● Seven (7) years of Data Warehousing experience teamed with Business Requirements Analysis, Application
Design, Data Modeling, Development, testing and documentation. Implementation of Warehousing and
Database business systems.
● 5+ years of experience in designing and development of business intelligence applications with
QlikView11.0/10.0/9.0/8.0/7.5 as BI tool.
● Expertise in designing and modeling in QlikView with complex multiple data sources relational
designs.
● Experience in optimizing applications by using QVD’s, removing non key fields and script variables
(Loading phase).
● Experience with QlikView sheet objects including multiple charts types, Trends, KPI’s, custom
requests for Excel Export, and Fast Change and objects for Management Dashboard reporting.
● Experience in creating ETL Applications for QVD’s and QlikView Scripting variables and File, Date,
System, Mapping, and Table functions.
● Experience in designing QlikView Document/User Setting, Layouts to make consistent and
professional optimized look to clients
● Seven (7) Years of ETL/DW/BI experience using Informatica PowerCenter
9.0/8.6/8.5/8.1/7.1/7.0/6.2/6.1/5.1.2 (Workflow Manager, workflow Monitor, Source Analyzer, Data
Warehouse Designer, Mapping Designer, Mapplet Designer, Transformation developer), Informatica
PowerMart 6.2/6.1/5.1.2/5.1.1/4.7, Datamart, ETL, OLAP, ROLAP, MOLAP, OLTP, IMS Data, Star
Schema, Snowflake Schema, Trillium 7.0, FirstLogic, IBM DB2 8.2, DB2 Toad 4.0, FTP PRO client 32,
Windows XP, UNIX, Main frames AIX.. Well versed with Informatica Team based development (Versioning).
● Seven Plus (7+) years of experience using Oracle 11g/10g/9i/8i/8.x/7.x, DB2 8.2/7.0/6.0,
Teradata V2R5/V2R4, SQL Server 12.0/11.x, MS SQL Server 2005/2000/7.0/6.5, MS Access
7.0/’97/2000, Oracle Report Writer, SQL, XML, PL/SQL, SQL*Plus, SQL*Loader and Developer 2000, Win
3.x/95/98/2000, Win NT 4.0 and Sun Solaris 2.x.
● Five Plus (5+) years of Dimensional Data Modeling experience using Star Schema/Snowflake
modeling, FACT & Dimensions tables, Physical & logical data modeling, ERWIN 7.2/4.5/4.0/3.5/3.x.
● Experience with TOAD to test, modify and analyze data, create indexes, and compare data from different
schemas. Experienced in Shell scripting and PL/SQL procedures.
EDUCATION & CERTIFICATIONS
Masters in Science, India
●
Certified in Data Warehousing Concepts
●
Certified in Qlikview
●
TECHNICAL SKILLS
Data warehousing & Informatica PowerCenter 9.0/8.6/8.5/8.1/7.1/7.0/6.2/6.1 (Workflow Manager,
ETL workflow Monitor, Source Analyzer, Data Warehouse Designer, Mapping Designer,
Mapplet Designer, Transformation developer), Informatica PowerMart
6.2/6.1/5.1.2, Datamart, ETL, OLAP, ROLAP, MOLAP, OLTP, IMS Data, Star
Schema, Snowflake Schema, IBM DB2 8.2, DB2 Toad 4.0, Teradata, Windows
XP, UNIX, Main frames AIX.
BI & Reporting QlikView 11.0/10.0/9.0/8.0/7.5 suite(Designer and developer),Business Objects XI/6i/6.0/5.1
Data Modeling Erwin 7.2/4.5/4.0/3.5.2/2.x, Visio, Physical Modeling, Logical Modeling, Relational
Modeling, Dimensional Modeling (Star Schema, Snow Flake, FACT, Dimensions), Entities,
Attributes, Cardinality, ER Diagrams.
Databases Oracle 11g/10g/9i/8i/8.x/7.x, DB2 8.0/7.0/6.0, Teradata V2R5/V2R4, SQL
Server 12.0/11.x, MS SQL Sever 2005/2000, PL/SQL, SQL*Loader
CA Autosys, BMC
Job Scheduling &
Other tools
Environment UNIX (AIX), Windows XP/2003/2000/98, Mainframes
Others HTML, C, C++, Java, Visual C++, CSS, SQL
PROFESSIONAL EXPERIENCE
DEC 13– PRESENT
BLOOMBERG LP, NYC
BI Qlikview Developer
Campaign Reporting:
Campaign Marketing Dashboards were built to monitor and increase the Campaign Performance.
This project is for Bloomberg marketing team. Bloomberg execute multiple campaigns globally. Marketing analysts
send the invites to target Audience and monitor response based on multiple criteria. This project was catering
reports for Executive leadership, Program manager, Marketing analyst etc..
● As a Sr. developer for this project I was involved in end to end development, from data integration to
developing dashboards. This project has multiple KPI like Messages Received, hard bounces, sent, open,
click, downloaded etc. I was also involved in conceptualization of this application with business and IT. I
created multiple templates with different visualization. There were multiple report like Domain report
(created showing performance of Bloomberg as compared to other domains), Audience time series
report. Quarterly, YTD reports etc. As this project involves multiple source data sets with daily load, I
used Incremental load functionality of QV.
● Waterfall report, cumulative report etc have been developed. Have extensively worked on different
qlikview functions like dual, match, interval match, pick, peek,date functions, mapping functions,string
functions etc
● Have used alternates states, conditional enablement of dimensions and expressions, calculated
dimension etc.
● Created different objects like container,multibox etc to utilize the space, bookmark object
● extensively used set analysis to create expressions at UI layer
● Benchmark report has been created using linear gauze and threshold has been set up dynamically.
● Have created QVDs to implement incremental load and then QVWs.
● Implemented security through section access.
Marketing Support:
Marketing Support Dashboards were built for Business view as well as Function Views. It shows the performance
of multiple business of Bloomberg under each function and Vice Versa. This Project is for Bloomberg CEOs, CFOs,
Managers to view the hours and cost related to each Business unit of Bloomberg for multiple functions like CMI,
IA etc..Drill down Business group has been created showing three levels of Business.
As a Sr. developer for this project I was involved in end to end development, from data integration to developing
dashboards. This project has multiple KPI like Hours, Costs, and Volume etc. I was also involved in
conceptualization of this application with business and IT. I created multiple templates with different visualization.
There was multiple reports like Dashboard by Business Unit, Dashboard by Functions showing comparison of KPIs
with previous Quarter, delta Quarter on same page. As this project involves multiple source data sets with daily
load, I used Incremental load functionality of QV.
● Hierarchy has been created using drill down group, cyclic group for expression to accommodate many
expression on same chart.
● Extensively worked on different qlikview functions like dual, match, interval match, pick, peek,date
functions,string functions, apply map etc.
● At UI layer created different charts like bar graph, straight table showing current year and previous yr
KPI using set analysis, pivot table, pie chart
● Different objects have been created like current selection, bookmark,input box,slider etc
● Different buttons have created,and triggers have been set up.
Market Sizing:
Market sizing dashboard has been built for analysing the data of RIA firms, BD firm, RIA rep and RR
● Developed load scripts to work with multiple data sources(SQL server 2005, Excel files)
● Created Trellis chart for each dimension value, pie chart, Bar chart showing totals
● Extensively worked on qlikview functions such as interval match to creating different bin,date and
time functions, class functions,string functions,mapping functions, wildmatch..etc at script level.
● Created qvds and then QVWs. Binary load have been implemented creating separate qlikview document.
Environment: SQL Server 2010,Qlikview 11.0,SQL, Windows 2007
JUN’ 11– NOV 13
BRISTOL MYERS SQUIBB, PRINCETON, NJ
BI Qlikview Report Developer/SR. ETL INFORMATICA DEVELOPER
The S&M DW was built to compile data from various legacy systems and aggregate them to produce reports to
monitor Sales Analysis and Market Share.
ETL / DW RESPONSIBILITIES:
● Designed and Documented the Logical and Technical architecture to extract and integrate data from 8
Source Systems using a Common data Model Approach.
● Created new Informatica jobs, Sequences to extract and integrate data to maintain complete history using
Slowly Changing Dimension (SCD2) approach.
● Used IDQ/IDE to analyze the quality of data and data integration points from different source systems
● Documented the enhancements to be implemented at Source systems to implement better data quality for
Customer Matching.
● Created Linux Scripts to validate the data for format and quality from Source Systems.
● Created sequencers for seamless integration for End to End Extract and Load processes.
● Worked with Informatica 9.0/8.6 for ETL purposes.
● Extensively used Informatica Client tools (Source Analyzer, Warehouse Designer, Mapping Designer,
Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager)
● Translated business rules and functionality requirements into ETL procedures.
● Extracting the data from different sources that involved Flat files, MS Access, Oracle, MS SQL Server
relational tables using Informatica and loaded in to Oracle target database.
● Performance tuning for mappings in Informatica to load the data from various sources using different
transformations like Source Qualifier, Lookup, Expression, Aggregate, Joiner, Filter and Router
transformations for optimum performance.
● Worked with IMS Data (Xponent, Xponent Plantrak, DDD) for sales and marketing reporting.
● Responsible for conversion of Business requirements into ETL Informatica technical specifications
● Create the Informatica Mappings to load the Data into Stage tables.
● Migrating the Informatica Mapping from Development Instance to the Production Instance
● Created the Work flow and the Error handling in Informatica
● Worked with Informatica Workflow Manager.
● Used Austosys for scheduling informatica jobs along with Unix shell scripts.
● Imported Job schedules and monitored ETL batches using Workflow Monitor
● Coordinating with sources system owners and day to day ETL progress monitoring.
● Designed triggers to notify errors from workflow executions. Created database to log all batch workflow
failures and generated cumulative reports.
● Performed Unit/integration Testing and assisted in UAT. Responsible for bug fixing in UAT phase.
● Involved in troubleshooting load failure cases, including database problems.
Have done reporting using Qlikview
●
Developed load scripts to work with multiple data sources (SQL Server 2005, Flat Files, Excel, and MS
●
Access).
Developed Executive QlikView reports and Executive dashboard to meet business requirements in timely
●
manner.
Implemented security (Document Level, Sheet Level, Object Level) using QlikView’s section access
●
technology that enables users to access the section of the QlikView document depending upon their access
privileges.
Optimized QlikView performance by creating QVD’s
●
Worked with business clients to confirm requirements and business logic.
●
Environment: Informatica 9.0/8.6, IDQ, IDE, Autosys, Windows XP, Linux, Oracle11g r2, SQL,
PL/SQL, MS SQL Server 2000, 2005, Erwin 7.1, IMS Data (Xponent, Xponent Plantrak, DDD),
Qlikview 11.0/10.0
BANK OF AMERICA, PENNINGTON, NJ _ _ AUG’ 09–
MAY’11
BI QLIKVIEW DEVELOPER/SR. DW Informatica Developer
The loans Datamart integrated information about Customers, services offered, loan products, Credit information
about customers, customer financial history, new offers, Loans on yearly, quarterly, monthly and weekly basis.
The integrated data was analyzed for reporting purposes. These reports were used for effective decision making
on Loan approvals, Payments, Customer service, New Offers, etc.,
Responsibilities:
● Requirements gathering, Source data analysis and design.
● Develop Logical and Physical data models that capture current state/future state data elements and
data flows using Erwin.
● Extensively used Informatica PowerCenter as an ETL tool to extract, transform and load data from
remote sources to EDW.
● Wrote shell scripts and stored procedures to automate the aggregate building process.
● Worked extensively on transformations like Filter, Router, Aggregator, Lookup, Update Strategy,
Stored Procedure, Sequence generator and joiner transformations.
Created, launched & scheduled sessions and batches using PowerCenter Server Manager.
●
Developed a prototype for designed interface and estimated the performance criteria.
●
Extensively worked on the Database Triggers, Stored Procedures, Functions and Database Constraints.
●
Used Connected and unconnected lookups in various complex mappings.
●
Reviewed and documented existing SQL*Loader ETL scripts.
●
Worked on Debugging, Troubleshooting and documentation of the Data Warehouse.
●
Identified the underlying tables and business rules by forward and reverse engineering using ERWIN.
●
● Responsible for development, test and productions mapping migration using repository manager. Also used
Repository Manager to maintain the metadata, Security and Reporting. Tuning the Informatica Repository
and Mappings for optimum performance
Have done reporting using Qlikview
●
Developed POC (Proof Of Concept). QVW for consolidation
●
Enhancing, re engineering, maintaining and supporting existing QlikView applications
●
Restricted user privileges using section access feature of QlikView
●
Environment: Informatica PowerCenter 8.6/7.1, Qlikview 10.0, Erwin 3.5, SQL, PL/SQL, SQL Loader,
Oracle 9i/10g, DB2 8.0, SQL Server 2008/2010, Windows 2000, AIX 4.3.3, Shell Scripting, Autosys.
Abbott Diabetics care, CA _ December 2007 – JULY
2009
DW INFORMATICA DEVELOPER
Responsibilities:
● Worked closely with client in understanding the Business requirements, data analysis and deliver the client
expectations.
● Participated in gathering Business requirements from the Analyst and translating them into Technical
requirements.
● Analyzed the source data and identified business rules for data migration for developing data warehouse and
data marts.
● Converted the data mart from Logical design to Physical design, defined data types, Constraints,
Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the
objects in the Database.
● Used ERWIN for Logical and Physical data modeling, and designed Star Schemas.
● Used Informatica for Data Integration and ETL Purposes. Prepared design documents, ETL
specifications and migration documents.
● Involved in Informatica Administration Tasks including Importing/Exporting mappings, copying folders over
the DEV/QA/PRD environments.
● Created different Transformations for loading the data into target like Source Qualifier, Joiner, Update
Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter,
Aggregator and Sequence Generator transformations.
● Created and monitored workflows/sessions using Informatica Server Manager/Workflow Monitor to
load data into the Target Oracle database.
● Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum
efficiency and performance.
● Low level design and developed various jobs and performed data loads using different transformations as
per the requirements.
● Involved in Initial loads, Incremental loads, daily loads to ensure that the data is loaded in the tables in a
timely and a appropriate manner.
● Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic, Persistence,
Shared and Recache from Database), and Join cache while developing the Mappings.
● Handled slowly changing dimension (Type I, Type II and Type III) based on the business requirements.
● Involved in performance tuning by optimizing the sources targets mappings and sessions.
● Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer,
and Worklet Designer. Extracted data from various sources and transferred the data into the required target
tables after appropriate transformation.
● Responsible for strictly maintaining naming standards and warehouse standards for future development.
● Debugged data during the mapping sessions, to validate data transfer from source to target using the
performance analyzer tool.
● Performed unit testing on the Informatica code by running it in the Debugger and writing simple test
scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for optimum
performance.
● Created mapping variables and parameters for incremental loading.
● Created and monitored sessions and workflows for daily extract jobs using Informatica Power Center
Workflow Manager and Workflow Monitor.
● Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger
and Workflow Monitor.
● Monitored workflows Using Workflow monitor.
● Managed the Scheduling of Tasks to run any time without any operator intervention.
● Assisted in developing different kinds of reports from existing Universes using Business Objects.
Create Command Tasks for better control of workflows.
●
Wrote SQL, PL/SQL codes, stored procedures and packages, Unix Scripts.
●
Optimizing Query Performance, Session Performance and Reliability
●
Using Business Objects to test the data gathered from individual mappings.
●
Migrating mappings from folder to folder and synchronizing them.
●
Documentation to describe program development, logic, coding, testing, changes and corrections.
●
● Performed Unit Testing, Integration Testing and User Acceptance Testing to pro actively identify data
discrepancies and inaccuracies.
Environment: Informatica PowerCenter 6.2/7.1, Cognos 7.0, ERwin 3.5, SQL, PL/SQL, SQL Loader,
Oracle 9i, DB2 8.0, SQL Server 2008/2005, Windows 2000, Sun Solaris, Shell Scripting, Autosys.
JAN 2006 – October 2007
Tata Consulting Services, India _
DW DATABASE DEVELOPER
Responsibilities:
● Migrated large volumes of data from legacy systems to Oracle database.
● Extensively used SQL Loader for Data loading
● Enhancements and Functional Specifications.
● Consolidation, Cleansing, Integration, and customization of data.
● Optimized Query Performance, Session Performance and Reliability.
● Created complex procedures.
● Database connectivity was done using ODBC.
● Preparation of Unit Test Plans.
● Verification of functional specifications and review of deliverables.
● Complex SQL queries were used for data retrieval.
● Involved in Data Modeling using Erwin.
● Developed Packages, Procedures and function to maintain the Business logic using PL/SQL.
● Creating Database objects including Tables, Indexes, Views, Sequences, Synonyms and granting Roles, and
Privileges to the system users.
● Creating sql scripts for deployment of database objects on Production.
● Involved in fine tuning of sql query to achieve good performance.
Environment: Oracle 8.x, PL/SQL, SQL, Unix scripting, MS Project