SREEDHAR KANDE
Email: adisg9@r.postjobfree.com
Phone number: 470-***-****
SUMMARY:
Overall 15+ years of IT experience with a prime focus on Analysis, Design, Development, Customization, Data Modelling, Gap Analysis, Testing, Administration, Documentation, and Maintenance of Data Warehouse/Data Marts, Business Intelligence. CRM, AWS and Snowflake.
Strong background in Analysis, Design, Development, Implementation of Data Warehousing in ELT/ETL & BI, including 2 years of experience in Snowflake Cloud Datawarehouse.
Good exposure to business domains like telecommunication, manufacturing, sales, consumer goods, financial services, and health insurance and trains users.
Worked extensively in developing ETL for supporting Data Extraction, Transformation and Loading using ORACLE DATA INTEGRATOR 12.2.1.2.6/12.1.2.0.0/11.1.1.x/10.1.3.5
Extensively used ODI (Oracle Data Integrator) to perform ELT from heterogeneous sources using ODI tools - Designer and Operator also good knowledge in Security Manger & Topology Manager
Experience working with ODI Knowledge Modules such as LKM, IKM, CKM, JKM, RKM.
Working experience in Creating ODI packages, Load Plans, scenarios using mappings/interfaces, variables, procedures and Topology including physical architecture, logical architecture.
Experience in working with Oracle IaaS environment with ODI environment
Excellent Data Analysis Skills and ability to translate Business logic into mappings using complex Transformation logics for ELT/ETL processes.
Worked on Oracle Golden Gate, for data integration, replication, and Journalizing interfaces to populate data from OLTP system to Data Marts.
Migrating Load plans/Scenarios from one environment to another environment. Setting up required topology.
Efficient in troubleshooting, performance tuning and optimization of ELT/ETL and reporting analysis.
Experience in Data Warehousing development using ETL tools such as Informatica 9.x/8.x/7.x
Working in design, development, and configuration of RPD, Dashboards, Reports, Analysis & Interactive Reporting, OBIEE 11g new functionality and implemented Filters, Scheduled Reports, Prompting, develop ad-hoc queries, reports, and dashboards.
Experience in configuring interactive Dashboards with drill-down, drill across capabilities using Filters and Dashboard Prompts, Setup groups, access privileges, Metadata Objects and Web Catalog Objects (Dashboard, Pages, Folders, Reports) and guided navigation links.
Experience in creating entity relational & dimensional relational data models with Dr. Ralph Kimball Methodology i.e., Star schema and Snowflake schema
Experience in pulling data from BIApps (Sales, Marketing, Financial, Procurement & spend, Supply Chain).
Experience in Tableau Desktop/Server, creating highly interactive Data Visualization Reports and Dashboards using a feature such as Data Blending, Filters, Actions, Parameters, Maps, Extracts, Context Filters, Sets, Aggregate measures etc.
Business Analysis:
Requirement analysis and Design, documentation, Flow chart & use case preparation.
Gap Analysis, Test case preparation & test execution.
Business Analysis, Project plan preparation & execution (Project initiation to closure),
Good exposure to business domains like telecommunication, manufacturing, sales, consumer goods, financial services, Education (K12) and health insurance.
Followed the Agile methodology for elicitation and representation of requirements based on interaction with the business process owner, SME’s and the development team. Participated in daily scrum.
Facilitated requirement sessions, including the creation of agendas and recording of action items from meetings and writing meeting minutes.
Analyses processes and recognizes areas that need improvement.
Documented the requirements in the form of detailed use cases and sent out the same for inspection to the team.
Test case preparation and Test execution, UAT execution & sign-off.
Big Data:
Excellent understanding/knowledge of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node, and Map Reduce programming paradigm.
Knowledge of Hadoop framework, Hadoop Distributed file system, and Parallel processing implementation
Knowledge in Hadoop Ecosystems HDFS, Map Reduce, Hive, Pig, HBase, Sqoop.
Knowledge in Import/Export of data using Hadoop Data Management tool SQOOP.
Hands on experience on writing Queries, Stored procedures, Functions and Triggers by using SQL.
AWS:
Exposer to AWS Cloud experience with EC2, S3, EFS, Glacier, VPC
AWS administration create EC2 Instances, IAM, Cloud Watch.
Experience in Cloud Migration of on-premise legacy applications to AWS Cloud.
Sharing resources across the network (NFS share).
Utilizing almost all of the AWS stack including EC2, S3, ELB, Auto Scaling, IAM, VPC, AMI and messaging services focusing on secure, fault tolerant, highly scalable and available systems that handle expected and unexpected load bursts.
Increase and/or attach EBS volumes and EBS Snapshots to EC2 instance and configure IOPS.
Snowflake:
Administration
Zero Copy cloning – Cloning Objects in the account
Time Travel, Data Retention Settings for Storage
User Management, Service Account Settings
Data Sharing from Prod to Stage and Dev Environments
Job Scheduling using Tidal on Snowflake Edge Server
Performance Tuning - Cluster Design for Tables and Query Tuning
Roles and authorizations
Setting up of Multi-Cluster Warehouses
Storage Considerations
Resource Monitors setup on warehouses
Snowflake Cost Accounting, Credit Management
Custom Monitoring on Long Running Jobs, Data Storage
AWS Cloud Integration – Data Loading, External Tables
Data monitoring for data ingestion and L1 support setup
Create snowpipes and stages and SQS/events on S3 for continuous data loads.
Data Ingestion
Setting up data pipelines from Netezza, PostgreSQL, MS SQL, Oracle Systems to Snowflake
ETL Pipeline with Python frameworks into Snowflake
Custom File Formats, setting up Ingestion Monitoring
Created UDF and applied masking on a required column.
Written python scripts to connect sources and data loads into snowflake stage.
Written procedures in snowflake with combination of JavaScript and SQL (complex procedures)
Designed Auto Splitting framework for full table loads of multimillion rows into parallel loads for performance loading
SKILLS/TOOLS:
Data Warehousing/ ELT/ETL : ODI 12c, ODI 11.1.1.x and ODI 10g (10.3.5.6), Informatica
Power Centre 9.x/8.6.1/8.1.1/8.0/7.1.2/7.1.1, Talend, Metallion.
Cloud Data warehousing : Snowflake/SnowSQL
Dimensional Data Modelling : Dimensional Data Modelling, Data Modelling, Star Schema
Modelling, Snowflake Modelling, FACT and Dimensions Tables, Physical and Logical Data Modelling
Siebel Analytics/OBIEE : OBIEE11.1.1.7.x/10.1.3.1/3.2/3.3/3.4, OBIApps
7.9.5/7.9.6/7.9.6.4/11.1.7, Tableau, QlikView
Siebel CRM : Configuration, EIM, EAI, Performance tuning,
Administration/Infrastructure (7.x &8.x)
Force.com : Visual Force Pages, Apex Standard, Custom Controllers and
Extension, Apex triggers, SOQL, SOSL
Operating Systems : MS-DOS, Windows family, UNIX (HP, Sun Solaris), Red Hat
Linux
Programming Languages : C, C++, Core Java
Databases, Database Tools : Oracle 11g/10g/9i/8i/8.0/7.x, MS SQL Server 2000/7.0/6.0,
MySQL IBM DB2 UDB 9.0/8.0/7.0
Web Servers : Microsoft IIS, Sun One
Scripting : eScript, VB script, Shell Script, SQL, PL/SQL
Testing Tools : Manual Testing, Win Runner
Data Modelling : Erwin
BA Tool : MS Visio, Snagit, ReqPro
Data Migration : SQL Loader, MS DTS
Defect Tracking : HP QC, BMC Remedy, IBM CQ, Service Now
CERTIFICATIONS & TRAININGS:
Certification in Snowpro
Certification in Oracle Data Integrator 12c
Certification in AWS Solution Architect Associate
Certification in Salesforce Administration (ADM201)
Certification in Siebel 7.7 CCC (Core Consultant Certified)
Undergone Informatica 8.6 training
Undergone Siebel Analytics 7.7 and OBIEE 10g /11g training
Undergone BigData training
Undergone DevOps training
Undergone Tableau Training
Undergone snowflake training
EXPERIENCE:
Raymond James Financials, St. Petersburg, FL
March 2019 to till date
Sr. DW Consultant
Responsibilities:
Analyse business requirement of Raymond James projects participate in technical discussions and provide detailed estimates for the projects.
Design relational databases to support business enterprise applications and physical data modelling according to project requirements for data acquisition and security as well as customer-defined deliverables
Collaborated with internal and external personnel, including system architects, software developers, database administrators, design analysts and information modelling experts to determine project requirements and capabilities, and strategize development and implementation timelines
Develop architectural strategies for data modelling, design and implementation to meet stated requirements for metadata management, operational data stores and Extract Load Transform environments
Created required file formats within snowflake that are required for current Datawarehouse.
Created external stages in the snowflake DB to pull the data from S3 buckets also enabled SQS to automate the snow pipe process in order to load the files onto snowflake stages as soon as the file is loaded on to S3 bucket.
Load data from the different source (csv, json, Oracle, MySQL) to snowflake DW
Created buckets, snow pipes and stages and SQS/events on S3 for continuous data loads.
Zero Copy cloning – Cloning Objects in the account (Within the account)
Setting up data pipelines from MS SQL, Oracle Systems to Snowflake
Creating and Setting up the connection between Snowflake and other clouds (AWS)
Created UDF/Policies and applied masking on a required column.
Created required procedures and tasks and scheduled to run as when needed.
Identifying the business scenarios for doing Scale-up in Snowflake for best performance of clusters.
Strong understanding of the AWS data management product and service suite primarily EC2, S3, VPC.
Designing the cost-effective snowflake accounts for various environments.
Involved in Data migration from on premises to Cloud Datawarehouse Snowflake.
Prepare functional/technical specs to define reporting requirement and ETL process.
Written python scripts to connect sources and data loads into snowflake stage.
Written procedures in snowflake with combination of JavaScript and SQL (complex procedures)
Create and implement end to end solutions by using database objects and advanced PL-SQL/ODI architecture (Oracle Data Integrator) features.
Improve performance of the application using appropriate data structures, reducing disk I/O and increasing utilization of CPU throughout, and partitioning, selecting the best data access path.
Enhance/add application features by using various analysis techniques and engineering concepts.
Support existing application provide bug fixes and write a test harness to deliver better quality products.
Manage code using TFS and GIT to deploy the code onto various environments.
Work on SCD type-2 dimensions to maintain the history and latest information also near Realtime process using CDC through snowflake streams.
Load data into dimensions and fact tables using the Star schema Data Warehouse Model.
Develop different complex mappings using components like Distinct, Pivot, Aggregate, Join, Subquery Filter, Dataset, and Expressions for Developing.
Helping Developers on Development process wherever is required.
Environment: Environment: SnowSql, Snowflake WebUI, Python, SQL Loader, Servicenow, PowerPoint, AWS, ODI 12.2.x, Metallion, Kafka,Informatica, WebLogic, Sangit Editor, Linux, MS Visio, Python, SharePoint, Git Repository, Jenkins, MS Visual Studio, BMC Control-M, Netezza, PostgreSQL, MS SQL, Oracle, ODI 12.2.1.2.6, SSIS, JMSQ Unix Script, spider, Erwin,PL/SQL, IaaS environment
MA-Executive Office of Education, Malden, MA Sep 2017 – March 2019
Sr. ODI Developer/Architect
Responsibilities:
Development, Admin & Operations:
Requirements gathering, Analysis, gap analysis, Designing, Technical documents preparation
Work with different business unit teams (EEC, HEIRS, SIMS, EPIC, MCAS) to gather functional and operational requirements and prepare functional specifications and configuration documents.
Work with business users in defining and analysing a problem.
Provide analysis, design and coding support for business enhancements to the system.
Design, code, implement, support and maintain Information Technology applications.
Maintain adherence to company standards and procedures.
Individual performer and as a member of a project team.
Provide day-to-day IT support and support the daily operations of the business.
Use standard methodologies to carry out Business Intelligence projects and tasks.
Load data from heterogeneous data sources like a database, flat file, web services into Data warehouse and developed complex dimensions/facts using various Oracle Knowledge Modules for maintaining Historical(SCD2)data and Incremental updates also implement real time integration by using Change Data Capture process on student data.
Model, Design, Development of data warehouse/ETL solutions using Oracle ODI, PL/SQL, AWS, Python.
Work on ODI Designer for designing the interfaces, defining the data stores, interfaces, and packages, modify the ODI Knowledge Modules to create interfaces to Load and transform the Data from Sources to Target databases, created mappings and configured multiple agents as per specific project requirements.
Optimize solutions using various performance tuning methods (SQL tuning, ETL tuning (i.e. the optimal configuration of transformations and memory parameters), Database tuning using Hints, Indexes, partitioning, Materialized Views, Procedures, functions etc.)
Coordinate data extracts for testing efforts - unit testing and dry runs; Provide information for cutover planning with business to get sign-off and production move
Perform customization of knowledge modules wherever is required, implement inflow data quality and audit the data.
Perform ODI Tasks such as Establishing Oracle DI Development Standards, Created ODI Repositories, Agent, Contexts, Data Server, and both of Physical & Logical Schema in Topology Manager, created both of source and target models folders, models and data stores
Creation and maintenance of ODI users and installation of ODI client/server software, creating work and master repository.
Creation of ODI Standalone and Java Agents and maintenance of the agents on WebLogic server.
Automate Scenarios, Packages, Load plans as per business need (Hourly, Daily, Weekly, Monthly and Semi-Annual and Manual)
Create new data server; reverse engineer new data store and promoting the same to across all the environments that include Dev, QA, and Prod.
Environment: ODI 12.2.1.2.6/11.1.1.x, IBM Data manager, Unix Script, DB2, PL/SQL, Oracle 11g, MS SQL Server 2014, SQL Loader, Servicenow, RTC, Powerpoint, AWS, Weblogic, Sangit Editor, Linux & MS Visio, Sharepoint.
Florida Blue, Jacksonville, FL Sep 2016 – August 2017
Sr. ODI Analysts (ELT) and Developer
Responsibilities:
Development & Support:
Requirements gathering, Analysis, gap analysis, Designing, Technical documents preparation
Development of various data feeds from disparate sources such as (relational databases, flat files) using Oracle Data Integration (ODI) Tool to the DWH or to separate Data Mart
Using ODI, Developed, maintained various Packages, Interfaces, Variables, and Models, which populated the Data into the Staging tables and to DWH
Used ODI Designer for importing tables from the database, reverse engineering, to develop projects, and release scenarios
Worked on Check Knowledge Module (CKM), Loading Knowledge Module (LKM), Integration Knowledge Module (IKM), RKM & JKM
Assist business functions to coordinate on the extraction of data from the warehouse to the function specific data marts such as - Capitation, Health, TOS CAT and DataMart
Facilitate ODI ELT data mapping by gathering legacy technical information as needed and providing ODI technical information (table names, field names, Scenarios, Job Scheduling information, etc.)
Maintained data quality by implementing Flow Control.
Implemented CDC and slowly changing dimensions -2 for customer and claims data
Worked on Oracle Golden Gate, for data integration and replication.
Use of Golden Gate and Journalizing interfaces to populate data from OLTP system to Data Marts.
Optimized solutions using various performance tuning methods (SQL tuning, ETL tuning (i.e. the optimal configuration of transformations and memory parameters), Database tuning using Hints, Indexes, partitioning, Materialized Views, Procedures, functions etc.)
Coordinate data extracts for testing efforts - unit testing, cycle testing, and dry runs; Provide information for cutover planning with business to get sign-off and production move
Environment: ODI 12.2.1.2.6/11.1.1.x, Unix Script, IBM DB2, PL/SQL, Oracle 11g, MS SQL Server 2008, Tableau9.x, Reqpro, Visio, Snagit Editor, Powerpoint.Flat files, Webservices, WSDL, XML
AT&T, Alpharetta – GA Jun 2014 – Aug 2016
Sr. ODI Consultant/Architect
Responsibilities:
Development & Operations:
Requirements gathering, Analysis, gap analysis, Designing, Technical documents preparation
Implemented data mart to support efficient rollup datasets- that enables Americas Sales Force and Commercial Ops and Higher Management to track real time Orders, Opportunities, Revenue and Orders & Revenue forecasting and Install Base Insight to the lowest entities
Integration of various source systems from EDW to capture America’s Sales data into SQL Server Report Data Mart using ODI ELT S2T Mapping
Implementing the extraction, loading, and transformation (EL-T) of data using Oracle Data Integrator; configuring the ODI master and work repositories, ODI models
Design and develop ODI project using interfaces, packages, procedures, variables and knowledge modules.
Developed Source to Target Mappings using ODI Designer to load data from source systems like SQL Server and flat files to the target database.
Worked on Check Knowledge Module (CKM), Loading Knowledge Module (LKM), Integration Knowledge Module (IKM), RKM and JKM.
Worked on Oracle Golden Gate, for data integration and replication.
Use of Golden Gate and Journalizing interfaces to populate data from OLTP system to Data Marts.
Implemented slowly changing dimensions -2 for customer data and SCD1 for Channels, Products and Promotions.
Implemented reports and dashboard that enabled report users to analyse data (slicing and dicing) by Zones, Regions, AE, AM’s, Fiscal Calendar Parameters (FY, FM, FW, CYD, PYTD, PY2TD etc.) to the lowest granularity is by individual customer BILL TO’s and SHIP TO’s site
Developing UNIX for scheduling jobs and sending an alert on failure, file archiving
Involved in Unit, integration, performance testing, test cases preparation and design reviews and reconciliation
Environment: ODI 11.1.1.7.0 (Data Integrator), Oracle 11g, OBIEE 11g, BIACM, BIApps, MS SQL Server 2008
Florida Blue, Jacksonville, FL Mar 2013 – May 2014
Sr. ODI Analysts (ELT) and Developer
Responsibilities:
Development & Support:
Target Functions - Assignments involves creating the staging, dimension and fact tables across various environments and loads the data from different sources DB2, Netezza, SQL Server and files to build one stop data hub for reporting and analysis or in to Function Specific Data Marts using ODI ELT Tool
Development of various data feeds from disparate sources such as (relational databases, flat files) using Oracle Data Integration (ODI) Tool to the DWH or to separate Data Mart
Using ODI, Developed, maintained various Packages, Interfaces, Variables, and Models, which populated the Data into the Staging tables and to DWH
Used ODI Designer for importing tables from the database, reverse engineering, to develop projects, and release scenarios
Worked on the Check Constraint Knowledge Module (CKM), Loading Knowledge Module (LKM), and Integration Knowledge Module (IKM)
Assist business functions to coordinate on the extraction of data from the warehouse to the function specific data marts such as - Capitation, Health, TOS CAT and DataMart
Facilitate ODI ELT data mapping by gathering legacy technical information as needed and providing ODI technical information (table names, field names, Scenarios, Job Scheduling information, etc.)
Maintained data quality by implementing Flow Control.
Implemented CDC and slowly changing dimensions -2 for customer and claims data
Worked on various ongoing ODI assignment required to revamp the existing ODI process for new servers and for optimized throughputs
Assisting in BCBSFL ODI ELT Team in development/ support for notification of erroneous data and on production job failure
Accountable for Financial Analysis and planning in the Corporate Finance team in identifying the metrics for sales growth and forecast of company performance using tableau
Experience in Tableau Desktop/Server, creating highly interactive Data Visualization Reports and Dashboards using a feature such as Data Blending, Calculations, Filters, Actions, Parameters, Maps, Extracts, Context Filters, Sets, Aggregate measures etc.
Involved in Tableau Server installation, configuration, creating Users, Groups, Projects and assign Privileges etc.
Coordinate data extracts for testing efforts - unit testing, cycle testing, and dry runs; Provide information for cutover planning with business to get sign-off and production move Project Execution:
FL and SC State Mandates “All Payer’s Claims Data” Extracts
Behavioural Health from American Psychiatric Association & the AMA mandate requirements – for TOSCAT (Type of Service Category)
Federal mandate – ICD10 Remediation Process
Capitation Redesign
Monthly Claim and Enrolment Audit Check
Environment: ODI 11.1.1.6, Unix Script, IBM Netezza, DB2, Oracle 10g, MS SQL Server 2008, Tableau7.x,OBIEE 11g
AT&T, St. Louis, MO/Alpharetta, GA Mar 2011 – Feb 2013
ODI/OBIEE Lead
Responsibilities:
Requirements gathering, Analysis, gap analysis, Designing, Technical documents preparation
Defined new data marts for Americas Sales Force data structures, functions, data warehouse fact & dimension tables and views
Modified and created the ODI Knowledge Modules such as LKM, IKM, CKM, and JKM.
Creating ODI packages, scenarios using interfaces, variables, procedures.
Creating incremental loads, facts and dimensions, designed new data elements and tables for data marts
Installing and Configuring ODI 11g on all the environments with standalone agents for custom Data Warehouse project.
Created Repositories, Agent, Contexts and both of Physical & Logical Schema in Topology Manager for all the source and target schemas.
Created ODI Packages, Jobs of various complexities and automated process data flow.
Involved in the installation, configuration, customization, and extension of OBIEE, OBIA, and ODI.
Worked with Enterprise Manager for deployment and control of repository, web catalog and configuring parameters.
Responsibilities included Building of RPD application design for Physical, Logical, and Presentation layers, the configuration of those layers, development of Reports, Dashboards, Publisher and testing, and deployment
Created proactive agents iBots and interactive Dashboards to alert the Administrative team about changes. Managed day-to-day Repository, Scheduler & iBots.
Implemented Object Level Security for the objects like Dashboards and Reports, and Data Level Security for Principal and Teacher Evaluation dashboard.
Enhanced Query performance, by analyzing & modifying SQL Query and Filters, for all the reports and dashboard prompts, to minimize the delay in Physical Query Execution
Responsible for migrating the web catalog and RPD from DEV, SIT, UAT and PROD environments.
Environments: ODI 11.1.1.5, OBIEE 11.x, BIApps (Financial Analytics), Oracle EBS 11i/R12, SQL Developer, Toad, Windows XP.
Nissan Motors, Dallas, TX Jan 2010 – Feb 2011
Lead Developer
Responsibilities:
Done an analysis to understand data structure and data flow of the legacy system and defined implementation plan (Data flow) according to the time constraints and NMAC DWH infrastructure on new NMAC Technology stack
Done ODI architecture of the new application using topology, various database schema setup and set the ODI projects
A defined data model to hold Filter, Scale, and Various External Data Feeds (Bid Wanted’s, Bond Characteristics, Scale, Customer Information, Bid Wanted Matching, Pricing, TIPS, and system notification etc.
Implemented ODI Mappings to load data from and to Oracle, MS SQL Server, Flat Files, and XML Files etc. using ODI 11g.
Done the setup of scheduling option as per the business hours to turn-on and turn off the Bidding Engine
Coordinate data extracts for testing efforts - unit testing, cycle testing, and dry runs; Provide information for cutover planning with business to get sign-off and production move
Environments: ODI 11.1.1.5, OBIEE 11.x, BIApps (Financial Analytics), DAC, Oracle EBS 11i/R12, SQL Developer, Toad, Windows XP.
3 Com, Singapore Jan 2009 – Dec 2009
BI Developer
Responsibilities:
Develop, implement and deploy data warehousing and business analytical solutions
Troubleshoot all issues and figure out the root cause/s of those issues
Standardize and improvise data quality through periodical data cleansing and auditing process
Maintain and monitor the data documentation for all data warehouse and reporting formats
Collaborate with end-users to check the efficiency level based on report requirements
Work with various teams across the organization, clients and other stakeholders to successfully carry out the enterprise level BI solutions
Worked extensively with the prebuilt OBI Marketing Applications & Custom repository.
Identified Facts and Dimensions (with levels & hierarchy) and developed OBIEE Repository (RPD) at 3 layers i.e., Physical Layer (connecting Data sources & import schema), Business Model / data mapping (create Logical tables & data source mapping) and Presentation Layer level (create presentation catalogs), create presentation folders, test and validate the model.
Implemented Security by creating roles & web groups.
Environment: OBIEE 10.1.3.4.1, Informatica 8.6, DAC, Oracle 10g, Pl/Sql
British Telecom, U.K/India Apr 2006 – Dec 2008
Developer (Siebel Analytics/OBIEE)
Responsibilities:
Designing the architecture of the data warehouse
Expertise with the extraction of data from different source systems such as Siebel - CRM and flat files
Structuring the nightly refresh of data load and scheduling it through DAC (Data warehouse administration Console)
Implementation of the data level and view level security, involved in developing the Incremental ETL
Iterative development of analytics dashboards, Answers requests, charts, pivot table layouts, dashboard prompts
Merged new RPD, dashboard and Answers requests with existing analytics content
Architected analytics solution to analyze multiple client reporting needs
Worked closely with Users, Developers, and Administrators to resolve ongoing Production Problems by reviewing design changes made to production systems and made corresponding changes to the Siebel Analytics Repository.
Developed ETLs for Data Extraction, Data Mapping and Data Conversion using Informatica Power Center.
Created Presentation catalogs, and Folders in Presentation Layer, and Performed Global Consistency Check in Siebel Analytics Administration Tool
Built Reports and Dashboards for Business Users, and Set up Usage Tracking database.
Implemented Dynamic Dashboard Prompts to zoom into different segments of the customers and Brokers, to achieve performance optimization and easy navigation of reports
Enhanced Query performance, by analyzing & modifying SQL Query and Filters, for all the reports and dashboard prompts, to minimize the delay in Physical Query Execution
Environment: OBIEE 10.1.3.2(Siebel Analytics Applications), Administration Tool, Informatica. SQL Server 2005, reporting Services, Integration services
Manulife, Toronto, Canada/India Jan 2005 – Mar 2006
Siebel/Analytics Developer
Responsibilities:
Upgraded and migrated Siebel Analytics repository (.rpd) database from Siebel Analytics7.8.2 to Siebel Analytics 7.8.4
Designed and build Star schemas
Interaction with the end users for the requirements
Participated in the complete life cycle of the project from Project Initiation to Development and Maintenance.
Designed repository, with complex Time Series Calculations, for prior months/ quarters/ years.
Created Presentation catalogs, and Folders in Presentation Layer, and Performed Global Consistency Check in Siebel Analytics Administration Tool
Built Reports and Dashboards for Business Users, and Set up Usage Tracking database.
Implemented Dynamic Dashboard Prompts to zoom into different segments of the customers and Brokers, to achieve performance optimization and easy navigation of reports
Enhanced Query performance, by analyzing & modifying SQL Query and Filters, for all the reports and dashboard prompts, to minimize the delay in Physical Query Execution
Configured repository (RPD) to use the LDAP authentication
Define the mapping from various physical databases to Business Layer to create the various subject areas in Presentation Layer
Defined dimensions and facts in the subject areas
Implemented OBIEE security for users and groups in administrator tool
Created reports by applying views, filters, calculations and conditional formatting