Sign in

Data Developer

Visakhapatnam, Andhra Pradesh, India
October 25, 2019

Contact this candidate





Over 8 years of experience in playing multiple roles on BI projects. I have participated in all phases of the project lifecycle and have extensive experience in the Data Modeling and Oracle Business Intelligence (OBIEE).

Created and maintained Data Flow Diagrams and Process Diagrams; Logical and Physical data structure of DB’s by using Erwin, Toad Data Modeler tool for OLTP/OLAP environments.

Designed architectures for OLTP and OLAP (DW) environments including: creating database structures and loading/importing data from multiple vendors’ databases for consolidation; applied business rules and transformation in ETL to move data from source system to Star or Snowflake Schema. Designed and developed Dimension and Fact tables addressing SCD type 2.

Performed data profiling, which includes examining data types, length, discrete values and uniqueness, occurrence of null values, typical string patterns and abstract type recognition. The metadata was then used to discover problems such as illegal values, misspelling, missing values, varying value representation, and duplicates.

Followed Modeling standards which includes Naming and Definition, defining proper formats, definitions of entities andattributes along with maintaining Entity and Referential Integrity, Relationships, Change Managementetc.

Follow SDLC guidelines and standards; Supports multiple projects by providing trainings for performance based SQL to business users and clients. Experience of delivering data marts using Ralph Kimbal's methodology (Star Schema).

Performed gap analysis between current and target data structures, document new business and data rules along with data models and get periodical signoff from key stakeholders.

Created the design of Data Reception Area(DRA) using the Materialized View concept to load data from services and network inventory source.

Created the design of Real Time System(RTS) in 3NF model and integrated heterogeneous sources to bring it in one platform.

Created the Operational Data Store (ODS) for reporting purpose and to send batch feeds to the downstream consumers.

Played different types of roles as BI Developer, Senior BI Developer, and BI Team Lead in various BI Projects.

Strong technical and functional skills along with integration testing and strong debugging, problem solving / analytical skills. Good communication and user interface skills, the ability to work independently and within a team.

Experience working in Tableau desktop to create dashboards, workbooks.

Oracle Business Intelligence Foundation Suite 11g Certified Implementation Specialist.

Experience in designing, development and Production supporting of Dashboards, Union Reports, Prompts in OBIEE and usage of the options available in analytics.

Experience in implementing/Supporting Data security using GROUP and UI security using WEB GROUPS in OBIEE.

Experience in Data Extraction, Transformation and Loading in Data Warehouse Environment using ODI.

Experience in Designing and development/Support of ODI, Informatica ETLs Views, Materialized views and modelling in OBIEE.

Experience in providing end-to-end business intelligence solution by configuring metadata and building OBI Repository.

Experience on development of BIAPPS with Oracle Business Intelligence (OBIEE) and Oracle Data Integrator (ODI) as Analyst and Developer.

Experience on Product Development of BIAPPS.

Assist the current support team in all phases of the product life cycle, create documentation, conduct knowledge transfer sessions, communicate directly with users, and implement new modules as needed.

Working knowledge of the Oracle eBusiness Suite applications and the ability to develop analysis in OBIA.

Maintain the ODI Administration Console.

Created Informatica Mappings and Mapplets using active and passive transformations and have the experience in using Data Warehouse Administration Console (DAC) to schedule the Informatica jobs and metadata management.

Expertise in designing OBIEE Analytics Repository with Dimension Hierarchies, and level based measures, Time series Functions, Initialization Blocks and Variables depending upon the need for Business Reports using the OBIEE Admin tool. OLAP Tool: OBIEE 10g, 11g, 12C,OAC(Oracle Analytics Cloud) Integration Tool : ODI 11g, ODI12C, Informatica 9.0.

Expertise in full life cycle Business Intelligence implementations and have an understanding of all aspects of an implementation project using OBIEE 11.1.1.x/OBIEE 10.1.3.x/Siebel Analytics 7.x.

Experienced in gathering reporting and analysis requirements, documenting the report specifications, implementing the metadata layers including Physical, Business Model and Mapping, and Presentation layer.

Highly skilled at configuring OBIEE Metadata Objects including repository, variables and reports.

Experienced in designing customized interactive dashboards in OBIEE using drill down, guided navigation, prompts, filters and variables.

Experience in installation, configuration and administration of OBIEE 10g/11g.

Experienced in setting up and working in a multi user development environment (MUDE).

In deep knowledge on Fusion Middleware Technology, concepts and data warehouse architecture.

Deep understanding of ODI ETL architecture and OBIA pre-built data models and able to extend or develop new mappings.

Worked on BI Financial, OBIA App CRM version and HRMS version R12 managing performance, caching and customizing reports.

Administrative roles such as security configuration for Users, Groups and Application Roles, system maintenance, performance tuning, monitoring system metrics, usage tracking, debugging, managingCatalog root directories, unit testing, version control and content migrations.

Implemented object and data level security.

Report development using OBIEE Analytics&BI Publisher to build interactive dashboards, adhoc reports, Pivot tables, Bar charts, Pie chart, Ticker, Performance Tiles, Column selector, drillable, guided navigation, union reports, presentation variables, inline prompt reports.

Working knowledge of Cloud computing IAAS, PAAS and SAAS.

Working knowledge in Informatica Power Center.

Strong in OBIEE Metadata development including (Projects, MUD Environment, Migration, Deployment, Applying patches, Maintenance etc.).

OBIEEmetadata (RPD) design and development - including design and definition of logical tables, physical sources, dimensions, columns and aggregation rules.

Responsible for the deployment and maintenance of the Physical Layer, Business Model and Presentation layers in OBIEE repository with security.

Experienced in working with Variable Manager to define session and repository variables and initialization blocks to streamline administrative tasks and modify metadata content dynamically.

Expertise in data analysis, time series analysis, gap analysis, data modeling and model designing, development, customization, testing, identification of dimensions, facts & aggregate tables, measures and hierarchies.

Well-developed ability to analyze and understand client problems and requirements, with excellent interpersonal and communication skills.

Experienced in creating interactive UI Design (OBIEE Answers, Delivers, and Dashboards) including experience in User Interface Design using CSS, HTML Java Scripts.

Understanding of DWH and DWH Methodologies. Experience importing tables, creating logical tables, facts, metrics, attributes, transformations, partitions Schema Design to support client's reporting requirements Suggest changes to existing data model Recommend new data sources, tools, and analysis that will be needed to proactively answer questions from development team.

Involved in design, enhancement, and development of applications for OLTP, OLAP, and DSS using dimension modelling with RDBMS.

Developed Tableau Trending reports to perform data quality checks based on the trends and patterns.

Strong decision-making and interpersonal skills with result oriented dedication towards goals.

Worked in onsite-offshore model.


Master in Computer Science from Monterrey Institute of Technology and Higher Education.


Business Intelligence Reporting

OBIEE 11g,OBIEE 10.1.3.x, Siebel Analytics 7.x, BI Publisher, Tableau 9.x, ETL Tool - Oracle Data Integrator( ODI 11g)

Concepts and Methods

Business Intelligence, Data Warehousing, Data Modeling, Requirement Analysis

Other Tools and software

SQL*Plus, SQL*Loader, Toad, SQL Assistant



Operating Systems

UNIX, Windows


Oracle 7.x/8/8i/9.x/10g/11g, Teradata

Testing Tools

Quality Centre

Tools Utilities

Vi Editor, SCP, MS – Office Suite

Data Modeling Tools

ERwin9.5/8.2/7.3/7.0, Power Designer 15/12/11, Embarcadero ER Studio 6.6, IBM Rational Software Architect 7.1, MS Visio 2000/2007/2010, ER Studio.

ETL Tools

AWS Redshift Matillion, Alteryx, Informatica Powercenter 8 Trained

Big Data

Hadoop, HDFS 2, MapReduce, YARN, Hive, Spark-SQL, PIG, HBase, Sqoop, Kafka, Oozie


Sr. OBIEE Developer–Applied Materials - Santa Clara, CAApr 2019 – Present

Applied Materials is the leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world.


Partner with the management and operations teams to design, build, and maintain a collective data repository and front-end dashboard development.

Create business intelligence tools and reports, such as physical data models and dimensional analyses.

Design, code, test, and aggregate results from SQL queries to provide information to users.

Scheduling meetings with Business, working on production support issues, collecting requirements for enhancements, changing the underlying RPD, building/modifying dashboards.

Worked in creating ODI global objects like variables, user functions to use the reusable functionality of ODI.

Optimized the Knowledge modules by adding parallel hints for increasing the efficiency of ODI jobs.

Developed advanced PL/SQL packages, procedures, triggers, functions added various indexes to RDBMS tables to enhance the data query and data retrieving performance.

Performed environment migrations between Development, QA and Production environments.

Designed and developed the OBIEE Analytics Metadata Repository (.rpd) using OBIEE Server Admin tool by importing the required objects (Dimensions, Facts & Aggregates) with Referential Integrity constraints into Physical Layer using connection pool, developing multiple dimensions (Drill-down Hierarchies) & Logical & Facts Measure objects in Business Model Layer, and creating Presentation Catalog in Presentation Layer.

Provide technical and process leadership for the technical team of ETL (ODI) and BI (OBIEE) developers in all phases of the delivery process: requirements, scoping, prototyping, design, development, testing, release and support.

Lead source system analysis, ETL design and development and system testing to create an integrated data warehouse based on custom and Oracle BI Applications mappings and reports.

Responsible for analysis, design, development, customizations, validations, deployment and post implementation support for OBIEE components including Oracle BI Admin, Answers, Interactive Dashboards, BI Publisher, Scheduler, and Interactive Reporting using OBIEE 11g/12c.

Participate in activities on cross-functional teams in order to achieve goals of BI and IT projects, including work plans, system design, development and testing of solutions.

Work with business users to determine the data source(s) required to deliver the solution.

Document data requirements (source & target) with the ETL team for Informatica package creation / modification.

Create / modify stored procedures to transform data and apply required business rules.

Work closely with the data integration team, the Insights Reporting Manager, and the OBIEE Dashboard Developer.

Key coordinator in the daily operations, configuration and maintenance of the Oracle Business Intelligence Enterprise Edition (OBIEE) repository.

Understand the basic business processes, data quality issues, data use, and any other areas of special focus for the data assigned. Responsible for the maintenance and development of existing and new repositories.

Plan and coordinate production system activities such as upgrades, updates, patching, and other maintenance with department users, IT team, and additional vendors associated with OBIEE. Also perform the quality software code creation, code compiling and testing, version control and release management activities.

Responsible for federating disparate data sources in the logical layer, implementing time series metrics and working with level based metrics for successful financial, clinical, and quality reporting.

Responsible for development and running Map-Reduce jobs on YARN and Hadoop clusters to produce daily and monthly reports as per user's need.

Responsible for scheduling and managing jobs on a Hadoop cluster using Oozie work flow.

Experienced with the Scala, Spark improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL and Pair.

Offloading historical data to Redshift spectrum using Matillion, Python and AWS glue.

Working on AWS components like S3,Dynamodb and Lambda for troubleshooting issues.

Migration of existing OBIEE OOTB subject to custom Redshift Datawarehouse.

Administration (Including Backup and Recovery) of OBIEE 11g/12c and SAP Business Objects Servers(based on Sun Sparc Solaris and Windows Servers).

Plan, design, develop and implement applications, scripts, procedures and metadata for relational databases.

Modify Business Objects Universe and Create OBI Subject Areas using OBIEE Admin tool to build new data models and to extend the capabilities of existing models.

Develop OBIEE reports and dashboards utilizing the Industry best practices and UI standards. Monitor, respond and resolve ticket/issues submitted by users in OBIEE 12c and SAP BusinessObjects. OBIEE implementations including all phases of the OBIEE RPD development lifecycle (requirements, design, logical/physical/presentation layer construction and implementation).

Assist in development of initial and long term solution architecture.

Develop all ETL routines to source data from client source systems and target the OBIA data warehouse.

Customize, develop, test, debug, deploy, administer and support ODI mappings and work flows within the pre-built Oracle Business Intelligence Application for various business areas.

Develop SQL statements and Oracle PL/SQL stored procedures, database packages and triggers Develop Oracle APEX screens and support backend configuration systems. Develop Oracle APEX Web Service and Oracle Restful Data Services (ORDS).

Develop RPD models on OBIEE using a dimensional approach with appropriate layers of base facts, dimension, snapshots and aggregates Develop and maintain API integrations based in Java Understand Dimensional Models to Support BI implementation.

Participate in the development of technical design documentation for various stages of the BI implementation.

Create Dashboards/Reports on OBIEE environment based on business requirements.

Install, upgrade, patching Oracle BI EE 11g, 12.x. Install, upgrade, patching Oracle APEX 5.x Install, configure, upgrade and support Weblogic 11g/12c on Linux and Windows.

Create technical documents to document current database contents, concepts, and mapping between databases.

Participate in the design, development, and analysis of data architecture and warehousing approaches.

Managing Clustering, Load Balancing and User Management with Tableau.

Working knowledge of Cloud Computing. Setup Star Schema/Snowflake schema. Design Fact and Dimensions tables, Design Physical and logical data model mapping layer.

Design Project in MUD Environment, Migration, deployment, applying patches, maintenance and use ETL process to extra data from various data sources.

OBIEE metadata (RPD) design and development - including design and definition of logical tables, physical sources, dimensions, columns and aggregation rules.

Responsible for the deployment and maintenance of the Physical Layer, Business Model and Presentation layers in OBIEE.

Proficient in PL/SQL Stored Procedures/Functions, Triggers and Packages and exclusively worked with tools like SQL Developer, TOAD.

Developed many Reports with different Analytics Views (Drill-Down, Pivot, View Selector and Tabular with Filters) using OBIEE Answers.

Customized the OBIEE Repository (physical, BMM, and presentation layers). Created connection pools, imported physical tables and defined joins in the physical layer of the repository.

Sr. Business Intelligence Developer (Oracle) – Oracle Corporation- Mexico Jul 2015–Oct 2018


Developing and supporting ETL's: extracting data from source tables, storing in a warehouse; creating store procedures and mapping in order to apply business rules and create functional reports.

Responsible for Oracle Business Intelligence implementation/upgrade (OBIEE 11g/12c), Oracle SQL and PL/SQL, APEX, JAVA, Web Services/JSON and HTML/Javascript, configuration, support, and performance tuning as well as for the design, development, administration and optimization to meet client business and technical requirements.

Proficient in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Redshift, SqlServer, and Oracle).

Worked on Amazon Redshift and AWS and architecting a solution to load data creates data models and run BI on it.

Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide Sql access on Hadoop data.

Developing in Oracle BI Repository: create connections to databases, logical joins between facts and dimensions tables, develop snowflake and star schemas to satisfy business needs and finally visualize in a dashboard.

Created dimensional data model to develop operational and executive reports on day to day data quality checks performed in the EBI space.

Integrated the Jira defect data into the same model to report on data quality defects.

The Jira file processing was done using ODI. The flat file loaded to Oracle DB.

Install ODI and Set up ODI Master and Work repository.

Created the aggregation layer to report on weekly and monthly level checks and defects.

Setup user groups and configured the use of LDAP via the WebLogic Console tool for OBIEE security.

Monitored user roles and policies, updating them as per business requirements via the Enterprise Manager tool.

Provided work estimates for report development and related activities.

Worked with end users to identify gaps in current report design.

Apart from the above responsibilities, lead the data quality operations team who were doing the day to day data quality checks.

Created complex reports on excel using macros and other advance excel functions (COUNTIFS, SUMIFS etc.).

Created data quality operational reports using Tableau desktop tools and published to the EBI tableau server.

Created live reports and scheduled extract data source for various requirements for Tableau.

Big Data implementation:Using MapReduce and Python (Anaconda) to retrieve specific data from different sources, CSV and text files. Storing the information in a database.

Creating a pipeline to retrieve information from large CSV files, cleaning datasets and store in our warehouse. Using MapReduce methodology to extract the information, loading into the staging and applying business rules.

Involved in improving the performance and optimization of the existing algorithms in Hadoop using Spark Context, Spark-SQL, and Data Frames.

From ERP application (Oracle Enterprise Resource Planning), create an ETL to process financial and booking data. Design facts and dimensions tables and use them to create reports in OBIEE.

Handle daily issues in production like data missing and data corrupted. Verifying SQL to create reports like tables, unions, joins, filters. Checking ETL, PL/SQL package and procedures to create facts and dimensions tables. And finally source data.

Provide support to HR, supply chain, financial and inventoryETL’s and reports.

For Booking reports, re-design and tuning ETL’s in order to run in a specific time (2 hours). Before this project all ETL’s were running between 6-8 hours, depending on data. I archived to complete this process in 1 hour and 30 minutes.

Tuning experience: reducing time when ETL s are extracting information and applying transformations.

Environment: PL/sql, ODI, Informatica, OBIEE, Oracle 11g/12c, Linux, Python, Hadoop, MapReduce, Jira, Confluence, Jenkins.

ODI Developer - Banco Santander (Santander) – Mexico City, Mexico Nov 2013–Jul 2015


Implemented SCD Type 1 and Type 2 using ODI.

Developed specifications (Document) based on the requirements, such as Business Requirement Documents, High Level Document and Low Level Document.

Extensively used ODI Designer for importing tables from database, reverse engineering, to develop projects, and release scenarios.

Implemented Change Data Capture (CDC) feature of ODI to minimize the data load times.

Created Repositories for DEV and migrated it to QA environment.

Set up the Topology including physical architecture, logical architecture and contexts.

Implemented and enhanced different Knowledge Modules in mappings for loading and integrating the data from sources to targets.

Designed Interfaces to load data from Flat files, CSV files in to staging area (Oracle) and load in to Oracle data warehouse.

Performed reverse engineering for the data sources and targets.

Developed, supported and maintained ETL (Extract, Load and Transform) processes using Oracle Data Integrator (ODI).

Analyzed Session log files in operator navigator to resolve errors in mapping and managed session configuration.

Developed Triggers, Procedures, Functions using PL/SQL.

Used markers and memos to distinguish interfaces and projects for ease to other developers.

Worked closely with the Data Architects in doing source data analysis, rectifying the requirement documents, creating source to target mappings.

Performed unit and system testing for ETL Packages.

Created PL/SQL procedures, functions and triggers for tasks outside of ODI.

Created ODI Interfaces using LKM, CKM & IKM Knowledge Modules.

Created load plans to run the scenarios in a hierarchy of sequential and parallel steps.

Environment: ODI, Oracle BI Applications HR and Financial analytics, Oracle 10g, PL/SQL, Toad, SQL Developer.

Java Developer – Sabritas – Mexico City, Mexico Aug 2011 - Nov 2013

Sabritas is the brand under which Pepsico brands the Frito-Lay products in Mexico, such as Cheetos, Fritos, Doritos and Ruffles. It is also the namesake for its own line of potato chips. Frito-Lay also sells variations of its products under the Sabritas brand in U.S. states bordering Mexico. Some seasons, every bag of Sabritas contains non-wrapped plastic and Tazos. Tazos are known as POGS in the U.S. It also has several local products such as Crujitos, Poffets, Rancheritos and Sabritones.


Developed user interfaces using JSP, HTML, CSS, JavaScript, JQuery and Ajaxand also developed SOAP based Web Services using JAXB.

Wrote SQL statements to store and retrieve data from Oracle and developed web pages using HTML, CSS, JSP and used JDBC for database connectivity.

Added Maven support to existing projects.

Developed and Modified tables, views, Triggers, stored procedures, packages.

Created SOAP XML web services to perform validations using third party systems during the sales flow.

Involved in design, coding, unit and systemtesting, documentation, assisting in training and implementation of projects, applications, work flows, etc.

Developed web Components using JSP, Servlet, Struts under J2EE Environment.

Developed web application for recovering missing customer orders using JSP, JPA, SQL, JQuery and Ajax.

Built Web pages that are more client interactive utilizing jQuery plugins for Drag and Drop, AutoComplete, AJAX, JSON, ReactJS, NodeJS and JavaScript, Bootstrap.

Responsible for writing Struts action classes, Hibernate POJO classes and integrating Struts and Hibernate with Spring for processing business needs.

Designed, Developed and analyzed the front-end and back-end using JSP, Servlets and Spring.

Worked in an Agile Scrum Development environment.

Involved in AGILE Methodology process which includes bi-weekly sprint and daily scrum to discuss the design and work progress.

WebLogic application server was used to host the Application.

Used GIT for version control and Eclipse as IDE.

Deployed the applications into LIVE and QA Environment.

Developed Java utility programs to retrieve data from Kafka.

Involved configuration and deployment of application on Tomcat Apache.

Developed J2EE applications (MQ Series) using WebSphere application server.

Environment: Java, J2EE, Spring, Hibernate, Servlets,JSP, Jquery, SQL, JUNIT, XML, Eclipse,ANT, JBOSS, SOAP,WSDL, OpenJPA, Web Services, HTML, CSS, JavaScript, MVC 3.5, SOAP UI, Google App Engine SDK, Apache Jmeter, Tomcat, Jenkins, Quartz Scheduler, PMD, Jasmine, ActiveMQ.

Contact this candidate