PRAVALIKA REDDI
****************@*****.***
Professional Summary:
Over 10 years of hands-on experience in Data Warehousing/Informatica ETL/OBIEE implementations and Migrations across various.
Involved in complete SDLC – Requirement gathering, Analysis, Design, Development, Testing, Implementation, Production, Maintenance.
Worked across various business domains – Telecom, Financial, Retail and Logistics.
Expertise in the Informatica Power Center and its components like Mapping designer, workflow Manager and monitoring,etc,.
Hands on Experience in data analysis on SQL Server and Oracle databases, identifying and resolving bottlenecks to optimize query performance.
Experienced with Data Analysis, and Data Mapping for ETL processes and the scheduling tools like DAC, Informatica Scheduler, Airflow, Autosys.
Hands-on experience in Dimensional Data modeling, Star/Snowflake modelling, Fact and dimensions tables, Physical & Logical Data Modeling.
Experienced in developing complex PL/SQL Collections, Exception Handlings, procedures, Triggers, Stored Procedures, Packages and Views in various databases such as Oracle and SQL SERVER.
Proficient in using Oracle Business Intelligence and Oracle Data warehousing, also various dynamic and interactive complex reports and dashboards using OBIEE/OAS/OAC.
Experienced with WebLogic Semantic Model creation(WSM/RPD).
Experience with Atlassian JIRA & Confluence and itrack and AYS ticketing systems.
Experienced in Agile/Scrum software development methodologies.
Worked on OBIEE Admin activities (Installation, Configuration, Security, LDAP, Configurations, Testing).
Experience in coordinating development efforts between Onsite and Offshore teams.
Education:
B. Sc.: Bachelor of Science in Computer Science, Andhra University, India
Masters: Master Of Computer Applications, Andhra University, India
Skills:
Languages
C, SQL, PL/SQL, Shell Scripting, T-SQL, core java
ETL Tools
Informatica 9.x, Informatica 10.2/10.4/10.5.2.
Business Intelligence/Reporting
OBIEE11g/12c, OAS 6.4, BI Administration, BI Publisher
Scheduling Tools
Informatica scheduler, Data Warehouse Administration Console (DAC), OBIEE Agents, Airflow, Autosys
Database
Oracle 19c/12c, Microsoft SQL Server
Operating System
Windows, Unix.
Experience:
Apr 2020 - Present
Client: Verizon
Role: Data Migration Developer
This application has a scope to analyze the Data of a Verizon Fios Internet/TV, Where the data Migrating from Legacy systems to the other Target system.
Responsibilities:
Collaborated with business analysts and data architects to gather requirements and implement data integration solutions using Informatica and SQL.
Proficient in optimizing ETL performance using Informatica performance tuning techniques such as partitioning, parallel processing, improving data processing throughput and efficiency.
Extensively used transformations – Aggregator, Expression, Filter, Joiner, Lookup (connected and unconnected), Router, Sequence Generator, normalizer, Sorter, Update Strategy, Union Transformations
Worked on data integration by using data files (e.g., XML, JSON). help transform and standardize data as they flow between different systems.
Created complex parameterized workflows and reusable transformations in Informatica to ensure modular and scalable ETL processes.
Developed mapping templates (DB, and table object parametrization) for Stage, CDC, Dimension (SCD Type1, Type 2 and Incremental Load) and Fact load processes
Experienced with Direct/Indirect file loading methods in informatica.
Developed complex PL/SQL procedures, functions, Cursors and packages to support end-to-end ETL processes in a data warehouse environment.
Used cursors, collections (BULK COLLECT, FORALL) and exception handling to efficiently manage and process large data volumes.
Designed custom error logging frameworks using PL/SQL to capture and log ETL errors with detailed audit information for troubleshooting.
Created ETL and Datawarehouse standards documents - Naming Standards, ETL methodologies and strategies, Standard input file formats, data cleansing and preprocessing strategies
Created mapping documents with detailed sources to target transformation logic, Source data column information and target data column information
Built OAS/OAC Dashboards and reports on top of Oracle Data warehouse, also support the users.
Worked with cross-functional teams to move the data processing smoothly and achieve goals.
Automated data analysis and quality checks to identify and resolve issues proactively, maintaining high data accuracy and reliability.
Environment: Informatica power center 10.5.2, Orace19c, OBIEE/OAS/OAC, WinSCP, toad, SQL, PL/SQL, UNIX
Employer: Infosys Oct 2019 – Apr 2020
Client: (Bank of America)
Project: OMNI
OMNI Financial has a scope to Migrate the data from legacy system to big data Platforms, where it has the scope of increase the performance and maintenance.
Environment: Informatica power center 10.2, Oracle, Unix, Hadoop Filesystem (HDFS), AutoSys, Teradata.
Responsible for extracting and loading the data from different data sources.
Responsible for creating different transformations like HTTP, SQL etc., to connect to the external servers and also write the files.
Worked on extracting data, transform and load the data by connecting to the external server and then load the data into Hadoop file systems.
Created Different JILs (Job Information Language), to run wrappers, CMD’s, File Watchers, Boxes, INFORMATICA Jobs.
Used Rest API’s calls through HTTP to connect and load the data to target systems.
Develop and maintain ETL processes using Informatica to integrate data from various sources into Hadoop.
Optimize data workflows for performance and efficiency, ensuring timely and accurate data delivery.
Implemented data profiling and cleansing processes to improve data quality and consistency.
Provided training and support to junior team members on Hadoop and Informatica best practices.
Worked on validation of scripts by running them and debugging the SQL’s.
Responsible to connect to the offshore team to take handshake on daily basis.
Worked on performance tuning on informatica mappings by using partitions.
Responsible for communicating with the client and gathering requirements also update status.
Environment:
Informatica, Hadoop, HDFS, HIVE, Informatica, oracle, Unix, SQL and PL/SQL, Autosys.
Employer: Collabera
Client: AT&T, Dallas, TX Nov 2016 – Oct 2019
ETL/Informatica Developer
Project: Direct TV Conversion
Direct TV data conversion scope and strategy specifies the scope of converted customer and billing data. The conversion (data conversion) is a onetime transformation of customer data (OMS, CRM, Billing) from one system (“Source” or “Legacy”) to the other (“Target”), for the purpose of system consolidation and platform upgrade.
Data Cleaning – involved in cleaning procedures and scripts to clean data on the Source platforms, which is required for the new system.
Worked with generation of Generic Loader Layout (GLL) files/flat files and forwarding the extracted data to target conversion environment using Informatica Power Center.
Involved in GLL file processing, and the creation of converted accounts on Target staging platform, using translation tables to translate between legacy and target values.
Worked on OMS, CRM and Billing Customer data.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
Wrote UNIX shell Scripts & PMCMD commands FTP of files from remote server and backup of repository and folder.
Involved in Performance tuning at source, target, mappings, sessions, and system levels.
Developed mapping parameters and variables to support SQL override and Prepared migration document to move the mappings from development to testing and then to production repositories.
Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
Conducted performance tuning of Informatica mappings and sessions, analyzing session logs and optimizing SQL queries to improve data processing efficiency.
Environment: informatica 9.6.1, UNIX, oracle SQL, Toad. SQL loader
Employer: Dechen Enterprises Pvt. Ltd.
Client: KDD (Kuwaiti Danish Dairy) Nov 2013 – Feb 2016
ETL/Informatica/OBIEE Developer
Project: KDD
KDD is a leading manufacturer and distributor of food and beverage products in the Gulf. This project is to implement BI Solutions in various domains such as Oracle Financial, Human Resource, Sales, Manufacturing, Assets, Etc.
Responsibilities:
Creation of mappings with the transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer per the business need.
Created mappings using Unconnected Lookup, Sorter, Aggregator, and Router transformations for populating target table in efficient manner.
Knowledge sharing with the end users, clients and documented the design, development process, the process flow and the schedule of each mappings/job.
Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
Experienced in DAC to Configure, Monitor and Schedule ETL routines of Full and Incremental ETL loads of the Oracle Business Analytics Warehouse.
Worked on time series conversion functions (AGO, TODATE, PERIODROLLING).
Configured LDAP Authentication.
Created users and groups and setup the Data Level Security as well as Object Level Security.
Created new mappings to load the SIS, EAM data (about the security) into the data warehouse.
Designed the tasks, Subject areas and EP in DAC.
Experienced in OBIEE (RPD and Catalog) migration from Dev to UAT and UAT to Production.
Performed Unit Testing for the various reports and dashboards and fixed issues.
Implemented Prompts to facilitate dynamic filter condition to End-users.
Configured DAC to run the Execution Plan from two containers i.e., Maximo75 and Oracle 11.5.10 containers.
Environment: Oracle BI Applications V 7.9.6.4, Informatica 9.1.0, DAC 11.1.1, Maximo 75, EBS 11.5.10.2, Toad, MS Excel.