Post Job Free

Resume

Sign in

ETL Developer/Big Data

Location:
Whitby, ON, Canada
Posted:
July 19, 2018

Contact this candidate

Resume:

Ramana Yanala ac6b3w@r.postjobfree.com

** ****** ****, ******* 647-***-****

** + years of IT professional experience in designing, analysis, programming and testing using Informatica Power Center, Informatica MDM, Informatica Big Data Edition and Hadoop Ecosystems.

Extensive experience in design, development and implementation of Data marts, Enterprise Data Warehouses and Business Intelligence Solutions.

Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of Data Warehousing using Data Conversions, Data Extraction, Data Transformation and Data Loading (ETL).

3 + years of hands on experience designing and developing big data solutions using Hadoop and ecosystems.

Good understanding of using different objects in Designer, Workflow Manager, Workflow Monitor.

Created mappings using transformations like source qualifier, Expression, Aggregator, Joiner, Router, Filter, Look up Update, Union, Rank, Sorter, Sequence Generator and Stored procedure. Also created reusable transformation and mapplets.

Experience on Hadoop clusters using major Hadoop distributions like Cloudera (CDH),Hortonworks and AWS servers.

Good knowledge and understanding of Hadoop Architecture and various components in Hadoop and ecosystems: HDFS, Map Reduce, Hive, Pig, Scala and Sqoop.

Knowledge and Experience on NoSQL Databases like Hbase and Cassandra.

Hands-on experience in using Sqoop scripts to import and export data from RDBMS to HDFS and vice-versa.

Hands –on experience in writing Pig Latin scripts, python scripts and java programming,

Hands on experience in scheduling jobs through Autosys and Control-M.

Proficient in writing complex Oracle SQL, PL/SQL, Views, for database applications.

Very good exposure to Oracle, MS SQL Server, IBM DB2, Teradata and Microsoft APS databases.

Hands on experience working in LINUX, UNIX and Windows environments.

Excellent Verbal and Written Communication Skills.

Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.

Education: Post Graduate diploma in Computers Applications.

Masters in Science.

Work Experience

FCL CO-OP LIMITED, Saskatoon Sep’15 –Present

ETL/Big Data Developer

Co-Op is one of the largest retailers in Canada, chains of large discount department stores and gas bars. LIGHT HOUSE is an enterprise application for Load management and Trip management as part of CO-OP’s Fleet. Product owners need to analyze transportation cost in which LIGHT HOUSE acts an entry point. This helps the business users to reach at important business decision.

Responsibilities:

Analysis and understanding the business requirements and architecture of the Retail Data warehouse (RDW).

Involved in implementing the ETL solutions to meet the project requirements.

Worked with Business Analyst to identify and understand source systems, target systems and source to target master data mapping.

Involved in building the ETL architecture and Source to Target mapping to load data in to Data warehouse.

Created mapping documents to outline data flow from sources to targets.

Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

Worked with different data base sources flat files and XML files into Natezza database Utilized Informatica Big Data (BDE)

Extracted Data from Hadoop and Modified Data according to Business requirement and load into Hadoop.

Imported Mappings and rules into power center for scheduling using Tidal.

Debugged the error’s using hadoop logs when mappings run in Hadoop mode and used informatica logs when mappings run in Native mode

Experience in using Sqoop to import the data on to HDFS and Hive tables from different relational databases.

Importing and exporting data into HDFS from database and vice-versa using Sqoop.

Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager and Workflow Monitor.

Involved in Performance Tuning of mappings in Informatica.

Involved in creating Hive tables, loading with data and writing hive queries.

Implemented Daily Oozie co-ordination jobs that automate parallel tasks of loading the data in to HDFS and pre –processing with pig using Oozie co-ordinate jobs.

Responsible for tuning Hive and pig scripts to improve performance.

Implemented unit tests with MR Unit and PIG Unit.

Involved in creating Hive tables, loading with data and writing hive queries.

Good understanding of source to target mapping and business rules associated with the ETL processes.

Developed complex mappings in Informatica Power Center to load the data from various sources using different transformations like Source Qualifier,Lookup(connected and unconnected),Expression,Aggregate, Update Strategy, Joiner and Router.

Review existing code, lead efforts to tweak and tune the performance of existing informatica process.

Scheduling the sessions to extract, transform and load data in to warehouse database on business requirements.

Environment: Informatica Power Center 9.6.1,Informatica MDM,Informatica BDE, Mainframes,,Toad, SQL Developer,DB2,Netezza, Hadoop Eco Systems, Sqoop, Pig, Hive, Python, Java, Oracle 11g, SQL loader, Control-M, Teradata,Congo’s and Linux.

Tangerine Bank, Toronto Mar’15– Aug’15

ETL Developer

APS MIGRATION PROJECT is migrating a data warehouse application from Oracle to the Microsoft® SQL Server® Parallel Data Warehouse (PDW) region within the Microsoft Analytics Platform System (APS) appliance. APS Migration is an end-to-end project which mainly focuses on the migration of existing jobs to APS database and thus provides many benefits like availability, scalability including improved query performance.

Responsibilities:

Analysis and understanding the business requirements and the Designing of the data base for the migration of the data from SQL database.

Analyzed user requirements for the system and Identified business rules for data migration.

Identify the tables, mappings, workflows, folders for migration required for providing the data for the new database.

Created Sqoop scripts to import and export data into HDFS.

Created Map Reduce jobs to process the daily transaction data.

Added pig Scripts for data cleansing and preprocessing the data.

Created Map Reduce jobs and pig scripts to transform and aggregate data for analysis.

Implemented workflow management of Sqoop, Hive and pig scripts using Oozie.

Involved in creating Hive tables, loading with data and writing hive queries.

Responsible for building scalable distributed data solutions using Hadoop.

Analyzed large amount of data sets to determine optimal way to aggregate and report on it.

Developed Map Reduce jobs in pig scripts for data cleansing and preprocessing.

Developed Junit tests for testing Map Reduce and also performed testing using small sample data.

Developed mapping parameters and variables to support SQL override.

Created mapplets to use them in different mappings.

Developed mappings to load into staging tables and then to Dimensions and Facts.

Used existing ETL standards to develop these mappings.

Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.

Transforming the data by applying business logic to data in order to adhere to target requirements.

Production Support which includes the correction of the data that has been migrated according to the application needs.

Environment: Informatica Power Center 9.6.1, Informatica MDM,HDFS, Pig, Hive, Oozie, Sqoop, Oracle 11, Mainframes, Toad, Oracle SQL developer, PL/SQL and SQL Server 2012,XML, Teradata, Tableau, Unix Shell Scripts.

Quindell Solutions, Toronto Aug’13 – Feb’15

ETL developer/BI

Telematics is an application which is mainly used for tracking the vehicles and driving history .This project deals with using GPS tracking devices to accurately report the location of vehicles and recorded the driving history in North America. Driving history is calculated on data gathered from the telematics device and follows the insurance benefits. Driving history is stored in the data Warehouse for future discounts and decision support.

Responsibilities:

Involved in designing, developing and documenting of the ETL (Extract, Transformation and Load) strategy to populate the Data Warehouse from various source systems

Designed and developed complex mappings in Informatica to load the data from various sources such as SQL Server, Flat files, Oracle, XML using different transformations such as, Source qualifier, Look up (connected and unconnected), Expression, Aggregate, Update strategy, Sequence generator, Joiner, Filter, Rank, and Router transformations.

Designed and developed mappings, defined workflows and tasks, monitored sessions, exported and imported mappings and workflows, backups, and recovery.

Developed CDC mappings using Informatica Power Exchange 9.1.0

Extensively worked with database and Informatica for performance tuning.

Created and configured workflows, Worklets, and sessions to transport the data to target warehouse SQL Server tables using Informatica Workflow Manager.

Created Mapping Parameters and Variables.

Worked on Database level tuning and SQL Query tuning for the Data warehouse.

Developed PL/SQL stored procedures and UNIX commands shell scripts for pre and post session commands and batch jobs.

Executed sessions, both sequential and concurrent for efficient execution of mappings and used other tasks like event wait, event raise, email, command and pre/post SQL

Performed extensive testing on the mappings and wrote queries in SQL to check if the data was loading to the dimension tables and fact tables properly.

Environment: Informatica Power Center 9.5.1,InformaticaPower Exchange9.5, Oracle 11,UNIX, Flat Files, SSIS, Mainframes, PL/SQL Developer and TOAD, XML, SQL Server, TFS.

GE, Australia Nov’10-Feb’13

Informatica developer

Integrated Logistics Solution (ILS) is the standard Logistics and Transportation management solution of GE Energy that supports the Global Transportation requirements of the strategic business units. ILS is a web-based extranet application that enables the collaboration of GE Energy with its strategic Supply Chain partners by providing a standardized interface to access the information required from each partner to plan, execute and track the logistics and transportation needs of the business.

Responsibilities:

Worked closely with the Architect in extracting the data from sources system and Designed and developed the processes to extract, transform and load data into Oracle Database.

Extracted data from Heterogeneous source systems like Oracle, Teradata, SQL Server and flat files.

Due to the new jurisdiction inclusion, large volume of the data and no data ware house worked on the preparation for the dimensional modeling of the warehouse using Erwin

Used Informatica Power Center Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, Mapplets, and Reusable Transformations.

Developed Mappings to load data from flat files through various data cleansing and data validation process.

Used almost all the Transformations like Lookup, Aggregator, Expression, Router, Filter, Update Strategy, Stored procedure and Sequence Generator

Designed Mapping document, which is a guideline for ETL Coding and implemented CDC, SCDI & SCDII Mappings.

Worked on various tickets w.r.t Informatica mappings.

Automated UNIX shell scripts to verify the count of records added everyday due to incremental data load for few of the base tables in order to check for the consistency.

Successfully worked on the unit testing, integration testing in developing the Data conversion programs.

Environment: Informatica Power Center 8.6,Informatica Power Exchange, Oracle 10, Toad, PL/SQL and SQL Server 2008, DB2, SSIS,SSRS, XML, TSQL, UNIX Shell Scripts and Teradata.

Australian Health and Hospital Association, Brisbane Apr’07-Sep’10

Informatica developer

This assignment is for the project in AHHA for creating the corporate Data Warehouse from the operational OLTP system and staging area using extraction capabilities of Informatica Power Center. Operational sources exist in OLTP environment comprising of Oracle. This project is a pilot project towards creating the corporate Data Warehouse in Oracle for AHHA using the existing operational data for better decision-making

Responsibilities:

Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Joiner and Stored procedure transformations.

Designed mapping document, which is a guideline for ETL Coding following Standards for naming conventions and Best Practices were followed in mapping development.

Extensively used Informatica Decision, Command, Event Wait, Event Raise, and Email tasks.

Optimized the SQL override to filter unwanted data and improve session performance.

Used Debugger to track the path of the source data and to check the errors in mapping.

Prepared unit testing document covering field to field's validations and source to target count.

Scheduling the workflow comprising of different sessions for their respective mappings in order to load data into Oracle database.

Designed various mappings for extracting data from various sources involving Flat files, Oracle, Sybase and Sql Server, IBM DB2, XML files.

Created, Modified, and documented Oracle Packages, Procedures, Functions, and Indexes.

Assisted the other ETL Developer in resolving complex scenarios.

Migrated final code from Development to Test, Test to QA and finally to Production.

Environment: Informatica Power Center,InformaticaPower Exchange, Oracle 11,UNIX, Flat Files, SSIS, Mainframes, PL/SQL Developer and TOAD, XML, SQL Server.

References are available upon request



Contact this candidate