Post Job Free

Resume

Sign in

Etl Developer Sql

Location:
Johnstown, OH
Posted:
January 14, 2022

Contact this candidate

Resume:

Swetha V adpxbw@r.postjobfree.com

Ph# : 614-***-****

Professional Summary:

•6+ years of IT Experience in Software Analysis, Design and Development in Client/Server systems, Data warehousing and Business Intelligence applications.

•Developed and maintained ETL (Data Extraction, Transformation and Loading) applications and mappings to extract the data from multiple source systems that comprise databases like Oracle 10g, SQL Server, flat files and CSV files to the Staging area, EDW and then to the Data Marts.

•Strong experience with various migrations and actuations projects in IBM Infosphere DataStage.

•Design, Develop and Test ETL Mapping, Mapplets, Workflows, Worklets using Informatica PowerCenter 9.x.

•Experience with dimensional modeling using star schema and snowflake models.

•Extensive Experience in designing ParallelJobs using various stages like Transformer, Joins, Merge, Lookup, Remove Duplicate, Filter, Sort, Copy, Funnel, Sequential file etc.

•Worked on SequenceJobs to control the execution of the job flow using various activities like Job Activity, Email Notification, Sequencer, Routine activity and Exec command activities.

•Debugged existing ETL processes and did job performance (Performance Optimization) tuning by using proper partitioning methods and analyzing the resources utilized using Job Monitor.

•Experience in creating database objects using PL/SQL like Tables, Indexes, DatabaseTriggers, Cursors and Views.

•Having valuable experience in Oracle 11g, MY SQL,DB2,SQL Serverand UNIX commands.

•Good knowledge on data warehousing basics, relational database management systems and Dimensional modeling (Star schema, Snowflake schema)

•Implemented Slowly Changing Dimensions phenomenon with type 1 and type 2 while building Data Warehouses.

•Worked on PL/SQL Stored Procedures to insert and clean data into the data warehouse tables, fired triggers upon completion of load to obtain the log of all the tables loaded with job name and load data.

•Scheduling and monitoring jobs using Atomic scheduler,Tivoli.

•Worked on various tools/software’s including TOAD, Teradata, Putty, WinSCP, FileZilla, JIRA and Oracle.

•Experience in Attunity Replica by configuring endpoints and execute simple data replication jobs.

•Experience in Alteryx data blending tool for changing the data by creating workflows.

•Experience with Teradata utilities like Multiload, Fast load and Fast Export.

•Little exposure to Dell Boomi in designing basic process from SF to Redshift and configuration of Dell Boomi Atom clouds.

•Independent communication with business partners to resolve issues and negotiate project related items such as priorities and completion deadlines

•Experience applying Agile practices to solution delivery

•Ability to learn new tools and technologies quickly and apply the knowledge correctly.

Education Details:

•B-Tech in Computer Science Engineering from JNTU,Kakinada, India

Skill Summary:

ETL Tools

DataStage 11.7,9.1, 8.7 MDM, Informatica 9.0, Quality Stage, SSIS, Dell Boomi

Programming Languages

SQL, PLSQL,DB2,MY SQL,TOAD

Application Tools

Oracle SQL Developer, Putty, WinSCP, FileZilla, JIRA

Databases

Oracle 11g, 10g, Teradata,Redshift

Scheduling Tools

Automic, Tivoli

Testing Tools

HP ALM Quality Center

Operating Systems

Windows XP, Windows 7, UNIX, Red Hat Linux

Data blending Tools

Attunity Replica, Alteryx

Project Experience:

Order Express, Columbus OH Aug 19 to till date

Role : ETL Developer

Project Description:

The objective of the project is to extract the Shopping List data from .com CMDS database and create a MQ message. MB will read the file and call the Commerce Order Service to create the Shopping List in the Commerce database under a generic user id. It details the ETL (Extraction, Transformation and Load) strategy for every target object to be loaded. Data from source systems is extracted based on business requirements, transformed as per business rules and loaded into target systems using IBM Infosphere Datastage.

Project Responsibilities:

•Interacted with Business Analyst to understand the requirement.

•Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the ORACLE and Teradata database.

•Used Dell Boomi atomsphere for some direct mapping table from Sales force to AWS Redshift database.

•Implemented data mapping from source to target in view of the physical data models.

•Used the DataStage Designer to develop processes for Extracting, Transforming, Integrating, and Loading data into Data warehouse.

•Used Alteryx for data blending to transform the data into required format.

Environment:IBM Infosphere DataStage 11.7,Dell Boomi,Alteryx UNIX, Oracle 11g, SQL Developer, Teradata, FileZilla, HP ALM Quality center, JIRA, WinSCP

FPA TV Hardware, New York NY Mar 18 to Feb 19

Role: ETL Developer

Project Description:

The objective of the project is to consolidate Bell TV hardware data into Master Data Management (MDM) product catalogue to support centralized data input process and on-demand automated data export from the centralized product catalogue into the downstream Bell TV systems, when new TV hardware and/or TV hardware type are introduced or changes to the existing TV hardware data are made.

Project Responsibilities:

•Interacted with Business Analyst to understand the requirment.

•Worked Extensively on Informatica tools -Designer Workflow monitor and Workflow Manager.

•Developed various mappings using transformations like Joiner,router,sourcequalifier,lookup, expression and agrregator.

•Debugging and validating the mappings and Involved in Code Review.

•Extensively worked on correcting and reloading rejected files.

•Documented test cases and performed validations based on design specifications for unit testing.

Environment: Oracle 11 g, UNIX, Windows XP, Informatica Designer 9.x, WinSCP, Putty, HP QC 9.0/11.0, UNIX, SQL developer, Teradata.

IFRS, BELLTV, CGI, India Aug 16 to Feb 18

Role: ETL Developer

Project Description:

The main objective of Bell IFRS Revenue project is to plan and execute an orderly transition to the new IFRS 15: Revenue from Contracts with Customers standard under Bell’s elected transition approach. Bell will apply the standard beginning on January 1, 2019. In addition, Bell needs to start the data capture needed for dual reporting early in 2017, and implement the dual (parallel) accounting and reporting process by January 1, 2018. In addition, targeting the July 2017 timeframe, the financial data will need to be accounted for under IFRS 15 for financial planning purposes (targeting 1 year of comparative revenue data when planning process launches in summer of 2018)

Project Responsibilities:

•Implemented data mapping from source to target in view of the physical data models.

•Used the DataStage Designer to develop processes for Extracting, Transforming, Integrating, and Loading data into Data warehouse.

•Developed various Server jobs and Parallel jobs using Oracle, FTP, Peek, Aggregator, Filter, Funnel, Copy, Hash File, Change Capture, Merge, Look up, Join, Sort, Merge, Lookup stages.

•Extensively worked with sequential file, dataset, file set and look up file set stages.

•Creation of jobs sequences.

•Performed Unit Testing and tuned for better performance with updates on data warehouse tables using DataStage Director for jobs Monitoring and Troubles Shooting.

•Testing all the Jobs and Sequences developed and integrating them effectively.

Environment: IBM Infosphere DataStage 9.1, UNIX, Oracle 10g, Toad, FileZilla, HP ALM Quality center, JIRA, WinSCP

BELL TV BI 4K Phases, CGI,India Jan 16 to July 16

Role: ETL Developer

Project Description:

The objective of the 4K resolution is an ultra-high definition television technology and will revolutionize customer's experience.For Bell Canada to maintain its dominance in the IPTV industry, it should invest to provision 4K content on Fibe TV to Canadian customers and be the first in the country to deliver this.For the BRS40 4K Phase3 project, BI will transfer Speed USOC information from WebCareand from a manual mass migration file, to OneVu, so that Speed Change requests can be processed.

Project Responsibilities:

•Involved as ETL Developer during the analysis, planning, design, development, and implementation stages.

•Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the ORACLE database.

•Designed and Developed DataStage Jobs to Extract data from heterogeneous sources, applied transform logics to extracted data and Loaded into Data Warehouse Databases.

•Created DataStage jobs using various stages like Transformer, Aggregator, Sort, Join, Merge, Lookup, DataSet, Funnel, RemoveDuplicates, Copy, Modify, Filter, Change Data Capture, Surrogate Key, Column Generator, RowGenerator, Etc.

•Extensively worked with Join, Look up (Normal and Sparse) and Merge stages.

•Maintained Data Warehouse by loading dimensions and facts as part of project.

•Created shell script to run data stage jobs from UNIX and then schedule this script to run data stage jobs through scheduling tool.

•Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.

Environment: Oracle 11 g, UNIX, Windows XP, DataStage Designer, WinSCP, Putty, HP QC 9.0/11.0, UNIX, SQL developer.

OTTO, BELL TV, CGI, India Mar 14 to Dec 15

Role: ETL Developer

Project Description:

Bell introduced a new TV product OTTO which will enable customers to consume live Bell TV streaming online on a mobile device (i.e. cell phone or tablet, using Apple IOS or Android).

Project Responsibilities:

•Understanding the business functionality & Analysis of business requirements through Mapping Sheets.

•Used DATASTAGE as an ETL Tool for Developing the Data Warehouse. Created Test Cases and developed Tractability Matrix and Test Coverage reports.

•Used the Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database.

•Developed all the necessary documents for QA Testing like Test Strategies, Test Plan which includes all test items & Test Cases, and User Documents etc.

•Running the jobs in DataStage designer and checking if the job run is successful.

•Tracked the defects using HP ALM tool and generated defect summary reports.

Environment: Windows XP, IBM DataStage v8.5, DataStage Designer, DataStage director, MS SQL Server, DB2, Oracle 9i, JIRA, MS VISIO, FileZilla, Unix.



Contact this candidate