Post Job Free
Sign in

Data Management Software Development

Location:
Saint Cloud, FL
Posted:
November 29, 2024

Contact this candidate

Resume:

Eliza Ozbay

248-***-**** ************@*****.***

Synopsis:

Ability to remain highly focused and self-assured in fast-paced and high-pressure environments. Highly Motivated, Solutions Driven with over 9 years of Data Warehousing experience in the areas of ETL design and Development, Data Management, Involved in complete Software Development life-cycle (SDLC) of various projects, including Requirements gathering, System Designing, Data modeling, and ETL design, development, Production Enhancements, Support and Maintenance. Creative Data Management Analyst dedicated to providing innovative and effective data solutions and assessment. Proficient at analyzing data for business and market trends, document and record validation, and completing extraction, transformation, and loading. Specialize at developing system tools to optimize data assessment and management. Excellent Interpersonal and communication.

•Expertise in IBM Datastage (8.5, 8.7, 9.1, 11.3, 11.5 and 11.7 versions) to perform ETL& ELT operations on data.

•Create and maintain ETL processes that extracts Telematics data from a csv, excel, or text file. Then transform/validate the data, so that it will be successfully loaded into a sql server database. With the use of SSIS

•Experience using snowflake cloud, Utilize Snowflake environment to load and transform the data.

•Worked on DataStage tools like DataStage Designer, DataStage Director and Data Stage Administrator.

•Providing software development with Visual Studio, Microsoft SQL Server, SQL Server Management Studio.

•Extracting the data through snow sql to snowflake cloud.

•Creating views and SP in snowflake cloud to automate etl processes

•Experience in UNIX shell scripting for processing large volumes of data from varied sources and loading into Teradata

• Implemented complex business logic within SSIS packages using control flow, data flow tasks, scripts, and custom components.

•Configured SSIS packages to handle errors, perform logging, and ensure data integrity and completeness.

•Optimized SSIS performance through careful analysis and tuning of packages and SQL queries, significantly reducing processing times and resource utilization.

•Collaborated with data analysts and IT teams to identify business requirements, translate them into technical specifications, and ensure accurate data availability.

•Created dynamic, reusable SSIS packages by parameterizing connections, variables, and SQL tasks.

•Experience on Teradata tools and utilities (Fast load, Multi Load and Fast Export).

•Design and Develop ETL processes based on industry standards and enhance loading of data from unfamiliar source systems that can be located in Cloud or on premise.

•Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and star/snowflake schema modeling.

•Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.

•Ability to work with snowflake schema Data warehouse to design of further expansion and normalized of dimension table using Etl.

•Developed parallel jobs using different processing stages like Transformer, Aggregator, Lookup, Join, Sort, Copy, Merge, Funnel, Hierarchy, CDC, Change Apply and Filter, FTP Enterprise.

•Used Enterprise Edition/Parallel stages like Datasets, Change Data Capture, Row Generator and many other stages in accomplishing the ETL Coding

•Familiar in using highly scalable parallel processing infrastructure using parallel jobs and multiple node configuration files.

•Ability pulling data from multiple sources including: xml, web services, delimited flat files, and other data sources

WebSphere MQ Stage to read messages from the Queues and worked on IBM WebSphere MQ Explorer to test the Queue Depth, dead letter queue and number of messages.

•Experience with PVCS as a migration tool

•Worked with Control M and Sykbort as scheduler tools for scheduling the Datastage batch jobs

•Worked with both an InfoSphere Information Services Director input stage and a service output stage

•Worked with XML files for reading the claims payload files from web services.

•Used Web Services-Transformer, Web Services-Client Stages to make Web Service Calls.

•Imported WSDL file definitions using Import Web Service File Definitions utility in DataStage Designer.

•Captured all the Rejected Data from Web Service Calls and shared those with customers, which they used to debug the data issues.

•Used to receive XML-Files from upstream PIM and used to parse them and generate the Flat files and XML-Outputs and send them to downstream

•Used Web services MQ-Stage to read and write messages to Queue.

•Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts and scheduling tools.

•Experience in troubleshooting of jobs and addressing production issues like data issues, ENV issues, performance tuning and enhancements.

•Knowledge in using Erwin as leading Data modeling tool for logical (LDM) and physical data model (PDM)

•Extensive experience in design and development of Decision Support Systems (DSS).

•Assisted in development efforts for Data marts and Reporting.

Experience in all aspects of analytics/data warehousing solutions (Database issues, Data modeling, Data mapping, ETL Development, Metadata Management, data migration and reporting solutions.

•Technical and analytical skills with clear understanding of design goals of ER modeling for OLTP and dimension modeling for OLAP.

•Extensive experience in Unit Testing, Functional Testing, System Testing, Integration Testing, Regression Testing, User Acceptance Testing (UAT) and Performance Testing.

•Worked with various databases like Oracle 19c,12, 10g, 11g/9i/8i, Netezza, DB2, Teradata 14, SQL Server.

•Experienced in Various ETL tools, DBT, rivery, ssis, datasatge,Azure Devops,Mainflame.

Work Experience

Application Production support Bank Of America, TX March 2022 -Pre

Responsibilities:

•Manage incidents and effectively communicate with users, application owners and senior stakeholders across all areas.

•Evolved in weekly production support.(Supporting all production applications, Monitor the Mainframe Systems, Splunk Alerts and Batch Job Failures, scheduled Batch jobs(Control-M/Autosy)Azure Devops

•Assist in defining software development project plans, including scoping, scheduling, providing time estimates for programming tasks and implementation plans and schedules.

• Identify alerts / processes that can be automated and then work with Engineering team in automating them.

•Assist with ongoing implementation of the data warehouse; create data marts, star schema data models and analytics datasets as suitable for the solution

•Identify defects, discrepancies, and trends by ways of code debugging or log analysis

•Track availability and Batch SLA's and taking proactive measures to avoid the risk of SLA breaches with a foresighted approach

•Challenge existing application setup, processing and suggest different ways to solve problem or improve stability

•Strong expertise in Incident Management, Problem Management, Event Management, Change Management DR Activities o

•n Mainframe Platform

•Clearly communicate problem/resolution process to management

•Contribute to the improvement of the of Business Intelligence/Data Warehouse team efficiency through implementation of IT methodologies, adherence to best practices, extensive

•Documentation, code/solution reviews, mentoring and cross training

•Performed root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement

Datatstage Developer/Data Analytics Engineer Rotech, FL Sept 2019 – Feb 2022

Responsibilities:

Apply data warehouse modeling methodologies to design and develop appropriate warehouse data staging, integration and reporting structures that would provide for scalable and efficient reporting and analytics solutions

Assist in defining software development project plans, including scoping, scheduling, providing time estimates for programming tasks and implementation plans and schedules

Assist with ongoing implementation of the data warehouse; create data marts, star schema data models and analytics datasets as suitable for the solution

Actively participate in Change management process with view to manage risk in production environment

Monitor daily production SSIS ETL’s and make sure they are completed on time.

Migrating SSIS /Datastage ETL to Rivery then Snowflake Cloud

Create and design SSIS packages per business requirements to pull data from different sources.

Demonstrate an ability to implement data governance and quality policies, change data capture, API-based data sourcing, dimensional design and slowly changing dimensions

Demonstrate an ability to interpret advanced SQL, develop and implement SQL, stored procedures and functions, as necessary for the solution

Design, develop, test, implement and document ETL solutions to incrementally load the data warehouse and other enterprise applications and systems

Develop technical documentation as needed to describe the environments, components and procedures relating to the ETL process

Follow established coding standards, perform code reviews, and improve standards to assure compliance

Help build positive relationships between business users and IT staff and between various teams in the IT function

Help produce "run books" to ensure operators can monitor and address any issues that arise in production

Maintain a current awareness of technological developments and trends as it relates to databases, development methodologies and tools, especially in the BI, data warehousing and ETL space

Possess strong analytical skills required to troubleshoot issues, identify resolutions and be available for off business hours implementation and support

Support implemented BI solutions by: monitoring performance, tuning data loads and queries, addressing user questions concerning data consistency or integrity, performing root cause analysis of issues and communicating functional and technical issues to management

Work closely and collaborate with various cross functional IT specialists and business subject matter experts in implementing solutions and validating results with business representatives

Work closely with DBA support, opening/working PMR’s with IBM Support, or cases with other vendors, when necessary

Environment: DBT,SSIS, Visual Studio 2017, SQL Server 2019, IBM Websphere DataStage 11.7,11.5 IBM AIX 5.2,Excel,Oracle 12c,19,11g,Autosy XML files, JSON, SNOWFLAKE CLOUD,RIVERY,MS SQL Server database,,Imformix,DB2 blue, sequential flat files,, PUTTY, WinSCP,Unix

Analysis –Data warehousing & Data integration service Post Holdings - Lakeville, MN Feb 2019 – Sep 2019

Responsibilities:

Assist in the design and maintenance of the corporate data warehouse and subsequent data marts

Completed various data management tasks to ensure record accuracy and workflow efficiency, including loading, transformation and extraction.

Developed ETL jobs per business requirement.Used Remedy to create CR and IN

Worked with Skybort as a scheduling tool.

Profound knowledge about the architecture of the Teradata database and experience in Teradata Unloading utilities like Fast export.

Developed Fast Load jobs to load data from various data sources and legacy systems to Teradata Staging

Involved in Developing data stage jobs using web services

Used Web Services-Transformer, Web Services-Client Stages to make Web Service Calls.

Maintain and support the data warehouse.

Coordinated with application development to ensure maximum efficiency and functionality.

Gathering reporting and analysis requirements and translate into data models, including aggregate tables, pivoted tables, and relational and dimensional (star-schema) marts.

Worked with Business analysts and the DBAs for requirements gathering, analysis, and testing to implement fixes for identified data and customer issues, metrics and project coordination.

Monitoring and Evaluating Daily Data and customer processes in order to meet SLA for Customer Master Data Management Environment.

Involved in azure, agile methodology on the projects.

Debug, test and fix the transformation logic applied in the ETL parallel jobs Excessively used DS Director for monitoring Job logs to resolve issues.

Developed jobs with unstructured data

Developed parallel jobs using different processing stages like Transformer, Aggregator, Lookup, Join, Sort, Copy, Merge, Funnel, CDC, Change Apply and Filter, FTP Enterprise.

Experienced in using SQL *Loader and import utility in TOAD to populate tables in the data warehouse.

Used designer and director to schedule, monitor jobs, and collect the performance statistics.

Extensively worked with database objects including tables, views, indexes, schemas, PL/SQL packages, stored procedures, functions, and triggers.

Worked with SCDs to populate Type I and Type II slowly changing dimension tables from several operational source files

Troubleshoot and resolve issues in ETL job logic as well as performance.

Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing. Prepared test data for testing, error handling and analysis.

Environment: Clubhouse, IBM WebSphere DataStage 11.7,8.5 IBM AIX 5.2, Oracle 11g, XML files, PVCS,BOOMI,SKYBORT, DB2,Teradata 14, sequential flat files,, PUTTY, WinSCP,Unix

Technical Specialist VISTANA, FL July 2018 -Jan 2019

Responsibilities:

Providing Production support with Technology Operations teams to ensure operations and maintenance of Marketing Data platform

Completed various data management tasks to ensure record accuracy and workflow efficiency, including loading, transformation and extraction.

Worked on Migration Datastage jobs from 11.3 to 11.5 version. My primary task was to gather migration requirements, test jobs in QA environment 11.5 and product implementation /support.

Monitored the status and security of data systems, established direct access files and other configurations

Coordinated with application development to ensure maximum efficiency and functionality

Worked with Business analysts and the DBAs for requirements gathering, analysis, and testing to implement fixes for identified data and customer issues, metrics and project coordination.

Experienced in PX file stages that include Complex Flat File stage, Datasets, Lookup File Stage, Sequential file stage.

Monitoring and Evaluating Daily Data and customer processes in order to meet SLA for Customer Master Data Management Environment.

Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job execution.

Involved in development of Job Sequencing using the Sequencer.

Worked on Robot scheduler tool for scheduling batch jobs.

Experience with customer data process such as Standardization, Merge/Match and data profiling.

Used designer and director to schedule, monitor jobs, and collect the performance statistics.

Extensively worked with database objects including tables, views, indexes, schemas, PL/SQL packages, stored procedures, functions, and triggers.

Creating local and shared containers to facilitate ease and reuse of jobs.

Worked with SCDs to populate Type I and Type II slowly changing dimension tables from several operational source files

Executed Pre and Post session commands on Source and Target database using Shell scripting.

Worked with Developers to troubleshoot and resolve issues in job logic as well as performance.

Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing. Prepared test data for testing, error handling and analysis.

Environment: IBM WebSphere DataStage 11.3-11.5 IBM AIX 5.2, Oracle 11g, XML files, IBM DB2 10.1, MS SQL Server database, sequential flat files,

Sr Programmer analyst Universal Studios - Orlando, FL Jul 2016 to May 2018

Responsibilities:

Worked with the Business analysts and the DBAs for requirements gathering, analysis, testing, and metrics and project coordination.

Hands on experience developing ETL jobs using Talend.

Experienced in PX file stages that include Complex Flat File stage, Datasets stage, Lookup File Stage, Sequential file stage.

Supporting and maintaining SSIS packages in PRD environment

Extracted data from SQL server database, transform using talend and load. Involved in

creating and maintaining Sequencer and Batch jobs.

Creating ETL Job flow design.

Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job execution.

Created various standard/reusable jobs in DataStage using various active and passive stages like Sort, Lookup, Filter, Join, Transformer, aggregator, Change Capture Data, Sequential file, Datasets.

Involved in development of Job Sequencing using the Sequencer.

Used Remove Duplicates stage to remove the duplicates in the data.

Used designer and director to schedules and monitor jobs to collect the performance statistics.

Extensively worked with database objects including tables, views, indexes, schemas, PL/SQL packages, stored procedures, functions, and triggers.

Creating local and shared containers to facilitate ease and reuse of jobs.

Environment: IBM Websphere DataStage 11.3, IBM AIX 5.2,Talend 6.3 SSIS, Visual Studio 2007, SQL Server 2019, Oracle 12c, XML files, MS SQL Server database, sequential flat files

ETL Datastage Developer Pyramid Systems - Fairfax, VA Nov 2015 to June 2016

Responsibilities:

Involved in understanding of business processes and coordinated with business analysts to get specific user requirements.

Helped in preparing the mapping document for source to target.

Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage Director for developing jobs and to view log files for execution errors.

Involved in design and development of Datastage batch jobs for loading the data into the Huntington's Customer Information System (CIS).

Experienced in developing parallel jobs using various Development/debug stages (Peek stage, Head & Tail Stage, Row generator stage, Column generator stage, Sample Stage) and processing stages (Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel, FTP, Remove Duplicate Stage)

Used DataStage as an ETL tool to extract data from sources systems, loaded the data into the SQL Server, DB2 and Oracle databases.

Developed job sequencer with proper job dependencies, job control stages, triggers.

Used the ETL Data Stage Director to schedule and running the jobs, testing and debugging its components & monitoring performance statistics.

Controlled jobs execution using sequencer, used notification activity to send email alerts.

Imported table/file definitions into the Datastage repository.

Participated in Datastage Design and Code reviews.

Migrated InfoSphere DataStage, Version 9.1 to Version 11.3.

Worked in waterfall and Agile/Scrum environment, Jira, Jazz and Azure

Involved in creating UNIX shell scripts for database connectivity and executing queries in parallel job execution

Worked on programs for scheduling Data loading and transformations using DataStage from DB2 to Oracle using SQL* Loader and PL/SQL.

Successfully implemented pipeline and partitioning parallelism techniques and ensured load balancing of data.

Involved in performance tuning of the ETL process and performed the data warehouse testing.

Documented ETL test plans, test cases, test scripts, and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.

Prepared documentation including requirement specification.

Participated in weekly status meetings.

Environment: IBM Infosphere DataStage 9.1 - 11.3(Parallel & Server), Oracle 10g, DB2, SQL Server 2008, PL/ SQL, Flat files, XML files, Zen Scheduler.

Sr ETL Data Stage Developer Kelly services, MI June2012 to Oct 2015

Responsibilities:

Analyzed, designed, developed, implemented and maintained Parallel jobs using IBM info sphere Data stage.

Involved in design of dimensional data model - Star schema and Snow Flake Schema.

Generating DB scripts from Data modeling tool and Creation of physical tables in DB.

Worked SCDs to populate Type I and Type II slowly changing dimension tables from several operational source files

Created some routines (Before-After, Transform function) used across the project.

Experienced in PX file stages that include Complex Flat File stage, Dataset stage, Lookup File Stage, Sequential file stage.

Implemented Shared container for multiple jobs and Local containers for same job as per requirements.

Adept knowledge and experience in mapping source to target data using IBM Data Stage 8.x

Implemented multi-node declaration using configuration files (APT_Config_file) for performance enhancement.

Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, FTP, Peek, Row Generator stages in accomplishing the ETL Coding.

Debug, test and fix the transformation logic applied in the parallel jobs, excessively used DS Director for monitoring Job logs to resolve issues.

Experienced in using SQL *Loader and import utility in TOAD to populate tables in the data warehouse.

Involved in performance tuning and optimization of DataStage mappings using features like Pipeline and Partition Parallelism to manage very large volume of data

Deployed different partitioning methods like Hash by column, Round Robin, Entire, Modulus, and Range for bulk data loading and for performance boost.

Repartitioned job flow by determining DataStage PX best available resource consumption.

Created Universes and reports in Business object Designer.

Created, implemented, modified and maintained the business simple to complex reports using Business objects reporting module.

Environment: IBM Info sphere DataStage 8.5, Oracle 11g, Flat files, UNIX, Erwin, TOAD, MS SQL Server database, Mainframe COBOL, XML files, MS Access database.

Education

Bachelor in Computer science completed 2008

Mzuzu Mzuni University

Skills

DATABASE, DATASTAGE, ETL, EXTRACT, TRANSFORM, AND LOAD, ORACLE, SSIS, DB2, SNOWFLAKE

CLOUD, SQL Server,DBT,AZURE DEV ops.AUTOSYS.MAINFLAME.



Contact this candidate