Post Job Free

Resume

Sign in

Data Warehouse Cloud

Location:
Charlotte, NC
Posted:
April 15, 2024

Contact this candidate

Resume:

PROFESSIONAL SUMMARY

**+ years of Data Warehousing professional experience in complete Software Development life-cycle (SDLC) of various projects, including Requirements gathering, System Designing, Data modeling, and ETL design, development, Documentation, Implementation, Unit testing, Deployment and Post deployment Maintenance, Production Enhancements and Maintenance.

Extensive ETL tool experience using IBM Info sphere/WebSphere DataStage, Ascential DataStage.

Having 3+ years of experience in Snowflake Cloud Data Warehouse, AWS S3 Bucket, DMS, RDS, Data analysis and implementation in DW Solutions for both on-premises and cloud solutions.

Creating Stages and Tasks creation for scheduling jobs, Using of Virtual Warehouses and Cloning.

Good exposure on Data loading, Transformations, Metadata Configuration, Snowpipe Configurations & Troubleshooting and AWS - Snowflake Integration.

Having good experience in Data Sharing, Zero Copy Cloning, Time travel and Security implementation/Data Masking for both row level and column level in Snowflake.

Hands-on experience in Data loading and Data unloading into Snowflake Cloud Data Warehouse using the COPY command for Bulk loading & used Snowpipe for continuous data load.

Used COPY, INSERT, PUT, GET commands for loading data into Snowflake tables from internal/external stages.

Handling large and complex datasets like JSON, XML and CSV files from various sources.

Handling python frameworks to process data loading from various upstream systems like SQL Server, Oracle, Teradata, PostgreSQL, MySQL, AWS S3 Bucket to Snowflake Data warehouse.

Create task flows to automate the Data loading, Unloading to/from Snowflake Cloud Data Warehouse and AWS S3 Bucket.

Strong understanding of the principles of Data Warehousing using fact tables, dimension tables and star/snowflake schema modeling.

Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.

Developed parallel jobs using different processing stages like Transformer, Aggregator, Lookup, Join, Sort, Copy, Merge, Funnel, CDC, Change Apply and Filter.

Familiar in using highly scalable parallel processing infrastructure using parallel jobs and multiple node configuration files.

Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts and scheduling tools like AutoSys and ControlM and Crontab.

Implemented Job Sequence to integrate the entire performance of the DataStage job designs.

Extensively worked on Change Requests, Pull Requests and Bit bucket implementations for new customer requirements.

Involved code migrations in Development, QA and Production environments using GitHub and CICD pipelines (Jenkins).

Extensive experience in Unit Testing, Functional Testing, System Testing, Integration Testing, Regression Testing, User Acceptance Testing (UAT) and Performance Testing.

Good command on Project Managerial roles like Team handling, RCA’s, Delay communications, Bug reports, Shift rosters etc.

Executed test cases, logged defects, Verified defect fixes and Captured test and defect metrics.

Experience in troubleshooting of jobs and addressing production issues like data issues, ENV issues, performance tuning and enhancements in different phases like Development, QA, UAT, Pre-Prod and Production support.

Hands on experience in estimating the test effort for the user stories.

Hands on experience on create test scenarios and test cases.

SKILLS SUMMARY

ETL tools IBM Infosphere DataStage v11.7, Informatica and Steamsets

Cloud Technologies Snowflake Could Data Warehouse, AWS S3, RDS, DMS, Cloudwatch

Databases Snowflake, Oracle, Toad, Hive and PostgreSQL

Programing Languages SnowSQL, Python

Domain Knowledge Insurance, Finance and Retail

Scheduling Tools Control-M, AutoSys and Crontab

Version Controls Tool GitHub, GitLab

Scripting Shell, Python

Other Tools WinSCP, Bit Bucket Jira, ServiceNow, Jenkins,

Electricflow

Operating System Windows 8/10, Linux

PROFESSIONAL EXPERIENCE

Working as a Sr ETL/Snowflake Developer at Infotek (Client: TIAA) Charlotte NC.

Worked as Technical Lead at TCS (Client: Nordea Bank) India.

Worked as Associate Test Architect at UST Global (Client: Dell Technologies) India

Worked as Sr. System Analyst at UST Global (Client: NetApp) India

Worked as ETL Developer at Artech Info systems Pvt Ltd ( Client: NetApp) India

EDUCATIONAL QUALIFICATION

MCA (Master of computer Application) from JNTU Hyderabad-2006

PROJECTS

Client. TIAA

Domain Finance

Role Sr ETL DataStage Developer and Cloud Engineer

Duration Jan-2023 to Till date

Project Description –

The Teachers Insurance and Annuity Association of America-College Retirement Equities Fund (TIAA, formerly TIAA-CREF) is an American financial services organization that is a private provider of financial retirement services in the academic, research, medical, cultural, and governmental fields. TIAA is listed on the Fortune 100 and serves over 5 million active and retired employees participating at more than 15,000 institutions and has $1 trillion in combined assets under management withholdings in more than 50 countries.

Responsibilities:

Analyzed, designed, developed, implemented and maintained Parallel jobs using IBM info sphere Data stage

Involved in design of dimensional data model - Star schema and Snowflake Schema.

Implemented Type I and Type II SCD(Slowly Changing Dimensions) tables in DataStage and Snowflake.

Experienced in PX file stages that include Complex Flat File stage, DataSet stage, LookUp File Stage, Sequential file stage.

Implemented multi-node declaration using configuration files (APT_Config_file) for performance enhancement.

Successfully implemented pipeline and partitioning parallelism techniques and ensured load balancing of data.

Deployed different partitioning methods like Hash by column, Round Robin, Entire, Modulus, and Range for bulk data loading and for performance boost.

Extensively used DataStage for extracting, transforming and loading data from multiple sources like Oracle, Snowflake, Hive and Flat files.

Collaborated with EDW team in, Low Level design document for mapping the files from source to target and implementing business logics.

Extensively worked on Job Sequences to Control the Execution of the job flow using various Activities & Triggers (Conditional and Unconditional) like Job Activity, Wait for file, Email Notification, Sequencer, Exception handler activity and Execute Command.

Involved in design and development of Snowflake Database components.

Preparing Data Load configurations, Building scripts to perform New Data Loades.

Implemented framework in snowflake to process daily loads using snowpipe and Tasks.

Implemented DMS service to stream data from MySql to AWS S3

Created multiple DB/Account objects in Snowflake DB such as Snowflake Roles, tables, File Formats, Stages, Data loads, Views etc.

Cloned testing environments from production to perform system testing.

Environment SQL, IBM Infosphere DataStage 11.7, Snowflake, GitLab and Autosys

Client Nordea Bank

Domain Banking

Role Technical Lead Duration Jul-2021 to Dec-2022

Project Description –

Nordea is a leading Nordic bank with a history spanning over 200 years, serving more than 9.3 million private and 530,000 active corporate customers in the Nordics, Poland, and Estonia. The bank operates in four business areas - Personal Banking, Business Banking, Large Corporates & Institutions, and Asset & Wealth Management. Nordea has strong market positions in all Nordic countries and is the seventh largest Nordic company among the 10 largest European financial groups. The bank is committed to playing an active role in the financial sector and leading the transition to a sustainable future.

Roles & Responsibilities –

Gather requirements from business users and product owners and translate them to technical requirements. Once approved by the SME, by using data stage tools developed the jobs using different stages.

Need to analyze the incidents or jobs and provide the solutions to the business,

Based on the issue and criticality we need to provide the log efforts and get it approved by the business before going to start the work.

Responsible for code migration from Development to Testing and Testing to Productions environments.

Attend the daily scrum call and provide the updates on ongoing development work.

Raised the required CRQ’s and Incidents for code migrations for quick changes based on requirement.

Responsible to create or update the Business justification, Risk assessment, Back out Plan, Install plan and Test and verification plan before raising the CRQ or code migrations.

Involved in performance tuning activities and preparing deployment documents (HLD and LLD documents).

Strictly followed the change control methodologies while developing the code from DEV to QA, PPROD and Production.

Support QA and UAT by preparing the environments, providing execution instruction.

Involved in design and development of Snowflake Database components.

Preparing Data Load configurations, Building scripts to perform New Data Loades.

Handling both streaming and batch processing jobs

Implementing framework in snowflake to process daily loads using snowpipe and Tasks.

Creating data shares to sync production to testing env.

Cloned testing environments from production to perform system testing.

Environment IBM Infosphere DataStage 11.7 and Snowflake

Client Dell Technologies

Domain Retail

Role Associate Test Architect

Duration Aug-2019 to Jul-2021

Project Description –

The PINE technology solution will replace the current solution, managed by an external vendor, running on decade old IBM infrastructure which will go end of life in June 2019.In this project, data will be collected from various sources and pooled into greenplum as part of the data ingestion layer. Once data is loaded for CDI and PRE-CDI layers then data will be pulled from these tables to load into post cdi tables. There are certain business rules that will be applied on these data such as identifying the winning perspective of an entity, identifying the marketable contacts (Direct Mail, Email and Phone), deriving respective job title for a contact etc. Various other dimension tables (account, residence, contact, site, phone, email, address) will be populated as part of the consumption layer. Finally, reports are generated based on these dimension tables which will be used for the campaigns conducted by business.

Roles & Responsibilities –

Analyze the existing information sources and methods, understanding the customer expectations and identifying the problems.

Based on the requirements need to prepare the road maps for development and get it approved by the SME

Once approved by the SME, Designed and Developed data stage jobs on DataStage Environment.

Involved in performance tuning activity.

Strictly followed the change control methodologies while developing the code from DEV TO QA, PPROD and Production.

Support QA and UAT by preparing the environments, providing execution instructions.

Need to take the ownership of the Problem tickets and incidents if any raised by the customers.

Need to attend the daily scrum call and provide the updates on how the work is going on.

Preparing the deployments documents (HLD and Mapping documents).

DWH tools IBM Infosphere DataStage 11.5 & Informatica

Client NetApp

Domain Retail

Role Sr System Analyst

DWH tools IBM Infosphere DataStage 9.1 & Oracle Duration. Oct-2013 to Aug-2019

Mas Project Manager Assoc. Project Manager

Project Description -

NetApp, Inc., formerly Network Appliance, Inc., is an American multinational computer storage and data management company headquartered in Sunnyvale, California . NetApp provides optimum storage solutions to the rapidly growing data storage needs of its clients – enabling storage consolidation & reliability, simplifying data center operations, disaster recovery features and distributed enterprise linkages. Network Appliance has partnered with UST to provide solutions for its Enterprise DWH, Sales & Finance Reporting systems. The BI team at Network Appliance which is responsible for maintenance, enhancement, and development of its DWH and Reporting Infrastructure, works closely with UST to achieve this.

Roles & Responsibilities

Responsible for successful daily production run and report immediately in case of any issue by conducting client meetings and calls.

Providing latest Production code, Shell scripts and Surrogate keys to the developers.

Responsible for Month End, Quarter End and Year End activities.

Executing the implementation plans shared by the development team in the production environment successfully and resolving the issues.

Making all production deployment flow through the pre-production environment and performing end to end dry run to foresee any impacts on production run-time SLA.

Performing impact analysis before the production deployment based on the observations from the pre-production environment and providing heads up in case of major code migration or process changes.

Providing a solution for the Production Tickets and creating the production tickets and assign to the corresponding team.

Giving KT and mentoring new team members in Operation team and taking up client calls and participating in meetings.

Follow up the production issues to concern SMEs and need to send the updates weekly basis to higher management.

Preparing the run manual and updating it to track the changes.

Participate in business workshops or calls to understand the business requirements.

Coordinating with business and technical users for analyzing the business requirements during development phase

Key ownership in taking care of all quality deliverables such as Test Effort Estimations, Master Test Plan, Test Strategy, Test Scenarios, Test Cases, Test Scripts and Test Metrics.

Participate in the reviews of requirements, Design Document and Data Models and Preparing/updating Test estimation/staffing plan..

Play a key role in the development of SQL queries while preparing the test case scripts.

Design and execution of the test cases and analysis of test results.

Logging defects and tracking them closer and Perform confirmation and regression testing.

Providing daily reports of the testing status of the team to the Test Manager.

Coordination with the development team during the entire life cycle for the defect triage and clarification.

Preparation of Weekly Status, test summary reports and participation in the test Completion Review Meetings.

Test closure activities and implementation of best practices to improve productivity.

Client NetApp

Role ETL Developer

Duration Jan-2010 to Sept-2013

Project Description –

NetApp, Inc., formerly Network Appliance, Inc., is an American multinational computer storage and data management company headquartered in Sunnyvale, California . NetApp provides optimum storage solutions to the rapidly growing data storage needs of its clients – enabling storage consolidation & reliability, simplifying data center operations, disaster recovery features and distributed enterprise linkages.Network Appliance has partnered with UST to provide solutions for its Enterprise DWH, Sales & Finance Reporting systems. The BI team at Network Appliance which is responsible for maintenance, enhancement and development of its DWH and Reporting Infrastructure, works closely with UST to achieve this.

Roles & Responsibilities

Analyze the specification document.

Extensively involved in preparation of low-level design documents (LLD).

Working on Data stage Designer – for Extracting Source Transactional Database and transforming and then loading into Target Database.

Develop Data Stage jobs for Dimensions, Fact tables and Sequence jobs.

Designed jobs by using different parallel job stages such as Sequential file stage, Dataset stage, Aggregator, Sort, remove duplicates, filter, switch, lookup, join, funnel surrogate key generator and copy stage.

Used DataStage Director to schedule, run and monitor the jobs that perform source to target data loads.

Extensively involved in using the various activities like job activity, routine activity, wait for file activity, sequencer activity, terminator activity, exception handler activity etc.

Involved in Unit Testing and Peer Review of the job designs.



Contact this candidate