Post Job Free

Resume

Sign in

Project Management Data Warehousing

Location:
Monmouth Junction, NJ
Posted:
December 19, 2023

Contact this candidate

Resume:

Siva Kumar Kandivalasa

New Jersey, USA • 908-***-****

ad122i@r.postjobfree.com• linkedin.com/in/siva-kandivalasa-80143777 SUMMARY

Highly proficient, versatile and resolution focused Data Architect/Data Warehouse Engineer/Cloud Architect offering ex- pertise in data warehousing architecture, data engineering, data modeling, and project management seeking a challeng- ing role to contribute my expertise and skills to develop scalable and reliable data warehouses.

17+ Years of Professional IT experience in Data warehousing, ETL, Data modeling, Data Analysis, Business Intelligence(BI), Cloud, Database, and Project management.

Seasoned expertise in Data lake, Data warehouse and Lakehouse architectures

Proficient in Snowflake, Amazon Web Services(AWS),IICS, and ICRT, with extensive experience in designing, building, and optimizing data warehousing solutions using cloud-based technologies

Expertise in Snowflake – data modeling, ELT using Snowflake SQL, implementing complex stored procedures and standard DWH and ETL concepts

Having good abilities in development and support for the successful execution of projects using industry-leading BI tools and technologies.

Designed and Implemented data warehousing projects in Banking, Insurance, Life Sciences, Retail, and Health Care domains at various client locations across the Globe.

Deep understanding of relational as well as NoSQL data stores, methods, and approaches (star and snowflake, dimensional modeling)

Experienced in integrating with external systems such as Snowflake, AWS, SAP (BW/HANA/ABAP/ IDoc/ BAPI), Structured(RDBMS) and Semi-structured(XML) data, Salesforce, WebServices using power exchange

3x AWS Certified and exposure to various AWS services including IAM, S3, EC2, EMR, Auto Scaling, Elastic Load Balancing(ELB), VPC, Cloud Formation, CloudFront, DynamoDB, RedShift, Lambda, and CI/CD.

Experience in using Serverless services including S3, Lambda, DynamoDB, API Gateway, StepFunctions, mes- saging services SQS, and SNS

Aspiring Data Scientist with knowledge of Python Scripting and libraries pandas, numpy, and matplotlib

Knowledge of Hadoop technologies including HDFS, MapReduce, Spark, and HIVE MANAGEMENT SKILLS

Successfully managed complex projects with global implementation, and rapidly evolving requirements.

Collaborated and communicated effectively with cross-functional teams including business users, data analysts, and other stakeholders to comprehend their requirements and provide technical solutions.

Extensive experience in driving projects under challenging circumstances, including tight deadlines, high-pres- sure situations, and ambiguous conditions

Experience in project planning as well as project estimations and resource allocation

Adept in providing analytical support to key business applications/solutions

Acted as liaison between business users and the development team in translating requirements, resolving issues

, and handling escalations

Possess hands-on experience in operating with the Onshore-Offshore model, skillfully led teams ranging from 5 to 10 members, and delivered projects ranging from 1 to 3 Million TECHNICAL SKILLS

EDUCATION & CERTIFICATIONS

B.Tech (Electronics&Comm), J.N.T.U, Hyderabad, India; Year 2000-2004; Snowflake & AWS Snowflake UI, SnowPipe, SnowSQL,S3, EC2, RDS, RedShift, Lambda, Cloud Front, Step Functions, Cloud Formation, ECS, EMR BigData&CI/CD Tools Databricks, HDFS, MapReduce, Spark, Hive, GitHub, BitBucket ETL/BI Tools Informatica PC, Dbt, SSIS, IICS, ICRT, PowerExchange,IDQ,Tableau Databases Snowflake, Teradata 12.0/14,Netezza, SAP HANAOracle, DB2, MS-SQL Server(TSQL), MongoDB

Data Modeling Tools Erwin 4.2/4.0, MS Visio

Programming Languages SQL, Python Scripting, Pandas, UNIX Scripting, PL/SQL, and Core JAVA Other Tools Jira, SAP GUI, HP QC, Aqua Studio, Force.com, Teradata SQL Assistant, WinSQL, Toad, SQL Developer, Putty, PVCS, Aginity Workbench for Netez- za, SAP HANA Studio, PAC2000 and ServiceNow

Scheduling Tools IBM Tivoli (TWS), Tidal, Stone Branch and Autosys

AWS 3x Certified (Certified Developer, Solution Architect-Associate and Cloud Practitioner)

SnowPro Core Certified, Snowflake

Informatica Power Center Certified Developer, Informatica Corporation.

Oracle Database 11g: SQL, Oracle Corporation

Teradata 12 Certified Professional, Teradata Corporation

Informatica Cloud 101 and 201

Certification Record: https://www.credly.com/users/siva-kumar-kandivalasa/badges WORK EXPERIENCE

Client: Tapestry, Inc. Oct’21 to Till date

Role: Snowflake Architect/Data Architect

Project Name: S4 HANA to Snowflake Migration/E2EF Performance Management Program Summary:

• Interacted with Business teams to understand their requirements, gathered all data elements, and defined timelines

• Designed and created data model and database objects in Snowflake and HANA S4 for new, and enhancements

• Integrated Snowflake with Informatica & DbT, created projects and models, custom SQLs, cloned objects, and devel- oped ETL pipelines

• Created data ingestion process in Snowflake to copy multiple files(bulk loading) and load in Snowflake stage tables.

• Created Snowflake stages(internal/external), File Formats, Sequences and Pipes to copy and build ETL pipelines

• Built ETL pipelines to process S3 files into Snowflake based on SQS notifications and automated

• Demonstrated expertise in data modeling, database design, and schema management for different types of databases, such as relational, NoSQL, and columnar databases, ensuring efficient data storage and retrieval in alignment with the project requirements.

• Reduced load time by 40% by optimizing snowflake database queries using pruning, restructuring database tables, and minimizing data spillage to remote and local disks

• Have used Snowflake utilities, SnowSQL, and SnowPipe as and when needed

• Implemented Snowflake query and search optimization methods for query optimization, time travel, zero-copy cloning, optimized virtual warehouses, created resource monitors and defined credit thresholds, defined roles (RBAC), and managed users

• Implemented a robust data validation framework, employing automated checks and validations, to maintain data in- tegrity and quality throughout the project lifecycle.

• Performed rigorous data validation to ensure the accuracy and integrity of the migrated data. Compare data between the SAP HANA S/4 system and Snowflake to identify any discrepancies or inconsistencies.

• Created data reconciliation framework in ETL to validate weekly/monthly/quarterly/yearly counts and quantities be- tween Snowflake and HANA S4 tables

• Implemented multiple sub-projects including Gift-card Escheatment, OMS Reconciliation for orders, Demand Sales, Traffic, DC Inventory Checks, and Store end of day comments in HANA and deployed to Snowflake

• Created Inbound and Outbound processes and designed end-to-end ETL solutions including MFT jobs creation, file generation, cleansing, loading, auditing, validation, scheduling, and archiving

• Created Unix shell scripts for XML source file validation, count checks, list file generations, and archiving

• Implemented error reporting and audit process for ETL loads

• Generate Tableau reports and dashboards using HANA and Snowflake as data sources Environment: Snowflake, Informatica, Dbt, SnowSQL, SnowPipe, AWS-S3, AWS-SQS, Tableau, SAP HANA S4, PythonScripting, DataFrame, Unix Shell Scripting, MFT(Axway SecureTransport), Jira & ServiceNow Client: Wells Fargo Oct’19 to Oct’21

Role: Cloud Architect

Project Name: Capital Markets Platform Management

Summary:

• Involved in migration of a legacy on-premise application to AWS cloud

• As a key member of the AWS Solution Architecture team, provided inputs in designing, building, and modernizing ap- plications, software, and services on the AWS platform.

• Collaborated with cross-functional teams to design and implement cloud-based solutions using AWS. Conducted sys- tem analysis and provided recommendations to improve performance and scalability

• Managed S3 buckets, setup policies, granted least access to applications and services via I AM roles

• Configured AWS CloudTrail to log IAM events for auditing and compliance

• Developed strategies to build serverless models versus traditional models Environment: Amazon Web Services(AWS), S3, EC2, IAM, StepFunctions, Lambda, API Gateway, SQS, SNS, Glue,Informatica Power Center 10.4.1, and UNIX shell scripting Client: Tapestry (Coach) Feb’19 to Oct’19

Role: Data Architect/ETL Engineer

Project Name: Tapestry S4 Implementation Phase 3 (Legacy to S4 HANA Data Migration) Summary: This project aims to migrate the data which is needed to carry on business in the new SAP system from the legacy systems. Migrated data will include transactional data such as inventory, open sales orders, open pur- chase orders, and the related materials (articles), customers, vendors, sites, etc. This migration is an iterative and inter- connected approach where data is analyzed, extracted, transformed, validated, and loaded into the target system(s), and reconcile the data loaded in the target system back to the source system.

• Understanding and analyzing functional specification documents.

• Developing ETL jobs for extraction, transformation, pre-load, load and post load processes.

• Extracted data from SAP R/3 system to Netezza stage using Power Exchange SAP ALE Integration.

• Load process involves loading data into SAP IDOCs or BAPI

• Generating validation scripts for reconciliation.

• Collaborated with Business and SAP functional teams to implement the conversions. Environment: Informatica Power Center 10.2, Netezza, Oracle, Putty, WinScp, SQL Developer, SAP GUI and UNIX shell scripting

Client: Bausch Health Companies Inc., NJ May’18 to Sep’18 Role: Cloud Engineer

Project Name: Marketo Automation Platform Integration Summary: The purpose of this project is to integrate Marketo email campaign system and Sales force (Sales forces instances - VEEVA instances) via an Integration layer which is the central repository of the master data of HCP’s, HCO’s, Rx activity, Marketo data, Salesforce data(6 instances) using Informatica real time(ICRT) integration platform. The real time integrations will be triggered from Marketo and then update salesforce, VEEVA instances, which holds read-only attributes for sales rep’s informational purpose on campaigns, brand related information and Opt IN-Out infor- mation’s.

Also, Brand Opt-Outs and Corporate Opt-Outs updates in Veeva Instances (By Reps) captured in Marketo using Infor- matica batch process.

• Created and published processes in Informatica cloud real time (ICRT) process designer using REST API.

• Designed and developed service connectors, connections and processes in ICRT to consume the changes in Marketo system and update in Salesforce.

• Developed Informatica Cloud Mappings, task flows and processes (Data Synchronization, Data replication) in batch mode to capture the changes in Salesforce (veeva) to Marketo system

• Created error reporting for inability to access receiving systems. Environment! Informatica Cloud Services and Real Time Services (ICS and ICRT) PREVIOUS WORK EXPERIENCES:

Primary Job Duties:

• Contributed in the development of system requirements and design specifications

• Participated in the design and development of Dimensional modeling

• Interacted with Business users, BI teams and Business analysts to understand the requirements and capture all the business needs

• Translated business requirements into ETL specifications for development activities

• Integrated with SAP R/3 systems using power exchange for SAP NetWeaver methods such as ABAP integration, ALE and BAPI/RFC function calls.

• Created ETL Inbound and Outbound mappings to generate IDOCs using SAP/ALE Integration with Informatica Power Exchange.

Client Role Duration Project Name

WellsFargo Data Architect/ETL Engineer Sep’18 to Feb’19 AMCT Reporting

Streamlining Project

Campbell Soups Company,

WHQ Data Architect/ETL Engineer Apr’17 to May’18 Corner Stone Americas’ PVH Senior data warehouse Engineer Sep’16 to Mar’17 Tommy China The Bank of Tokyo-Mitsubishi

UFG Senior data warehouse Engineer May’15 to Aug’16 Core Banking & OFSAA

Leones 42

Campbell Soups Company,

WHQ Senior data warehouse Engineer Aug’14 to May’15 Financial Reporting Verizon Senior data warehouse Engineer Jan’14 to Aug’14 Migration Factory (MF) –

Profile Migration

Merck (Cognizant Tech

Solutions)

Solution Architect/Data Architect/

Tech Lead Oct’09 to Jan’14

Merck Service Rendered

for ET

Principal Financial

Group(Fujitsu) ETL Developer Oct ’06 to Oct’ 09

Treasury Consulting

Project

• Created SAP/ALE IDoc Preparer and Interpreter Transformations to process the segment data from upstream/down- stream transformations

• Integrated with SAP BW system to access the data in open hub destination(OHD)

• Created various ETL mappings simple to complex, designed frameworks, implemented error handling and audit mech- anisms

• Participated in creating database scripts using Erwin forward and reverse engineering techniques

• Created table partitions, sub-partitions, indexes, views, materialized views and Stored Procedures as and when need- ed

• Optimized SQL queries using database tuning techniques and enhanced performance Environment: Informatica Power Center, Power Exchange, SSIS, SAP NetWeaver, Teradata, Unix Shell Scripting, Sources ( SAP BW, Share Point List, MS SQL Server, Flat files & Oracle ), Teradata SQL Assistant, Salesforce, Force.- com, Putty, Autosys & Soap UI

Organization: Cognizant Technology Solutions

Client: Merck Co Oct’09 to Jan’14

Role: Solution Architect/ETL Architect/Tech Lead

Project Name: Merck Service Rendered for ET

Summary: Merck Services Rendered for ET is development and enhancement project. It is an umbrella project in which multiple projects execute in parallel. It provides extract, transform and load (ETL) services to 26 USHH project teams across 5 business areas. Each project is independent from the other projects and requires strict timelines. An Umbrella structure of Merck Services Rendered for ET project provides centralized control over the ETL project man- agement.

Sub Projects:

" Emerging Markets (EMEA) – Turkey, Russia & South Africa

" Activities and X-force Dashboard (EMEA)

" Sales & Activities (EMEA)

" GENESYS

" UNIVADIS

" GHH Divisional Data warehouse (DDW)

" MCC Divisional Data warehouse (DDW)

• Managed requirements and design phase towards identifying mutually agreed solutions with technical teams and busi- ness partners.

• Responsible for providing requirements, functional knowledge and design documents to ETL team for development

• Responsible for converting requirements into comprehensive detail design specifications (Integration, databases, data flows, transformations, interfaces etc.) for solutions

• Responsible of creating project plan and working with customer on regular basis to show the progress at each stage

• Responsible for all ETL deliverables in SDLC methodology such as integrated functional specification, source to target mapping sheet, design, coding, unit testing and deployments

• Integrated Informatica with Salesforce Cloud in extracting Customer flags and RFM data.

• Integrated Informatica with Web Services using Power Exchange in extracting data.

• Other source systems majorly used are Teradata, Oracle, SQL Server, XML and Flat Files

• Extensively used Teradata utilities MULTILOAD, FASTLOAD, TPUMP, TPT and BTEQ Scripts to load data into Data warehouse.

• Created UNIX shell scripts to run the Informatica workflows, Dynamic parameter files creation, indirect file creation, file watcher and email notification etc…

• Responsible for driving the team to meet the objective of the project by proactively taking decisions. Environment: Informatica Power Center 9.1, Informatica Power Exchange 9.1 for SAP NetWeaver, Teradata, Unix Shell Scripting, Sources ( SAP BW, Share Point List, MS SQL Server, Flat files & Oracle ), Teradata SQL Assistant, Salesforce, Force.com, Putty, Autosys & Soap UI

Client: Principal Financial Group Oct ’06 to Jan’ 09 Role! ETL Developer

Project Name: Treasury Consulting Project and GE Securities Job Duties:

• Designed ETL jobs and scheduled using third party scheduler's

• Implemented performance tuning techniques to increase the session performance.

• Responsible for Unit testing and creating migration plans.

• Coordinated with the various teams including database, unix and stake holders involved in project Environment: Informatica Power Center 8.1/7.1, DB2 UDB, Oracle, SQL, PL/SQL & UNIX Shell Scripting.



Contact this candidate