Post Job Free

Resume

Sign in

Data Aws

Location:
United States
Salary:
120000
Posted:
December 24, 2020

Contact this candidate

Resume:

PROFESSIONAL SUMMARY

Consultant with over **+ years of experience in Snowflake database, AWS cloud computing services along with ETL tools (AB Initio and Informatica Power center /MDM).

In-Depth understanding of core concepts for Public/Private/Hybrid clouds AWS & GCP.

Experience working with Technical Support Group, Database Administration Group, Product Development Group, and Professional Services Group to drive most optimal solution architecture within each client implementation

Good Understanding of storage systems, Linux kernel, UNIX kernel, UNIX file system, and Windows infrastructure

Possesses adequate working hands-on knowledge on AB Initio, Informatica power center, Teradata and snowflake Utilities.

Expert knowledge in implementing snowflake service like cloning, data sharing, stages, RBAC controls and warehouse re-sizing.

Expert knowledge in snowflake data modeling, ELT using snow Sql and data migration from on premises warehouse to snowflake warehouse.

Expert knowledge of integrating snowflake with Hive, loading and unloading data to and from snowflake using ETL tools with Ab initio and Informatica and external sources like AWS Data Lake.

Expert knowledge in snowflake utilities like snow-pipe, workflow and stream for continuous data loading.

Good understanding and working knowledge in AWS Cloud computing services like (data lake, CRC, Rehydration, ec2, SNS, SQS, Athena, glue, Lambda and ELK)

Strong hands on experience in Python.

Possesses adequate and working knowledge on Hadoop batch integration on HDFS, Hive and Sqoop.

Sound understanding of Spark and Scala.

A good working knowledge on Informatica MDM.

Good understanding on Automic and Tivoli scheduler.

Gained exposure in working on BFSI Domain.

Technical Expertise

Databases

Oracle 10g, Teradata, Snowflake, SQL

ETL Tools & Utilities

AB Initio,Informatica power center9 & 10, Informatica MDM, Informatica Cloud

Operating System

Linux, Unix

Scheduler

Tivoli, Automic

Hadoop Technology

HDFS, Pig, Hive, Sqoop, Oozie, Scala,

Cloud Computing

Amazon Web Service (S3,EC2, SNS, SQS,DATA PIPELINE, ATHENA,EMR)

Scripting Language

Shell, Python

Functional Experience

Managing requirement gathering, system analysis and finalization of technical / functional specifications.

Defining best practices intended for project support and documentation.

Designing, developing, testing, troubleshooting and debugging of the applications using different technologies.

Conducting smooth implementation and testing of application at client location.

Providing post-implementation, application maintenance and enhancement warranty support to the client.

Cooperating & communicating with other team members for efficient project work.

Professional Experience:

Clients: PNC Bank

Role: Snowflake and ETL Senior Developer, September 2018 to Current

Description: PNC Bank is in process to migrate on premises infrastructure including applications to Amazon web services along with adopting new cloud technologies.

Migration of Teradata and DB2 to Snowflake database using AWS.

Loading data to AWS S3 bucket.

Loading data from AWS S3 bucket to Snowflake database using snow-pipe.

Continuous data loading into snowflake using snow-pipe, capturing CDC using stream.

Ab initio code changes to read and write data to and from S3 bucket.

Ab initio & Informatica code migration to Spark and Scala.

Informatica code migration to AB initio.

Technology Used:

AB Initio, Informatica Power Centre

Teradata,Oracle, Snowflake database

Linux, Python

AWS Cloud computing

Automic Scheduler

Spark, Scala

Roles & Responsibility:

Converting HLD into LLD and providing solution.

Snowflake Data modeling and implementing the star and snowflake schema.

Migrating data from existing warehouse from Teradata to snowflake warehouse.

Snowflake data warehouse management using various services and utilities like (snow-pipe, stream, cloning, data sharing, ware house resizing)

Responsible to configuring various cloud services like (IAM users & role, policies, Bucket, Athena, Glue and EMR, ELK).

Configuration of File gateway using services (S3, SNS, SQS, Lambda and CRC).

Looking into performance issues of Ab initio on cloud.

Creation and configuration of S3 buckets and directories.

Rehydration process for ec2 instances.

Data Lake management.

Lift and shift Ab initio ETL processes on AWS EC2 service.

Responsible for end-to-end code migration.

Interacting with business to understand requirement.

Clients: Option Clearing Corporation

Role: Senior Developer, July 2015 to August 2018

Description: The Option Clearing Corporation is under process to migrate various Option application from on premise to AWS cloud using lift and shift process with a technology migration from Informatica to Ab initio and (Teradata and Oracle ) to Snowflake database.

Applications Delivered

MDA Citi (Remortgage, Purchase, Further Advance)

MDA alerts application

MDA Letters

MDA Data Lake in Hadoop

KYC (Know your customer)

SRP Enhancement

Informatica to AB initio migration

Technologies Used:

Informatica, AB Initio

Snowflake database

AWS Cloud Services

Oracle, SQL Server,Teradata and snowflake Utilities

Tivoli scheduler

Shell and Python Scripting

Roles & Responsibilities:

Data model design and implementation using snowflake.

Co-Ordination with Business users and collecting requirements.

Code migration from Informatica to AB Initio.

Data loading from oracle and Teradata to AWS Data Lake and snowflake data warehouse using ETL tool AB initio and Python.

Providing ETL Designs and technical solutions

Snowflake Cost and Effort estimation.

Responsible for code review.

Review of schedules and jobs creation on Tivoli.

Helping team in test case preparation.

Impact analysis for enhancement.

Project Documents preparation.

Responsible for deliverables.

Clients: Wells Fargo

Role: Senior ETL Developer, January 2013 – June 2015

Description: Wells Fargo is in the process to decommission of their EDW system called prime for Germany and Spain country and implementing the new vision plus card system integrating with 6 other applications in place of prime. Prime contains around 13 years of data which is to be migrated from Oracle database to Teradata database. Project involves below activities

History Migration (13 years of Legacy data from Oracle to Teradata).

AnP (Acquire and Publish of vision Plus files after files validation)

CDC (Change data capture)

Stage Load

Data Extraction (3rd party)

Data Transfer over network (HTTP)

Mart Loading

Roles & Responsibility:

worked as a Module lead

Doing Impact Analysis for any change in the code.

Code review developed by the peers.

Informatica workflows Schedules and Jobs creation in Tivoli.

Assigning work task among team members.

Converting HLD (high level design) into mapping document for code development

Responsible to create the LLD document.

Knowledge transfer to support team.

Providing warranty support to client on developed modules.

Coordination with other team members for (SIT, OAT, VPT, and PROD) code implementation.

Technology Used:

AB Initio, Teradata, Oracle, Linux, MQ, Tivoli Scheduler

Clients: Provident Bank

Role: Developer, March 2010 to Dec 2012

PROJECTS HANDLED

LAS, DC Monitoring system, HRMS Training tracking system, Fraud Alert System, Payment Collection, GPPS, projects done in Pl / Sql & Java /J2EE Technology

Team Size:- 20

Responsibility: Application Development, Requirement gathering, Single Point of Contact, Responsible for module

Delivery, BCP

Technology: - Oracle Pl /sql, Informatica 8.6, J2EE

Anil Vakkalanka (US Citizen)

Email ID- adiw98@r.postjobfree.com / Ph:773-***-****

https://www.linkedin.com/in/anil-vakkalanka-55a3341a2/



Contact this candidate