Post Job Free
Sign in

Data Analyst Engineer

Location:
Charlotte, NC
Posted:
June 13, 2023

Contact this candidate

Resume:

Vamshi Bandaru

adxox9@r.postjobfree.com

+1-479-***-****

Cloud Data Engineer.

A well enthusiastic consulting and technical expert with around 7+ Years of experience in Information Technology with extensive experience in projects involving data warehousing, data integration, business intelligence, reporting, data migration, business analysis, design, development, and documentation. Expertise in software design, development, and deployment of large and complex software applications. Has led and participated on teams doing analysis, design, implementations, and enhancements and testing of software applications.

Professional Overview:

Experience in Data Analysis, Data Migration, Data Validation, Data Cleansing, Data Verification, identifying data mismatch, Data Import, and Data Export using multiple ETL tools such as DataStage and Informatica.

Experience in Design, Development, and implementation of the enterprise-wide architecture for structured and unstructured data providing architecture and consulting services to Business Intelligence initiatives and data driven business applications.

Have experience in Dimensional Modeling using Snowflake schema methodologies of Data Warehouse and Integration projects

Experience with Business Process Modeling, Process Flow Modeling, Data flow modeling

Experience in working with creating ETL specification documents, & creating flowcharts, process workflows and data flow diagrams.

Worked with processes to transfer/migrate data from AWS S3/Relational database and flat files common staging tables in various formats to meaningful data into Snowflake.

Strong Experience in Data Migration from RDBMS to Snowflake cloud data warehouse.

Experience in various phases of data warehouse development lifecycle, including gathering requirements to design, development, testing.

Strong development skills with Azure Data Lake, Azure Data Factory, SQL Data Warehouse Azure Blob, Azure Storage Explorer

Implemented Copy activity, Custom Azure Data Factory Pipeline Activities for On-cloud ETL processing.

Experience in Monitoring and Tuning SQL Server Performance

Experience in Data Requirement Analysis, Design, Development of ETL process using IBM DataStage 11.X.

Experience in Designing and Implementing Data Warehouse applications, mainly Transformation processes using ETL tool DataStage.

Experience in Data Warehousing applications, responsible for the Extraction, Transformation and Loading (ETL) of data from multiple sources into Data Warehouse.

Wrote, tested, and implemented Teradata Fast load, Multi load and Bteq scripts, DML and DDL.

Played as a liaison between Business and IT team in transforming Data/Reporting requirements to build software solutions.

Experience in agile projects and very familiar with agile ceremonies

Developed multiple reusable codes for deployment across ETL projects.

Experience in coding optimized SQL queries on databases like MySQL, SQL server.

Expertise in SQL and PL/SQL programming to write Views, Stored Procedures, Functions and Triggers.

Experience in using GIT/SVN for software development version control.

Hands-on experience working with Continuous Integration (CI) build-automation tools such as Maven, Jenkins.

Experience on manipulating IDE or tools such as Eclipse, Web storm, Microsoft Visual Studio.

Good experience in creating Workflows, Lightning Process Builder, Reports and Force.com sites.

Experience in implementing Data pipelines using Matillion.

Have good knowledge on MatillionAdministartion and designing Orchestration and transformation Jobs.

Technical Skills:

Languages : Python.

Databases : SQL, PL/SQL.

Web Technologies : JSON, XML.

ETL Tools : IBM Data stage data build tool, Informatica

Data Warehouse : Snowflake Cloud.

BI and Reporting : Tableau, Power BI.

Version Control Tools: GIT, SVN

Operating Systems : Windows, Mac, Unix, Linux

Development Tools (IDEs): Visual Studio, NetBeans, Eclipse

Cloud Technologies : AWS EC2, S3 bucket, IAM, SNOWFLAKE.

Methodologies : Agile, Waterfall.

Professional Experience:-

Duke-Energy, Charlotte, NC Oct2020- Till Date

Snowflake Cloud Data Engineer

Responsibilities:

Responsible for requirement gathering, user meetings, discussing the issues to be resolved and translated the user inputs into ETL design documents.

Responsible for documenting the requirements, translating the requirements into system solutions, and developing the implementation plan as per the schedule.

Created ETL mapping document and ETL design templates for the development team.

Created external tables on top of S3 data which can be queried using AWS services like Athena.

Snowflake data engineers will be responsible for architecting and implementing very large-scale data intelligence solutions around Snowflake Data Warehouse.

A solid experience and understanding of architecting, designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.

Involved in migration from on prem to Cloud AWS migration.

Process Location and Segments data from S3 to Snowflake by using Tasks, Streams, Pipes, and stored procedures.

Led a migration project from Teradata to Snowflake warehouse to meet the SLA of customer needs.

Used Analytical function in hive for extracting the required data from complex datasets.

Extensively used to handle the Equipment Data and Vision Link data by using XML, XSD, JSON files and loaded into Teradata database.

To Build distributed, reliable, and scalable data pipelines to ingest and process data in real-time.

Created ETL pipelines using Stream Analytics and Data Factory to ingest data from Event Hubs and Topics into SQL Data Warehouse.

Responsible for Migration of key systems from on-premises hosting to Azure Cloud Services Snow SQL Writing: SQL queries against Snowflake.

Experience on Migrating legacy data warehouse and other databases to Snowflake.

Data Analysis and Profiling to discover key join relationships, data types, and assess data.

quality utilizing SQL queries. Data cleansing, Data manipulation and exploratory analysis to identify, analyze and interpret trends and patterns in large data sets.

Data Modeling/Data Architecture: Organization and arrangement of data in Staging /Intermediate/Final Targets using tables, views, etc.

Used DBT to test the data (schema tests, referential integrity tests, custom tests) and ensures data quality.

Used DBTto debug complex chains of queries. They can be split into multiple models and macros that can be tested separately.

Implemented Error logging and Notifications in Matillion ETL jobs.

Worked on Migrating objects from DB2 to Snowflake.

Environment: Snowflake Cloud Datawarehouse, AWS S3, AWS IAM, DBT (data build tool), Oracle Database Python, Tableau, SQL Server, DB2.

Ford

Norfolk, VA April 2018 – Oct2020

Snowflake Cloud Data Engineer

Responsibilities:

Experience in learning architecting data intelligence solutions around Snowflake Data Warehouse and architecting snowflake solutions as developer.

Build, create and configure enterprise level Snowflake environments. Maintain, implement, and monitor Snowflake Environments.

Hands-on experience with Snowflake utilities, Snowflake SQL, Snow Pipe, etc.

Worked in Snowflake advanced concepts like setting up Resource Monitors, Role Based Access Controls, Data Sharing, Virtual Warehouse Sizing, Query Performance Tuning, Snow Pipe, Tasks, Streams, Zero- copy cloning etc.

Created Snow pipe for continuous load data and used copy to bulk load the data.

Used Spring Kafka API calls to process the messages smoothly on Kafka Cluster setup.

Created Internal, External stage and transformed data during load.

Create Teradata mapping, workflows, using various transformations for performing ETL aspects

of the data migration to snowflake.

Assisted to map existing data ingestion platform to AWS cloud stack for the enterprise cloud initiative.

Used tools like IBM UCD, GIT/GitHub for version control.

Provide support for customer data warehouse data to serve analytical reporting for marketing, life sciences and various other user communities by accomplishing return to service and enhancements to process for consistent and timely data delivery.

Develop processes for extracting, cleansing, transforming, integrating, and loading data into data warehouse database.

ETL Development for Control Architecture, Common Modules, Sequence Controls, and major critical interfaces.

Environment: Snowflake, Python, Linux, AWS services, SQL

Caramel IT Services Private Limited - Hyderabad May 2013– Oct 2015

Role: Data Analyst

Responsibilities:

Created automation scripts to streamline ETL processes, Data imports/exports, and API pulls using Python and Shell languages

Created reports and data visualization dashboards using complex SQL logic and BI tool Tableau.

Provided data warehousing solutions and designed data models to efficiently store and process data.

Scheduled ETL jobs on Linux servers and Amazon cloud (AWS) using Crontab and CloudWatch.

Daily activities include extracting, transforming, integrating, and loading client data, developing batch and streaming.

Data pipelines, performing data analysis, supporting production activities, and resolving production issues.

Worked collaboratively with Technical Architects, Data Analysts, and Project Managers to gather requirements and technical specifications and reformed development activities in an effective and timely manner.

Evaluated new database technologies and data integration mechanisms to stay current with industry trends

Environment: AWS S3, GitHub, Service Now, SQL, Apache Spark, Tableau

Bachelor's : BBM 2013 kakatiya university

Master's: CIS 2018 Bellevue University.



Contact this candidate