E. Bharathi
Snowflake Developer Email: - ***********@*****.*** Mobile: - 203-***-**** Stamford, CT - 06901
Professional Summary:
Having 5+ Years of IT Experience including 3+ years of experience as a Snowflake Developer.
In-depth understanding of Snowflake Architecture and Cloud technology.
Knowledge on DBT Models creation and data transformation and transformed raw data into analytics ready table
Deep understanding of Snowflake architecture, with a focus on performance tuning and optimization of Snowflake data warehouses.
Hands-on experience with Snowflake utilities, including SnowSQL, Snowpipe, tasks, and streams, as well as with Amazon S3 components.
Developing Snow sight dashboards to present warehouse expenditure visually.
Assessing and enhancing warehouse dimensions for efficiency.
Detecting and purging outdated tables to minimize storage expenses.
Facilitating new user setup and managing access permissions.
Experience in using Snowflake features like Zero Copy Cloning, Time Travel, Fail-safe, Snow pipe
Experience on Fivetran pipelines to ingest data
Expertise in Snowflake features such as Time Travel, Data Sharing, Data Cloning, Secure Views, and multi-cluster architecture.
Technologies:
Databases : Oracle, Sqlserver Cloud : Snowflake, AWS
ETL Tools : Informatica, DBT Operating Systems : Unix, Windows
Version control : Github, Jenkins, Udeploy Methodologies : JIRA, Agile (Scrum), Waterfall
Ticketing Tools : Remedy, Service Now, Jira Programming : Python
Professional Experience:
Horizon media, Contract (Snowflake Engineer), (11/24 - Present), (NYC USA)
Environment : Snowflake, Fivetran, HVR, DBT, SQL Server
Provide expert support to optimize Snowflake workloads.
Experienced in writing complex SQL queries, including sub queries, joins, and views, to meet various business requirements.
Migrated Talend Jobs to Snowflake functionality, including unit testing.
Created dimensional models, facts, and data marts using DBT.
Built DBT models with Jinja macros, YAML configurations, and data tests.
Developed and maintained ETL pipelines using Fivetran and HVR for
seamless data integration and replication.
Implemented real-time and batch data ingestion strategies to ensure
data accuracy and availability.
Optimized data movement processes across cloud and on-premise
environments, improving performance and scalability.
Collaborated with cross-functional teams to support data migration,
transformation, and analytics initiatives.
Prepared analytical SQL queries for data analysis (e.g., revenue, matching etc).
Created and managed Snowflake objects such as databases, schemas, tables, stages, sequences, views, procedures, and file formats using SnowSQL and the Snowflake UI.
Extensive hands-on experience with Snowflake utilities, including SnowSQL, Snowpipe, and Snowflake Streams, to implement Slowly Changing Dimensions (SCD).
Designed and managed roles, groups, and resource tags using AWS Identity Access Management (IAM).
Proficient in leveraging Snowflake’s data sharing and integration features to facilitate seamless collaboration.
Support Fivetran pipelines to ingest data.
Optimized Snowflake storage by implementing transient tables where applicable and ensuring optimal clustering column definitions.
Cloned database schemas, tables, and procedures from QA to production environments.
Utilized advanced Snowflake functions such as Time Travel, Data Sharing, Data Cloning, Secure Views, and clustering for effective data management.
Assessing and enhancing warehouse dimensions for efficiency.
Design and implementing Snowflake security models with RBAC and DAC.
Conexus Solutions (Snowflake Developer), (04/23 – 03/24), (India)
Environment : Snowflake, AWS, DBT
Defining roles, privileges required to access different database objects using RBAC.
Using Snowsql to implement bulk access requests.
Created SnowSQL scripts for loading data from flat files into Snowflake tables, optimizing data ingestion processes.
Leveraged Snowflake functions such as Time Travel, Data Sharing, Data Cloning, Secure Views, and clustering to manage and optimize data workflows.
Loaded data from files staged in external sources (e.g., Amazon S3) into target Snowflake tables and queried the data warehouse for business insights. Worked extensively with Amazon S3 components.
Cloned database schemas, tables, and procedures from QA to production environments to streamline the deployment process.
Implemented Slowly Changing Dimensions (SCD) using Snowflake utilities such as SnowSQL, SnowPipe, and Snowflake Streams.
Managed and processed semi-structured data (JSON, XML) and columnar PARQUET files using the VARIANT data type in Snowflake.
Creating Shares, Stages, File formats, Storage integrations for LOBs.
Redesigning the Views in snowflake to increase the performance.
Interact with cross-functional teams, project managers and agile teams to estimate development efforts and ensure complete delivery of solutions and fulfilling requirements.
Veeva systems (Veeva Vault and SQL Developer) 07/22-03/23. – India
Environment: Oracle, UNIX, Autosys, Informatica, Veeva
Developing Procedures, Functions.
Create shell scripts for automating the export jobs and moved the information to respective environments.
Data load process using oracle SQL loader from other systems into Oracle Data base.
Implemented Advanced features like Merge, Global temporary tables, XMl Tables
Handled the exceptions and raised the error messages to the front end.
Creation of DB objects like Tables, Views, Sequences, and Indexes.
Creating Ref cursors, Table Return Functions, Types, and Objects.
Creation of Global Temporary Tables for inserting XML Data.
Modifying the Procedures According to the Requirement.
Solving the issues and updating the Patches.
Implementation of merge statement for updating and inserting data into main
Adding the Constraints, columns as per the requirement.
Mind fire Solutions - (Software Developer) (03/20 – 07/22) Hyderabad, India
Developed SQL and PL/SQL Infolets using ECP (Product Tool) to enhance query performance through the creation of scheduled Infolets.
Modified existing PL/SQL subprograms based on changing business requirements (Change Requests).
Managed the import and export of database objects using IUP, including exporting database dumps and working on the GRC platform.
Launched and configured Amazon EC2 Cloud Servers using AMIs (Linux/Ubuntu), setting up servers for specific applications.
Wrote complex SQL queries using joins and subqueries to retrieve data from databases, and modified PL/SQL subprograms to meet business needs.
Troubleshoot various problems that arise in a day-to-day work and fix the issues. (Monitoring SQL Server Error Logs and checking your email alert if there is one configured).
Academic Qualification:
B. TECH from JNTU university in 2015.