PRAVEENA
KRISHNAKUMAR
Hopkinton, MA linkedin.com/in/praveena-krishnakumar
781-***-**** ********.************@*******.***
Experienced Data Engineer with 10 years of expertise in ETL, database management, and cloud environments like AWS and Snowflake. Proven track record across diverse sectors, including food supply chain, healthcare, and energy, managing billions of data records and optimizing data pipelines with tools like Fivetran, Airbyte, and Pentaho. Successfully led the migration of large-scale, on-premises data to cloud platforms for various clients. Currently seeking to leverage my extensive experience in a lead or managerial role, driving innovation and tackling new challenges.
TECHNICAL COMPETENCIES
Databases - Oracle, SQL Server
Data Warehousing Tools - Snowflake
ETL Tools – Pentaho, Qlik, Airbyte, BryteFlow, Pendo
Cloud – AWS Azure
SCM – Git, Bit bucket
Deployment Tool – Liquibase
Scripting Languages - Shell scripts, Batch Script,Python
Replication Tools - Oracle GoldenGate, Qlik Replicate, Striim, Snaplogic
Scheduling/Monitoring - Paessler PRTG Network Monitor
Agile Tools - JIRA, Confluence
Other Tools - Oracle Enterprise Manager (OEM), Microsoft Visio, TOAD Data Point, AEM
EXPERIENCE
YES ENERGY
Snowflake cloud DBA Lead Needham, MA
November 2022 - Present
Recognized for my performance as an ETL Lead, I was handed the responsibility to lead efforts as a Snowflake Cloud DBA, where I achieved the following:
Integrated third-party data into Snowflake, enhancing data availability and enabling real-time analytics for better decision-making.
Utilized Snowflake utilities and Snow Pipe for data modeling and optimization.
Managed the team to build and automate Snowflake credentials for clients, streamlining the process for
metadata functions in Oracle.
Leveraged Snow Pipe for efficient and automated data loading into Snowflake, reducing data latency and
enhancing real-time data availability.
Managed Snowflake tasks, procedures, and warehouse maintenance, including implementing MFA and auto-replication, improving security and reducing downtime.
Designed scalable Snowflake warehouses, increasing query performance by 25% and optimizing data analytics for high-volume datasets.
Developed data streams to process data flow from the source to warehouse like Snowflake and Data Lake.
Collaborated with Engineering and Product Management teams on roadmap planning.
Built automated code deployment pipelines through Bitbucket and became familiar with code repositories such
as Visual Studio and SourceTree.
Created and managed buckets on S3 for storing DB and log backup, uploading files, and customizing data
through JSON for Data Lake clients.
Experienced in Paessler PRTG tools for monitoring the data flow and quality process.
Collaborated with Scrum master, client partners, Business managers, and adhered to agile methodologies.
Managed the data team in becoming proficient with data warehousing principles, specifically leveraging
Snowflake to enhance data accessibility and analytics.
Developed data pipelines using Snaplogic,Qlik, Bryteflow and other ETL tools that supported the Product and BI teams in creating real-time dashboards, enhancing data-driven decisions.
Automated workflows using Python, Snaplogic and other scripting languages, reducing manual tasks by 60% and boosting team productivity by 20%.
Derived data in Snowflake through API integration using Python, enabling efficient data retrieval and enhancing the performance of data-driven applications.
Ensured top-tier data quality and compliance, achieving zero data breaches and strengthening client trust.
Managed 350+ cloud client accounts, optimizing metadata structures and ensuring seamless data operations.
Directed CI/CD deployments using Liquibase, ensuring smooth integrations and faster response times.
Established PRTG alerts and monitoring tools, enabling rapid response to issues and improving overall service reliability.
Optimized Snowflake: Leveraged Snowflake’s capabilities to optimize data models, reducing query times by 40%, developed custom SQL scripts to automate database maintenance, integrated Snowflake with BI tools for superior data visualization, and improved data accuracy by 20% through cross-functional collaboration.
Worked with the team to create Root Cause Analysis (RCA) for any issues to ensure continuous improvement.
Developed Proof of Concept (POC) for new products, testing and validating new technologies and approaches to
enhance data solutions.
YES ENERGY
ETL/DBA Analyst Needham, MA
September 2019 – November 2022
As an ETL DBA Analyst, I delivered the following:
Handled the replication of over 10 billion records across on-premises environments using Oracle GoldenGate, Qlik, and BryteFlow, significantly improving user access and replication performance.
Streamlined DBA processes by implementing Materialized Views (MVs), schedulers, and validation procedures, reducing errors and enhancing data accuracy.
Implemented real-time alerts and conducted in-depth analytics to detect and resolve latency issues, ensuring optimized data transfer and continuous data integrity.
Created intuitive visual tools like map alerts to enable non-technical teams to resolve issues swiftly.
Spearheaded ETL pipeline optimizations with Fivetran, Airbyte, and Pentaho, boosting data processing speeds by 50% and minimizing downtime.
Developed and deployed data solutions across multiple industries, including food supply chain, healthcare, and energy, driving operational efficiency and providing actionable business insights
HEALTH CARE FINANCIALS
Senior ETL Lead Quincy, MA
November 2018 – September 2019
Worked on business transformation initiatives to enhance data warehousing and analytics capabilities.
Managed end-to-end Pentaho ETL processes and troubleshooting.
Created functional documents and supported system integration and user acceptance testing.
Analyzed business data, identifying gaps and ensuring alignment with business processes.
Designed and developed a Test-Driven Development framework
FSE NET
Senior ETL Developer Waltham, MA
July 2016 – November 2018
Developed Kettle transformations and jobs for data publishing and reporting.
Created Jasper reports and deployed them on Apache Tomcat.
Designed and maintained file formats and templates for various manufacturers and recipients
EDUCATION
MASTER IN INFORMATION SYSTEMS AND APPLICATIONS
Bharathidasan University
June 2002
BACHELOR OF COMMERCE
Madras University
May 2000