N. SWATHI
Contact: +1-512-***-****
E-mail: ********@*****.***
https://www.linkedin.com/in/swathi-lagadapati/
IT PROFESSIONAL - ETL DEVELOPER
Offering 12+ years of experience; seeking challenging assignment across the industry
SUMMARY OF SKILLS
●Technically competent professional with significant experience acquired over the years in diverse areas encompassing SQL Server Development, Data Warehouse Technologies, Client/Server Environment, Agile Methodology, JIRA, Software Design & Development, Database Administration, Technical Analysis, Quality Assurance and Team Management.
●Well versed in writing complex queries involving multiple tables using SQL skills and developing the ability to create and maintain stored procedures and user defined functions.
●Expert in data Extraction, Transforming and Loading (ETL) using various tools such as Pentaho, NiFi, Google replicator, Airflow and creating custom pipelines for Incremental Loads.
●Highly skilled in performance tuning of SQL queries and validating the data using google sheets.
●A quick learner and can easily adapt to new technologies with ease and Innovative approach.
●Proficient in mapping users’ requirements, custom designing solutions & troubleshooting for complex Software/ Application related problems.
●Defined, designed and coordinated approval of validation testing controls, processes, and study documentation
●Excellent team leader with a unique blend of Technical, Functional, and Qualitative skills backed by strong record of facilitating cross-functional coordination for successful delivery of technical solutions.
Technical Skills Set
ETL Tools
Pentaho, Nifi, Data bridge,Clarinet, Replicator, Airflow
Cloud Data Platforms
Data Analysis & Visualization Programming/Scripting
Operating Systems
Google BigQuery
Plx, Looker Studio, Google Sheets
JavaScript, Core Java, Python(Basic)
Windows Family
Databases
Reporting
Agile Methodologies
SQL Server, MSSQL, BigQuery, PLX
Dashboard creation, KPI monitoring, Email notifications
Jira, Sprint Planning, On-call Support
Data Warehousing & Modeling
Dimensional Modeling, Performance Tuning
PROFESSIONAL EXPERIENCE
●Working as a Technical Manager in HCL Technologies From Feb-2021 to till date
●Worked as a Software Engineer in Sagarsoft India Ltd. From Dec-2015 to Jan-2021
●Worked as a Software Engineer in Vincere Semantics From Mar-2013 to Nov-2015
EDUCATION
PG - Master Of Computer Application
Avanti Institute of Technologies affiliated to JNTU University
ANNEXURE
Client
Role
Business Analysis -ETL
Environment
Airflow, Big Query, PLX, Plx Dashboard
Duration
April’24 – Till date
Synopsis
CMPA Cloud Marketplace Agency is a financing reporting project for onboarding countries for corporate entities in cloud for internationalcontrollership teams and business finance accounting requirements
Responsibilities
●Interviews with business and understanding the business requirements
●Review the requirements, prepare the plan and review with business for signoff
●Developed simple to complex sql queries using the Big Query & PLX
●Perform UAT and get signoff from business
●Validating the results in preprod and deploy to prod
●Create the pipelines using dag in airflow
●Build dashboard using Plx and send the stats as an email notification to the end users
●Fixing the customer bugs, tests failures and recon jobs
●Supporting the ETL and DB issues occurred in higher environments
●Assist team members when required
Client
Role
ETL Engineer
Environment
Airflow, Big Query, PLX
Duration
April’23 – April’24
Synopsis
Billerweb is a project for transitioning of the reports from payments platform(Plx) to FDU(BigQuery)
Responsibilities
●Analysis of the payments reports and prepare the functional requirement document
●Review the functional document with manager and get the approval
●Developed simple to complex sql queries using the Big Query & PLX
●Creating the Blueprint files for the Deployment purpose
●Validating the results and deploy to prod
●Create the pipelines using dag in airflow
●Build dashboard using Looker studio and send the stats as an email notification to the end users
●Fixing the customer bugs, twinpeak tests failures and recon jobs
●Supporting the ETL and DB issues occurred in higher environments
●Assist team members when required
Client
Role
ETL Engineer
Environment
Replicator, clarinet, Big Query, Spanner, PLX
Duration
Feb’21 – March’23
Synopsis
DSF enables to manage, enforce and monitor its data securely.
Responsibilities
●Creating the various ETL pipelines by using Java code and use it in Replicator ETL tool
●Creating the Clarinet pipelines and scheduled as per the requirements
●Developed the simple to complex queries by using the Big Query & Spanner database
●Create the pipeline for each environment for ETL and for validations
●Once the deployment completed follow-up with the customers and get the LGTM
●Supporting the production issues on priority and fixed the issues
●Prepare the G3Docs for each adoptions for oncall support
●Build plx dashboards, workflows and piper controlled the code
Client
Point72
Role
ETL Engineer - Implementation and Development
Environment
ETL – Kettle, Spoon, Nifi, SQL Server
Duration
Jan’17 – Jan’21
Synopsis
Point72 Asset Management, L.P is a privately owned family office. The firm invests in the
public equity markets. It employs a long/short equity strategy to make its investments.
Responsibilities
●Designed ETL Architecture for Data Extraction from different types of Sources and
load to Data Lake.
●Extracted data from web services using Pentaho.
●Generated JSON output and input using Pentaho.
●Load data to Kafka (v10) using Pentaho.
●Extracted data from different sources such as database, flat file ...etc
●Processed more than 125 GB of data and around 50 million records for Data Lake.
●Created data marts to Assemble large, complex data sets to meet business requirements.
●Created Custom processors in Nifi to handle the null and extract the kafka publication time
●Load the data into target systems Kafa, HDFS locations, tables etc
●Involved in Pentaho8 migration.
●Integrated CyberArc to retrieve passwords in a secure
manner and usage to store passwords in configuration properties file configuring
●Emailing and Scheduling Reports through the job.
Client
Sagarsoft
Role
Software Engineer
Environment
Android, Core Java
Duration
March’2015 – Dec’2016
Synopsis
VAS is a web and mobile application designed to help farmers get access to
information related to their crops. Farmers will be assisted by Field Agents to use the
application without any difficulty. The Consultant and Sub-Consultants will guide the farmer
on how to resolve issues concerning their crops and increase their crop production.
Responsibilities
●Analyzing the mocks provided by the designer
●Involved in Building and Designing advanced Android application for Android platform
●Responsible for delivering quality App to the client
●To use and work with outside APIs and data sources
●Testing compatibility of the app in different Android mobiles
●Meet projected needs
Client
Active Ops
Role
Software Engineer
Environment
Java Jersey Restful Web Services
Duration
March’13 – Dec’15
Synopsis
ActiveOps is a highly configurable Cloud-based software platform, with rapid deployment
of robust accounting, financial and travel and expense management applications.
ActiveOps provides the services to the users through Active Finance Mobile application.
For this Mobile application, it requires the REST services which are having the user
Information, can view all items in the user’s inbox, get item details, invoices..etc
Responsibilities
●Analyzing the requirement document.
●Developing new functionality intended to support new services provided by the client.
●Involved in developing Restful web services using JERSEY framework
●Coordinated with the team
●Attending client calls and doing code review and code deployment to specific servers
●Responsible for performance testing and Unit testing
●Responsible for bug fixing
●Developing Error Handling as per client requirement