RACHAPALLI SAI SRINIVAS
+91-949******* *********************@*****.*** https://www.linkedin.com/in/rachapalli-sai-srinivas-20250372/
PROFESSIONAL SUMMARY
Result-oriented and insightful developer with around 7 years of extensive expertise in Python, GCP, AWS, and Consulting with willingness and passion to master in Data Engineering & Analytics. To have a good professional career in Data field and contribute the same for organizational growth.
SKILLSET
●Apache Beam, Python, Postgres SQL, Data Engineering, GCP (Dataflow, Pub/Sub, GCS, Cloud SQL, Airflow, Composer, Big Query, Data Proc and Cloud Functions) and Agent Space, Data Stores.
●Python, AWS (LAMBDA, GLUE, S3 Services, SNS), Data Analysis, SQL, JSON, Linux (Basic).
●Requirements Elicitation, Design and Solution documentation, Agile, MS Excel, PowerPoint.
WORK HISTORY
Senior Associate – PwC Acceleration Center, BANGALORE
Client – Internal Feb 2025 – Aug 2025
Technologies: Google Agent Space, Data Stores
Description: Building AI agents according to the business use cases to give demo in Google meet held in USA.
Responsibilities:
Review the technical and functional documents of the client and filling gaps if there are any
Understanding the requirements and developing.
Worked closely with the high-level design/build team, infrastructure team, and cross-functional teams to gather requirements, design data models, and build robust data infrastructure.
Giving demos to the leadership teams and working on the feedback points.
Lead Programmer Analyst – Bilvantis Technologies, HYDERABAD
Client – HSBC Sept 2023 – January 2025
Project Name: C48 Collection Management
Technologies: Python, Data flow, Apache Beam, GCS Bucket, Airflow, Pub Sub, PostgreSQL, Jenkins Pipelines, PyTest, Git.
Description: C48 is a program to build a Generation Collection platform. The recommended collection system is an online solution that supports the collection of delinquent and over-limit customers. It maintains up-to-date collection information of accounts in its database and communicates directly with the accounting system, eliminating the need for paper flow. This allows collection departments to more accurately track and forecast potential volumes each day, prioritizing their work more effectively and efficiently.
The Collections and Recovery HASE service categorizes and manages accounts that require special handling. HASE Collections primarily manages past-due accounts but can also track, analyze, and recover other account groups, including VIPs, lost or stolen cards, over-limit accounts, and bankruptcy cases. It offers flexible collection strategies driven by control record parameters. The service interfaces with AR and dialer systems. If the system fails for any reason and becomes unavailable, there is no direct impact on customers. However, collectors will be unable to update any data or collect payments from customers during the period of unavailability. From the bank's perspective, this could result in a delay in collecting payments.
Responsibilities:
Designed and implemented scalable ETL pipelines using Python and Apache Beam on Google Cloud Platform (GCP) to process and transform large datasets.
Utilized GCP services such as Cloud Dataflow, Big Query, Pub/Sub for event publishing, and Cloud Storage for efficient data processing, transformation, and storage.
Automated data workflows using Jenkins pipelines for continuous integration and deployment, optimizing data pipelines for real-time and batch processing to reduce data latency and improve throughput.
Developed unit test cases for data pipelines and implemented continuous integration practices to ensure data accuracy and reliability.
Worked closely with the high-level design/build team, infrastructure team, and cross-functional teams to gather requirements, design data models, and build robust data infrastructure.
Debugged and resolved technical issues; also participated in code reviews and followed coding standards.
Tested and deployed applications in production environments.
Wrote and maintained technical documentation.
Client – Internal April 2023 – Sept 2023
●Gathering requirements, building, and testing data migration tool using python, Apache Airflow, Big Query and GCS
Senior Software Engineer – Legato Health Technologies LLP, HYDERABAD February 2022-Apirl 2023
Client – Anthem
●Writing scripts in python to automate the data flow from AWS-S3 to snowflake.
●Developing generic frame works for Data Migrations from S3 to Snowflake, deploying AWS-LAMBDA, AWS-GLUE, and testing.
Consultant – ITCRATS INFO SOLUTIONS, HYDERABAD August 2021-January 2022
Client – Worldline Company
●Obtain client requirements, understand, and deliver the code as per need basis.
●Participate in code reviews, code merge and other development activities.
●Worked on Django framework and used python to script.
●Deliver internal and client demos for the projects assigned.
●Collaborate with team members for impact assessments, understanding functionality and estimates.
Python Developer
Client --With Globe Life Insurance Company. August 2018-August 2021
Project Name: Binary To ASCII Data Conversation
Technologies: Python, AWS Step Functions, S3, Lambda functions
●Involving with clients and other internal teams to understand client requirements and deliver the tasks within time.
●Reviewing the data sets from mainframe against the layout to detect discrepancies
●Analyzed data files using COBOL, assembler, Easytrieve mainframe layouts.
●Developed and deployed python scripts and parameters in the AWS environment.
●Designed test cases to carry out Unit Testing and acceptance testing.
●Tested and debugged python scripts manually in local as well as in AWS cloud environment.
●Wrote requirements and design documents for developed solutions.
EDUCATION
Bachelor of Engineering (Electronics and Communications) – SRKR Engineering College Sept 2012-May 2016
CERTIFICATION
Data Science Certification Program (6 Months) AcadGlid
Assignments: Python Statistics, Data Analysis
Key Result Areas:
•Knowledge on Python Programming concepts.
•Hands-on experience in Data statistics for different types of problems with different techniques.
•Good understanding the issues, providing proper clarifications and getting the defect fixes quickly
•Have exposure towards Jupyter notebook tools.
•Involved in datasets understanding, data modelling, reporting, and documentation.
Certification Project - Relative Humidity Prediction Analysis
Description - In this project, we had processed the dataset that contains air quality of an Italian city coming with 9358 instances of hourly averaged responses from an array of 5 metal oxide chemical sensors embedded in an Air Quality Chemical Multi sensor device. We have replaced the null values with forward fill. Plotting with dependent and independent variables and splitting the data into train and test datasets, thereafter, applying multiple Machine Learning algorithms to show a comparative analysis of the performance of each algorithm. Finally, we will look at the results and finalize with algorithm fits for this dataset.