Akshu Patel
*********@*****.*** 980-***-**** LinkedIn
Professional Summary
Senior Software Engineer with 6+ years of experience building cloud-native backend systems, scalable microservices, and end-to-end data pipelines across enterprise, fintech, and e-commerce domains. Experienced in Python (Django, Flask, FastAPI), AWS services (Lambda, Glue, Redshift, Step Functions, EKS, Cognito, S3), and frontend frameworks (React/TypeScript) to develop secure, high-performance applications for analytics, reporting, and compliance. Implemented GenAI and LLM workflows for semantic search, document summarization, and intelligent data insights. Skilled in DevOps, CI/CD, containerization (Docker/Helm), infrastructure automation with Terraform and CloudFormation, and designing monitoring, caching, and visualization solutions with Prometheus, ELK Stack, Redis, D3.js, and Chart.js. Focused on security, compliance, automated testing, and full SDLC practices to deliver scalable, reliable, and maintainable software solutions. Professional Experience
Senior Software Engineer United Rentals Jul 2024 – Present
• Designed and implemented a retrieval-augmented generation (RAG) pipeline where vendor documents uploaded into S3 triggered Step Functions; data was vectorized with FAISS/Pinecone and surfaced via LangChain APIs, enabling semantic search and contextual summarization across millions of records.
• Built an event-driven ingestion architecture using AWS Lambda, EventBridge, and SQS queues; transformed incoming vendor data with Pandas/SQLAlchemy and stored it in PostgreSQL and Amazon Redshift for analytics and reporting.
• Created React/TypeScript dashboards with D3.js and Chart.js visualizations to track ingestion jobs, schema mismatches, and file processing results; integrated APIs and WebSockets for real-time updates.
• Developed Django APIs and microservices to serve processed business data to internal teams, providing fast, reliable access, seamless integration with downstream systems, and improved operational efficiency.
• Applied GenAI workflows to process and summarize large volumes of documents, enabling teams to quickly access key information and make informed decisions.
• Developed secure microservices on AWS EKS, integrated with Cognito for authentication, IAM roles for RBAC, and encryption for sensitive customer and financial workflows; all API calls were logged in CloudTrail for audit readiness.
• Automated infrastructure provisioning using Terraform and CloudFormation templates, deploying EKS workloads, Redshift clusters, Glue jobs, and networking resources; created reusable IaC modules for team-wide adoption.
• Established monitoring and observability with Prometheus, ELK Stack, and CloudWatch metrics, with SNS alerts for pipeline delays, ingestion errors, and performance issues.
• Implemented LLM workflows using LangChain APIs to process and analyze millions of vendor documents stored in S3, generating contextual summaries, enabling semantic search, and providing internal teams with actionable insights for faster decision-making and improved operational efficiency.
• Developed and executed unit and integration tests for Django APIs and ETL pipelines, ensuring data accuracy, pipeline reliability, and robust service performance.
• Involved in end-to-end software development lifecycle, including requirements analysis, system design, development, testing, deployment, and maintenance, ensuring delivery of scalable and reliable software solutions aligned with business objectives. Software Engineer Nomura Oct 2023 – Jul 2024
• Built end-to-end data pipelines to ingest, validate, and transform enterprise trade datasets from CSV/JSON sources, applying schema validation with Pandas and persisting cleaned data in PostgreSQL and Redshift for downstream analytics and reporting.
• Implemented Flask APIs to deliver processed trade and compliance data to internal tools, enabling analysts to query information efficiently, integrate with monitoring dashboards, and access frequently requested datasets.
• Developed caching mechanisms for API responses and intermediate datasets, improving response times and reducing load on backend databases during high-volume trade data queries.
• Designed and deployed fraud detection pipelines using Scikit-learn models via AWS SageMaker, integrating anomaly alerts into transaction monitoring workflows for proactive detection.
• Implemented GenAI-driven data processing workflows to extract insights from large vendor datasets, enabling semantic search, intelligent recommendations, and automated analysis across e-commerce operations.
• Created Elasticsearch-based indexing and embedding pipelines to support hybrid keyword and similarity search across trade records, improving data retrieval efficiency and compliance checks.
• Developed LLM workflows to process and analyze enterprise trade datasets from CSV and JSON sources, generating embeddings and enabling semantic search, similarity queries, and contextual insights that supported compliance checks, risk monitoring, and faster decision-making for analysts.
• Built real-time monitoring pipelines for ingestion workflows using WebSocket-driven dashboards in React/TypeScript, allowing analysts to track processing status, errors, and validation results.
• Automated CI/CD pipelines for data and API pipelines using Jenkins and GitLab CI/CD, containerizing and deploying services to AWS EKS clusters with Helm for reproducible, scalable data processing.
• Applied Infrastructure-as-Code (IaC) practices using Terraform, provisioning AWS resources such as Redshift, EKS, and S3 for consistent, repeatable, and compliant deployments.
• Developed and executed unit and integration tests for ETL workflows and Flask APIs, ensuring pipeline reliability, data quality, and robust application performance.
• Applied pipeline-level security and compliance measures, including IAM segmentation, TLS encryption, audit logging, and RDS/Redshift encryption at rest.
• Participated in the full SDLC, from requirements analysis and development to testing, deployment, monitoring, and maintenance, ensuring alignment with business goals and operational needs. Software Engineer OpenXcell Jul 2020 – Jul 2022
• Developed FastAPI and Django microservices for e-commerce order flows, handling product, vendor, and customer data; ensured secure and reliable backend services integrated with relational (PostgreSQL) and semi-structured (MongoDB) databases, with Redis caching for high-performance session and frequently accessed data.
• Built end-to-end Spark ETL pipelines on AWS EMR to ingest, normalize, and transform vendor feeds into Redshift and S3, ensuring consistent schema alignment for reporting and downstream analytics.
• Automated infrastructure provisioning and deployments using Terraform and Jenkins, creating reusable modules for AWS EKS clusters, Redshift warehouses, and S3 pipelines across production and staging environments.
• Implemented security best practices across APIs and AWS resources, including IAM roles, encryption for S3 and Redshift, and role- based access control (RBAC) to safeguard sensitive e-commerce and operational data.
• Created interactive dashboards with React and Angular to visualize product analytics, vendor performance, and ETL job statuses; integrated D3.js and Chart.js charts with backend APIs to provide real-time operational insights.
• Established DevOps pipelines in GitLab CI/CD for static code analysis (Flake8, MyPy), container builds, integration testing, and automated deployment into Kubernetes clusters, enabling faster and more reliable releases.
• Monitored distributed services and pipelines using Prometheus, Grafana, ELK Stack, and CloudWatch alarms with SNS notifications, ensuring quick detection and resolution of ingestion errors and system performance issues.
• Applied automation across ETL and backend workflows, reducing manual interventions, improving reliability, and maintaining consistency across internal tools and reporting pipelines.
• Participated in the full SDLC, from requirements gathering and development to testing, deployment, monitoring, and maintenance, ensuring operational alignment with business goals. Junior Software Engineer Arth Technologies Apr 2019 – Jun 2020
• Built and maintained Flask APIs for internal HR applications, handling employee records, payroll data, and HR workflows; ensured secure access, consistent responses, and smooth integration with backend databases and reporting tools.
• Developed end-to-end ETL pipelines in Python to ingest CSV, Excel, JSON, and XML files into MySQL and RDS PostgreSQL; applied validation, transformation, and normalization for high-quality datasets.
• Automated ETL execution and backend processes using AWS Lambda triggers, scheduling batch jobs and streamlining workflows to minimize manual intervention while improving reliability and timeliness.
• Provisioned and managed AWS infrastructure (EC2, S3, RDS, VPC) with IAM roles and network security; applied IaC practices via reusable scripts for reproducible and scalable deployments.
• Implemented caching strategies with in-memory or persistent caches to optimize API response times and reduce repetitive database queries, enhancing system performance.
• Created interactive dashboards and visualization tools to monitor ETL pipelines, track data ingestion metrics, highlight anomalies, and provide operational insights to HR and management teams.
• Performed comprehensive testing including unit tests (unittest), API testing (Postman), and ETL output validation to ensure robustness, reliability, and compliance with business requirements.
• Integrated development work into CI/CD pipelines, automating code builds, containerization, testing, and deployment to improve development speed and release reliability.
• Monitored backend services and data pipelines using logs, metrics, and alerts; ensured smooth operations, minimal downtime, and proactive issue detection.
• Participated in full SDLC and cross-functional collaboration, applying security and compliance measures (IAM policies, encryption, VPC isolation), streamlining workflows, and maintaining consistency across HR applications and internal tools. Technical Skills
• Programming & Frameworks: Python (Django, Flask, FastAPI), React, TypeScript, Angular, Pandas, SQLAlchemy, PySpark, NumPy, Scikit-learn, Flask APIs, FastAPI microservices
• Databases & Data Management: PostgreSQL, MySQL, MongoDB, Redis, Amazon Redshift, RDS, S3, ETL pipelines, Data validation, transformation & normalization, Schema validation, CSV/JSON/XML processing
• Cloud & Infrastructure: AWS (Lambda, Glue, Redshift, Step Functions, EKS, S3, Cognito, API Gateway, CloudWatch, EventBridge, SQS), Terraform, CloudFormation, EC2, VPC, IAM, RBAC, Encryption at rest & in transit
• DevOps & CI/CD: Jenkins, GitLab CI/CD, Containerization (Docker/Helm), Kubernetes, Blue/Green deployments, Infrastructure-as- Code (IaC), Automated testing, Deployment pipelines
• Monitoring & Observability: Prometheus, Grafana, ELK Stack, CloudWatch metrics & alerts, SNS notifications, Logging, Performance monitoring
• Frontend & Visualization: React, TypeScript, Angular, D3.js, Chart.js, WebSocket integration, Interactive dashboards
• Security & Compliance: IAM roles, RBAC, TLS/SSL encryption, Audit logging, Secure API development, Pipeline-level security, Compliance-driven workflows
• AI/ML & LLM-related: Retrieval-Augmented Generation (RAG), FAISS, Pinecone, LangChain APIs, LLM workflows, Embedding pipelines, Semantic search, Fraud detection ML pipelines (Scikit-learn, SageMaker), GenAI workflows
• Testing & SDLC: Unit testing (unittest), API testing (Postman), Integration testing, Full software development lifecycle (SDLC), Agile/Scrum workflows
• Automation & Tools: ETL automation, Backend workflow automation, Infrastructure provisioning, Reusable modules, Data ingestion automation
Education Details
Master's in Computer Science, University of Texas at Arlington