Post Job Free
Sign in

Cloud Engineer Aws Developer

Location:
Dallas, TX, 75201
Salary:
120000
Posted:
April 08, 2025

Contact this candidate

Resume:

Navya Reddy Gogula Email: *********@*****.***

AWS Developer & Cloud Engineer Phone: +1-518-***-****

Professional Summary:

A passionate AWS Developer & Cloud Engineer with around 6 years of IT experience specializing in the design and development of cloud-based applications and middleware tools. Skilled in various AWS services, with hands-on experience in DevOps tools, automation, and cloud architecture. Adept at delivering high-performance, scalable, and secure cloud solutions, including Infrastructure as Code (IaaC) using CloudFormation and Terraform. Strong presentation and communication skills, with a proven ability to translate technical concepts into clear, business-oriented outcomes.

Key Skills & Expertise:

Designed and deployed secure, scalable AWS environments using EC2, S3, VPC, RDS, and IAM.

Implemented security best practices, including IAM policies, least privilege access, encryption (KMS, SSE, TLS), and MFA enforcement.

Managed AWS Key Management Service (KMS) for secure data encryption and secret management using AWS Secrets Manager.

Designed and implemented event-driven architectures by integrating AWS Lambda with S3, DynamoDB, SNS, SQS, and API Gateway.

Implemented serverless architecture using API Gateway, Lambda, DynamoDB, and Step Functions.

Built and managed ECS clusters using EC2 launch type for more control over infrastructure and optimized compute usage.

Developed complex SQL queries for data extraction, transformation, and loading (ETL) from diverse sources into Amazon S3, Redshift, and RDS.

Managed S3 buckets, configured storage classes (Standard, Intelligent-Tiering, Glacier) for cost-optimized data management.

Configured AWS Backup to automate snapshot retention and recovery strategies.

Developed custom ETL scripts in Python for AWS Glue jobs, optimizing data transformation workflows.

Defined and optimized ECS task definitions, service discovery, health checks, and load balancing via Application Load Balancer (ALB).

Experience with CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy.

Managed AWS Glue Data Catalog for metadata storage, ensuring efficient schema management and data discovery.

Experience in setting up & managing Windows/Linux servers on EC2.

Automated provisioning of AWS services (EC2, Lambda, S3, API Gateway, RDS) using AWS CDK, enhancing deployment efficiency

Automated infrastructure provisioning using AWS CloudFormation & Terraform, reducing manual setup time.

Developed custom automation scripts using Python & AWS SDKs (Boto3) to streamline cloud resource management.

Implemented AWS Config & AWS Systems Manager for compliance, auditing, and patch management.

Strong understanding of networking concepts including VPC, VPN, Subnets, NAT Gateway, Endpoints, Peering, and DNS.

Designed VPC peering and transit gateway configurations for secure, cross-region connectivity.

Configured EBS, ELB, SSL, Security Groups, IAM roles for enhanced security & availability.

Configured AWS Security Hub and AWS GuardDuty for proactive threat detection and compliance monitoring.

Achievements & Impact:

Reduced manual setup time through Terraform & CloudFormation automation.

Improved data processing efficiency with optimized AWS Glue ETL pipelines.

Enhanced security compliance by implementing IAM policies, MFA, and automated security audits.

Strengthened disaster recovery by implementing AWS Backup & RDS Multi-AZ failover strategies.

Increased API performance through API Gateway optimization & caching mechanisms.

Developed Python automation tools to reduce operational overhead in cloud resource management.

Technical Skills:

Cloud Services: AWS (EC2, RDS, S3, Lambda, Glue, Route 53, SNS, SQS, CloudFront, CloudWatch, CloudTrail)

Automation & DevOps Tools: Terraform, CloudFormation, Ansible, Jenkins, CodeBuild, CodeDeploy, CodePipeline

Programming Languages: Python, Java (Core), TypeScript, JavaScript

Containerization: Docker, Kubernetes, AWS ECS, EKS, Fargate, Helm, Microservices Architecture

Networking: VPC, Subnets, NAT Gateway, VPN, Route 53, Load Balancers, Peering

Security & Monitoring: AWS IAM, GuardDuty, Security Hub, WAF, KMS, Secrets Manager

️ Databases: MySQL, PostgreSQL, DynamoDB, Aurora, RDS

Version Control & CI/CD: Git, CodeCommit, GitHub Actions

Web & API Technologies: REST, SOAP, API Gateway, JSON, YAML

Education and Training:

Jawaharlal Nehru Technological University, Hyderabad

Bachelor’s in Computer Science

Certifications:

AWS Certified Solutions Architect - Associate

Validation Number: 1EL26K22SNB1QWGK

Validation Link: https://aws.amazon.com/verification

Professional Experience:

Project#: 1

Cloud Engineer Accenture (April 2023 – Present)

Client: Progressive Insurance

Description: In an insurance data processing project using AWS Glue and Amazon S3, the goal is to build an automated ETL (Extract, Transform, Load) pipeline to process, transform, and store data for reporting. This setup enables handling large volumes of data related to claims, policies with efficiency and scalability.

Responsibilities:

Designed and implemented an automated ETL pipeline using AWS Glue and Amazon S3 for processing large-scale insurance data related to claims and policies, improving data efficiency and accuracy.

Developed AWS Glue ETL jobs to clean, transform, and process raw claims data, optimizing data workflows for reporting and analytics.

Wrote Python & PySpark scripts for data preprocessing, including removing spaces, handling missing values, standardizing formats, and calculating aggregate claim amounts.

Optimized AWS Glue job performance by tuning Spark configurations, partitioning strategies, and data serialization formats to improve processing speed.

Configured AWS Glue Workflows & Triggers for end-to-end ETL job automation, reducing manual intervention and improving process efficiency.

Implemented EventBridge rules to trigger Glue jobs upon new data uploads in S3, ensuring real-time data processing and freshness.

Built ETL pipelines using AWS Glue and SQL-based transformations to ingest and process data from on-premises databases into S3.

Used Amazon CloudWatch for monitoring AWS Glue job execution, tracking errors, latency, and resource consumption, ensuring optimal performance.

Set up CloudWatch alarms to send real-time alerts for ETL job failures and performance bottlenecks, enabling proactive issue resolution.

Enabled detailed logging for AWS Glue jobs to troubleshoot failures, analyze processing times, and enhance job observability and debugging.

Stored processed and transformed data in structured folders within Amazon S3, improving data accessibility and organization for analytics teams.

Ensured data security and compliance by implementing AWS IAM roles and policies, restricting unauthorized access to sensitive claims data.

Optimized Glue job execution costs by leveraging AWS Glue job bookmarking, AWS Cost Explorer insights, and resource monitoring techniques.

Designed and implemented Athena queries to enable ad-hoc querying and real-time data analysis of transformed datasets.

Automated AWS infrastructure provisioning using Terraform, streamlining deployment, configuration, and maintenance of AWS resources.

Project#: 2

Cloud Engineer TCS (April 2022 – April 2023)

Client: Allianz Global Corporate & Specialty (AGCS)

Description: This project aimed to digitize the eDelivery dispatcher process, focusing on securely sending, processing, and managing statement PDF files. Key components included file ingestion, broker authentication, and access control for PDF statements using AWS services

Responsibilities:

Developed Python-based AWS Lambda functions for efficient data processing, including bulk data ingestion from S3 to DynamoDB.

Designed and implemented an automated workflow using AWS Step Functions to handle Lambda function execution, reducing manual intervention.

Configured API Gateway to securely expose AWS Lambda functions with authentication and authorization mechanisms using Cognito User Pools and IAM roles.

Integrated API Gateway with AWS Lambda, DynamoDB, and S3, ensuring secure and seamless data processing.

Optimized API Gateway configurations, including stages, request validation, throttling, and rate limiting to improve API performance and security.

Developed event-driven architectures using Amazon EventBridge to trigger AWS services based on business workflows.

Provisioned, configured, and maintained AWS infrastructure as code (IaC) using CloudFormation for resource automation and consistency.

Automated data processing pipelines with AWS Glue jobs to extract, transform, and load (ETL) data as per client requirements.

Implemented real-time monitoring and alerting using AWS CloudWatch Alarms, Logs, and Metrics, improving incident response times.

Configured and optimized S3 storage policies, including lifecycle rules and access control, ensuring cost efficiency and security.

Implemented Single Page Application (SPA) deployment using AWS S3 and CloudFront, improving application performance and end-user experience.

Managed IAM roles, policies, and security best practices, enforcing the Least Privilege Access model across AWS resources.

Developed AWS SNS notifications to inform stakeholders of system events and failures proactively.

Enabled secure file transfer via SFTP integrated with AWS services, ensuring seamless and secure data exchange.

Worked on serverless API integrations to improve system scalability and cost efficiency.

Improved system reliability by implementing disaster recovery strategies and high availability architectures across AWS services.

Project#: 3

Cloud Engineer ERP Analysts (September 2020 – March 2022)

Client: City of Boston

Description: Provisioned, configured, and maintained AWS cloud infrastructure using CloudFormation. Developed and maintained operational tools for deployment, monitoring, and management of AWS infrastructure and systems in alignment with client requirements.

Responsibilities:

Designed, deployed, and maintained AWS cloud infrastructure using EC2, S3, RDS, VPC, and other services.

Implemented network security best practices using Security Groups, NACLs, AWS WAF, GuardDuty, and Shield.

Set up Load Balancers (ALB, NLB) and Auto Scaling Groups for high availability and fault tolerance.

Implemented AWS CloudWatch for monitoring system performance, logs, and alarms.

Used AWS CloudTrail and AWS Config for auditing and compliance tracking.

Developed CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy.

Automated infrastructure tasks using AWS Lambda, Step Functions, and Systems Manager.

Planned and executed cloud migrations using AWS DMS, Server Migration Service (SMS), and AWS Application Migration Service.

Implemented hybrid cloud connectivity with AWS Direct Connect and VPN.

Project#: 4

Cloud Engineer ERP Analysts (July 2019 – September 2020)

Client: Florida State College at Jacksonville

Responsibilities:

Designed and configured AWS VPCs, subnets, Security Groups, and Network ACLs to ensure secure and efficient network communication.

Deployed NAT Gateways and VPC Endpoints to enable private subnet communication and restrict unnecessary internet exposure.

Enabled Multi-Factor Authentication (MFA) and federated access to strengthen authentication mechanisms.

Automated AWS resource provisioning and management using Python (Boto3), reducing manual effort and deployment time.

Automated AMI/Snapshot/Volume management to enable seamless backup, recovery, and resource optimization.

Upgraded/downgraded AWS resources (CPU, Memory, EBS) based on usage analysis to optimize cost and performance.

Worked with Amazon RDS (MySQL, PostgreSQL) to set up scalable and cost-efficient databases.

Configured multi-AZ deployments for high availability and enabled automated backups, snapshots, and failover mechanisms.



Contact this candidate