Post Job Free
Sign in

Aws Python

Location:
Dublin, CA
Salary:
135,000
Posted:
January 20, 2021

Contact this candidate

Resume:

Prashanth Mamidala

adjkzy@r.postjobfree.com 424-***-**** 5542 Crestridge Terrace, Dublin, California, 94568.

Professional Summary:

* ***** ** ********* ********** in the data engineering field working hands on backend Development and cloud engineering.

Advanced Python programming, backend rest API development using frameworks like Django, Flask, Fast-API and AWS services like API Gateway, cloud front.

Experience developing logic according to the requirements, building the docker images and deploying on to ECS and EKS clusters.

Expertise includes bash and python scripting with focus on DevOps tools, CI/CD Architecture.

Experience in creating AWS CloudFormation Templates for all kind of AWS service to automate the process.

Proficient in writing terraform, cloud formation scripts and deploying using Jenkins, AWS code Build and code Deploy.

Expertise in writing Python scripts to automate the service from lambda process through API Gateway.

Expertise in python scripting according to requirements provided.

Expertise in AWS cloud services such as EC2, S3, ELB, Auto scaling, EBS, RDS, VPC, Subnets, Security Groups, Elastic IP, Internet Gateway, Route Tables, Route53, CloudWatch, IAM, SQS, SNS, SES and Azure for IaaS, PaaS, SaaS, Cost Explorer etc.

Experience in planning, designing, consulting, Implementation & technical skills in areas of AWS cloud, Python, Windows and Linux.

Experience in Administration of Production, Development and Test environment's carrying Windows, Linux.

Experience in database query construction, data warehousing.

Experience working with databases like Amazon RDS and Redshift carried out functionalities for creating instances as per the requirements.

Experience with Linux command lines.

Worked on data processing on very large datasets that handle missing values, creating dummy variables and various noises in data.

Experience with using Airflow for running data pipelines.

Experience with common data science toolkits such as NumPy, Matplotlib, Pandas.

Data engineered and defined the scope, features for the data set.

Strong knowledge and experience on Amazon Web Services like EC2, S3, Glue, Redshift, Kinesis, RDS, VPC, and IAM.

Created functions and assigned roles in AWS Lambda to run python scripts, Created Lambda jobs and configured Roles using AWS CLI.

Education Details:

Masters: Northwestern Polytechnic University, Fremont, CA (2016) Major: Computer Science

Masters: University of the Cumberland’s (2020)

Major: Master’s in information system security

Bachelors: Kakatiya University, Warangal, India. (2014)

Major: Pharmaceutical Sciences.

Technical Expertise:

Languages

SQL, Python, Java, Hive

Cloud

AWS

Scripting

Bash, Groovy

Infrastructure

Cloud Formation, Terraform

CI/CD

Jenkins, Code Deploy

Version Control

Git, Code Commit

Visualization Tools

Tableau, AWS Quick sight

Virtualization

AWS EC2, EKS, ECS, ECR

Database management

SQL workbench, DBeaver, PG admin

Databases

AWS Redshift, AWS RDS (MySQL, PostgreSQL)

Professional Experience:

Genentech, SFO, CA Feb’2019 - present

Job Title: Data Engineer

Project: PHC Data and Analytics

Description: All the enterprise data including raw copies of source system data were ingested, scanned for viruses, classified, data warehousing, data cataloged and used for reporting to clients. Various kinds of data sources using different ingestion methods and ran through various pipelines according to their requirements.

Roles & Responsibilities:

Writing API calls (Django and FastAPI) for various tasks in airflow (workflow management platform) to migrate the data from on-prem to AWS for data moving, virus check, data classification, data warehousing (Redshift), data cataloged and visualization.

Developing Application programming Interface, Command Line Interface and creating binary files for applications using python module called PyClick, Pyinstaller.

Creating APIs for Interactive platform for data scientist to launch RStudio, Jupiter and SAS applications to run their analytics.

Developed a scratch space solution (similar to Data lab in Teradata) for data scientists to create tables and run analytics.

Crawling the data from s3 to glue crawler and loading datasets into redshift by Creating databases, schemas and tables.

Responsible for design and development of SQL/Python programs to prepare transform and harmonize data sets in preparation for modeling.

Developed user onboarding and access automation application to create and sync AD users automatically in the redshift and providing access to corresponding databases and schemas.

Classification Personal Health information/ Personal Identity Information using third party tool Dataguise.

Developed Python code to periodically survey the resources in AWS.

Experience in working with large data sets and distributed computing tools (Hive, Redshift).

Automated cataloging of datasets metadata to CKan using harvest framework and JsonLD.

Written Django models to store the data in database for report generation.

Developed API’s with Lambda and API gateway for executive dashboards visualization app to serve UI requests for Electronic health records by querying the RDS PostgreSQL DB.

Developed Python-Django view and created API’s with CLAMAV Daemon to scan the file uploaded to UI.

Built docker images for CLAMAV for antivirus scanning and deployed on to lambda and SQS to automate the event driven API calls.

Created models and schemas to persist the data in database using FastAPI.

Writing terraform scripts to automate the required AWS infrastructure like (RDS, EC2, S3, secret manager, lambda, API gateway, etc.) for the development.

Used Airflow for running the entire pipeline in an automated way.

Performed CI/CD automation using Jenkins, GitHub, Terraform, Docker.

Environment: Python, Django, FastAPI, Docker, Data guise, AWS S3, EC2, ELB, Cloud Trail, Terraform, Cloud Watch, Cloud Front, IAM, SNS, RDS, Route 53, EKS, ECS, ECR, Redshift, EMR, Apache AIRFLOW, CKan, Linux, Django, Jenkins, LDAP, GIT.

8kmiles, Pleasanton, CA 94588: July’2018-Jan’2019

Job Title: AWS DEVELOPER

Roles & Responsibilities:

Developed Python code to periodically survey the resources in AWS.

Created & Maintaining AWS simple Active Directory to control the users and groups for a web-based application.

Programmed backend python program to create users/groups and create endpoint through Django framework to authenticate them.

Created authentication mechanism using Django, Python-Ldap and AWS active directory services.

Configured migration of data from on-premise data centers to AWS using python and ec2 instance.

Converting file formats from CSV to Parquet using python library called pandas. And creating static, animated and interactive visualization using matplotlib and Numpy.

Written python scripts to spin up the instances in AWS EC2 and Ops-Works stacks and integrated with Auto scaling to automatically spin up the servers with configured AMIs.

Written boto3 API calls to automate the bucket replication process for centralized logging.

Working on creating a Continuous Delivery CI pipeline with AWS Code Pipeline to automate builds with AWS Code Build.

Created centralized logging solution using AWS Kinesis.

Performed CI/CD automation using Code Commit, Code Build, Code Deploy, Docker.

Developed AWS CloudFormation scripts to create and set up new AWS accounts standardization.

Developed AWS CloudFormation scripts to automate and deploy the developed applications.

Provided SQL programming, with detailed direction, in the execution of data analysis that contributed to the final project deliverables. Responsible for data mining.

Documented the complete process flow to describe program development, logic, testing and implementation, application integration coding.

Performed and managed the environment related configuration changes as a part of deployment.

Environment: Python, Docker, Django, LDAP, GIT, AWS Kinesis, AWS CloudFormation, Cloud Watch, IAM, SNS, RDS (Postgres, MySQL), Route 53, ECS, ECR, Redshift, EMR, Bash.

Panasonic Avionics, Pleasanton, CA 94566: Feb’2018 – July’2018

Job Title: AWS DEVELOPER

Roles & Responsibilities:

creation and management, setting file permissions, customizing shell environment setup for users.

Working with AWS services like EC2, S3, VPC, ELB, Autoscaling Groups, Route 53, IAM, CloudTrail, CloudWatch, CloudFormation, CloudFront, SNS, and RDS.

Writing python automation script for when user logs in to the Aws account. It will sense the Cloud trail logs and will send SNS notification to lambda.

Automated Redshift user onboard and off board process using SailPoint Active directory API calls.

Automating design, development, deployment and support RESTful API services any specific requirements.

Set up and built AWS infrastructure with various services available by writing cloud formation templates in Json, yaml.

Experience with AWS Kinesis to load the streaming data from On-prem servers to AWS S3 buckets in json format.

Working with configuration management tools like Ansible and CI/CD tool Jenkins.

Set up and built AWS infrastructure with various services available by writing cloud formation templates in Json, yaml.

Developed SQL queries to create cluster, schema, database and tables in redshift.

Experience in creating complex views based on already existing tables.

Created users, groups with required privileges.

Used GIT for version control and project management tools like JIRA.

Created and implemented scripts as a data engineer for various part of the project using python.

Created API endpoint with AWS API gateway.

Experience in deploying infrastructure as code with terraform.

Environment: AWS EC2, EBS, Redshift, ELB, Cloud Trail, Cloud Formation, Cloud Watch, Cloud Front, IAM, AWS Kinesis Data Stream, Kinesis Firehose, SNS, RDS, Quick sight, Route 53), Bash, Jenkins, Shell/Bash scripting, Python, GIT.

Orabase solutions LLC, McKinney, TX 75069: May’2017-Jan’2017

Job Title: JUNIOR CLOUD ENGINEER

Roles & Responsibilities:

Experience in Linux/AWS services specifically in installation, maintenance, configuration and monitoring efficiently to achieve organizational goals.

Created custom domain for API gateway and configured with VPC endpoint and load balancer and lambda(python).

Involved in designing and deploying multitude applications utilizing almost all the AWS stack (Including EC2, Route53, S3, RDS, Dynamo DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and auto-scaling in AWS CloudFormation.

Automated manual tasks using Shell scripting.

Worked on creating the cloud formation scripts, setting up the VPC (public and private subnets), EC2 instances, ELB, ASG and IAM modules.

Written Cloud formation templates and deployed AWS resources using it.

Hands on experience in monitoring EC2 instances using Cloud Watch. Worked closely with EC2 infrastructure teams to troubleshoot complex issues.

Utilized Cloud Watch to monitor resources such as EC2, CPU memory, Amazon RDS DB services, Dynamo tables, EBS volumes to set alarms for notification or automated actions; and to monitor logs for a better understanding and operation of the system.

Environment: AWS services like VPC, Cloud Formation, Cloud Watch, Cloud Front, IAM, SNS, RDS, Route 53, SVN, Jenkins, Linux.

Basecamp Technologies Pvt Ltd, Pune 411009: June’2014 -March'2015

Job Title: Jr DEVOPS/AWS ENGINEER

Roles & Responsibilities:

Designing, actualizing and supporting completely automation Continuous Integration and Continuous Delivery

Working and supporting various development teams delivering a wide range of software applications.

Implemented AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer, Auto scaling groups, Optimized volumes and EC2 instances.

End to end deployment ownership for projects on Amazon AWS which includes scripting for automation, scalability, build promotions for staging to production.

Set-up Jenkins AWS Code Deploy plugin to deploy to AWS.

Integrated Amazon Cloud Watch with Amazon EC2 instances for monitoring the log files, store them and track metrics.

Created AWS S3 buckets, performed folder management in each bucket, Managed cloud trail logs and objects within each bucket.

Defined and Implemented Software Configuration Management Standards considering Agile/Scrum methodologies, in accordance with association.

Environment: AWS, GIT, Jenkins, maven, Apache Tomcat, Unix/Linux, Windows, Oracle, MY SQL.

Academic Projects:

Market Rates Android Application:

Northwestern polytechnic University, Fremont, Ca (Masters): Sept’2016-Dec’2016

Actively participated in developing android app, which will help the users to browse through all the products with details and order the product to be delivered.

Developed an application on the android held device that provides the facility to the user to view the product details and to order the product.

Written Django API calls for the backend code to serve the content to frontend UI.

Created database tables to get the required product details using MySQL.

Created Django models to automate the data insertion by parsing the UI requests.

Performed regression, and performance testing to certify the stability and usability of software systems.

Personal Project:

Created a weather app to get the temperature of the given city/zip using FastAPI.

Built a magic mirror using Raspberry Pi4 and Python3.8.

Downloaded the data from Kaggle did ETL processing with python scripts and Loaded the data back to MySQL database.



Contact this candidate