Post Job Free

Resume

Sign in

Data Analyst Engineer

Location:
Schaumburg, IL
Posted:
April 12, 2018

Contact this candidate

Resume:

DURGA PRASAD SALADI

ac44dh@r.postjobfree.com

217-***-****

linkedin.com/in/durgaprasads

Performance profile:

Highly skilled Cloud Computing Engineer has exceptional development ability and extensive knowledge on AWS cloud and machine learning methodologies gained over the years of professional career.

Built tools based on Docker and AWS which is utilized internally in the Capital One.

Exposed to all aspects of software development life cycle (SDLC) such as Analysis, Planning, Developing,

Testing and implementing and Post-production analysis of the projects.

Expertise in AWS web services in framing end to end solutions from on premises to AWS.

2 years of experience in core AWS services (S3, EC2, ELB, EBS, Route53, VPC, Auto scaling etc.) and deployment services (Elastic Beanstalk, Ops Works and Cloud Formation) and security practices (IAM, Cloud watch and Cloud trail).

Worked on different methodologies like Agile/Scrum Software Development, waterfall Model, and Test-Driven Development Approaches.

In-depth understanding of data warehousing techniques such as predictive analytics; have knowledge of business intelligence tools such as Base SAS, SAS EG, SAS Macro, SAS Enterprise Miner and Tableau.

Experience with business process modeling, system analysis, Requirement planning and Management.

Ability to work effectively in cross-functional team environments and experience in training the cross-functional teams.

Experience in using bug tracking systems like Jira, Bugzilla, HP Quality Center and IBM Rational Clear Quest.

Good knowledge on R and Python Scripting in analyzing and visualizing the data.

Strong handle on Git and GitHub and experience in designing Git organization architecture for projects.

Hands on experience working with AWS and Java Script Object Notation(JSON)

Strong handle on Data mining techniques and statistical analysis tools.

Ability to use analytical expertise to parse and refine requirements into detailed system level requirements or business processes.

Strong problem-solving and analytical abilities

Certification:

AWS Certified Developer - License EJYE7QFK2NVQQ13E

AWS Certified Solution Architect - License EJYE7QFK2NLNQSE

Advanced SAS Programmers for SAS 9 – License BP071336v9

SAS Certified Base Programmers for SAS 9(A00-211) - License GR1P6RXV

Technical Skills:

Cloud Platform

Lambda, S3, Dynamo DB, EC2, Sage maker, VPC, IAM, RDS, ECS, Route 53

Devops

Cloud Formation, EBS, Opswork, Docker, Chef, GitHub, Jenkins.

Data Analysis

SAS, Python 2.X/3.X, R, SQL

Analytical Packages

Pandas, Numpy, Boto3, SAS MACRO.

Programming

OBJECTIVE C, C++, PYTHON, UNIX, JSON, PHP and PL/SQL.

Tools/Software

SAS E-Miner, SRSS, Tableau, Microsoft Visio, Microsoft Project, Signavio, Unigraphics, Pro E, Pycharm, R Studio.

Ticketing/Knowledge Tools

SharePoint, Jira, Confluence

Version Control Tools

GIT, SVN, GITHUB, TFS and IBM Rational Clear Case.

Web/Application Servers

Web Logic, Apache Tomcat, Web Sphere and JBOSS.

Operating Systems

Windows, UNIX, Linux (Cent OS), Mac OS, Mainframes(z/os)

Automation Tools

Jenkins/Hudson, and Bamboo.

Configuration Tools

Chef, Terraform, Ansible, Puppet.

Databases

SQL Server, Teradata, Redshift, Oracle.

Virtualization Tools

Docker, VM virtual Box and VMware.

PROFESSIONAL EXPERIENCE

Company: Capital One, Chicago, Illinois June 2017 – Present

Role: Data Analyst / Cloud Engineer

Responsibilities:

Cloud Engineer: Built a tool which is internal to the capital one called LEGOLAND. LEGOLAND utilizes containerization via Docker and AWS ECS to allow users to customize their environments.

Build AWS lambda architecture to monitor S3 buckets and triggers and updates the daily marketing one file.

Hands on experience in Amazon Web Services (AWS) provisioning and good knowledge of AWS services like EC2, S3, Glacier, ELB (Load Balancers), RDS, SNS, SWF, lambda and EBS etc.

Created the AWS VPC network for the Installed Instances and configured the Security Groups and Elastic IP's accordingly.

Able to create scripts for system administration and AWS using languages such as BASH and Python, created Lambda functions to upload code and to check changes in S3, Dynamo DB table.

Implement systems that are highly available, scalable, and self-healing on the AWS platform.

Provisioned the highly available EC2 Instances using Terraform and cloud formation and wrote new plugins to support new functionality in Terraform.

Creating Cloud watch alerts for instances and using them in Auto scaling launch configurations.

Cloud Migration: Lead early adaptor in the cloud migration environment and build standards of practice for the entire marketing segmentation team to have a smooth transition from legacy Mainframe (z/os) and SAS to Amazon Web Services.

Designed and implemented Cloud solutions with AWS Virtual private cloud (VPC), Elastic Compute Cloud (EC2), Redshift, S3, Auto scaling, Elastic Container service, Cloud watch and other AWS services.

Scripting python code for transition from legacy Mainframe (z/os) and SAS to Amazon Web Services and equivalent SAS code.

Worked with business professionals in constructing the right architecture for cloud migration.

Data Analyst: Performed statistical analysis and build classification models to predict customer churn in the partnership portfolios.

Performed predictive analysis and customer segmentation to identify the customers who generate the maximum revenue for marketing campaigns (80/20 rule) which improved the customer response significantly.

Build python code for data analysis for setting different campaigns like promotions, new user welcome letters, and birthday campaigns.

Proficient in “Know Your Customer” Phase 4 and phase 5 works and executed end to end KYC campaigns and Data Anomalies.

Environment: AWS, Ruby, Python, Puppet, Ansible, API, Docker, Terraform, StorageS3, Java/J2ee, Jenkins, GIT, Shell script, EC2, LEGOLAND, Centos, BOTO3, SAS, Mainframes.

Company: Amazon, India May 15 - Dec 15

Role: AWS & Data Analyst

Responsibilities:

Conducted data extraction and exploratory data analysis on huge customers query data sets using AWS Redshift and Python BOTO3.

Design highly available, cost effective and fault tolerant systems using multiple EC2 instances, Auto Scaling,

Elastic Load Balancer and AMIs.

Partnered with Data Science team to perform statistical analysis with tool like SAS to evaluate the right survey questions for measuring customer’s experience.

Worked with business partners and key stakeholders to identify business needs and delivered high-exposure reoccurring and ad-hoc reports to answer business questions.

Used data-driven approach to improve the response rate of the customer experience.

Created Tableau based dashboards to track progress against different approaches applied to improve customer experience.

Environment: Git, AWS, Windows Solaris, UNIX, C++, Java, Eclipse 3.20, Redshift, Tableau, Jira, Python, SAS, SQL.

Company: Survey Research Organization – Springfield, Illinois Sept 16 – May 2017

Role: Data Analyst Intern (AWS)

Responsibilities:

Designed and implemented a Command line tool using python and BOTO3 to perform AWS EC2 tasks using simple commands and flags.

Created an AWS Lambda architecture to monitor AWS S3 Buckets and triggers a thumbnail create event whenever a user uploads a new picture.

Designing and developing various machine learning frameworks using R.

Summarize and visualize conclusions from large-scale data sets to support the decision making of internal business stakeholders using SAS, tableau.

Administered a research study related to the 2016 elections in the state of Illinois and other clients like Realtors Association.

Environment: AWS, Git, Lambda, Python, Boto3, SAS, SQL, EC2.

Company: EMorphosys Solutions Pvt. Ltd., India Mar 14 – May 2015

Role: Data Analyst

Responsibilities:

Extensive use of SQL Server Management Studio to write and prepare queries for further data analysis.

Designed databases, stored procedures, reports using SQL Server 2008, 2014 and Excel.

Imported, exported and manipulated large data sets in multi-million-row databases under tight deadlines.

Environment: Databases, SQL, Exploratory Analysis.

EDUCATION

University of Illinois Springfield, Springfield, IL May 2017

Master of Science in Management Information Systems GPA: 3.8/4.0



Contact this candidate