Post Job Free
Sign in

Engineer Software

Location:
Costa Mesa, CA
Posted:
February 18, 2020

Contact this candidate

Resume:

RAHUL TALARI

*** ******** ** #**, ***** Mesa, CA – 92627 612-***-**** ***********@*****.***

Education

M.S ENGINEERING MANAGEMENT AUGUST 2019 - CURRENT CALIFORNIA POLYTECHNIC UNIVERSITY M.S COMPUTER SCIENCE DECEMBER 2016 UNIVERSITY OF CALIFORNIA SANTA CRUZ B.S. COMPUTER ENGINEERING MAY 2015 UNIVERSITY OF ILLINOIS URBANA CHAMPAIGN Technical Skills

PROGRAMMING LANGUAGES

Python, C/C++

PLATFORMS AND TECHNOLOGIES

OS: Linux/Ubuntu, Red Hat Fedora, MacOSX

Technologies: Git, Docker, Kubernetes, Ansible, AWS - [S3, EC2, EB, Route53, API GateWay, Lambda, RDS, VPN, ELB, SQS etc.], PyTorch, Flask REST - HTTP/HTTPS, MySQL, NoSQL, MongoDB, RabbitMQ, Terraform, Bash, Datadog, PagerDuty, Slack

INTERESTS

ML/AI, Deep Learning, Distributed Systems, BackEnd, Performance, Monitoring & Automation RELEVANT COURSES

COMPUTER VISION STANFORD UNIVERSITY

Implemented CNN, RNN, LSTM, GRU and GAN from scratch. Transfer learning, Data augmentation, Object Detection, Adversarial Examples, Feature Visualization and Inversion, Reinforcement Learning

NATURAL LANGUAGE PROCESSING STANFORD UNIVERSITY Implemented Word2Vec, Word Embeddings, Bi-LSTM for Translation, Character Level Modeling, Natural Language Generation, Neural Parsing from scratch

MACHINE LEARNING UNIVERSITY OF CALIFORNIA SANTA CRUZ Implemented Causal Inference, Graphical Models, D-Separation Recent Work Experience (3 years)

SOFTWARE ENGINEER, DATA INFRASTRUCTURE WISER SOLUTIONS AUGUST 2018 – APRIL 2019

• Developed a distributed micro-services architecture for scraped product ETL stack and scaled for $2M budget infrastructure

• Enhanced data ingest quality and speed by 3x speedup by using automated monitoring and performance metric measuring through DataDog, Terraform, Python and AWS

• Built and deployed high availability architecture using RabbitMQ (1 node before) to run ML algorithms on payload triggers

SENIOR SOFTWARE ENGINEER TOSHIBA MEMORY AMERICA SEPTEMBER 2017 – AUGUST 2018

• Worked on the Flashmatrix research team that is focused on edge/IIoT real-time analytics

• Designed and developed ML Models for IBM’s cloud edge platform (DSX) integration on our platform

• Developed dynamic storage provisioning for the application layer for the custom hardware below the Kubernetes layer

• Implemented the entire automation and configuration framework for base OS image/dependencies/scripts/RTL

MACHINE LEARNING RESEARCH ENGINEER PERCOLATA SEPTEMBER 2016 – SEPTEMBER 2017

• Built and scaled Deep Learning Models (Seq2Seq, LSTM) for time series for analyzing customer traffic via store front cameras

• Implemented data engineering algorithms to fill missing data and improved accuracy by 18%

• Automated the infrastructure deployment using Docker and AWS

SOFTWARE DEVELOPMENT ENGINEER NATIONAL CENTER FOR SUPERCOMPUTING APPLICATIONS SEP 2014 – SEP 2015 MAY 2014 – SEP 2014 JAN 2013 – OCT 2013

• Performed ETL and automation for healthcare data to analyze public health hazard indicators (Bash,Python,R)

• Built the universal accessibility protocol for the website according the Illinois State Law (AJAX, JS, HTML5/CSS)

• Created a web application for digital content from paper magazines via ColdFusion (HTML5/CSS,ColdFusion,PHP,Adobe) Research (1 year)

GRADUATE RESEARCHER PROF. SESHADHRI COMMANDUR (PRINCETON) SEP 2016 – DEC 2016

• Mining of directed triangle sub-graph patterns in large scale social and web networks with millions of edges.

• State-of-the-art algorithm using smart indexing and dynamic referencing increased speeds by 5-10x in our experiments

• Code: www.bitbucket.com/rahulraju93/escape (C++, Python)

!

GRADUATE RESEARCH ASSISTANT STORAGE SYSTEMS RESEARCH CENTER JANUARY 2016 - SEPTEMBER 2016

• Research in benchmarking using realistic content generation and distributions for benchmarking block based storage systems. Lenovo and HP were interested in trying it for their secondary back-up storage.

• Used Deep Learning to model data distribution in storage sizes of 10-100 GB by characterizing access patterns and timing to model DB, Cache,RAM etc. storage characteristics

• Expansion of DEDISbench: https://github.com/jtpaulo/dedisbench (C, Bash) to use a entropy-based dynamic content generator based on information theory



Contact this candidate