Post Job Free
Sign in

Team Lead Scrum Master

Location:
Justin, TX
Posted:
May 18, 2024

Contact this candidate

Resume:

Meet Patibandha

E-mail: ad5ssc@r.postjobfree.com

Phone: +1-323*******

PROFESSIONAL SUMMARY

** ***** ** **** ******** experience. Currently working as a Technology team lead with cutting edge tech

Improve code quality within team lead by examples

Improve tech team performance using coding standard, organizing work, established rapid development using scrum, mentoring team members.

Experienced with full software development life-cycle, architecting scalable platforms, object oriented programming, database design and agile methodologies.

Involve as Scrum master to set sprint goals, Sprint demo, sprint retrospective, sprint grooming and daily standup

Skills

Languages:- Python, C#, C, C++, java script, SQL, PERL

Cloud: - Azure, AWS, GCP

Database:- My SQL, Oracle, MongoDB, SQLite, PostgreSQL, Hadoop, SequoiaDB

Framework:- Django, flask, jinja, Oozie, RabbitMQ, H2O wave, potly dash

Application Server:- Apache, AWS, GCP, Azure

Designing Tools:- UML, MS-Visio

Development Tools:- Visual Studio, Sublime, PyCharm, Eclipse

Version Control: - GIT, Bit Bucket, mercurial.

Testing Framework:- Pytest, mock, Pyunit, nose

Search optimizer: solr search, sequoia search

Deployment tool: Circle CI 1.0 and 2.0, Jenkins, Ansible, Docker

GITHUB: - https://github.com/Patibandha/

LinkedIn: - https://www.linkedin.com/in/meet-patibandha-1591b4143/

Education

Bachelor of Engineering in Information Technology (Saurastra University, 2011, India)

Master of Science in Computer Science (Pacific State University, 2016, USA) (3.6 GPA)

Work Experience

Shutterfly, Plano, TX May 2023 to Current

Sr Architect/ Director (GCP)

Project Description:

Design and Develop distributed micro-services component based projects which support live production environment.

Responsibilities:

Design and develop micro-service, event driven applications to support shutterfly production floor

Work with onshore – offshore model and able to successfully lead teams and projects

Work on revolutionary AI LLM project which enable generative AI.

Convert .csv data for product catalog and vaporizing data in vectorDB.

Run victories data and feed to LLM to generate q&a model.

Work on LLM trueLense which will generate q&a accuracy matrix and it will improve result on feedback.

Work with business stockholders and understand business need, later convert business requirements to technical work by providing phisible solution with POCs and prototypes.

Responsible for decision making across the Production platform on standards, resources, and overall direction of work through proactive partnership with executives across technology and business.

Groom and motivate team technically to overcome any challenges

Responsible for decision making across the Production Platform on standards, resources, and overall direction of work through proactive partnership with executives across technology and business.

Maintain and share knowledge and expertise on the evolving needs of Clients, internal and external markets, and industry trends and changes across Production community

Design and develop databricks data pipeline for data consumer micro-services which enable data to extract and use in different applications in production.

Use Apache airflow to run schedule job and execute dag flies in cloud environment.

Worked on Python, Angular tech stack and expertise in Python Based API frameworks, utilizing automation Tools/Frameworks and deployment via continuous delivery infrastructure (Jenkins).

Using snowflakes DB from transection.

Design and Implemented in API Design & Development and Micro services Architecture.

Deploy and maintain on going development and new features in GKE.

Deep understanding of agile and strategies for maximizing the effectiveness of the methodology through best practices, coaching and continuous improvement.

Tools and skills: Python, GCP devops, Camunda, Jenkins, etc

AT&T, Plano, TX April 2021 to May 2023

AI/ML Lead/ Architect (AWS & Azure)

Project Description:

Design and Develop AI/ML projects

Responsibilities:

Work on prediction AI model based on current scenario.

Work with various business stack holder to understand business need and work with my technical team to deliver those requirements.

Source data using various of datasources like big data Hadoop, snowflakes, mongo, etc.

Wrote big queries to source and filter data and provide to data pipeline.

Automate big query pipeline.

Develop Lambda Step function event driven micro-service components.

By pytorch and Tensorflow algorithms calculate best case prediction models

All Algorithms running in Ray Cluster node which is running in Cloud platform which has been develop by OpenAI(Chat GPT) group.

More the prediction pipeline run better the result we will see (more +ve predictions)

All data pipeline collect data in mongo instance running on Azure cloud, mongo db instance created and managed by my-self.

Implement advanced understanding of common AI, ML, and data science methods and deep familiarity with Natural Language Processing Techniques.

Worked with generative AI to generate data points and simulation.

For LLM model convert different source of data and document in vector DB and graph DB.

Consume data from snowflakes DB.

Take template from the application and create dag files and run at schedule cron job in airflow

Created db pipeline in databricks to snowflake data.

Develop API layer in Reactjs to support api for application

Design and Architect Data Engineering projects(ACO, SUD, PAMM) in Azure dev-ops platform

Worked with ML module where using ray cluster in AWS to run simulation and train models with using Pytorch and Tenserflow module to get possible probability of cases.

The projects source data from various platforms to their respected data lakes.

Migrated old batch procedure to data pipelines which will stream data in azure devops.

Deploy micro service applications in Azure AKS

Build DevOps pipeline to deploy automated code and manage ci/cd pipeline

Use API gateway to enable API running on k8s clusters

Lead team of back-end, frontend and DevOps engineers, to build and deploy ongoing development

Project write and maintain on databricks platform to provide solution quickly.

Extensively work on testing the backend with unit testing and front-end with selenium testing.

Develop performance testing using jmeter testing.

Design and develop different test cases scenario.

Integrated Prometheus Monitoring in Dynatrace monitoring tool

Lead team by example and provided grooming and guidance to all fellow team mates

Working with data science with proltly Dash application and H2O wave application

Cloud orchestration achieve by terraform and it is also develop as automated tool

H2O wave and plotly both based on flask web frame work.

Both application use flask framework in nut shell

Build backend pipeline in databricks which perform ETL operation

Tools and skills: Python, Azure devops, jmeter, selenium, pyspark, databricks, SQL, NiFi

Amgen, Woodland Hills, CA October 2019 to March 2021

Sr Technical Lead (AWS)

Project Description:

Design, Develop and support automated project which will use by global Data Science teams

Responsibilities:

Design and Architect global Data Science tool build on Python Django.

Lead team of back-end, frontend and DevOps engineers, to build and deploy ongoing development

Design micro service solution for global data science work bench.

Set coding standard, initiate TDD in work culture, groom team member and provide right motivation to work on that solution.

All components deploy on EC2 of EKS

Develop python based microservice components which will write dag files and use provided docker images

Build CI/CD pipeline using Jenkins to automate deployment process.

Provide databricks platform to write complex algorithms in PySpark notebook

Develop custom Docker to on board and launching project using Rstudio session from the Django project

The projects source data from various platforms to their respected data lakes.

Migrated old batch procedure to data pipelines which will stream data in azure devops.

Lead team of back-end, frontend and DevOps engineers, to build and deploy ongoing development

Project write and maintain on databricks platform to provide solution quickly.

Extensively work on testing the backend with unit testing and front-end with selenium testing.

Develop performance testing using jmeter testing.

Integrated automated code review using SonarQube with CI pipeline.

Design and develop different test cases scenario.

Lead team by example and provided grooming and guidance to all fellow team mates

Develop python based microservice components which will write dag files and use provided docker images to run automated cartridge console mechanism.

All components running in EC2 instances.

S3 buckets contains all docker image templated to pull from it.

Integrated Prometheus Monitoring in Dynatrace monitoring tool and incorporate logging trace back with micro-services running into K8s platform.

Write cartridge tools developed in python Django and import data from different web hooks.

Also normalized imported raw data.

Develop automated airflow orchestration tool which write dag file using python/Django application and running in containerize environment.

It will eliminate airflow console work and also eliminate any manual event triggering

Cloud orchestration achieve by terraform and it is also develop as automated tool which has prewritten TF template and it will added component based on need for any build.

Integrated sphinx to automate documents generator

Tools and skills: Python, Django, sphinx, PostgreSQL, Airflow, AWS, S3, EC2, Lambda, EQS, API Getaway, Terraform

Corelogic, Oakland, CA Jan 2019 to October 2019

Sr. Software Engineer (GCP cloud)

CoreLogic, Inc. is an Irvine, CA-based corporation providing financial, property and consumer information, analytics and business intelligence. The company analyzes information assets and data to provide clients with analytics and customized data services.

Project Description:

Working on GCP to develop and automate image factory which will going to bake OS images for containerize environment.

Responsibilities:

Working on cloud image factory which is Google cloud platform application which is based on python, terraform, ansible playbook, postgresSQL.

Design and develop GCP image factory in python 3.6 which is multithreaded application which will bake all OS images integrate all company policy in OS.

Develop less fragile code with sufficient amount of logging and exception handling.

For optimize use of resources develop reusable modular code which can be consume by multiple time with in project

Project deploy and run on GCP instances

Use terraform to automate deployment in GCP app-engine

Working on automate project build deployment through Jenkins

Working closely with project architect to improve design of the project and enhance functionality.

Involve in day to day stand up, sprint planning, and sprint review

Also involve in review of co-worker committed code.

Analyzing user requirements and defining functional specifications using Agile and Extreme programming methodologies

Maintaining version control using GitHub.

Being a technical resource for direct communication to team members in the program development, testing and implementation process

Documenting modifications and enhancements made to the applications, systems and databases as required by the project.

Wrote multithreaded watchdog for image factory to effectively communicate virtual machine while project running in GCP

Build a streaming pipeline in Kafka which will going to source data from Oracle, IBM DB2 and Spanner DB and stream through Kafka to python application.

Maintain services which will populate resource data for Kafka, services develop in python with Flask, node.js and Java with Mavan.

Develop less fragile code with sufficient amount of logging and exception handling.

For optimize use of resources develop reusable modular code which can be consume by multiple time with in project

Project deploy and run on GCP stateless VMs.

Propose PoC (Proof of Concept) of design for the application, and also design and develop prototype application.

Working closely with project architect to improve design of the project and enhance functionality.

Involve in day to day stand up.

Working with Kanban rapid development environment.

Working closely with Product owner and project manager, and QA team to resolve any bug and issue related to existing code.

Help to discover design and methodology to development.

Represent development team.

Coordinate with on shore and off-shore team member to assign and follow up with work and also review finish work as well.

Also involve in review of team member’s committed code.

Maintaining version control using GitHub.

Being a technical resource for direct communication to team members in the program development, testing and implementation process

Documenting modifications and enhancements made to the applications, systems and databases as required by the project.

Tools and skills: Python, PostgreSQL, Docker, and GCP, PyCharm, Shell Scripting, windows OS, Linux, Git, Jira, Scrum methodology.

Elsevier/Bepress, Berkeley, CA Dec 2017 to Dec 2018

Sr. Python Developer

Bepress exists to serve academia. We empower academic communities to showcase their works and expertise for maximum impact. Through our services bepress seeks to link communities of scholars, listen to their needs, and provide solutions to support emerging academic missions and goals.

Project Description:

Continuous integration and development for DCN (Digital Commons Network), developing automation testing for the project and migrate django 1.4 to 1.11. Automation deployment, also works on API development for Expert Gallery suites, wrote unit test cases for the project.

Responsibilities:

Primarily working on DCN (Digital commons Network), EGS(Expert Gallery suites) which is python based applications.

With DCN developed end to end testing for the application also update working Django version update from 1.4 to 1.11 which is the latest Django version which support python 2.7

On SWNG application initiate TDD (Test Driven Development) approach to update Rest API from DRF (Django Rest Framework) to Rest API.

Improve some functionality on frontend with using JS.

Analyzing user requirements and defining functional specifications using Agile and Extreme programming methodologies

Act as scrum master for team of 3 involve in team grooming, assign sprint tickets among the team, manage backlogs in jira board, sprint goal also sprint review.

Developing and building applications using Python, Django, PostgreSQL, Docker, Rabbit MQ and Amazon AWS.

Created AWS EC2 instance, to host Application in AWS also, created s3 bucket for data bucket for python scripts, created AWS Lambda to run Python Scripts for EGS project.

Configure AWS EC2 instance as test environment for QA, configure .yml files to setup docker-container in EC2 instance.

Leading multiple modeling, simulations and analysis efforts to uncover the best Python-based solutions

Creating and consuming RESTful Web Services

Developing and implementations test validations of the applications also developing sufficient amount of test coverage for the application

Analyzing test results and recommending modifications to the applications to meet project specifications using Jira

Maintaining version control using GitHub.

Being a technical resource for direct communication to team members in the program development, testing and implementation process

Documenting modifications and enhancements made to the applications, systems and databases as required by the project.

Tools and skills: Python, Django, PostgreSQL, Docker, Rabbit MQ and Amazon AWS, HTML, CSS, JavaScript, PyCharm, Django, Shell Scripting, mac OS, Linux, Git, Jira, Atlassian, Scrum methodology, solr search, SequoiaDB.

Apple Inc, Sunnyvale, CA January 2016 to Dec 2017

Python Developer

Apple Inc. is a technology company, it designs, develops, and sells consumer electronics, computer software, and online services.

Project Description:

In GBI platform, client need to develop ETL API in python which will mine data from different databases and develop new database schema to store all mined data

Responsibilities:

Design and develop ETL APIs which will automate the data mining in different database sources.

Use TCP/IP protocol to connect with server which has database available to connect for the ETL tool.

Use pyspark to interact with Hadoop and works with larger data sets.

The application also look for data mining workflow and services health check and integrate HQL and SQL queries in it.

Create custom DB APIs in python to connect with database and write SQL procedure to mine data.

Validate all mined data and store in Oracle DB.

Migrated Hadoop DB from main frame to GCP cloude

Migrated through hive and validated data through python application.

Unit test all develop module using pyunit and nose test framework, also develop custom scenario to test worst case scenario for the application.

Use panda for to get large dataset.

For version control I have use Gitlab.

Maintain all job report and heath of the job. Time by time generate email to all upper management team and associated project developer regarding job status.

Design and develop prototype python agent which will independently work with any python web application.

Make connection with using TCP/IP protocol with the help of Socket module in python to connect with .external java collector to monitor application performance.

Use dynamic instrumentation and monkey patch on runtime to change the method/class behavior.

I have use Git for version control for the build

Develop Interactive front end to visualize data from backend and represent in graphical manner to batter understanding for user

Design and develop modular tool and micro-services which trigger different workflow which mine data from database.

The application will interact numbers of server to get workflow and star mining job.

The application also look for data mining workflow and services health check and integrate HQL and SQL queries in it.

If anything has been change in any workflow or any database queries it will detects and start a new job and make a log for it.

This tool dump all mine data in in Oracle from various data mining tool like Hadoop and it will integrated with Oozie workflow.

For version control I have use Gitlab.

Maintain all job report and heath of the job. Time by time generate email to all upper management team and associated project developer regarding job status.

Tools and skills: Python, PyCharm, Hadoop, Oozie, Eclipse, Sublime, Linux, Shell Scripting, Mac, Gitlab.

Maahi Milk Producer Company, Rajkot, India Dec 2011 to Dec 2014

Software Engineer

Maahi is the subsidiary of NDDB (National Dairy Development Board). It’s one of the first milk producer own company in India. It is also the pioneer company which is transfer all milk data and transection online in India.

Project Description:

Work as software engineer. While working with this company I have worked on various project. I have responsible for all milk transection database which is in MS-SQL. Also part of a team for implementation of SAP in the company. Also part of the development and deployment of the Ration Balancing Software.

Responsibilities:

Responsible for all milk transection which is store MS-SQL. Part of the team for ETL process to bring all data in database with ETL process.

In early days I have to generate custom reports from MS-SQL with using SQL query.

Involved in ASP.net application which is known as IMMS (Integrated Milk Management System) for generate reports from company’s database in C# programming environment in front-end in HTML, CSS. We have use XML to fetch data from DB.

Part of development and implementation of SAP in company. In order to development I have took full-life cycle training for SAP ABAP module.

I was looking after for MM, FICO module for implementation.

I was part of development team for Ration balancing application with team up with various milk companies and this application was develop in NDDB HQ which is located in Anad, Gujarat, India. This application is develop using C#, HTML, CSS and for the report crystal report.

Assigned to train veterinary doctor and their user and deployed application in the local machine.

Tools and skills: C#, MS SQL, Visual Studio, Visio, HTML, CSS, XML, Crystal Report.

Dotcom Services (India) Pvt. Ltd., Mumbai, India Nov 2010 to Nov 2011

ASP.Net Developer

Dot Com India PVT LTD is a web solution provider. It provide web development as well as web hosting services.

Project Description:

Part of development team which develop back end logic for interactive web pages in ASP.net with using C#.

Responsibilities:

Assigned to develop interactive websites with using C#.

Responsible for backend development in environment of Visual studio 2005/2008, designed and developed database in MS SQL 2005/2008.

Worked in a team, Involved in developing the backend with C#, design database and create database schema for web application that matches requirements of the company.

Excellent Coordination in project activities and ensured that all project phases are followed and documented properly.

Integrated different web-services like SMS in admin website to create bulk messages for website users.

Tools and skills: C#, MS SQL, Visual Studio, Visio, HTML, CSS.

Project worked on while pursing Master

Master Degree Project 1 worked on 2015 (3 month)

Java Develop, design DB schema, UI developer

Project Description:

Worked on JSP application in team of three. Develop responsive webpage for Hotel booking website.

Responsibilities:

Design all webpages and design backend algorithm for the project as well as design DB schema for the project in NO-SQL DB Mongo DB.

Design responsive front-end with using bootstrap, work with my team to write some custom CSS class for attractive front-end.

Design and develop backend logic and wrote code in java for quick and fast web page.

Integrate google maps in it.

Master Degree Project 2 worked on 2015-2016 (4 month)

Python Develop, design DB schema

Project Description:

Worked on Djnago application in team of two. Develop responsive webpage for Social Networking website.

Responsibilities:

Design MVC structure for this application.

Design Restful-API to work with responsive front-end.

Design and wrote backend logic and wrote custom views for the application.

Wrote database schema in Django models which interact with PostgreSQL.

Include live chat option with friends.



Contact this candidate