Post Job Free

Resume

Sign in

Software Development Data Science

Location:
VasanthaNagar, Karnataka, 560001, India
Posted:
April 01, 2024

Contact this candidate

Resume:

Sapan Sharma

+91-986-***-**** ad4o9w@r.postjobfree.com

** ***** ** ***** ********** in Software Products AIR 330 in GATE-2008

* ***** ** ******* *** sustaining winning Products

●Helped in creating interesting B2B and B2B2C products and services that use DevOps, Data Science capabilities (like Machine translation, prediction & diagnostics)

●Rich experience of forming Software Development Teams, Hands on in Coding, Architecture Design, Businesss Networking; Dealing with clients; Exposure to solutions offerings in Data science, Virtual Reality, Cloud SAAS, Conversational AI & DevOps

●Wide experience in Software Development Lifecycle (SDLC), Scrum, in implementations across different Business Processes

●Certified Scrum Product Owner

●Regularly Volunteer for Ecological restoration causes; Reforestation; lake cleanup etc

Professional Experience Details:

Senior Technical Program Manager

Nu10 Technologies

October 2022 -Present

(1 Y, 5 Months)

My Responsibilities:

1. I work with product management to oversee the Product Development efforts for 2 products.

2. I Lead the Customer Engagement practice for a few major customers.

Technical Architecture:

Our products are Apps and APIs built using the MERN Stack. The backend and middleware layers also employ the ELK stack along with an event management system built using Apache Kafka.

The AI based modules constituted of the following:

1.Phenoms proprietary conversational AI built on top of openai’s gpt-2. Models are served to upstream applications using Django REST APIs. Apps consuming these models are built in React

2.There is a functionality that matches candidates with job descriptions. Each candidate is given a match score for a job description.

optimal product setup and smooth delivery.

I have a full handle on execution of large business programs end to end. Possess a rich storied experience of ensuring business objectives are being met, ROI is being realized & assets are in check.

Make decisions on work priorities, Keeping a birds eye view of all development activities and tech improvements that all teams across the organization are involved in.

This is achieved by parsing resumes and job descriptions using NER to generate training samples. BERT sentence pair classification is used then to rank resumes based on their suitability to the job description.

3. Data ingestion and processing pipelines (to load jobs, candidates and applications data from client systems) are built using Apache spark; data base is built using MongoDB; every service consumes and produces events to the Kafka server. The spark server for examples, has a producer written in pyspark that sends the event ‘data load complete’ to a Kafka topic.

The DevOps architecture is built partially using (Jenkins CI, Ansible playbooks) and (Kubernetes, Ansible playbooks). The cloud infrastructure uses AWS(EKS, EC2).

My Day to day work:

From a holistic viewpoint of the solution & technology ecosystem of the customer, I spearhead client engagement for maximum value realization,

We deal with frequent customer issues on a daily basis. I like to keep vigilant on the process pipeline to ensure data safety & checks and balances are present at the right places at all times.

I work with senior leadership in the organization and on the customer side to maintain confidence, transparency and lasting results.

Engineering Manager: North America Enterprise

OpsMx

April 2017 -March 2022

(5 Years)

I was a part of the core team at OpsMx. I owned software development efforts for the products Open Enterprise Spinnaker and Autopilot.

Technical architecture:

We have forked Netflix’s popular CD/CI platform Spinnaker's analysis engine Kayenta provide enterprise managed software and support services to our clients.

The Analysis is an implementation of the following 3 algorithms:

1. Algorithm 1 models time series data by Supervised learning. Specifically, a combination of Kolmogorov–Smirnov test followed by Classification tree modeling is used for feature selection.

A CNN, one per metric group (Pre-tuned based on our knowledge base) is used to train a model. Two resulting models (of performance metrics) are used for Statistical process Control and Anomaly detection to detect differences between them.

2. Algorithm 2 performs a comparison of two sample distributions using a combination on the following (primarily non-parametric) tests: Wilcoxon signed rank test, Spearman Rank correlation, comparing quantiles of two samples, Polynomial linear regression.

Results of these 2 are then quantified to produce a health score of the comparison.

3. Algorithm 3 is a proprietary conversational AI built on top of openai’s gpt-2. Models are served to upstream applications using Django REST APIs. Apps consuming these models are built in React

Data ingestion and processing pipelines (to load jobs, candidates and applications data from client systems) are built using Apache spark; data base is built using MongoDB; every service consumes and produces events to the Kafka server. The spark server for examples, has a producer written in pyspark that sends the event ‘data load complete’ to a Kafka topic.

The DevOps pipelines is built partially using (Jenkins CI, Ansible playbooks) and (Spinnaker with Kubernetes, Ansible playbooks). The cloud infrastructure uses AWS (main services: Lambda, DynamoDB, EKS etc).

App and APIs are built using the MERN Stack. The backend and middleware layers also employ the ELK stack along with an event management system built using Apache Kafka.

Managing project budget for cloud infrastructure and personnel (amidst very stringent situations at times).

Our Continuous Verification Product (Autopilot) measures the quality of a Software release by producing a health score.

This is done by analyzing the following 3 measures: 1. Server Performance Metrics 2. Service Logs 3. Application Source code.

My Responsibilities

Creating the system and architecture design for different product features; devising solutions to solve different problems; Implementing POCs

I practice SCRUM; conduct various ceremonies with varying complexities for different projects that I own; use jira for project management

Training and mentoring reportees progressively take care of increased responsibilities; keeping them invested and motivated in the journey of creating a new product mentoring them to navigate complex and stressful situations in a fast-growing startup

As much as I can, actively taking part in coding/maintaining the products frontend; backend; testing and deployment infrastructure

Working with client business partners on a continuous basis to plan and execute projects in line with the business objectives of the client; track individual project level metrics

Analyzing change requests for the data architecture for client wise pipeline implementations for enterprise clients (enterprise is one among 3 product licenses), defining scope and arriving at rough estimates before handing them over to the concerned dev/support teams

Created Jira dashboards to report project performance, sprint health for weekly and biweekly stakeholder reviews; created Jira Structures to track progress at program and practice level for my superior; creating recurring status reports of my team’s performance to present to my superiors.

Created guidelines for relevant release strategies that can be used at

Conducting sprint planning, daily scrum (and keeping it novel and interactive) review and retrospective meetings periodically

Conducting performance reviews of my subordinates.

Technical Project Manager

Elucidata

May 2015 – Mar 2017

(1.11 years)

Lead a software development team working on implementing a ‘Natural Abundance correction’ algorithm in python 3(pandas, scikit-learn, numpy). In addition to this, employed a combination of linear-kernel svm, linear regression; curve fitting (all using scikit-learn) to implement a matlab-similar curve calibration package; api. Application developed using Python flask, redis, for backend package and api & Typescript-AngularJS for web frontend. more details here: www.ncbi.nlm.nih.gov/pubmed/20236542

Collaborated with a Research institute in analyzing data generated from CRISPR Editing. Development. The application beta release:

sapan-sharma.shinyapps.io/TSUNAMI_repos

My primary responsibility was owning software development, UAT testing, Go live and Production support.

Proficient in Identity management using ldap, oauth, various SSO protocols

Scoping and prioritizing activities based on business and customer impact.

Working with the Business Analysts/development team to plan and execute release goals.

Managing and planning the resource and budget utilization.

Project planning using MS Project, JIRA. Regular updates on Project deliverables.

Weekly Status Reports to CIO. Breaking down all the Projects using WBS(Work Breakdown Structure).

Continuous improvement to Projects by meeting Business stakeholders regularly and understanding pain points.

Performed Performance appraisals, career tracking of subordinates

Technical Project Manager: Risk Analytics

JP Morgan Chase Bank

July 2014 - April 2015 (10 Months)

Drove Stratus: A real time back-office data analytics engine for Investment banking products. The system is designed to operate at top of the line latency.

process

Requirement gathering for BRE and Collections Projects in post-sales phase. Making SRS, FSD User Manual etc.

Building Test Cases, Test Procedures and carrying out functional unit testing, system testing and facilitating User Acceptance Testing (UAT).

Preparing Functional Specification Document, HLD, LLD, RTM, User Manual. Presenting Multiple Prototype presentations and Demos to the users of the system midway into the development phase and the final one.

Following a process and incident management plan to oversee Change management and issue tracking (using github projects and jira)

My Responsibilities:

Customer retention and feedback: improving solution offering by both better design to core more of customers pain points and with building a better support

Driving Training at customer's end, to IT and Business users

Coordinating with various stakeholders such as Project Manager, Developers, Quality Assurance Team, Business Users, throughout the implementation of the solution, ensuring timely delivery of the project

Reporting the project status to stakeholders regularly and tracking the overall project progress.

Created low level design, user stories with detailed acceptance criteria from high level requirements; Delivering regular updates of project metrics to different stakeholders

Software Developer

Tata Consultancy Services

July 2008 - June 2014

(6 Years)

I was a part of the product development team of TCS iON, An ERP solution for Small and medium manufacturing organizations. The software is a cloud based suite of products which integrates an enterprise-wide functions to provide a holistic view and control of geographically spread business processes.

I was involved in implementing the software’s modules that automated Procurement and Production management related processes.

The product was implemented as a WPF (with c#) based frontend implemented using PRISM package and ASP.NET RESTful webservices. The system used MS SQL server as a database. Some ETL based modules were development using SSIS Server. I was tasked with implementing both the frontend, backend, ETL and implementing version control with automated building for one product module.

Playing the role of technical defense in product demos for a regulatory filing product for hedge funds; taking part in solutioning discussions for custom development requirements

Performing systems integration testing for the end-to-end flow of various product functionalities for clients that I managed

Designed the application and backend architecture. This includes the front end web applications, web and application servers, data science algorithms and the rest api's exposing them, a caching data store the database

Contributed to a project called Gaitoraid which is implementing an NAR based Neural Network to make spatial orientation predictions to establish identity of a person in traffic cameras using the person's gait information. Prototyping is being carried out in octave/ implementation in Python. Git Link: https://github.com/sgang007/gaitoraid

Single handedly re-engineered vb6 based API's plugging in to an imaging solution namely Kofax Capture (~25 Wireframes) to c#.net 4.0 based API's. Designed the datamodel and created the database as an OLTP system. Implemented ETL jobs as custom c# api. The application supported the middle office ops team working on Public finance Products.

Education:

B.E. in Electronic & Telecommunications Engineering from Institute of Engineering and Technology, Devi Ahilya Vishwavidyalaya, Indore, Madhya Pradesh

Certifications

●Certified Scrum Product Owner from Scrum Alliance (completed in june 2021, valid till june 2023)

●Certified in Machine Learning & Statistical Modeling from Stanford University, USA.

●Certified in 'Hadoop Platform and Application Framework' from the University of California, San Diego

●Microsoft Certified Professional in Application development; Windows based client

development

Software Skills:

●Analytics & Reporting Tools: SPSS, Matlab, SAS, Tableu, EazyBI

●Project Management tools: Jira, ServiceNow, Github, Confluence, Trello

●Cloud Environments: AWS, Azure, DigitalOcean



Contact this candidate