Post Job Free

Resume

Sign in

Project Manager Scrum Master

Location:
Kolkata, West Bengal, India
Posted:
November 27, 2023

Contact this candidate

Resume:

Apurbalal Gayen (PMP)

Project Manager

Email: ad1idz@r.postjobfree.com

Phone: +91-771*******

AWS Certified Cloud Solution Architect Professional

Summary of Experience:

15+ years of Experience in Analysis, Design, Development, Support and Testing of the software applications. Experience in AWS cloud, Big Data Hadoop, Enterprise application Integration (EAI) and Object-Oriented programing.

Project Management Professional having having 5 plus years of experience in agile delivery.

Professional Agile Scrum Master /Delivery Manager with broad range of experience in delivering projects effectively according to time and budget. Worked on a variety of complex, large-to-medium scale projects in various IT sectors.

Extensive experience working on agile techniques: User Stories, Agile estimations Estimation methods, deployment.

Primarily into Application Development as Project Manager/Scrum Master/Team Lead/ Developer / Tester.

Thorough knowledge on engineering life cycle with AWS cloud, Big Data, EAI, JAVA

Excellent background in software quality aspects.

In Depth knowledge on Delivery Pipeline and DevOps.

Through knowledge on Quality practice and processes -Agile levels.

Excellent understanding of consumer credit reporting, Insurance, Information & Media, Healthcare, Banking, Manufacturing, Telecom and Forex & Commodity Trading domains.

Vast experience of working in both Product & Service based projects.

Experience in working in Agile Transformation project transforming project from waterfall to Agile.

Excellent interaction skills with different stakeholders like Tribe leads, Technical Excellence Groups, Product Owners, Product Manager, Business & Architecture Groups and Testing team for resolving issues / impediments and tracking them to closure.

Resolving critical Production support issues, working on, Big data, AWS, Enterprise application, Object oriented programing, data migration, EAI integration and SAP Support pack upgrade issues.

Onsite-Offshore Co-ordination in providing quality deliverables on time.

Managing the offshore deliverables and the cutover activities.

Managing Big Data infrastructure related enhancements for release developments for Data Lake layer.

Good onsite / client facing experiences with both US and Europe customers.

Professional Certifications & Other Assets

PMP – Project Management Institute (PMI) – Nov 2023

AWS - AWS Certified Cloud Solution Architect Professional – Feb 2021

SCRUM- Certification (Agile Development Method) from UST Academy -Sep 2021

DevOps -Certified from Equifax Inc -May 2022

PMP – Certification from Udemy-218

Big Data- Certified from L&T & Udemy Academy – Feb 2016

PMP - Project Management Professional Certification from L & T Academy -Oct 2015

EAI(BW/EMS)- Certified from TIBCO Academy -May 2012

JAVA-Sun Certified from Sun Microsystems, Inc April 2009

Managerial Assets

Project Manager/Scrum Master for a US based leading consumer credit reporting agency; leading Nexus Scrum teams across UK & India including Stakeholder management, Task Allocation, Release & quarterly/annual planning.

Project Manager for a North America based customer, leading a distributed agile team located at USA, UK and Bangalore.

Delivery Manager a North America based leading Insurance customer.

Delivery Lead for a US based leading Commodity trading customer.

Project Lead/Team Lead for a USA based leading Manufacturing customer, Banking Telecom customer.

Over-all delivery ownership, resource and people management experience in leading and managing large and medium sized teams

Efficient in Release & sprint planning, execution and tracking (quarterly release cycle mode) in addition to account billing and resourcing

End to end resourcing from profile screening to onboarding in project /Org of full lifecycle (H2R).

Provides client team with a single point of contact for project delivery

Strong analytical skills, ability to quickly adapt to new technologies and domains

Manage the tracking and reporting of key solution delivery metrics for one or more projects.

Manage the continuous review and improvement of software development processes to maximise the quality of products and services.

Ability to translate business processes into clear and comprehensive practical implementations

Technical Skills:

Very good understanding and hands-on experience in AWS Cloud – AWS Cloud infrastructure: EC2, S3, IAM, KMS, Lambda, EBS, ELB, DynamoDB, Auto Scaling, VPC, Cloud watch, RDS, Cloud Formation, Cloud front, SNS, SQS, EBS, ELB, Elastic Beanstalk, AWS Analytics, AWS Gue, EMR, Redshift, Application Integration, Security, Identity & Compliance.

Security specialty with extensive experience including AWS cloud architecting and Big Data Hadoop Solution

Experience in developing and deploying enterprise-based applications using major components in Hadoop ecosystem, Hadoop 2.x, YARN, Hive, Pig, Map Reduce, HBase, Flume, Sqoop, Spark, Storm, Kafka, Oozie and Zookeeper.

Experience in Operations, developing, maintaining, and upgrading Hadoop Clusters in cloud as well as in-house (Apache Hadoop, Hortonworks and Cloudera distributions).

Experience in installation, configuration, management and deployment of Big Data solutions and the underlying infrastructure of Hadoop Cluster.

Good amount of experience on Cloud technology AWS Cloud.

Good experience on ETL development/Support, using Kafka, Flume, SQL, Sqoop and Data Lake layer development.

Experience integration of Kafka with Spark for real time data processing and Spark Streaming.

Experience in setting up the High-Availability Hadoop Clusters.

Experience in Sparks, and Spark Streaming support.

Experience in Hadoop administration with good knowledge about Hadoop features like safe mode, auditing.

Experience in understanding the client requirements and translate business requirements into functional and technical designs with rich analysis, design skills and Project management.

End to end resourcing.

Extensively worked in EAI & JAVA technologies.

Excellent presentation and interpersonal skills, having very good team player willing to take on new and varied projects and an ability to handle.

Technology:

Cloud: AWS Cloud, AWS Cloud – EC2, S3, IAM, KMS, Lambda, EBS, ELB, DynamoDB, Auto Scaling, VPC, Cloud watch, RDS, Cloud Formation, Cloud front, SNS, SQS, Dev Ops etc.

Big Data Ecosystem: Hadoop, HDFS, MapReduce, Spark, Hive, Pig, Sqoop, Oozie, Flume, Zookeeper, Kafka, HBase, Cloudera Manager, Apache Ambari, Ganglia, Nagios, Talend, Apache NiFI

Operating Systems: Windows, Linux - Red hat, CentOS, Ubuntu

Servers: Tomcat 5.x, BEA Web logic7.x, Oracle GoldenGet 11.2

Languages & Scripting: JDK, HTML, Java Script, Perl Script, Shell scripting

J2EE Technologies: JDBC, Servlets, JSP, Struts1.1, spring, Hibernate

Tools: Agile, UML, Design Patterns, Log4J, Ant, IBM I-LOG J-Rule, Visio, XML Canon, SVN,QC,HPSC, DB symphony, Service Manager, Service Now, BMC Remedy

Data Bases: Oracle 9i, Oracle 10/11g, SQL Server 2005x

IDE: Eclipse3.x

TIBCO EAI: TIBCO BW 5.x, Hawk 4.x, TIBCO Adaptor (ADB, SAP, file), TIBCO Administrator-5.x, TIBCO BPM 3.4, TIBCO MFT 7.1, TIBCO Active Spaces2.1, TIBCO Business Connect 5.3, EMS 4.x, TIBCOActive Matrix Service Grid, IBM I-LOG J-Rule.

Global Exposure: Worked in the USA /UK and USA clients, Equifax, T- Mobile, Novalis, Travelers, Macys, Deutsche Bank.

Education: Master’s Degree in Computer Application (M.C.A)

Professional Experience :

Role: Senior Project Manager April’22 to till date

Client: Equifax Inc.

Org: UST/ PTEC (Product Technology and Engineering Center)

Roles & Responsibilities:

Accountable as one point of contact for the project deliverables from all the 4 scrum teams.

Play the Scrum Master role dedicatedly for 4 scrum teams.

Embrace agile methodologies - Nexus Scrum for all the 4 scrum teams across the program.

Ensure that team follow agile ceremonies and practices in line with Nexus Scrum.

Work closely with architects to ensure the best engineering practices, process and tools are adopted across the program.

Facilitate scrum ceremonies - Sprint Planning, DailyScrum,Sprint Demo and Sprint Retrospection.

Facilitate User story refinement session for the upcoming development epics.

Communicate the project progress through various agile related metrics and performance indicators.

End to end resourcing for project & Client

Continuously anticipate and eliminate project risks. Encourage Technical leads, Technical Product owners to manage dependencies and integrations issues,

Actively participate in daily tech stand-up calls, collaborate with global team for resolving technical dependencies.

Ensure on-time software release by closely collaborating with Technical Product owners and Operations team.

Responsible to monitor team progress against committed goal.

Support Product Owner in maintaining Product Backlog with priority and business value.

Role: Project Manager Sep’21 – April’22

Client: Dell

Org: UST Global

Roles & Responsibilities:

Coordinating with Business to understand the business requirements, analyzing functional Requirement, key developments and implementations of projects on each phases/release using Dell Cloud.

Following Agile Methodologies, Sprint planning, grooming for User Stories and leading daily Scrum / Standup meetings and coach the team members on Agile methodology implementations.

Guiding the team on creating the STM (Source Target Mapping) data archival along with that of guiding the right methodology and tools using ECS, Isilon.

Communicate with the client day to day basis focusing on the project constraints- include scope, time, deadline, quality, deliverables, resources engagement & risks, etc.

Responsible to propose POC for various legacy system changes and their sign-off for implementation.

Responsible for preproduction and postproduction plan & testing.

Coordinating with various team for migrating data from Data center to Data center.

Proposed & Implemented log team archival solution for data centers.

Collaborate with business owner and various project teams for production changes and release.

Coordinating with multiples team for finalize prod changes using Service Now tool.

Environment: ECS, Isilon,Service Now, Dell Cloud, Cassandra, Vault

Role: Architect/ Project Manager, Freelancer

Client: Cal State Aug’17-Sep21

Roles & Responsibilities:

Configured the cluster to achieve the optimal results by fine-tuning the cluster using Cloudera distribution.

Making sure Back up policies for the high availability of cluster at any point of time.

Extremely handle commission and decommission Nodes, targeting to load balancing as per the project plan.

Actively Host Monitored and making sure that data have been collect for health and metric information about hosts.

Services have been Monitored and health and metric information about services and activity information from the YARN and Impala services.

Alert has been generated and delivers alerts for certain types of events.

Planned the cluster, on the number of nodes based on the estimated amount of data flow.

Approved and roll backed roles of services as per the ongoing project plan.

Developed data pipelines with combinations of Hive, Pig and Sqoop jobs scheduled with Oozie.

Deploy and monitor scalable infrastructure on Amazon web services (AWS) & configuration management.

Extremely worked to manage data link coming from different sources into HDFS through Sqoop, Flume.

Troubleshooting and monitoring Hadoop services using Cloudera manager.

Environment: AWS Cloud, Cloudera, Jira, Shell Script

Client: Travelers, Hartford Sep’ 2015– June 2017

Role: Project Lead/ Architect

Org: L& T Infotech

Roles & Responsibilities:

Configured the cluster to achieve the optimal results by fine-tuning the cluster using Cloudera distribution.

Making sure Back up policies for the high availability of cluster at any point of time.

Extremely handle commission and decommission Nodes, targeting to load balancing as per the project plan.

Designed in implementing the Name Node High Availability (HA), Resource Manager High Availability (HA) for Hadoop clusters and designing automatic failover control using Zookeeper and Quorum Journal Nodes.

Implementing Hadoop security solutions Kerberos for securing Hadoop clusters

Integrated Oozie with the rest of the Hadoop stack supporting several types of Hadoop jobs Map Reduce, Pig, Hive and Sqoop as well as system specific jobs such as Java programs and Shell scripts.

The managed full data mine from the huge data volumes is exported to MySQL using Sqoop.

Configured Hive Metastore to use MySQL database to establish multiple user connections to hive tables.

Performed administration using Hue WebUI to create and manage user spaces in HDFS.

Configured the Hadoop Map Reduce and HDFS core properties as a part of performance tuning to achieve high computational performance.

Configured Cloudera for receiving alerts on critical failures in the cluster by integrating with custom Shell Scripts.

Maintained comprehensive project, technical and architectural documentation for enterprise systems.

Deploy and monitor scalable infrastructure on Amazon web services (AWS) & configuration management.

Environment: CDH 5.4.5, Hive1.2.1, HBase1.1.2, Flume1.5.2, Map Reduce, Sqoop1.4.6, Spark2.1.0, Kafka, Shell Script, Oozie 4.2.0, Zookeeper 3.4.6.

Client: Macys, Atlanta Jan’15 – Jun ‘15

Role: Project lead/Architect

Roles &Responsibilities:

Configured the cluster to achieve the optimal results by fine tuning the cluster using Apache Ambari.

Implemented Fair schedulers and Capacity schedulers to share the resources of the cluster with other teams to run map reduce jobs.

Developed data pipelines with combinations of Hive, Pig and Sqoop jobs scheduled with Oozie.

Created data link to transfer data between database and HDFS and vice-versa using Sqoop scripts.

Developed real time pipeline for streaming data using Kafka.

Worked with Hive data warehouse to HDFS to identify issues and behavioral patterns and Configured and enable services for Microsoft R Server for analytical team.

Worked with operational team for Commissioning and Decommissioning of data nodes on the Hadoop Cluster.

Performed both major and minor upgrades to the existing cluster and also rolling back to the previous version.

Enabled Kerberos for authorization and authentication make sure the cluster safety.

Enabled HA for Name Node, Resource Manager, Yarn Configuration and Hive Metastore.

Implement, maintain and support reliable, timely and reproducible builds for project teams.

Interact with developers and Enterprise Configuration Management team for changes to best practices and tools to eliminate non-efficient practices and bottlenecks.

Designed the cluster so that only one secondary name node daemon could be run at any given time.

Environment: Hadoop HDP 2.3, Oracle, MS-SQL, Zookeeper3.4.6, Oozie 4.1.0, MapReduce, YARN 2.6.1, Nagios, REST APIs, Amazon web services, HDFS, Sqoop1.4.6, Hive 1.2.1,Pig 0.15.0.

Client: Novelis, Atlanta Nov’13 – Sep’14

Role: Hadoop Analyst /Project lead

Org: Capgemini India

Roles &Responsibilities:

Installed and configured Hadoop clusters for Dev,QA and Production environments as per the project plan

Developed Oozie workflows to automate data extraction process from data warehouses.

Supported Map Reduce Programs those are running on the cluster. Monitoring and tuning Map Reduce Programs running on the cluster.

Created and maintained technical documentation for launching HADOOP Clusters and for executing pig Scripts.

Extremely worked in creating Hive tables, loading with data.

Extremely worked in loading data into HBase using HBase Shell, HBase Client API, Pig and Sqoop.

Responsible for architecting Hadoop clusters with Cloudera distribution platform.

Performed performance tuning and troubleshooting of various ecosystems jobs by analyzing and reviewing Hadoop log files.

Configured Spark Streaming to receive real time data from the Kafka and store the stream data to HDFS.

Involved in creating Hive, Impala tables, and loading and data using hive queries

Involved in running Hadoop jobs for processing millions of records

Installed and configured the Hadoop name node HA service using Zookeeper.

Extremely worked to manage data link coming from different sources into HDFS through Sqoop, Flume.

Troubleshooting and monitoring Hadoop services using Cloudera manager.

Environment: CentOS4, SAP, EAI Integration, Hadoop HDP 2.1, HIVE, HDFS, Sqoop, FTP, Apache, SMTP, ETL, Talend, SQL, JAVA, VMware, HBase, Apache-Tomcat.

Client: Hisna, CA May’12 – Dec'12

Role: System Analyst

Org: APT Inc

Roles &Responsibilities:

Responsible for implementation and ongoing administration of EAI infrastructure.

Worked on the Requirement gathering Analysis, Design for set up TIBCO environment

Installed, Configured and Maintained various suite of TIBCO product

Designed, configured and managed the backup and disaster recovery for EAI server.

Analyzed Business Requirements and Identified mapping documents required for system and functional testing efforts for all test scenarios.

Setup the alerting mechanism using TIBCO Hawk Rule base implementation for application level, server level

Worked on creating EAR files and deploying BW projects in various environments

Implemented various technical Solutions using TIBCO Suite of solutions and Messaging.

Worked in configuration of File adapter publication services to get the data from Files

Worked in all stages of the Software Development Life Cycle (SDLC)

Worked on the deployment process of the several ongoing EAI projects

Maintain Middleware application environments and deployment process

Extensive work for Middleware administration

Worked for Server building, Domain setup for EAI environment

Environment: TIBCO BW 5.9, TIBCO Hawk4.6, TIBCO administrator 5.6, EMS 5.1, Red-Hat Linux, SAPR/3, PeopleSoft adapter, MFT 7.0

June 2004 -2012 April:

Work on multiple projects for Client: Played role of Developer, Technical lead. Deutsche Bank (NYC) TATA Consultancy Services (TCS) Jan 05 2011 to March13, 2012, T-Mobile (Seattle), HCL Technologies, November12th2009 to December2010. Hewlett Packard Global Soft Ltd,May2006 -Feb2009 Client Mphasis India, June2004- May2006

Responsibilities:

Developed the technical design document and interface design document based on requirements.

Worked on developing various mapping matrix documents for data transformations.

Worked on code reviews to review the developed code in BW applications.

Designed, developed, and implemented Middleware application using TIBCO suite of products

Installed, Configured, Upgraded and hot fixed TIBCO, JAVA, SAP component

Extremely work Servlets and JSPs based on MVC pattern using Struts and Spring framework.

Used HP Service Center tool for call tracking for Incident Management and Change Management

Worked in the design of the application using UML/Rational Rose.

Worked in developing presentation layer with JSP

Developing the front-end logic and validations

Extended support -GO-Live, Production support

Developed JDBC code for backend processing.

Environment : TIBCO BW 5.6, TIBCO administrator 5.3, Hawk 4.6, Adapter file SAP, Business Connect 5.3, MFT7.0, UNIX Red Hat, dB Symphony Tools, JDK 1.5,, Spring framework, Web Service, i-Batis, SOAP-UI, WSDL, Tuxedo, Web Logic 10.3, JDK 1.5, Eclipse 3.3, AccuRev, Active Matrix, TIBCO BW 5.6, TIBCO administrator 5.1, Hawk 4.5, Adapter, Spring framework, Web Service, i-Batis, SOAP-UI, WSDL, Tuxedo, Web Logic 10.3, JDK 1.5, Eclipse 3.3, AccuRev, Active Matrix, TIBCO BW 5.3, TIBCO Administrator 5.0, TIBCO Hawk 4.5, TIBCO adaptor, JSP2.0, JavaScript, Web Logic 9.1, JDK 1.5, Eclipse 3.0, Unix, JDK1.4. WebLogic7.x, Oracle, DB2, SQL Server



Contact this candidate