Resume

Sign in

Manager Data

Location:
Torrance, California, United States
Posted:
September 16, 2018

Contact this candidate

Resume:

Apurbalal Gayen

Email : ac62dr@r.postjobfree.com

Phone : 424-***-****

Professional Summary:

10+ years of Experience in Analysis, Design, Development, Support and Testing of the software applications. Experience in Big Data Hadoop and Enterprise application Integration (EAI).

Experience in developing and deploying enterprise based applications using major components in Hadoop ecosystem, Hadoop 2.x, YARN, Hive, Pig, Map Reduce, HBase, Flume, Sqoop, Spark, Storm, Kafka, Oozie and Zookeeper.

Experience in Operations, developing, maintaining, monitoring and upgrading Hadoop Clusters in cloud as well as in-house(Apache Hadoop, Hortonworks and Cloudera distributions).

Experience in installation, configuration, management and deployment of Big Data solutions and the underlying infrastructure of Hadoop Cluster.

Good knowledge on Cloud technology AWS Cloud.

Good experience on ETL development using Kafka, Flume, and Sqoop.

Involved in all phases of Software Development Life Cycle (SDLC) and Worked on all activities related to the Operations, implementation, administration and support.

Experience in deploying, configuring, supporting and managing Hadoop Clusters of Cloudera and Hortonworks distributions on Linux platform.

Experience in installing, configuring Hive, its services and Metastore. Exposure to Hive Querying Language, knowledge about tables like importing data, altering and dropping tables.

Experience in installing and running Pig, its execution types, Grunt, Pig Latin Editors. Good knowledge about how to load, store, filter data and also combining and splitting data.

Experience integration of Kafka with Spark for real time data processing and Spark Streaming.

Experience in setting up the High-Availability Hadoop Clusters.

Experience in SparkSQL, and Spark Streaming.

Experience in Hadoop administration with good knowledge about Hadoop features like safe mode, auditing.

Experience in understanding the client requirements and translate business requirements into functional and technical designs with rich analysis, design skills

Excellent presentation and interpersonal skills having very good team player willing to take on new and varied projects and an ability to handle.

Technical Skills:

Big Data Ecosystem : Hadoop, HDFS, MapReduce, Spark, Hive, Pig, Sqoop, Oozie, Flume, Zookeeper, Kafka, HBase, Cloudera Manager, Apache Ambari, Ganglia, Nagios, Talend, Apacha NiFI

Operating Systems : Windows, Linux - Red hat, CentOS,Ubuntu

Servers : Tomcat 5.x, BEA Web logic7.x, Oracle GoldenGet 11.2

Languages & Scripting : Core Java, HTML, Java Script, Perl Script, Shell scripting

J2EE Technologies: JDBC, Servlets, JSP, Struts1.1, spring, Hibernate

Tools : UML, Design Patterns, Log4J, Ant, IBM I-LOG J-Rule, Visio, XML Canon, SVN,QC,HPSC, DB symphony, Service Manager, Service Now, BMC Remedy

Data Bases : Oracle 9i, Oracle 10/11g, SQL Server 2005x

IDE : Eclipse3.x

TIBCO EAI : TIBCO BW 5.x, Hawk 4.x, TIBCO Adaptor (ADB, SAP, file),TIBCO Administrator-5.x, TIBCO BPM 3.4, TIBCO MFT 7.1, TIBCO Active Spaces2.1,TIBCO Business Connect 5.3, EMS 4.x, TIBCOActive Matrix Service Grid, IBM I-LOG J-Rule

Education: Master’s Degree in Computer Application(M.C.A),West Bengal University of Technology, India

Professional Experience :

Client: Travelers, Hartford Sep’15– till date

Role: Hadoop Administration

Roles & Responsibilities:

Configured the cluster to achieve the optimal results by fine-tuning the cluster using Cloudera distribution.

Making sure Back up policies for the high availability of cluster at any point of time.

Extremely handle commission and decommission Nodes, targeting to load balancing as per the project plan.

Designed in implementing the Name Node High Availability(HA), Resource Manager High Availability(HA) for Hadoop clusters and designing automatic failover control using Zookeeper and Quorum Journal Nodes.

Implementing Hadoop security solutions Kerberos for securing Hadoop clusters

Integrated Oozie with the rest of the Hadoop stack supporting several types of Hadoop jobs Map Reduce, Pig, Hive and Sqoop as well as system specific jobs such as Java programs and Shell scripts.

The managed full data mine from the huge data volumes is exported to MySQL using Sqoop.

Configured Hive Metastore to use MySQL database to establish multiple user connections to hive tables.

Performed administration using Hue WebUI to create and manage user spaces in HDFS.

Configured the Hadoop Map Reduce and HDFS core properties as a part of performance tuning to achieve high computational performance.

Configured Cloudera for receiving alerts on critical failures in the cluster by integrating with custom Shell Scripts.

Maintained comprehensive project, technical and architectural documentation for enterprise systems.

Deploy and monitor scalable infrastructure on Amazon web services (AWS) & configuration management.

Environment: CDH 5.4.5, Hive1.2.1, HBase1.1.2, Flume1.5.2, Map Reduce, Sqoop1.4.6, Spark2.1.0, Kafka, Shell Script, Oozie 4.2.0, Zookeeper 3.4.6.

Client: Macys, Atlanta Jan’15 – Aug'15

Role: Hadoop Administrator

Roles &Responsibilities:

Configured the cluster to achieve the optimal results by fine tuning the cluster using Apache Ambari.

Implemented Fair schedulers and Capacity schedulers to share the resources of the cluster with other teams to run map reduce jobs.

Developed data pipelines with combinations of Hive, Pig and Sqoop jobs scheduled with Oozie.

Created data link to transfer data between database and HDFS and vice-versa using Sqoop scripts.

Developed real time pipeline for streaming data using Kafka.

Worked with Hive data warehouse to HDFS to identify issues and behavioral patterns and Configured and enable services for Microsoft R Server for analytical team.

Worked with operational team for Commissioning and Decommissioning of data nodes on the Hadoop Cluster.

Performed both major and minor upgrades to the existing cluster and also rolling back to the previous version.

Enabled Kerberos for authorization and authentication make sure the cluster safety.

Enabled HA for Name Node, Resource Manager, Yarn Configuration and Hive Metastore.

Implement, maintain and support reliable, timely and reproducible builds for project teams.

Interact with developers and Enterprise Configuration Management team for changes to best practices and tools to eliminate non-efficient practices and bottlenecks.

Designed the cluster so that only one secondary name node daemon could be run at any given time.

Environment : Hadoop HDP 2.3, Oracle, MS-SQL, Zookeeper3.4.6, Oozie 4.1.0, MapReduce, YARN 2.6.1, Nagios, REST APIs, Amazon web services, HDFS, Sqoop1.4.6, Hive 1.2.1,Pig 0.15.0.

Client: Novelis, Atlanta Jan’13 -Dec'14

Role: Hadoop Administrator and Analyst

Roles &Responsibilities:

Installed and configured Hadoop clusters for Dev,QA and Production environments as per the project plan.

Developed Oozie workflows to automate data extraction process from data warehouses.

Supported Map Reduce Programs those are running on the cluster. Monitoring and tuning Map Reduce Programs running .

Created and maintained Technical documentation for launching HADOOP Clusters and for executing pig Scripts.

Extremely worked in creating Hive tables, loading with data for loading .

Extremely worked in loading data into HBase using HBase Shell, HBase Client API, Pig and Sqoop.

Responsible for architecting Hadoop clusters with Cloudera distribution platform.

Performed performance tuning and troubleshooting of various ecosystems jobs by analyzing and reviewing Hadoop log files.

Configured Spark Streaming to receive real time data from the Kafka and store the stream data to HDFS.

Involved in creating Hive, Impala tables, and loading and data using hive queries

Involved in running Hadoop jobs for processing millions of records

Installed and configured the Hadoop name node HA service using Zookeeper.

Extremely worked to manage data link coming from different sources into HDFS through Sqoop, Flume.

Troubleshooting and monitoring Hadoop services using Cloudera manager.

Environment: CentOS4, Hadoop HDP 2.1, HIVE, HDFS, Sqoop, FTP, Apache, SMTP, ETL, Talend, SQL, JAVA, VMware, HBase, Apache-Tomcat.

Client : Hisna, CA Jul’12 – Dec'12

Role: System Administrator

Roles &Responsibilities:

Responsible for implementation and ongoing administration of EAI infrastructure.

Worked on the Requirement gathering Analysis, Design for set up TIBCO environment

Installed, Configured and Maintained various suite of TIBCO product

Designed, configured and managed the backup and disaster recovery for EAI server.

Analyzed Business Requirements and Identified mapping documents required for system and functional testing efforts for all test scenarios.

Setup the alerting mechanism using TIBCO Hawk Rule base implementation for application level, server level

Worked on creating EAR files and deploying BW projects in various environments

Implemented various technical Solutions using TIBCO Suite of solutions and Messaging.

Worked in configuration of File adapter publication services to get the data from Files

Worked in all stages of the Software Development Life Cycle (SDLC)

Worked on the deployment process of the several ongoing EAI projects

Maintain Middleware application environments and deployment process

Extensive work for Middleware administration

Worked for Server building,Domain setup for EAI environment

Environment: TIBCO BW 5.9, TIBCO Hawk4.6, TIBCO administrator 5.6, EMS 5.1, Red-Hat Linux, SAPR/3, PeopleSoft adapter, MFT 7.0

Client: APT INC, MN April '12 – July'12

Role : Consultant

Responsibilities:

Understand and articulate complex business issues LLD and HLD for various project

Ensure a smooth transition between pre-sales and technical work

Worked in all stages of the Software Development Life Cycle (SDLC)

Responsible for delivering presentations to customers, partners

Produce written proposals and technical responses to RFIs and RFPs

Conduct hands-on demonstrations and knowledge transfer

Develop customer solution proposals and supporting documentation as per the plan

Lead a team of 10 members

Environment: Microsoft office, Middleware Suite of products, TIBCO BW 5.7, TIBCO Hawk4.6, TIBCO administrator 5.x, EMS 4.5, TIBCO Business Connect 5.3(B2B), MFT7.0

Client : Deutsche Bank, NYC Jan'11- Mar' 12

Role : Project Lead

Responsibilities:

Developed the technical design document and interface design document based on requirements documents.

Worked in configuration of File adapter publication services to get the data from File.

Worked in developing interfaces which are part of adapter’s application.

Worked in developing various mapping matrix documents for data transformations.

Worked in code reviews to review the developed code in BW applications.

Created .ear files out the developed projects and deployed the applications into different environments by using TIBCO Administrator GUI.

Developed various Hawk rule bases to manage and monitor the deployed processes.

Monitored BW engine, Hawk, file and SAP Adapter

Extended support -GO-Live, Production support

Managed and lead a team of 16 members

Environment : TIBCO BW 5.6, TIBCO administrator 5.3, Hawk 4.6, Adapter file SAP, Business Connect 5.3, MFT7.0, UNIX Red Hat, dB Symphony Tools, JDK 1.5

Client : T-Mobile, Seattle April '09- Dec'10

Role: Consultant

Responsibilities:

Coordinated with onsite team to get the requirements

Developed the technical design document and interface design document based on requirements documents.

Developed various TIBCO Active Matrix BW processes by using different types of adapters.

Designed various input, output and fault messages by using XSD schemas.

Monitoring BW engine, Hawk, file and Sap adapter

Designing SID and DDD for Web Services, Hawk rules

Used TIBCO General Interface solution for rapidly building and deploying rich internet applications.

Generated unit tests for each operation using SoapUI

Modifying the old service as per the business requirement

Creating the new Web service as per the requirement

Configure the services with RSP

Extended support -GO-Live, Production support

Led a team of 12 members

Environment: TIBCO BW 5.6, TIBCO administrator 5.1, Hawk 4.5, Adapter, Spring framework, Web Service, i-Batis, SOAP-UI, WSDL, Tuxedo, Web Logic 10.3, JDK 1.5, Eclipse 3.3, AccuRev, Active Matrix

Client : Hewlett Packard Global Soft Ltd, India May '06-Feb'09

Role: Technical Lead

Responsibilities:

Designed, developed, and implemented Middleware application using TIBCO suite of products

Developed various HAWK rule bases to manage and monitor the deployed processes.

Installed, Configured, Upgraded and hot fixed TIBCO components.

Generated unit tests for each operation using SoapUI.

Coordinate Development, Test and Configuration team.

Extremely work Servlets and JSPs based on MVC pattern using Struts and Spring framework.

Used HP Service Center tool for call tracking for Incident Management and Change Management

Worked in testing and debugging process

Worked in the design of the application using UML/Rational Rose.

Worked in coding Java

Worked in developing presentation layer with JSP

Developing the front-end logic and validations

Extended support -GO-Live, Production support

Developed JDBC code for backend processing.

Configured the connection pools and security for the server

Environment: TIBCO BW 5.3, TIBCO Administrator 5.0, TIBCO Hawk 4.5, TIBCO adaptor, JSP2.0, JavaScript, Web Logic 9.1, JDK 1.5, Eclipse 3.0, Unix, JDK1.4

Client : MphasiS, India Jun '04-May'06

Role : Developer

Responsibilities:.

Design, implement business for front-end logic and validations using Core Java.

Developed the Server site programming using Servlets.

Developed front end using JSP and HTML.

Established JDBC API Calls.

Worked in Oracle database designing.

Prepared test plan and test cases for various module.

Provided the production support after deployment.

Worked in design of the application using UML/Rational Rose.

Designed, developed and implemented various interfaces.

Extended support -GO-Live, Production support.



Contact this candidate