Post Job Free

Resume

Sign in

Data Project

Location:
Bulandshahr, UP, 203001, India
Posted:
July 19, 2015

Contact this candidate

Resume:

AMIT SINGH Mobile: +91-991*******

Technical Architect - Big Data/Hadoop acqteb@r.postjobfree.com

Summary

* ***** ** ******** *********** experience in domains like Life Insurance, Telecommunications, Platform development, Online portal.

Cloudera Certified Hadoop Developer having good experience in Hadoop 2.4, Map Reduce, YARN, HDFS and ecosystem namely Pig, Hive, Sqoop 2, Flume.

Having hands on experience around Lambda Architecture and streaming and messaging frameworks like Kafka, spark, Confluent, Camus.

Having hands on experience around NoSQL Databases like MongoDB, Cassandra.

Having hands on experience on Tableau Public BI tool.

Having knowledge around enterprise search engines like Apache Solr, Elastic Search, Kibana.

Having good experience in Core JAVA, JSP, Servlets, SOA Architecture, Spring DI, Spring Core, Spring MVC, Liferay, Middleware Technologies.

Lead team as Scrum Master and Well versed with Agile methodologies.

Design and Development experience in EAI middleware suite SUN Java CAPS 5.1.3 and 6.2.

Technically adept and confident software programmer with good skills in coding.

Ability to translate Business Requirements to Technical Specifications.

Experience of all the phases of Project lifecycle like Requirement Analysis, Functional Specifications, Design Specifications, Coding, Testing, Implementation and Support.

Possesses Client site experience with Aegon (United States) & Safaricom (Kenya).

Technical Proficiency

Languages: Java

Big Data Technologies: Hadoop 2.4, HDFS, Pig, Hive, Sqoop 2, YARN, Map Reduce

Kafka, Camus, Lambda Architecture.

NoSQL Databases: MongoDB, Cassandra

BI Tool: Tableau Public

Frameworks: Spring, Liferay, Struts 1.2, Hibernate

Web Technologies: XML, HTML, JavaScript, Servlets, JSP

Portal: Liferay Portal

Application Servers: Jboss, Websphere, Weblogic, Tomcat

Continous Integration: Jenkins, Hudson, Nexus

Tools: Eclipse, Apache Ant, Maven, JIRA, Crucible

Configuration Management: SVN, PVCS, VSS

Visa Status

U.S.A B1 Visa – Valid till November, 2020.

Education

Bachelor of Technology in Computer Engineering from F/O Engg. & Technology, Jamia Millia Islamia, New Delhi 2004 – 2008

Intermediate from DAV Public School (CBSE Board), Meerut

High School from DAV Public School (CBSE Board), Meerut

Employment

MagicBricks, Noida as a Technical Architect – Big Data/Hadoop (November,2014 – present)

Project: PropIndex Automation using Hadoop, Map Reduce, Sqoop 2, Spring Batch, MongoDB, Tableau Public, DB2.

Client: MagicBricks

Duration: November 2014–present

Description: As part of www.magicbricks.com a Property Index report is generated out of data collected by user activity on website. This report was generated quarterly and required manual effort. The process to generate was DBA used to extract the data from DB2 tables and put it into excel and gave it to content and research team. The content team used to do some excel formulas and generate static pdf.

I used hadoop for Batch processing, Map Reduce for performing computations, MongoDB for storing pre-computed results, Sqoop 2 for exporting data from DB2 to HDFS and Tableau Public for generating graphs using MongoDB as data source.

MagicBricks, Noida as a Technical Architect – Big Data/Hadoop (May,2015 – present)

Project: Data lake creation with all data sources using Kafka, Confluent, Camus, Hadoop.

Client: MagicBricks

Duration: May 2015–present

Description: Within magicbricks there are plenty of data sources like every user activity is tracked and dumped into Db2, tracking and alerts data, BOT identification data, transactional data etc. The data is maintained in various data repository like DB2, MongoDB. They purge the old data as it crosses the threshold limits.

I created an architecture and using the (distributed messaging systems) Kafka, Camus (ETL - Kafka topic to Hadoop) and HDFS. I successfully achieved to transport one such data source i.e. BOT traffic data directly to Data lake.

Pitney Bowes Software, Noida as a Senior Software Engineer (October,2011-August,2014)

1.Project: Keystone Platform - Security

Client: Different Products

Duration: February 2013-August 2013

Role: Understand the liferay autologin framework to leverage and extend single sign on capabilities and liferay service architecture to expose some API capabilities as web services.

Tech Stack: Java, Liferay Autologin Framework, Liferay Service Builder, Spring, JBoss 7, MySQL, Maven

Description:

The keystone platform provides different modules and integration points for different products. This project deals with the security aspect of the platform and utilizes the liferay autologin framework and service builder module for sso & ldap to provide a generic security component for different products.

Responsibilities:

Analysis of the requirements to define the functional specifications.

Development of the security module using the liferay’s service builder & Autologin framework.

Definition of the deployment process on various platforms including app servers and for databases.

2.Project: Keystone Platform

Client: Internal Products

Duration: April 2012-January 2013

Role: Execute POC's & suggest approach related to various aspects of the platform like portal (liferay), jBPM designer integration with portal, identity manager (OpenAM).

Tech Stack: Apache Hadoop, Cassandra, Core Java, Liferay, Alloy UI, OSGI, Maven, JBoss

Description:

A lot of the products in the company can be orchestrated together to unlock new business opportunities but these products are working on different technologies, so a platform needs to be developed that will provide integrating capabilities for different products addressing common aspects like logging, security, workflow capabilities, portal & data services.

Responsibilities:

Execution of POC’s related to workflow, portal and data.

Suggestion of approaches based on the findings in the POC’s.

Active involvement with the architectural group in the technology selection process.

3.Project: eMessaging - Enhancements & Performance Improvement

Client: Santander, Citibank

Duration: October 2011-August 2014

Role: Development of integration module & performance improvement.

Tech Stack: Apache Hadoop, Cassandra, Core Java, Spring Core, My SQL, Tomcat

Description:

e-Messaging helps organizations maintain consistency and enhance personalization across

multi-channel communication, including automated email and SMS texts. With this easily managed e-Messaging, companies can vastly improve call center response to email and or text communications.

Responsibilities:

Evaluate product on Hadoop and NoSQL technology for performance bottlenecks and generating analytics for satisfying customer generated data processing.

Certifying the product against new platforms.

Development of the new module using the existing base framework..

Computer Sciences Corporation, Noida as an Application Developer (Jul,2010-Oct,2011)

1.Project: Integrated Competency Centre

Client: AEGON

Duration: July 2010-October 2011

Role: Development, Implementation

Tech Stack: Java, JCAPS 5.1.3, 6.2, XML, WSDL, SOAP, MQ, JMS

Description:

ICC is the JAVA-CAPS based implementations on 5.1.3 and 6.2 versions enabling AEGON’s enterprise applications to talk to these interfaces and perform certain validations depending on the business requirement eq- Agent Validation, Policy Validation..

Responsibilities:

Development and testing of new interfaces created as per the business needs.

Integrate communication to various mainframe systems using Java Messaging service to post messages on various type of queues and topics.

Mahindra Comviva, Gurgaon as an Engineer (Jul,2008-June,2010)

1.Project: PreTUPS

Client: Safaricom, Tigo

Duration: July 2008-June 2010

Role: Development and Support

Tech Stack: Core Java, Struts 1.2, PL/SQL, Oracle 10g, Tomcat, Eclipse, VSS

Description:

PreTUPS sytem is web based application used for recharging and billing of prepaid subscribers. It enables the service provider to define the domains and hierarchy of entire retailers chain. The e-recharge is integrated with Intelligent Networks (IN), USSD Gateways, POS (External gateways) in order to receive requests for recharge and process. This is mobile-to-mobile top up system. PreTUPS provide the operator’s prepaid retailers with the ability to accept subscriber’s postpaid bill payments. This increases the operator’s collection points for its postpaid subscribers as they do not have to seek drop boxes or pay cheque to settle their postpaid bills.

Responsibilities:

Design HLD and LLD based on the Change Requests or new feature request.

Development, Unit Testing and Implementation along with support.

Environment Setup including Application Server setup and VSS related activities.

Database setup for the application.

Defect analysis and bug fixing.

Migration of Live Environment to new code base using downtime.

Trainings

Completed Scrum Master training programme at Pitney Bowes Software.

Completed Agile methodologies programme at Pitney Bowes Software.

Completed LOMA Insurance learning programme at CSC.

Personal Strengths

Good analytical skills and a quick learner.

Believe in teamwork and have ability to work collaboratively to achieve the maximum output.

Process oriented and hard working.

Personal Profile

Father’s Name : Suraj Pal Singh

Date of Birth : 11 June 1985

Gender : Male

Marital Status : Married

Nationality : Indian

Languages : English, Hindi

Passport : Yes



Contact this candidate