Post Job Free
Sign in

Java Data

Location:
United States
Posted:
September 26, 2013

Contact this candidate

Resume:

Hyder Khan

*****@********.***

703-***-****

Professional Summary:

* 7+Years of experience with emphasis on Big Data Technologies, Development and Design of Java based enterprise applications.

* More than two years of experience in Hadoop Development and Five years of Java Application Development

* Experience in installation, configuration, supporting and managing Hadoop clusters.

* Implemented in setting up standards and processes for Hadoop based application design and implementation.

* Responsible for writing MapReduce programs.

* Logical Implementation and interaction with HBase.

* Developed MapReduce jobs to automate transfer of data from HBase.

* Perform data analysis using Hive and Pig.

* Load log data into HDFS using Flume.

* Gained good knowledge on creating strategies on risky transactions.

* Successfully loaded files to Hive and HDFS from MongoDB.

* Worked in Multiple Environment in installation and configuration.

* Document and explain implemented processes and configurations in upgrades.

* Support development, testing, and operations teams during new system deployments.

* Evaluate and propose new tools and technologies to meet the needs of the organization.

* Experience in using Scoop,ZooKeeper and Cloudera Manager.

* Good Knowledge on Hadoop Cluster architecture and monitoring the cluster.

* Experience in Administering, Installation, configuration, troubleshooting, Security, Backup, Performance Monitoring and Fine-tuning of Linux Redhat.

* Worked on debugging tools such as Dtrace, Struss and Top. Expert in setting up SSH, SCP, SFTP connectivity between UNIX hosts.

* An excellent team player and self-starter with good communication skills and proven abilities to finish tasks before target deadlines.

Technical Skills:

* Programming Languages : Java 1.4, C++,C,SQL,PIG,PL/SQL.

* Java Technologies : JDBC.

* Frame Works : Jakarta Struts 1.1, JUnit and JTest, LDAP.

* Databases : Oracle8i/9i, NO SQL (HBase),MY SQL,MS SQL server.

* IDE’s & Utilities : Eclipse and JCreator, NetBeans.

* Web Dev. Technologies : HTML, XML.

* Protocols : TCP/IP, HTTP and HTTPS.

* Operating Systems : Linux, MacOS, WINDOWS 98/00/NT/XP.

* Hadoop ecosystem :Hadoop and MapReduce,Sqoop,Hive,PIG,HBASE,HDFS,

Zookeeper, Lucene, Sun Grid Engine Administration.

Education:

Bachelors in Information Technology (IT)-India.

Professional Experience:

Smith&Nephew Nov 2012 – Present

Memphis, TN

Role: Hadoop Developer

Responsibilities:

* Worked on analyzing Hadoop cluster and different big data analytic tools including Pig, Hbase database and Sqoop.

* Responsible for building scalable distributed data solutions using Hadoop.

* Implemented nine nodes CDH3Hadoop cluster on Red hat LINUX.

* Involved in loading data from LINUX file system to HDFS.

* Created HBase tables to store various data formats of PII data coming from different portfolios.

* Implemented test scripts to support test driven development and continuous integration.

* Worked on tuning the performance Pig queries.

* Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required.

* Responsible to manage data coming from different sources.

* Involved in loading data from UNIX file system to HDFS.

* services through Zookeeper.(NA)

* Experience in managing and reviewing Hadoop log files.

* Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team.

* Installed Oozie workflow engine to run multiple Hive and pig jobs. (Did not mentioned Hive in the project )

* Analyzed large amounts of data sets to determine optimal way to aggregate and report on it.

* Supported in setting up QA environment and updating configurations for implementing scripts with Pig and Sqoop.

Environment:

Hadoop, HDFS, Pig, Sqoop, HBase, Shell Scripting, Ubuntu, Linux Red Hat.

Kohls Feb 2011 – Oct 2012

Milwaukee, WI.

Role: Hadoop Developer

Responsibilities:

* Involved in review of functional and non-functional requirements.

* Installed and configured HadoopMapreduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.

* Importing and exporting data into HDFS and Hive using Sqoop.

* Experienced in defining job flows.

* Experienced in managing and reviewing Hadoop log files.

* Extracted files from CouchDB through Sqoop and placed in HDFS and processed.

* Experienced in running Hadoop streaming jobs to process terabytes of xml format data.

* Load and transform large sets of structured, semi structured and unstructured data.

* Responsible to manage data coming from different sources.

* Got good experience with NOSQLdatabase.

* Involved in loading data from UNIX file system to HDFS.

* Installed and configured Hive and also written Hive UDFs.

* Involved in creating Hive tables, loading with data and writing hive queries which will run internally in map reduce way.

* Gained very good business knowledge on health insurance, claim processing, fraud suspect identification, appeals process etc.

* Developed a custom FileSystem plug in for Hadoop so it can access files on Data Platform.

* This plugin allows HadoopMapReduce programs, HBase, Pig and Hive to work unmodified and access files directly.

* Designed and implemented Mapreduce-based large-scale parallel relation-learning system

* Extracted feeds form social media sites such as Facebook, Twitter using Python scripts.

* Setup and benchmarked Hadoop/HBase clusters for internal use

* Setup Hadoop cluster on Amazon EC2 using whirr for POC.

* Wrote recommendation engine using mahout.

Environment:Java 6, Eclipse, Oracle 10g, Sub Version, Hadoop, Hive, HBase, Linux,, MapReduce, HDFS, Hive, Java (JDK 1.6), Hadoop Distribution of HortonWorks, Cloudera, MapReduce, DataStax, IBM DataStage 8.1, Oracle 11g / 10g, PL/SQL, SQL*PLUS, Toad 9.6, Windows NT, UNIX Shell Scripting.

Boston Scientific, Nov 2010 – Jan 2011

Natick, MA.

Role: Hadoop and Java Developer

Responsibilities:

* Worked with several clients with day to day requests and responsibilities.

* Installed/Configured/Maintained Apache Hadoop clusters for application development and Hadoop tools like Hive, Pig, HBase, Zookeeper and Sqoop.

* Involved in analyzing system failures, identifying root causes and recommended course of actions.

* Worked on Hive for exposing data for further analysis and for generating transforming files from different analytical formats to text files.

* Wrote the shell scripts to monitor the health check of Hadoop daemon services and respond accordingly to any warning or failure conditions.

* Managing and scheduling Jobs on a Hadoop cluster.

* Utilized Java and MySQL from day to day to debug and fix issues with client processes.

* Developed, tested, and implemented financial-services application to bring multiple clients into standard database format.

* Assisted in designing, building, and maintaining database to analyze life cycle of checking and debit transactions.

* Excellent JAVA, J2EE application development skills with strong experience in Object Oriented Analysis, Extensively involved throughout Software Development Life Cycle (SDLC).

* Strong experience of J2SE, XML, Web Services, WSDL, SOAP, UDDI, TCP, IP.

* Strong experience of software and system development using JSP, Servlet, Java Server Face, EJB, JDBC, JNDI, Struts, Maven, Trac, Subversion, JUnit, SQL language.

* Rich experience of database design and hands-on experience of large database systems: Oracle 8i and Oracle 9i, DB2, PL, SQL.

* Hands-on experience of Sun One Application Server, Web logic Application Server, Web Sphere Application Server, Web Sphere Portal Server, and J2EE application deployment technology.

Environment:Hive, Pig, HBase, Zookeeper,Sqoop, Java, JDBC, JNDI, Struts, Maven, Trac, Subversion, JUnit, SQL language, spring, Hibernate, Junit, Oracle, XML, AltovaXmlSpy, Putty and Eclipse.

Pfizer Global Research & Development, Nov 2007 – Oct 2010

New York, NY.

Role: Java/JEE Architect/developer

Responsibilities:

* Architected a JSF, Web sphere, Oracle, spring, and Hibernate based 24x7 Web application.

* Built an end to end vertical slice for a JEE based billing application using popular frameworks like Spring, Hibernate, JSF, Facelets, XHTML, Maven2, and Ajax by applying OO design concepts, JEE &GoF design patterns, and best practices.

* Integrated other sub-systems like loans application, equity markets online application system, and documentation system with the structured products application through JMS, Websphere MQ, SOAP based Web services, and XML.

* Designed the logical and physical data model, generated DDL scripts, and wrote DML scripts for Oracle 9i database.

* Tuned SQL statements, Hibernate mapping, and Websphere application server to improve performance, and consequently met the SLAs.

* Gathered business requirements and wrote functional specifications and detailed design documents.

* Improved the build process by migrating it from Ant to Maven2.

* Built and deployed Java applications into multiple Unix based environments and produced both unit and functional test results along with release notes.

Environment:Java 1.5, JSF Sun RI, Facelets, Ajax4JSF, Richfaces, Spring, XML, XSL, XSD, XHTML, Hibernate, Oracle 9i, PL/SQL, MINA, Spring-ws, SOAP Web service, Websphere, Oracle, JMX, ANT, Maven2, Continuum, JUnit, SVN, TDD, and XP.

Nash Infotech July 2006 – Sep 2007

India

Role: Java/J2EE developer (Development focus of Java/J2EE based applications.)

Responsibilities:

* Designed and developed Struts like MVC 2 Web framework using the front-controller design pattern, which is used successfully in a number of production systems.

* Spearheaded the “Quick Wins” project by working very closely with the business and end users to improve the current website’s ranking from being 23rd to 6th in just 3 months.

* Normalized Oracle database, conforming to design concepts and best practices.

* Resolved product complications at customer sites and funneled the insights to the development and deployment teams to adopt long term product development strategy with minimal roadblocks.

* Convinced business users and analysts with alternative solutions that are more robust and simpler to implement from technical perspective while satisfying the functional requirements from the business perspective.

* Applied design patterns and OO design concepts to improve the existing Java/JEE based code base.

* Identified and fixed transactional issues due to incorrect exception handling and concurrency issues due to unsynchronized block of code.

Environment: Java 1.2/1.3, Swing, Applet, Servlet, JSP, custom tags, JNDI, JDBC, XML, XSL, DTD, HTML, CSS, Java Script, Oracle, DB2, PL/SQL, Weblogic, JUnit, Log4J and CVS.



Contact this candidate