Post Job Free

Resume

Sign in

Senior Hadoop / ETL Developer

Location:
Jersey City, NJ, 07302
Posted:
May 29, 2017

Contact this candidate

Resume:

Marimuthu Murugan

Employer details:

Ganesh

+1-516-***-****

acz9zc@r.postjobfree.com

PROFESSIONAL SYNOPSIS

Having more than 12 Years of experience in Complete Software Development Life Cycle including Analysis/Planning, Design, Development, Testing and Implementation as Technical expertise and Managerial in Information Technology.

Having around 3 years of Experience in Hadoop and 7 years of experience in ETL Development.

Also 3 years of experience in Mainframe/Teradata/Testing.

Worked on various projects involving Data warehousing projects using Informatica Power Center 9.6.1 / 9.1 and Datastage 8.5 and using Pig, Hive, Scoop, HBase and Cloudera Manager and importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.

Hands on proficiency in HP ALM Quality Center 11.5, Informatica, Teradata, Mainframe, SQL Assistant, GUI, Workbench, Control M, Endeavor, MS SQL Server, SQL Developer, DB2Studio

Good knowledge in Python, SAS, SPARK, Agile and scrum methodologies.

Worked in Computer Sciences Corp (CSC) as Full time employee for more than 12 years for onshore& offshore.

Involved in various phases of STLC activities that include Requirements Analysis, Test Planning, Test execution, Defect Management and Reporting.Performed ETL Testing, Unit testing, Functional Testing, Database Testing, Integration, Smoke, Regression Testing.

Core Domain skills include Healthcare, Banking, Retail, Term life, P & C Insurance

Experience in Performance Tuning in mapping level and Debugging of existing ETL processes and handling problem and troubleshooting.

Excellent interpersonal and communication skills, creative, research-minded, technically competent and result-oriented with problem solving and leadership skills.

Technical Expertise

Primary Skills

Secondary Skills

Hadoop,Hive,Pig,Sqoop,Oozie Informatica, Data stage

Map Reduce, Hbase, Flume, Spark, Insync, Unix, Sql Server, Oracle, Cloudera 4.3,Tableau,Scala,Python,SAS

Mainframe, Teradata, MF Cobol

SQL Assistant, DB2 Studio, Jcl, VSAM

HP ALM Quality Center, Q TP Automation testing (Functional)

Endevor, Mgen, SPUFI, Control M, SQL, GUI, Workbench

Education

Degree

Qualification

College / University / Institution

Masters

Master of Business Administration(MBA-Systems)

University of Madras, Tamilnadu

Bachelors

Bachelor of Business Administration(BBA)

University of Madras, Tamilnadu

Any Other

Diploma in Computer Technology (D.C. TECH)

Valivalam Desikar Polytechnic, Nagapattinam, Tamilnadu

Professional Activities, Certifications & Training

Course / Certification Name

Institution

Teradata Certification(V2R5)

Teradata Corporation

LOMA Certification

Life office management association(LOMA)

NCFM (NSC Certification of financial Market)

NSE India

PMP/ASSET/SPARC

PROJECT EXPERIENCE SUMMARY:

Project Title – 1

STATEFARM – TIMECARD

CLIENT

State farm Insurance –USA(Bloomington-IL)

ROLE

Senior Hadoop / ETL Developer

DURATION

Jan 2013 – till date

CLUSTER

20Nodes

Project Description:

State Farm is an American group of insurance and financial services companies in the United States. The group's main business is State Farm Mutual Automobile Insurance Company, a mutual insurance firm that also owns the other State Farm companies. The corporate headquarters are in Bloomington, Illinois.

Roles and Responsibilities:

Involved in design, development, integration, deployment/production support & other technical aspects of the development of, modification to the applications

Working with Project Managers, Architects, Designers, Analysts, Programmers, System Analysts, and Testers to ensure development requirement are properly documented as per SDLC standards.

Created Informatica source-to-target mapping using different transformations to implement business rules to fulfill the data integration requirements.

Prepared DML to perform audit & error handling.

Provided project estimates, coordinated the development efforts and discussions with the business partners, updated status reports and handled logistics to enable smooth functioning of the project and meet all the deadlines.

Worked on SQL developer querying the source/target tables to validate the SQL and Lookup override.

Converting data integration models and other design specifications to Informatica source code

Worked extensively on performance improvement & query tuning tasks in Oracle.

Involved in providing user support for executing System Integration Testing & Acceptance Testing

Performing unit testing and responsible for System Integration testing, Regression testing until it moves to production

Installed and configured Hadoop MapReduce, HDFS

Importing and exporting data into HDFS and Hive using Sqoop.

Experienced in defining job flows. Experienced in managing and reviewing Hadoop log files.

Experienced in running Hadoop streaming jobs to process terabytes of xml format data.

Load and transform large sets of structured, semi structured and unstructured data.

Responsible to manage data coming from different sources.

Supported Map Reduce Programs those are running on the cluster.

Installed and configured Hive and also written Hive UDFs.

Involved in creating Hive tables, loading with data and writing Hive queries which will run internally in map reduce way.

Working knowledge in modifying and executing the UNIX shell scripts. Involved in web testing using soap UI for different member and provider portals.

Involved in building and maintaining test plans, test suites, test cases, defects and test scripts

Conducted functional, system, data and regression testing.

Involved in Bug Review meetings and participated in weekly meetings with management team.

Environment: Hadoop, Cloudera 4.3, MR,Hive, Pig, sqoop,oozie, Informatica 9.6, DB2 Studio, SQL, Unix

Project Title – 2

UNUM-Data warehouse

CLIENT

UNUM – USA (Portland-ME)

ROLE

Senior Developer

DURATION

Apr 2010 – Jan 2013

Project Description: UNUM Group is a leading provider of financial protection benefits to assist working families through life’s challenges, such as disability insurance, life insurance, voluntary insurance, and long-term care insurance as part of employer group plans.

The prime intent of this team was vested with Development, Enhancement, monitoring the Sub-Production batch cycle, fixing the abends with SLA and taking care of ongoing activities across the UNUM applications and here all the applications will be under the responsibility of CCP-AMS.

Roles and Responsibilities:

•Coding of Informatica mappings to load data into various layers such as Staging, Integration, and Presentation.

•Expertise in ETL process using Informatica Power Center components; Mapping Designer, Workflow manager, and Workflow monitor.

•Expertise in transformations like XML Parser, Source Qualifier, Aggregator, Filter, Joiner, Expression, Lookup, Expression, Sorter, Union, Router, Sequence Generator transformations according to the business rules and technical specifications.

•Expertise in loading data by using the Teradata loader connection (Multiload) and working with loader logs.

•Expertise in reading the data from MQs (XML) and load them to Teradata tables.

•Used PMCMD command to start, stop and abort workflows, sessions from UNIX.

•Used different tasks such as session, command, decision, email tasks.

•Proficient in understanding business processes / requirements and translating them into technical requirements, creating conceptual design, development and implementation of software application and integrating new enhancements into existing systems.

•Updated numerous Bteq/Sql scripts, making appropriate DDL changes and completed unit and system test.

•Worked on SQL developer querying the source/target tables to validate the SQL and Lookup override.

•Converting data integration models and other design specifications to Informatica source code

•Developing and maintaining the Informatica and Teradata objects. Involved in peer to peer reviews

Environment: Informatica 9.1, DB2, SQL, Teradata, UNIX, Control M, Windows XP

Project Title - 3

Kaiser Permanente

CLIENT

Kaiser -One link

ROLE

Developer

DURATION

Mar 2008 – Mar 2010

Project Description: For more than 60 years, Kaiser Permanente has provided quality health care. Each entity of Kaiser Permanente has its own management and governance structure, although all of the structures are interdependent and cooperative to a great extent. There are multiple affiliated nonprofits registered with the U.S. Internal Revenue Service.The POU solution is a part of SCTI and will enable accurate collection of medical/Surgical utilization data to begin demand forecasting and improve end to end supply chain management.

Roles and Responsibilities:

•Understanding the complete requirements and business rules as given in the mapping specification document.

•Used Datastage for Extraction, Transformation and Loading data to the target.

•Extracting data using Dataset and loading into DB2 Relational Database Tables.

•Created Parallel jobs using various Stages involving ODBC Connector, Filter, Remove Duplicate, Sort, Transformer, Funnel, Lookup, Join, Pivot, Aggregator, sequential file stage and etc.

•Used job parameters and parameter set on different jobs based on various business requirements to make job more flexible.

•Involved in Unit Test, Integrated Testing and Defect fixing.

•Created sample data for testing the jobs initially.

•Used Quality Center for Test and Defect tracking, fixing in System testing if any.

•Prepared extensive Unit Test cases upon understanding the Business logic, validation rules as given in the Mapping document.

•Written queries to test the functionality of the code during testing.

Environment: Datastage 8.5, DB2, SQL, UNIX

Project Title - 4

Distribution Center

CLIENT

Gap - USA

ROLE

Software Engineer

DURATION

May 2006 – Feb 2008

Project Description: Distribution Centers are warehouses that receive the goods shipped by Gap’s suppliers (referred to as vendors), stock them and distribute them to the various stores serviced by them. Gap has several DCs spread over USA, Europe and Japan servicing the 3300+ stores spread across USA, Canada, UK, France, Germany, Holland and Japan. Each DC will stock and distribute goods to the stores under its service.

Roles and Responsibilities:

(The day-to-day problems of the business community are logged in an application called Service Center.) Attending to calls related to Problems/ Abends in Production jobs.

Communicating with Business Partners for understanding the problem and clarifying their queries.

Analyzing and resolving them within stipulated time

If required, making Temporary or urgent fixes and moving them to Production.

Analysis of the data and production problems

Environment: Mainframe, Teradata, MF-COBOL,SQL Assistant, Control M, Easytrieve, SAS

Project Title - 5

Diamond (Personal LOB) (USA-Cincinnati)

CLIENT

Cincinnati Insurance Company – USA

ROLE

Test Engineer

Project Description: The Cincinnati Insurance Companies (CIC) use Diamond for the personal lines insurance administration. This application is governed, designed and maintained by the Personal Lines Department of CFC. Diamond is a web-based application that supports policy processing functions Personal Auto, Homeowners, Dwelling Fire, Personal Articles, Umbrella and Watercraft lines of businesses. Diamond also provides ability to process billing and invoicing activities and interface with third-party application for client validation.

Roles and Responsibilities:

•Requirements study and understanding the functionalities

•Involved in Requirement Analysis, Test coverage Analysis, Test case Design, Review & Execution and Defect Management and Reporting

•Preparing the Software QA Test plan

•Allocating task to the Team members and ensure that all of them have sufficient task and tracking the progress of work

•Conducting and Participating in Daily Team Meeting and Planning Meeting

•Keeping track of the new requirements from the Project

•Tracking and reporting the testing activities, including testing results, test case coverage, required resources, defects discovered and their status etc.

•Interacting with the Dev team and BA’s for any pending clarifications

•Reviewing the complex requirements and helping the team in understanding the requirements

•Providing Test estimates and review QA efforts

•Logging / verifying the defects in Quality Center - ALM

•Using the Dashboard in the Quality Center - ALM and generating the Reports

•Participating in the Client calls, Project meetings and Team meetings

•Sending Daily and Weekly status reports to client

•Ensuring to provide Quality, on time delivery of agreed deliverables

•Giving K.T and Training sessions to new joiners

•Submitting Test Completion report at the end of Test execution phase

Environment: HP Quality Center, QTP, DOORS, SQL, Informatica



Contact this candidate