Post Job Free
Sign in

Data Developer

Location:
Dallas, TX
Posted:
January 19, 2018

Contact this candidate

Resume:

Name: MAGUS “GUS” CHIHABA

Skills: Hadoop Developer

Objectives:

Having Fifteen+ years in Telecommunications and Software Development, 3+ years of hands-on expertise in big data, Detail-oriented Analytical thinker and complex problem-solver with a strong technical background and a recent degree emphasis in Mathematics, Statistics and Data Analytics seeking a position to harness Big Data Analytics utilizing MapReduce and Apache Hadoop tools to transform raw data into information, insight and business decisions.

SUMMARY:

Having Three+ years of hands-on expertise in big data, analytics and cloud computing, analytics and cloud computing (Hadoop Ecosystems, Java, Map Reduce, Hive, Impala, Pig, Oozie, HBase, Cassandra, Sqoop, Zookeeper, Redshift, AWS)

Two years’ experience installing, configuring, deploying and testing Hadoop ecosystem components using Cloudera Manager.

Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture.

Excellent working knowledge of RDBMS and NoSQL databases: MS SQL, Mahout, Cassandra and MongoDB.

Experience in developing Data Transformations and Loading using Advanced Data Visualization tools.

Clear understanding of Hadoop architecture and various components such as HDFS, Job and Task Tracker, Name and Data Node, Secondary Name Node and Map Reduce programming.

Familiarizing with Spark, Spark SQL, Python and Flume and willing to learn new technologies.

Strong leadership, excellent mentoring and superb interpersonal skills

TECHNICAL SKILLS:

Languages: Visual Studio .NET, VBA, VB, C#, C/C++, R, Java and MATLAB

Packages: Excel 2013, Access 2013, PowerPoint 2013, SQL and SAS

Scripts: JavaScript, HTML5, CSS3, XML, Unix Shell Scripting

Operating Systems: Windows (all versions), Linux (Ubuntu 16.01), Mac OS X

Databases : Microsoft SSMS, MySQL, MongoDB, HBase,

EXPERIENCE

Client: Bank of America, Addison, TX September 2017 – December 2017

Role: Big Data Applications Developer

Data Reports Migration From Platfora to Arcadia) :

Sqoop tables from MySQL and Oracle Database to Hadoop FS

Pre-processing with Hive and Impala

Develop data models using advanced Big Data visualization tools; Platfora and Arcadia

Learning Tableau, Informatica and Ab Initio

Client: P&G, Cincinnati, OH June 2014 – September 2017

Role: Hadoop Developer/Administrator

Hadoop / Big Data(Big Data Platform) :

Develop Data Pipelines that work seamlessly on Hadoop clusters using Pig, Hive and Impala

Implemented Hadoop based data warehouses, integrated Hadoop with Enterprise Data Warehouse systems

Expertise in data ingestion using Sqoop or Hue

Excellent data modeling skills using Platfora, Arcadia, Tableau, Informatica, Ab Initio, UDFs, Hive Scripts

Design, develop, testing, tuning and building a large-scale data processing system, for Data Ingestion and Data products for both Operational Applications and Analytical needs

Troubleshoot and develop on Hadoop technologies including HDFS, Hive, Pig, Flume, HBase, Spark, Storm, Impala and Hadoop ETL

Proficiency in Linux Scripting and regulation of Hadoop administration rights and access

Expert knowledge in Hadoop/HDFS, MapReduce, HBase, Pig, Impala, Sqoop, Amazon Elastic Map Reduce (EMR), Cloud : Amazon EC2, Cloudera Manager,

Client: CTDI, Flower Mound, TX and South Bend, IN Jun 2004– May 2014

Role: Java Developer/Engineer

Designed, developed, tested, supported and deployed desktop, custom web, and mobile applications for departmental projects within an Agile environment.

Developed software, architecture, specifications and technical platforms using C#( .NET) and Java(Spring Tool Suite & Eclipse),

Developed automation scripts to support testing of telecommunications equipment

Integrated developed code with database functions using Java REST API

Designed and deployed an app which implemented statistical/predictive models and cutting edge algorithms to measure vehicle safety and depreciation.

Led the migration of monthly statements from UNIX platform to MVC Web-based Windows application using Java.

Built sustainable and modular automation solutions using agile and continuous delivery best practices.

Very good knowledge of OOP and OOAD concepts since 2008

Role: Data Analyst/Repair Supervisor

Turned-around projects with a loss into profitability and kept them above 60%

Developed custom financial and auditing reports using Crystal Reports with T-SQL, MS Excel and Access.

Analyze large datasets to provide strategic direction to the company.

Perform quantitative analysis of product sales trends to recommend pricing decisions.

Conduct cost and benefit analysis on new ideas.

Scrutinize and track customer behavior to identify trends and unmet needs.

Assist in developing internal tools for data analysis.

Implemented process improvements to increase efficiency utilizing limited operative resources

Effectively recruited or selected, trained, and assigned personnel

Efficiently scheduled projects and assignments

Role: Lead Electronics Technician

Installed, tested, turned on, modified, upgraded and debugged telecommunication equipment

Adapted to rapid change and possessed the determination and focus to get the job done, while meeting assigned schedules, and adhering to strict technical standards

EDUCATION AND ACTIVITIES:

University of Texas at Dallas

BS in Actuarial Science (2014)

1st Place Phi Beta Lambda Leadership Conference: Statistical Analysis State of Texas March 2013

1st Place Phi Beta Lambda Leadership Conference: Telecommunications State of Texas March 2013

Semi-finalist UTD Business Idea Competition November 2012



Contact this candidate