Post Job Free
Sign in

Developer Analyst

Location:
New South Wales, Australia
Posted:
March 11, 2016

Contact this candidate

Resume:

Marcelo Nagy

Adress: unit **/** Darley st, Sydney. NSW 2103

Mobile: 046*-***-***

Email: actv5y@r.postjobfree.com

Summary of Qualifications

Ability to work in group, team leadership ability, easy to deal with changes, good at facing challenges, 6-year experience in web analysis/development, configuration analyst (versions control) 2 years, 1 year- experience in mobile development, 3 years working with Hadoop technology .

Almost 3 years working in a company of online advertising as back-end programer. I was responsable for contextual urls system, developing Map Reduce for processing the logs of our campaigns as well and also acting in the advertising systems when needed.

I'm system user unix since 2008.

I've worked with svn and git for version control systems.

Since 2011 I'm studing Big Data technologies like Hadoop ecosystem.

Since 07/2014 and work as a systems engineer at Semantix Brazil planing and implementing solutions (POC's) using Hadoop and Solr.

Employment History – Brazil

Semantix Brazil (Cloudera partner)

Position: Systems Engineer – I was rensponsable to choose the tecnologies and implement the solutions for the costumers.

Time period: 07/2014 to 12/2015.

Duties: Planing and implementing (POC's) Big Data solutions. Using Java, Hadoop(Map reduce, Pig, Spark, Impala, Hive, Sqoop, Flume and Hbase) and Solr. After came to Australia I act meeting called support when requested.

Projects

Itau Bank

Instalation and configuration of a Hadoop cluster.

Development of Map Reduce indexer for solr and also an Java api to cosult those registers. (Java MAP-REDUCE, Solr, HDFS)

Indexer used for diary insertion of terabytes of banks statements.

Java MAP-REDUCE – Used to implement the indexer.

Java – api to consulting the registers at Solr.

Solr – Nosql data Base inside HDFS.

HDFS – File system of H.adoop.

Porto Seguro

Instalation and configuration of a Hadoop cluster.

Development of a data-hub for merging information of different data sets.

(Pig, Hive, Impala, Sqoop, Flume, Spark, HDFS)

Pig – used to do the delta insertion.

Hive and Impala – used to do the inteface with Tableau.

Sqoop – used to do the first insertion and the delta insertion of the databases that had the dateModified fields.

Flume – used to do the insertion of the logs of the gps machines into the data-hub.

Spark – used to process the logs of the web servers and insert the result into the data-hub.

HDFS – File system of Hadoop.

VIVO

Instalation and configuration of a Hadoop cluster.

Development of Spark for processing logs.

(Spark,Hdfs)

Spark – used to process the logs of the web servers.

HDFS – File system of Hadoop.

HotWords

Position: Developer/Analyst

Time period: from 10/2011 to 06/2014.

Duties: Analysis/Development of JAVA (J2EE) applications, using NoSQL MONGODB, MYSQL, JSON, REDIS, HADOOP(Map reduce, Impala, Pig, Hive and Hbase). Company of online advertising I worked as back-end programmer. I was responsible for contextualization urls system, developing Map Reduce for processing the logs of our campaigns and also I acting in the advertising systems when needed.

Projetcs

Contextualization urls system – java system to identify the subjetc matter that the url is about so then we could choose with advertising to put in each url.

I was the owner/mainteiner of the project.

Tecnologies

Java

Html parser apache tika.

Json – to deal with mongoDB Registers.

Redis – used as a queue for the urls to be processed and used as cache as well.

Mysql – used to keep the information of the partners(sites)

Log processing system – Java Map reduce/Pig processing for online campaigns and to contabilize the partners(sites) payments and consulting interface with Hive and Impala so the partners and the company (HotWords) be able to follow how the campaigns are being delivered and recorded.

Gimon Telecomunicações Ltda.

Position: Developer/Analyst

Time period:01/2011 to 10/2011.

Duties: Analysis/Development of applications with JAVA (J2ME) / Android, CANVAS, SOAP, RMS, Regular Expressions, Tomcat, JSON. Analyst/Developer of company’s application used by Petrobras, Holcim and Brazilian Ministry of Tourism.

Shift; MPS.BR- certified company- level F

Position: Developer/Analyst

Time period: 06/2009 to 09/2010.

Duties: Systems Analysis/Development, using Zen framework (Cache data base), documentation using Enterprise Architect tool (use cases methodology), systems implantation. Configuration analyst, configuration management processes (activities for controlling systems versions related to MPS.BR) using SVN-subversion. Analyst/Developer in the DB2 to Cache (objectScripit) system migration project, operational control system of a laboratory (technical area performing automate programs).

Visual Systems

Position: Intern

Time period: from 15/09/2008 to 30/01/2009.

Duties: Development of JEE applications, study of Java 1.4 and 1.5. Frameworks JSF ADF Top-Link certification material. Ide: jdeveloper 10 and 11.

Additional Experience

Inclua - from 06/2007 to 12/2007 – position: Intern (Websites development);

Control Point - from 27/06/2006 to 05/02/2007 – position: Electronic technician;

Compuway Informática - from 13/08/2004 to 09/06/2006 – position: IT instructor;

Seyprol Eletronic Security - from 14/02/2003 to 01/06/2004 – position: Maintenance/installation technician;

TW Automation Secunet - from 20/02/2001 to 29/07/2002 – position: Operation/maintenance of property automation;

JDA Eletro Metalúrgica Indústrias Ltda – from 06/2000 to 11/2000 – position: Intern.

Education

FIAP from 06/2011 to 12/2012 post graduate in Services-oriented software engineering MBA – SOA

FATEC – from 06/2006 to 06/2009 Technologist Degree in Business management information technology

ETE P. G. Netto from 02/1999 to 06/2000 Technical in Electrotechnician

Training

Semantix Brazil – 02/2015 Apache Spark developer.

Semantix Brazil – 02/2014 Data analyst impala/pig/hive/Solr

Semantix Brazil – 10-11/2013 Hadoop developer (java Map reduce).Hadoop administrator.

Caelum - 2011 Mobile development with Google Android.

Global Code – 2011 Java for Web development (JSF, JPA).

Shift Systems Consulting – 2010 Technologic Training of object (Cache) orientated data bank.Sql, Web Programming (JavaScript - CSS – HTML-ObjectScript).UML, mps.BR level F processes (roles - Configuration Analyst – Analyst/Developer).

Caelum – 2008 Java and Orientation to objects.Java for Web development(JSP).

SENAC - 2007 Web designer technician - Fireworks 8, Flash 8, Dreamweaver 8.Linux Mandriva Systems Administrator.

ETE Philadelpho GouveiaNetto – 1999 Manufacturing of Printed Circuit – CETEISA.

All at São Paulo,Brazil.



Contact this candidate