Post Job Free

Resume

Sign in

Data Developer

Location:
Birmingham, West Midlands, United Kingdom
Posted:
May 26, 2020

Contact this candidate

Resume:

Mr Olakunle Kuye

Mobile: 079*-*******

Email: addfi7@r.postjobfree.com

Profile

An IT practitioner with over 17 years of experience with excellent requirement gathering, documentation, stakeholder management, project planning, strategy implementation and scheduling skills with focused determination and progressive attitude on providing consistent delivery of high-quality products throughout the project lifecycle.

I have a strong background with file distribution systems in the big-data arena, and I understand the complex processing needs of big data and have experience in developing codes and modules to address those needs

I have degrees in Computing, and I am interested in understanding business vision while using either business process or technology to resolve business problems.

Work Experience

NCP

June 19 – April 20 Business Intelligence Developer

Designing, building, testing, and maintaining data pipelines to connect operational systems, data for analytics and BI systems. Using Azure Data Factory, Cosmos DB, Azure DevOps and Power BI.

Design, maintain and optimize the Data Warehouse and ELT/ETL pipeline solutions to maximize performance.

Designing and maintaining on-premises, transactional and analytical, MS SQL Server Infrastructure.

Designing and developing Datawarehouse solutions.

Developed Azure Function Apps in C# that consumed REST API’s among other function.

WEJO

July 18 – May 19 Senior Data Engineer

• Engineering the Company data platforms for scale, performance, reliability and security.

• Work with other members of the Data Engineering team to design and build big data streaming capabilities using AWS data pipeline, S3, SQS, SNS, EMR and Lambda as well as leveraging technologies like scala, spark, pulsar and kafka.

• Work with the product owners and business analysts in analysing business requirements to design and implement data processing pipelines, associated data and database structures and fine tune performance to meet those requirements.

• Review new external data sets and open data sources to understand potential usage.

• Work with Infrastructure and DevOps teams to release and maintain live products.

• Design, Implement & Test all data processing systems.

• Participate in establishing processes and best practices around development standards, version control, quality control, deployment, maintenance and change management.

Shop Direct Group

Jan 18 – June 18 Scala Developer

• Spark / Scala coding, unit testing, system testing

• Agile Backlog Grooming

• Definition of Acceptance Criteria for QA and Business Analysts

• Writing technical design documentation (high and low level) as required

• Liaising with QA team to ensure that the documentation is fit for purpose

• Working with system team to perform load, performance and destructive testing

• Developing CI/CD pipeline for production and pre-production environments

Using primarily AWS such as S3, SQS, SNS, EMR and Lambda leveraging Kafka, and Cassandra

Cap Gemini HMRC EDH Project

Dec 14 – May 17 Architect/Analyst Developer

Designed and developed a web application using Django, which used parsed SOLR queries based on chosen parameters at runtime.

Used Sqoop and Flume to move data through various landing stages in a hadoop eco system once they had been through a cleansing process using shell scripts before placing Hive tables over them where required.

Developed MapReduce programs in Java to parse raw data and also used Morphlines to perform ETL operations on data before indexing them for SOLR.

Used Pentaho PDI and Informatica to transform data.

Used Spark and Scala for program development using TDD methods and data analysis.

Used AWS as an environment for proof of concept development and deployments.

Williams Grand Prix Engineering IT

Mar 02 – Dec 14 Lead Application Architect/Developer

Developed Scala programs using TDD methods

Used AWS in the development of proof of concept programs

Shared responsibility for administration of Hadoop.

Created Hive queries that helped with the comparison of car data models and historical metrics.

Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in Hive tables.

Produce high and low-level designs for numerous multi-threaded applications such as desktop, web and mobile utilising frameworks such as TOGAF, UML, OOD and Agile

Led the development of multiple applications utilising technologies such as C#, C++, J2EE, WPF, Windows Form, Visual Studio, Dev Express, Dundas and SOA

Translate business requirements into functional specification highlighting interactions between system interfaces and application functionalities

Technical design authority ensuring that systems are developed in accordance with policies and standards whilst promoting re-useablity of components

Liaison with stakeholders including C-level to elicit, analyse, communicate and validate requirements for changes to business processes, policies and information systems

Co-ordinate and contribute to RFP, RFQ and vendor selection processes

Translate business requirements into business process and system processes using BPM

Led various system integrations and report authoring projects utilising technologies XSDs/WSDL, Oracle, Crystal, Cognos, PERL, C-Sharp and MS SharePoint

Manage logical design and physical implementation of databases and data-warehouses using Microsoft and Oracle technologies

Ensure that functional and non-functional requirements are analysed, captured, prioritised and validated

Design and develop data extraction, transformation and load strategy from disparate sources into a centralised data warehouse using Oracle and Microsoft BI technologies

Remove bottlenecks in business processes through transformation and managed the upgrade and consolidation of IT infrastructures

Manage end-to-end system testing including: analysis of specifications, reviewing documentation, internal /client liaison, regression testing and OAT

Built programs that used large datasets and leveraged the Hadoop platform using MogoDB, where MogoDB was used as the real-time data store and Hadoop was used for batch data processing and analysis .

Unisoft Corporation LTD IT

Jan 98 – Mar 02 Analyst Developer

Led the development of multiple web, mobile and desktop applications utilising technologies such as C#, C++, J2EE, WPF, Windows Form, Visual Studio, Dev Express, Dundas and SOA

Manage the implementation of application development with a value of circa £5M whilst managing a team of 25 based onshore and offshore

Co-ordinated and performed Oracle and SQL server upgrade, migration and decommissioning

Created and maintained information modelling approach monitored and enforced compliance of information standards to minimise redundancies and enhance information quality

Building and configuring Unix and Windows servers ensuring compliance with regulations

Produced business analysis and design documentation in accordance with policies and standards

Liaised with internal stakeholders and third party vendors to develop desktop, web and mobile applications

Translation of business requirements into functional and non-functional requirements whilst ensuring business requirements are met

Produced application roadmaps, development standards and testing strategy

Define and implement architecture improvement efforts across a large team of architects and developers.

Environment

* Linux, Unix, Aix

* Microsoft 2000/XP/2003/2008/2012

* Hadoop - Cloudera and HortornWorks

Application Software

* Java/J2EE 1.1 - 8

* Oracle versions 7, 8, 9, 9ias, 10g/11g/12c

* Spark

* WPF, WCF, Silverlight

* PRISM, MVVM

* Hive

* Impala

* Kafka

* Scala

* Storm

* HBase

* ASP/ASP.NET (2.0.3.5.4)

* JSP

* JDBC

* SQL Server (2008 and 2012)

* PL/SQL

* SharePoint 2003/2007/2010

* Oracle BPM

* SafeNet Sentinel HASP

* XML technologies (XSD, XQuery, etc.)

* SqlPlus, MySQL, TSQL

* PERL, VB, Python or Java Script

* Rational Software Architect

* C, C#, C++

* CORBA

* CUDA

*Cassandra

Development Tools Requirement:

Visual Studio (6 – 2015)

IntelliJ

Eclipse

Oracle Developer Suite 10g, 11g and 12c

Net Beans

Power Builder

Crystal Reports

Cognos Reports

DevExpress

Dundas Charts

Qualifications and Education

Cranfield University, Cranfield, UK

Software Engineering for Technical Computing

Oxford University, Oxford, UK

Advanced Diploma in Computing

Greenwich University, Greenwich, UK

B .Tech Business Studies



Contact this candidate