Resume

Sign in

Data Manager

Location:
Madison, Wisconsin, United States
Posted:
January 29, 2017

Contact this candidate

Hepsi Email: acyjav@r.postjobfree.com

Mobile: +* 608-***-****

Professional Summary:

Currently working as a Data Analyst with Business Intelligent in the Data Management Team.

Very good professional experience in all phases of Software Development Life Cycle (SDLC) including Design, Implementation and testing during the development of software applications along with Hadoop, Big data, Oracle Database(11g, 12c), Sql Server 2012, MySQL, Derby etc. Oracle database Warehouse (DBW), Informatica Power Center, IBM Data Manager for BI, Data Analysis, Data Extraction, Data Transformation and Data loading, Data Mart & IBM Cognos Reporting tool to create cubes and reports.

Strong experience in Writing Complex HIVE & SQL Queries and Scheduling jobs using windows job scheduler and Oozie, testing and implementing triggers, Stored procedures, functions, Packages and Pl/Sql, building reports using Tableau, Statistical and pattern, algorithmic analysis using R-Programming, excel and pivot tables.

Hands-On experience in coding with Java (Map-Reduce), python scripts and php, html, xhtml, xml, pl/sql.

Full stack experience in J2EE programming with different web services (RESTful, Restless) different Architectures (MVC Architecture) and different frameworks (Struts & Spring Frameworks), Java design patterns (Singleton, Factory etc.).

Experience in Front-end programming, web designing and analysis with JavaScript, JSP, PHP,HTML, CSS

Experience in Back-end programming EJB and JVM, develop, debug and unit testing with Eclipse, IBM Web Sphere

E-Commerce experience with the supply chain, retail website.

Experience in data mining extremely large data sets, high proficiency in SQL-Oracle, MySQL, MS SQL server 2012, oracle, DB2 relational database systems, JDBC connection and design.

Experience in Data migration and upgrade in different databases & Cloud and Amazon Web Services (AWS).

Sound Knowledge in RDBMS, SQL and NOSQL databases (MongoDB) and Open Sources(Apache Couch DB) and the Storage Systems(Hadoop Distributed File System(HDFS)

Very good understanding in Object Oriented Programming (OOPs) concept and System Designing with Microsoft Visio and Framework Manager.

Experience in Enterprise Java Programming with IBM Web Sphere

Very good understanding and Experience in Web Logic and JBOSS.

Experience in data warehouse and OLAP data

Experience in SQL Query tuning and instance tuning as the part of performance tuning.

Excellent Analytical, problem solving skills and a motivated team player and interpersonal skills.

Functional expertise includes Telecom, Manufacturing, Health care, Insurance, Banking, Retail Store, supply chain, Insurance, Stock market, automotive and Financial Accounting.

Experience in UNIX shell scripting.

Good knowledge and hands-on experience in Hadoop, big data technologies such as Hive, Pig map reduce, Yarn, oozi, sqoop and cloud environment such as AWS

Good experience on installing Hadoop in AWS environment and worked on POC’s to extract data from traditional database to HDFS environment, data transformation and data ingestion.

Education:

Database and Data Analysis

University of California Santa Cruz (UCSC)

Data Mining Master of Philosophy (MPhil)

Madurai Kamaraj University (MK University)

Master of Computer Application (MCA)

Madurai Kamaraj University (MK University)

Bachelor of Science (BSc) - Physics

Manonmaniam Sundaranar University (MS University)

CERTIFICATION:

Oracle Certified Professional (OCP) in Oracle 11g, Advanced PL/SQL programming- 98%.

Oracle Certified Associate (OCA) in oracle 11g Database administrator I -93%.

Workshops:

IBM Cognos 11 Analytics Workshop – Nov 2016

Hadoop 1.1 Workshop – May 2014 - Setup for single node for Word count

Big data Analysis with medical devices-Hadoop 2.0 – Real-time Analytics on Medical Device Data with Kudu and Stream sets

The Role of Data Science in providing Loans to the underbanked

Apache Edgent and IBM Watson data platform Workshop

Courses:

Big data, Hadoop, Data Modelling, Java, Python, Database Management & Data Warehouse, Google Analytics, R-Programming, Tableau.

Php, UNIX server, Object Oriented Programing Analysis and Design, XML, XHTML.

Technical Skills:

Hadoop Tools & Concepts

Hive, Oozie, Storm, Spark, HBase, Flume, HCatalog, Pig & HDFS, Map-Reduce, YARN, Hadoop Eco System, zoo keeper, Cassandra, sqoop, Chukwa, Pentaho kettle and Talend

Programming & Scripting Languages

Java, Python, php, JavaScript, PL/SQL, SQL, HTML, XHTML, XML, Bash shell scripting

Java, J2ee specific skills

J2EE Common Services APIs, Web Services, Java JSP, Servlets, Struts driven web sites, JDBC connections, Hibernate, Applet, SOAP, RESTful, Restless JUNIT, Eclipse, IBM WebSphere.

Database

Oracle 11g, Oracle 12c, MS Access, MySQL, SQL Server 2012,

NoSQL- Mongo DB, Couch Base.

Web Server

Apache Tomcat 7.0, Derby,

ETL Tools

IBM Data Manager & Informatica power center

Designing and modeling Tools

OOAD- DIA, MS-Visio, Star UML, IBM framework Manager

Tools & Utilities

Eclipse IDE, Net Scape,JDK1.6,SQL*Plus, SQL & PL/SQL Developer, SQL * Loader, CVS, SVN, JIRA, JAMA, IBM RAD, WebSphere

Domain

Manufacturing, Health care, Medical Insurance, Telecom, Automotive, finance

Internet Technologies

Oracle Web Tool kit, web services

Analytics, Visualization tools

IBM Cognos, R Studio, R-Programming, Tableau8.0, Pentaho, Advanced Excel VBA, Macro

Methodologies

Agile and Water fall methodology

Operating system

Windows Vista/ XP/2000/7/8, UNIX, Ubuntu

Professional Experience:

Madison Area Technical College (MATC), Madison, WI Oct 16 – Till Date

Data Analyst

Responsibilities:

Working in the IT services with Data Management team for different customer teams (enrollment, student help and HR, payroll etc.) to secure the data based on FERPA and create different reports based on the requirement.

Working on building the cubes and reports

Create the Business metric KPI (Key Performance Indicator) to evaluate factors for different modules.

Create the catalog in data manager, fact build, dimension build, reference dimension, customize create and deploy ETL jobs from the transaction data

Working on database migration, upgrade and maintenance.

Cleansing, mapping and transforming data, create the job stream, add and delete the components to the job stream on data manager based on the requirement

Use Service Now to incident management.

Job schedule through the windows job scheduler.

Migrate data from Dev environment to PROD

ETL process on the data ware housing and OLAP data.

Create the look-up tables for the data processing

Experience with creating different reports and cubes using Cognos.

Pivot tables and create various analysis report using MS-Excel

Using statistical R-packages and R-programming for Factor, quantitative Analysis and k-means clustering.

Using python script API for R- programming analysis

Tableau reports and dashboards created and distributed in pdf format to the administration team.

IBM Cognos, Tableau, R-Programming Data manager, DataMart, Sql Server 2012, MS-Excel, Service Now.

Student, HR, payroll & Blackboard module

Deloitte, Madison, WI Sep 15 – Sep 16

Web Analyst

Responsibilities:

Worked in CARES and ACCESS projects different modules with Architecture & Maintenance Function Area (FA6) teams for Govt.Wisconsin, Department of Children and Families (DCF) & Department of Workforce Development (DWD).

Involved in System Analysis and Design methodology as well as Object Oriented Design and development using OOAD methodology to capture and model business requirements.

Responsible for Module development, web services(RESTful, Restless), Frameworks(MVC), Java design patterns(Factory, singleton, etc.), JVM & Memory management and tuning

New page Design, object mapping and migrate to html5 tag changes

Incident Management through JIRA and JAMA

Development and Unit test using IBM WebSphere

SVN version control management for code repository.

Experience with different development and maintenance projects

Full stack custom design for the new implementation.

Mapping the backend data from DB2 and Oracle 11g to the j2ee applications

Cares, Access module integration and implementation based on the law updates

Packages, functions, procedures, triggers written in oracle 11g sql, pl/sql.

Job schedule, migration and update.

Efficient J2EE stack designing analysis and implementation for different applications.

XML scripts written for different purposes.

Java EE, JSP, Servlets, JSF, Spring DI/IOC, Hibernate, XML, HTML, JS, CSS, DB2, Web services, Rational Software Architect, Web sphere Application Server, UNIX, Junit, Log4J, SVN, Linux / Windows, oracle 11g

Explorys Mar 15 – Aug 15

Big data Analyst

Responsibilities:

Worked as a Hadoop Developer / Administrator with big data stack design architecture research team to build a strong data repository by developing, installing and configuring Hadoop ecosystem components that moved data from individual servers to HDFS.

Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the Enterprise Data Warehouse (EDW).

Create Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.

Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and Pig to pre-process the data.

Provided design recommendations to improve review processes and resolved technical problems.

Managed and reviewed Hadoop log files.

Tested raw data and executed performance scripts

Shared responsibility for administration of Hadoop, hive and pig.

Installed and configured MapReduce, Hive and the HDFS

Implemented CDH3 Hadoop cluster on CentOS. Assisted with performance tuning and monitoring.

Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.

Created reports for the BI team using sqoop to export data into HDFS and Hive

Developed multiple MapReduce jobs for data cleansing and preprocessing.

Assisted with data capacity planning and node forecasting.

Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.

Administration for Pig, Hive and Hbase installing updates, patched and upgrades.

Installed, Designed, build and compare Hortonworks and MapR for the better performance and research purpose.

Data Mining, Data Analysis and Data Profiling using Hadoop, Amazon Web Services-AWS, Hive, Map Reduce, Pig Scripts, Oozie, Sql, HDFS, Python, java, CDH3, GitHub and excel, Hortonworks, MapR, Apache Falcon, Sqoop, Google Analytics.

TLS IT - Stone Bridge, System Technology Sep 12 - Feb 14

Data base Administrator

Worked as a Pl/sql developer/ Database Administrator (DBA) in TLS IT Solution network team for different applications.

Responsibilities:

Coordinated with the front end design team to provide them with the necessary stored procedures and packages and the necessary insight into the data.

Worked on SQL*Loader to load data from flat files obtained from various facilities day to day.

Created and modified UNIX shell scripts to loading the cleansed data into the base tables.

Developed pl/sql triggers, stored procedures, functions and packages for moving the data from staging area to data mart.

Created scripts to create new tables, views, sequences and queries for new enhancement in the application using TOAD.

Performed sql and pl/sql tuning using various tools (EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE etc.).

Used bulk collections for better performance and easy retrieval of data by reducing context switching between sql and pl/sql engines.

Extensively involved in using hints to direct the optimizer to choose an optimum query execution plan.

Used Pragma autonomous transaction to avoid mutating problem in database trigger.

Error handled using Exception handling and used advanced features and concepts (Dynamic Sql, collections, Bulk binding)

Partitioned the fact tables and materialized views to enhance the performance.

Involved in Logical & Physical database layout design.

Java, sql loader, DataMart, TOAD, Sql Developer, Oracle, UNIX.

Client: Mepline Technical Services Nov 05 – Aug 12

Software Developer

Intelligence Information System is deployed for use by the MEP Quantity Surveyors to prepare estimation sheet. Auto CAD drawing measurement, area calculations, commission percentage, unit calculations and point calculations, length of wire and pipe, type of wire and pipe and its cost, man power, time limit to finish the work. All calculations done by internally. In the form through pull down menus Quantity engineers select their options for the calculations and enter the needed details in the text box and get the report automatically and they can see the preview before printing. It easy to get the estimation sheet and help to finish their work without risk.

Responsibilities:

Tracking the transactions and create the Analytics reports using Google analytical tool in the web based system.

Involved in full life cycle of Intelligence Information System application development.

Build model of this IIS system modules HLD & LLD.

Transferred business narrative into a graphical representation (flow diagram) of business information needs and rules.

Confirm and refine the model with the analysts and experts.

Achieved to store the AutoCAD diagrams and its storage locations in the database.

Creating and adding new fields to existing database.

Participated in meetings with development team & support team.

Installed and configured environment for both development and production.

Created packages for easy handling.

Handling Deployment Activity.

Maintained Release Plan.

Prepared Performance analysis report for both Oracle databases in development and production.

Environment: Oracle PL/SQL, J2EE, Hibernate, Google Analytics, Windows.

Client: The Commerce Benefit Group Mar 04 – Oct 05

Role: Java Developer-Data Analyst

Medical Accounting Administration system support business to handle various accounting activities, like different Ledger, Balance sheet preparation etc. It handles customer’s medical claiming forms, their accounting summary and detailed expenses. Valid expenses and their claiming amount pass to the approval section and keep it for their security.

Project Description

Responsibilities :

Requirement gathering, analysis and customer interaction.

Tableau reports and dashboards created and distributed in pdf format.

Design the database object.

Module development, Involved to create views for the Ford of Mexico region.

Inline views join conditions given based on the requirement and conditions.

Packages, Procedures, Triggers, functions developed for this application.

Prepared Application Design Document.

Involved in data modeling- constraints, relationship decided based on the business logic.

Enhancements.

Data Warehouse (DW) data integrated from different sources in different format (PDF, TIFF,J PEG, web crawl and RDBMS data MYSQL, oracle, SQL server etc).

Small change document preparation.

Client interaction- interact with business unit, assigned schedule and organized the meeting.

Prepared unit test cases.

Upload the developed database codes (Views and database objects) to the database under VIP schema in DEV environment and migrate to QA and Prod environment.

Give privileges and create synonym to these objects.

Packages, stand alone procedures and functions, triggers compiled and executed using DBCR-ADS tool.

Environment: Java 1.6 Hibernate web services framework, Oracle PL/SQL, SQL developer, Tableau 8.2, Oracle 11g, Access, NOSQL connector, Mongo DB, Linux – putty.

Projects:

E commerce Online retail business

Responsibilities :

Develop the web site designing

Dynamic page creation and connect with database

Design the database objects and tables.

Secure data and coding to avoid sql injection

Module development for payment process, signup, login, add or remove items from cart and

Environment : PHP, HTML, Css, Java Script, Mysql

Web designing

Responsibilities :

Design and Develop the web site for small business

Ten pages created and connect with database

Design the database objects and tables.

Create different forms and secure password

Module development for send and receive email, footer, header and slider views

Environment : XHTML, Java Script, Mysql

Trade market analysis (Big data Analysis)

Responsibilities :

Design and Develop the big data stack

Analysis the financial trade market data

Regression methods to build the model

Graphs, charts for visualization

Cleansing and mapping data.

Environment : R-Programming, Statistical Analysis

Big data Analysis on Banking

Responsibilities :

Credit card analysis using mapR

Stack design and development

Virtual Machine setup and build the system

Distrubuted system allocation

Cleansing and mapping data.

Environment : Map-R, hadoop

Marketing analysis (Big data Analysis)

Responsibilities :

Cleansing, mapping and grouping the data

Analysis the business market data

Design the dashboard & KPIs

Generate pdf file to share to the Marketing team

Cleansing and mapping data.

Environment : Tableau, Excel, Microsoft Sql Server.



Contact this candidate