Resume

Sign in

Data Analyst

Location:
Clifton, New Jersey, United States
Salary:
$45/hr
Posted:
September 30, 2019

Contact this candidate

Resume:

Location: Boston, MA Mobile: 617-***-**** Email: adahi2@r.postjobfree.com

Akash Patel

Data Analyst/Business Analyst

Summary

3+ years of IT experience in the field of Data/ Business analysis, Data Modeling, and Project Management with good experience in Big Data and related Hadoop technologies.

Solid understanding of all phases of Software Development Life Cycle (SDLC) methodology (such as requirement, analysis, design, data modeling, business process modeling, implementation and deployment). Excellent understanding Agile and Scrum development methodology.

Experience in gathering user requirements, preparing and analysing Business Requirement Documents (BRD), Functional Requirement Document (FRD), and propose changes as per various internal and external requirements gathered for process improvement.

Identifying the Stakeholders by power to interest ratio and assertive in eliciting requirements by Personal interviews, Requirements Workshops, JAD Sessions, Questioners, Document Analysis, Focus Groups, Surveys and Brainstorming for software development.

Fluency in Python with working knowledge of Machine Learning & Statistical libraries (e.g. Scikit-learn, Pandas).

Good Knowledge in writing SQL, for generating Complex Stored Procedures, User-defined Functions, Views, Cursors and Triggers.

Experienced in Writing PL/SQL, ETL routines in Data warehousing applications Involved in database design and Data modeling.

Involved in the entire data science project life cycle and actively involved in all the phases including data cleaning, data extraction and data visualization with large data sets of structured and unstructured data, created ER diagrams and schema.

Experience in working with Tableau Desktop, Tableau Server and Tableau Reader in various versions of Tableau and other BI technologies like Teradata etc.

Ability to write and optimize diverse SQL queries, working knowledge of RDBMS like SQL Server, Oracle, MySQL and PostgreSQL.

Experience in deploying Hadoop cluster on public and private cloud environments like: Amazon AWS, Rack space and Open stack.

Experienced in creating filters, quick filters, data source filters global filters, context filters, user filters, actions and creating Dashboards for key performance indicators (KPI).

Designed and implemented large scale business critical systems using Object oriented design and Programming concepts using Python and Django.

Supervised and streamlined the daily delegation of incoming issues using Jira.

Capable of forming and maintaining positive and productive working relationships in internal/external and team environments with excellent communication skills, strong team-building and conflict management skills.

Skills

Education

Master’s in Data Analytics North-eastern University, Boston, MA

Experience

AmerisourceBergen, MA Jan 2019 – Current

Role: Data Analysts/Business Analyst

Responsibilities:

Generated PL/SQL scripts for data manipulation, validation and materialized views for remote instances.

Performed the requirement analysis, impact analysis and documented the requirements using Rational Requisite Pro.

Participated in the Sprint Planning meetings and Vetting sessions to understand requirements and acceptance criteria.

Participated in creating the User Interface Mock-ups for the monitoring system using MS Visio.

Produced and maintained Functional Requirements Specifications (FRD/FRS) and Requirement Traceability Matrix (RTM) for various Releases of the project.

Performed a cost-benefit analysis to indicate the benefits from the ongoing consolidation in the future.

Built multifunction readmission reports using python pandas and Django frame work.

Independently organize the project, analyse the data and develop the detailed reports, dashboard and scorecards by Tableau.

Built Tableau visualizations against various Data Sources like Flat Files (Excel, Access) and Relational Databases (Teradata, Oracle, SQL Server).

Developed Python modules to extract data from MYSQL database.

Evaluated data profiling, cleansing, integration and extraction tools (e.g. Informatica).

Assisted Business users and recommended features, charts, KPI’s for their business analysis in Tableau.

Maintained data dictionary by revising and entering definitions.

Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.

Performed Data analysis using Python Pandas. Writing scripts in R to implement Predictive Modelling algorithms.

Creating user stories and maintaining and tracking them using JIRA.

Vivma, India Jan 2015 – June 2017

Role: Data Analysts

Responsibilities:

Involved in every phase of the SDLC process starting from Inception to Transition.

Analysed Sales data mart model to identify the tables and their relations in order to extract required data.

Writing SQL queries to extract data from the Sales data marts as per the requirements.

Create Source to Target Data mappings for the individual reports.

Create Pivot tables in Excel to analyse data across several dimensions.

Creating mock-ups of Dashboards and individual data visualizations in MS Excel.

Developed Tableau data visualization using Scatter Plots, Geographic Map, Pie Charts and Bar Charts and Density Chart.

Prepared test cases for the validation of data and look feel of the reports.

Document all data mapping and transformation processes in the Functional Design documents based on the business requirements.

Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Ab Initio and Informatica PowerCenter Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.

Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.

Created and executed test cases for ETL jobs to upload master data to repository.

SDLC, Agile, Scrum, Python, R, Hadoop, Spark, Crunch, Pig, Avro, Flume, SQL, SQL Server, Oracle, MySQL, PostgreSQL, Django, Informatica, Power BI, MS Office, Rational Requisite Pro, MS Visio, Jira, Scikit-learn, Pandas, AWS, Windows, Linux



Contact this candidate