Post Job Free

Resume

Sign in

Data Analyst

Location:
Centerton, AR
Posted:
July 28, 2020

Contact this candidate

Resume:

SANJANA KRITHI Ph no:906-***-**** Email : adew7n@r.postjobfree.com

Summary:

• Expertise in BIGDATA using HADOOP framework for Analysis and Development using SQL and Big Data technologies

• Experience in interpreting and analysing data in order to drive successful business solutions.

• Excellent understanding of business operations and analytics tools for effective analyses of data.

• Experience in developing ETL applications on large volumes of data using different tools: MapReduce, Spark-Scala, PySpark, Spark-Sql.

• An individual with positive attitude, good interpersonal skills and can do approach. Technical Skills:

Programming: Python, R, Matlab, Java, Scala Big Data: Hadoop, HDFS, HIVE, Spark, Apache Kafka Operating System: Linux, Ubuntu, Windows Visualization Tool: Tableau, PowerBI, Excel Databases: Oracle SQL, Teradata, MongoDB, Cassandra Infrastructure: AWS, Azure(Databricks) Python packages: NumPy, Panda, Scikit, Matplotlib, Sci-py Deep-learning: Keras, TensorFlow, CNTK, NLP Education:

Master of Science, Data Science GPA: 3.5/4

Michigan Technological University - Houghton, MI

Course Work: Artificial intelligence, Deep learning, Machine learning, Data mining, Data visualization, Predictive analysis, Business Analysis, Time Series Forecasting, Statistical analysis and modeling, Hadoop, Cognitive Science Bachelor’s in technology, Electronics and Communication Engineering GPA:3.25/4 Jawaharlal Nehru Technological University - Hyderabad, India Work Experience:

Walmart, Data Engineer June’19-Present

• Analysing large amounts of data sets to determine optimal way to aggregate and report on these data sets.

• Working as Big data developer where dealing with Data Lake, Hadoop, Hive, Teradata, Automic tool, JIRA.

• Experience in developing testing and deploying code in YAML.

• Worked on developing scripts to analyze and collect statistics for the tables .

• Delivered on Tableau, Google cloud platform and Global data portal to maintain data integrity and quality.

• Worked on developing Workflows to ingest data from hive tables to GCP buckets .

• Coordinated with business partners to resolve trouble shoot provided solution and discrepancies in data.

• Used Hive SQL, and Spark SQL for ETL jobs and using the right technology for the job to get done

• Worked publishing interactive data visualizations dashboards, reports /workbooks on Tableau. Data Analyst, Superior National Bank, Houghton Jan’19-May’19

• Performed Data analysis of customer data to help highlight the current drawbacks and analyse the problems within their company and provide a business solution for the bank to solve the issues with customer service. This analysis helped the bank understand the resources they need to hire to improve and work on existing problems.

• Worked on writing SQL queries for DDL. Performed data analysis included data cleaning and performing explanatory data analysis and statistical modelling to help the bank in improving their customer service.

• Experience in interacting with stakeholders/customers, gathering requirements through interviews and presenting the findings to the VP of the company.

Teaching assistant, Michigan Technological University, Houghton Aug’18-May’19

• Responsible to conduct labs and communicate with multiple students, maintain grade reports, records related to the students and other administrative works related to the position. Projects:

Self-driving cars using Deep Q Learning:(Using python(pytorch)) Aug’18-Jan’19

• Worked on building an environment that contains the map to run the cars and programmed an AI using deep-Q learning models using PyTorch. The AI was programmed using reinforcement learning.

• The cars learn about map from the sensors on it and runs iteratively to understand the environment it chooses the optimal path using the AI.

Image classification using Tensor flow: (Using Tensor-flow,keras, python) March’18-July’18

• Worked on classifying a data set of images which is a collection of items of clothing, using convolutional neural network. First the images were flattened into an array of vectors, the neural network has two hidden.

• The neural network has 3-layer feed-forward neural network, the hidden layers use relu activation function and a soft-max function Is used on the output layer to create 10 outputs(1 for each class). Predictive analytics for fraud detection (using SQL, python) Nov’17-Feb’18

• Performed data cleaning and explanatory data analysis to understand the underlying relationship between predictors in the data set. Performed data visualization and statistical modelling for data analysis.

• Created various linear and non-linear predictive models like logistic regression models, Neural network, Ridge regression, Lasso regression, Decision trees etc. Compared the models to find the best fitting model based on the accuracy, precision, and other metrics.



Contact this candidate