Resume

Sign in

Data Python

Location:
Tampa, FL
Posted:
March 21, 2020

Contact this candidate

Resume:

Malathy Munusamy

***** ****** ***** *****, ***** - 33613 813 – 452 - 7258 adceci@r.postjobfree.com https://www.linkedin.com/in/malathy90/ Professional Summary

• • AWS of Versatile insurance data Certified in SQL, the industry. healthcare PL/Developer SQL Procured and industry. with Python in-over depth programmer, 7 years domain of experience, knowledge with expertise that and includes in proven working analysis, ability for the to design, form, most reputed manage implementation clients and lead in the and teams US management healthcare efficiently, with records of delivering tasks within stringent deadlines. Education

University ICFAI Anna University University of South – – Computer Human Florida Resources, – Science Business and Master Analytics Engineering, of Business and Information B. E Administration (GPA-9.systems, 32) (GPA-M. S 4) (GPA-3.7) Expected May May May 201*-****-**** Technical Skills

Languages: Skills: Databases: Tools: Cloud Certifications AWS Professional Certified Platforms: Data Tableau, Visualization, Cassandra, Python, Developer PowerBI, Experience Amazon R, Stata, Oracle, Weka, Big Web – Validation SQL, data Sybase, Services, AzureML, MySQL, (Map MS Reduce, number: Cloudera NOSQL SAS SQL Miner, server PySpark, (0S1CJ67C2FQQ1LSB Hive, Hive, Impala)Hadoop) MS Office,, C#, Shell MS Visio, script Databricks, Argo Nov. UML 2018 – Nov. 2020 University Project: • • Analyzed Designed Monitored Executed Bank of South of America real dashboards social twitter Florida time media events, score – for Data events on weekly performed an Science event to presentations establish Intern NLP through on statistical tweets decision using to Tableau. find metrics trees, patterns to Performed for predict volume and rapid predicted web and volume scraping drivers, volume jump. using that jump Tampa, helped Python on different FL in to Feb. gather decision 2020 time tweets. intervals. - making. Present Graduate Teaching Assistant - Cyber Security Tampa, FL Dec. 2018 – Dec. 2019

• • Assisted Setup Shell scripting, a fully in the functional redesign cryptography, virtual of the Incident Information Linux CentOS Handling Security 6 environment. & Risk and assessment IT Risk Was Management responsible course for assisting for graduate students students with queries on Linux NTT data - Business Systems Analysis Specialist Bangalore, India Jul. 2016 – Jun. 2018

• • • • • Served of Obtained data Prepared Operational Business million implementation in the as healthcare the a impact: system liaison impact: visualization Improved through between of raw deployment Root data of cause SQL technology first the in and flat processed analysis pass pipeline deployed files rate team format, leading and of data and insurance it configuration to using business cleaned to production closure Tableau, claims and stakeholders of establishment using preprocessed 60 by to issues close infer MS Access. to to per the 40%create using quarter. health the . Savings data efficiencies SQL insurance Average using realized excel through claims impact and in the trend. of SQL. a $magnitude creative 50,Configured 000 per approach of issue. $the 2.7 Cognizant - Product Specialist Chennai, India Jun. 2015 – Jul. 2016

• • • Created million Gathered Tableau Applied users. and feature to and find led pre-patterns selection a processed team of of and 10 authorization the people predicted health to insurance develop if failures. an incoming and data maintain using claim Python. will a US fail healthcare Performed authorization, configuration an exploratory using the project Logistic analysis that of regression comprised the data using and of 5

• Decision Lead for the a rigorous client. tree model. data security Adjusted initiative hyper parameters and identified to achieve system-a based better loopholes accuracy leading of 78%. up to savings of more than $30,000

• • Re-queries Was queries designed accountable using to reduce the joins, stored for variation thus designing procedures increasing and improve and the implementing effectively batch efficiency. performance for automation performance by 60%frameworks tuning . through using Sybase Indexing stored and procedures implementing and efficient ad-hoc Cognizant - Junior Product Specialist Coimbatore, India Aug. 2011 – May. 2015

• • • • Worked Involved Gathered Preparation implementation with in the performance of data cross-the plans in functional various the for form tuning Membership Technical of teams of flat the files and Oracle and and and maintained Functional Eligibility loaded queries to daily to documents, the module. improve communication system Completed the using Detailed efficiency. Sybase with platform Design customers. and SQLDeveloper. Document, migration Test activities Case from document Sybase and to Oracle over a period of 6 months, that eventually led to 74% increase in efficiency. Academic Projects

Occupation • • • Built in Applied Achieved idle detection a state, model Logistic an using accuracy to – detect regression Big PySpark data of occupancy 90% and Machine in Decision both of a Learning the room models. trees with Library. to CO2, determine temperature, occupancy. humidity & light sensors to help conserve energy when Air plane • • Preprocessed Visualized crash analysis states the that - airplane Tableau had the crash most dataset airplane for Missing crashes values in the US using and Excel. inferred the causes, using Tableau. Gun • Shot Condensed transformation. Classification the Created dataset - Predictive models by analytics, merging using several Sound Decision classification features. trees, SVM, Used Random Mel Frequency Forest and Cepstral Voting classifiers Coefficients to (predict MFCC) the for gunshot feature

• • sound, Collected Tuned the using the model Python. firecracker by adjusting sounds estimators similar to and gunshots tree depths manually, & achieved for testing accuracy purposes. of 85% in the test dataset. Churn • • • Prediction Preprocessed bagging Created Calculated retaining models technique, churners. modelling confusion the using Churn as - the matrix Logistic Predictive data data and regression, is for skewed, analytics cost a telecom matrix to Decision find company. by out assigning trees the Randomly best and cost model, Random to sampled using various Forest Python. the to scenarios predict train dataset the to advise churners. into telecom 1: n samples companies through for Airbnb data analysis – Statistical Data Mining

Professional • • Analyzed Modelled Accolades dataset high variance with 90 data attributes with Quasi to test Poisson hypothesis using R. based Inferred on how the multiple insights factors that will affect help the retailers occupancy to gain rate. best returns.

• • • Acknowledged within Awarded improve Recipient process a alternatives short performance multiple of “Spotlight” duration with “Associate Silver to by meet of [60% 2015] 5 Award months the of and award project the thus [2016] thus Month” for saving deadlines. leading 5 in consecutive [NTT 2013] more up data to than award saving for times $30,successfully at of 000 Cognizant at $2.Cognizant for 7 million. the implementing Technology client. for coming Solutions a challenging up with for implementable automating pricing configuration the real code time to



Contact this candidate