Post Job Free

Resume

Sign in

Data Engineer

Location:
Columbus, OH
Posted:
December 06, 2023

Contact this candidate

Resume:

Mounisha Gotte

ad1q4q@r.postjobfree.com

https://www.linkedin.com/in/mounisha-gotte-7073b6207/

+1-443-***-****

SUMMARY

Overall 6+ years of professional experience in the IT Industry with 8+ months experience as a Data Engineer and 4+ years’ experience in Data processing and Data Management at HSBC, 1+ years of Exp an Operations officer, 6+ months Exp as an XML Developer, and Master’s Graduate in Data Science. Focusing on the position to leverage technical skill and passion for data to contribute to the success of an organization with a strong background in Hadoop Ecosystem, Apache Spark, Python, SQL, data warehouse, data processing & ETL cloud service, shell scripting, Data analytics, reporting, and data quality solutions. Able to work independently and collaboratively in various settings, conditions, and environments.

WORK EXPERIENCE

Idol Soft Inc, USA

Data Engineer (January 2023 – Till Date)

Responsibilities:

Involved with Business Analysts team in requirements gathering and in preparing functional specifications and changing them into technical specifications.

Responsible for resolving errors, generating client payment data, transmitting data to clients, and ensuring conflict resolution utilizing MS Access, MySQL, Perl scripting and other various tools.

Wrote PL/SQL statement and stored procedures in DB2 for extracting as well as writing data.

As a PL/SQL developer, I have migrated the legacy SQL stored procedures to Oracle. I have a solid understanding of both SQL and PL/SQL, as well as experience in migrating data from one platform to another.

To begin with, I have analysed the existing SQL stored procedures and identify any compatibility issues with Oracle. I also rewrote some of the code to make it compatible with the Oracle database.

I also have experience with data visualization tools and creating dashboards using Tableau.

Worked extensively on SQL querying using Joins, Alias, Functions, Triggers, and Indexes.

Developed mappings in IBM Infosphere DataStage to load the data from various sources includes SQL server, DB2, Oracle, Flat files into the Data Warehouse.

Cleansed, Extracted, and analyzed business data on regular basis and generated ad-hoc analytical reports using Excel and T-SQL.

Responsible for database creation logins & user’s creation, granting object level permission.

Worked together with Implementation team to assure a smooth transition from the design to implementation phase.

HSBC (HDPI, HSBC Data Processing), HYD, Telangana, India

Trainer and Data Programming Associate (July 2017 -June 2018)

Data referral Associate (January 2016 -July 2017)

Customer service Executive (March 2015- January 2016)

Process: Financial Crime Compliance (Sanction Screening): The process deals with live payments made to sanctioned countries in the Middle East Region.

Responsibilities:

Corroborate the payments which are stopped at the initial level of screening and verify them with various parameters and take an appropriate decision on whether to be referred.

Coordinate with Relationship Managers for the detailed purpose of the payment and contact the concerned authorities through emails with complete details about the stopped payments.

Monitor the errors and provide appropriate feedback to the staff. Render training to the new staff and monitor their learning curve.

Achievements:

Received the Top Performer of the month 3 times for achieving the best results within the team on all performance parameters for July’16 and Oct’17

Awarded STAR performer in February’18 for achieving outstanding results and meeting the business requirements

Presented with a Certificate of Appreciation for maintaining 100% quality and adhering to compliance in terms of processing the payments.

VFS Global Service Pvt Ltd., HYD, Telangana, India

Operations Officer from March 2013 to August 2014.

Responsibilities:

Managing and processing VISA applications for UK and Schengen countries.

Non-judgmental tasks for its client governments.

Dealing with Customer Queries regarding the applications.

Marker Soft Solution, KNR, Telangana, India.

XML Developer from August 2011 to December 2012.

Responsibilities:

E-publishing i.e., Preparing XML Style sheets based on client requirements.

Automating work through scripting (specific editors) and macros.

Used Math type 6.0 for designing Math books and editing images using Photoshop 7.0.

Engineering web development, all layers, from database to services to user interfaces.

Supporting legacy systems with backups of all cases to/from parallel systems.

Analysis and design of databases and user interfaces.

EDUCATION

Master of Science in Data Science Jan 2021 - Dec 2022

ROWAN University, Glassboro, NJ, USA

Relevant Coursework: Data Warehousing, Information Visualization, Artificial Intelligence, Natural Language Processing, Advanced Database Systems: Theory and Programming, Data mining-1, Patient Data Understanding, Probability and Statistics, Applied Multivariate Statistics, Big Data tools and techniques

Bachelor of Technology in Computer Science Engineering Sep 2007-Aug 2011

Sree Chaitanya College of Engineering (JNTUH affiliate), Telangana, India

Relevant Coursework: Advanced Data Structure, Software Engineering, Database management systems, Design and analysis of algorithms, computer Design and networks, Object-oriented analysis and design, Network Programming, Data Warehousing, and data mining,

TECHNICAL SKILLS

Languages

Python, R, CSS,HTML,D3.js

Cloud Technologies

AWS(S3, Athena,EC2,IAM)

Big Data Technologies

Hadoop, Apache Spark, Data mining, ML, Data Analyst

IDEs & Tools

PyCharm, Sublime Text3, Databricks

Operating systems

Windows, Microsoft Word, Excel, PowerPoint, LINUX, Google Drive

Scrum Methodologies

Agile

Databases

Oracle, Snowflake, MYSQL, SQLite, Mongo DB

ACADEMIC PROJECTS

1.Building Walmart grocery Data warehouse

Implemented a data warehouse using Python and SQLite that contains a year’s worth of sales and transactions that can be used for data cleansing, transformation, data cubes, and business reports, as a part of an ETL team.

2.Language Prediction Using Python

Language detection is about automatically understanding in which language a text was written. To do that we developed a naive Bayes algorithm using python to identify which language of the given text. In the process, without compromising the data built a model and applied text preprocessing, a bag of words, vectorization techniques, and training, and tested the model to get an accurate prediction.

3.Real Estate Price Prediction

In this project, our team implemented a house price prediction model for Bangalore, India location in Python. Machine learning techniques are applied to analyze historical property transactions in India to discover useful models for house buyers and sellers. In the process, we applied data Pre-processing techniques, and model building using K-Fold cross-validation and linear regression methods.

4.Academic scholarly paper on Patient data understanding

5.Vast Challenge 2018 MC3

In this Project, with team efforts this challenge requires the extensive use of Python, tableau, networks, and timeline graphs using R to get the insights for given questions. Data cleaning process required to convert the timestamp data to date time data using Python. Created the dynamic report using HTMl, CSS and Javascript and also and developed graphs using Tableau for the advancement of purchases of the company over the years and highlighted suspicious purchases using Tableau projections, using R projected Network graphs.

6.Medical cost Prediction

This Project is about Statistical data analysis for health insurance, predicted healthcare costs and calculating premiums. For this, we used the regression analysis method to identify the important factors that affected prediction of insurance costs. Using the given factors and calculate the health insurance premium with a regression model to identify the important factors which affect the cost using RStudio and do hypothesis test to verify the statistical assumptions and with proper classification, techniques offers a fair process to the customer and maximizes the company’s profit.

Statistical data analysis for health insurance, predicted the medical costs using the given factors and calculate the health insurance premium with regression model to identify the important factors which affect the cost using RStudio and do hypothesis test to verify the statistical assumptions.

7.Indian Summers over the years

The Big data project is about End-to-End Data Engineering Model using Apache Spark and AWS service for the Indian summers from 2011 to 2022 data set with a 3-step approach for ETL pipeline, data ingestion, data curation, data provisioning using data bricks, deploy & monitor the spark job with small EMR cluster and AWS lambda script and orchestrate the lambda script using step function and CloudWatch rule. Also created various versions of the dataset like bronze, delta, Silver, and Gold Datasets by validating, and cleaning null values. Using silver and gold datasets, created dashboards and reports by linking them to Tableau using Athena for visualization.



Contact this candidate