Post Job Free

Resume

Sign in

Engineer Software

Location:
Vasant Nagar, Karnataka, India
Posted:
September 30, 2020

Contact this candidate

Resume:

Email id – adgi9t@r.postjobfree.com

Mob: +968*******

SAVITA A

CAREER OBJECTIVE

Seeking a challenging position in a dynamic organization, where I can utilize my Storage Testing, Big Data Testing.

PROFESSIONAL SUMMARY

Around 7 years of experience in Storage Testing, 1 year of experience with Big Data Testing.

Experience in Storage Testing, Big Data Testing

Understanding of Python, NumPy, Pandas, Tuple, List, Dictionary.

Experience in Hbase, Hive, S3, Spark, Kafka (Producer and Consumer) and Informatica.

Project experience on Visualization tool Tableau. Done a project where in created a bullet chart with data blending, created a dashboard with this view.

Have worked on switches like Cisco, Brocade.

Have worked on technologies like FC, FCOE, DAS, SAN, NAS, iSCSI, Virtualization, VMWARE, Backup and Recovery, Data Migration.

Have worked on FC switches like Cisco, Brocade and FCOE switches like IBMs Gryphon FC, Compass FC, HP’s ICM and Components like HBA, CNA, Rack servers.

Good knowledge on Storage Concepts, FC, FCOE, DAS, NAS, SAN, RAID and protocols like FC, FCOE, FIP, SCSI.

As a test Engineer was involved in configuring the setup, writing and updating test plans, test execution, defect logging and verification.

EDUCATION

B.E in Information Science from B.T.L.I.T.M, Bangalore

WORK EXERIENCE

Worked as a Technical Lead with Aricent from August 2015 to Jan 2017.

Worked as a Senior Software Engineer with Calsoftlabs from February 2012 to Jan 2015.

Worked as a Software Engineer for Hewlett-Packard in R&D division from August 2009 to January 2012.

Worked as Software Engineer for Ektha from April 2008 to September 2008.

Worked as a Associate SQA for InMage from December 2006 to March 2008.

CERTIFICATIONS

Doing Certification in Data Analyst.

Doing Certification in Machine Learning.

Tableau Project(Sales Performance Analysis)

Used the Sample Superstore Dataset. Created a bullet chart with Category and Segment dimensions and Sales measures. Blended the data with another dataset called Sales Target. Color coded the chart to identify Categories and Segments that are above or below target. Add the year od sales to the view the trends and outliers.

Added a filter so that the user can select one, more than one, or all years. Created a dashboard with this view.

Professional Experience: Projects Involved

Project Undertaken in Aricent

Project 1. Big Data

Project : Big data

Organisation : Aricent

Role : Tech Lead

Duration : August 2015 to Jan 2017

Client : hp

Team Size : 8

Tools : Hue, S3, Kafka, Spark, Hive, Hbase, putty

Project Description

Project aims Data Migration Validation and Lookup Validation. Sample file is loaded to S3 bucket.

Kafka, Spark, Hive is run to fetch data from source, migrate to target DB and to view the data.

Kafka has producer, consumer. Producer will fetch the data from S3, Consumer will transfer the data

To target DB. In Hive we query and see the results.

Key Responsibilities:

Involved in Data Migration Validation.

Involved in Lookup Validation. Used Numpy

Used Hive for quering and seeing the results.

Run Kafka topic.

Responsible for configuring different setup like setting up the topology.

Coming up with paper of learning on any new topics worked which helped new comers for doing hands on and to understand the feature at a glance.

Project Undertaken in Calsoftlabs

Project 2. Shogun(FCOE) for client IBM

Project : Shogun

Organisation : Calsoftlabs

Role : Senior Software Engineer

Duration : February 2012 to Jan 2015

Client : IBM

Team Size : 8

OS : Windows, Linux

Hardware : FCOE Switches, Cisco, Brocade, SANBLAZE, IXIA, CNA,

Project Description

Compass-FC and Gryphon-FC are being jointly developed by IBM and QLogic to provide FC/FCoE solutions on IBM CEE switching platforms. QLogic provides the FC hardware module hosting the

QLogic stack; software module containing the FC/FCoE stack IBM provides the hardware(Compass-FC/Gryphon-FC Baseboard) and software(BladeOS) infrastructure to host the above mentioned QLogic hardware and software.

Key Responsibilities:

● Mostly involved in manual regression testing, new feature test design.

● Going through the UseCases and writing test plans.

● Written test plans like Entrance, Functional, Scalability.

● Was involved in Regression, Scaling and feature testing.

● Responsible for configuring different setup like setting up the topology for testing.

● Responsible for Defect tracking, Test reporting, Defect Closure.

● Coming up with paper of learning on any new topics worked which helped newcomers for doing hands on and to understand the feature at a glance.

Project Undertaken in hp

Project 3. Data Protector[Backup and Recovery Software]

Project Name : Data Protector R&D QA

Organisation : hp

Role : Software Engineer

Duration : August 2009 to January 2012

Team Size : 25

OS : Windows, Linux, HP-UX, Solaris

Hardware : HP blade servers, FC switches, Servers, HBA, Disk Drives, Tape devices,

MSL Library etc.

Project Description:-

HP Data Protector software is automated Backup and recovery software for single-server to enterprise environments which provides cross-platform, online backup of data on many flavours of windows,linux,solaris & HP-UX. Data Protector works on many devices like switches, servers, HBA, arrays,tapes etc and can take back over heterogeneous environment like DAS, NAS, SAN.

Key Responsibilities:

• Responsible for configuring different setup for Data Protector.

• Worked on Backup over SAN and took backup on different devices.

• Carry out test requirements.

Project Undertaken in Ektha

Project 4. Data Migration in SAN Enviornment(Using Switches,Arrays,HBA,Data Movers,Hosts)

Project Name : iADM, iNSP

Organisation : Ektha

Duration : April 2008 to September 2008

Team Size : 10

OS : Windows, Linux, Solaris

Hardware : Cisco, Brocade, MCdata switches, Servers, HBA, Data Movers iADM iNSP

Project Description:-

Incipient Network Storage Platform Software designed for customers that have ongoing data mobility needs and have little to no downtime available. iNSP combines storage virtualization technology with robust management allowing customers to move data across their SAN environment without any downtime.

Key Responsibilities:

• Configuring storage devices.

• Responsible for configuring switches,Zoning

Project Undertaken in InMage

Project 5. Backup and Recovery Software

Project Name : DR Scout

Organisation : InMage

Role : Associate SQA

Duration : December 2006 to April 2008

Team Size : 10

OS : Windows, Linux, Solaris

Hardware : HP blade servers, FC switches, Servers, HBA, Disk Drives, Tape devices,

MSL Library etc.

Project Description:-

DR Scout is a continues backup and recovery solution. The aim is to test the functionality of DR Scout. This testing is done manually. The aim is to test regressively each feature of the product.

Installing product agents on VMWARE or Physical machine having multiple NIC. This provides fault tolerance and load sharing.

Aim is to use public IP instead of Private IP. and doing replication. Here by using public IP the private IP will not be known to the outside world. This can be used for security purpose.

Key Responsibilities:

• Installing Backup Software on Server and Client.

• Doing Performance.

Personal Strengths:

• Excellent presentation and communication skills.

• Good planning, listening and time managing skills.

• Good Inter-personal and team playing skills.

• In built Willingness to learn.

• Adapting to new skills and technology.

Personal Details

Name : Savita A

Contact Number : 968-***-****

Email id : adgi9t@r.postjobfree.com

Education : B.E in Information Science



Contact this candidate