Resume

Sign in

Data Security

Location:
United States
Salary:
80000
Posted:
October 16, 2020

Contact this candidate

Resume:

Kavitha Guntuka

Visa: H* EAD Cell: +1-682-***-**** adg04v@r.postjobfree.com Cincinnati, OH – 45216

https://www.linkedin.com/in/kavitha-guntuka-b689471b5/

Professional Summary

●Strong experience with 6.x product, distributed Splunk architecture and components including search heads, indexes, and forwarders

●Experience in customizing Splunk for monitoring, Application Management and security per customer requirements and industry best practice

●Experience in installing and configuring Splunk forwarders on Linux, Unix, and Windows

●Knowledge on Configuration files in Splunk (props.conf, Transforms.conf, Output.conf, Input.conf)

●Create and maintain reports and alerts in APM tools.

●Good Experience in Ticketing tools (Jira, Service-now, BMC remedy)

●Hands on experience in Splunk version upgradation from lower versions to higher versions.

●Expert in build custom searches and visualizations in both Splunk Core and Splunk ITSI.

●Managed Splunk user roles by mapping it to newly created AD and existing Ad groups.

●Configuring and push the configuration files like inputs.conf, outputs.conf, props.conf, transform.conf, deploymentclient.conf from Deployment server to agent server depends on the requirement from the user.

●Publishing data into Splunk through configurations files such as severclass.conf, server.conf, apps.conf, inputs.conf and outputs.conf.

●Standardized Splunk forwarder deployment, configuration, and maintenance across all the UNIX and Windows platforms.

●Installation and capacity management of Cyber-Ark Privilege Session

●Implemented the streamline process for the Splunk requests from the Users.

●Experience with Cloud environment services such as AWS, Microsoft Azure.

●Monitored the health checks of Indexer cluster members, search head cluster members and Volume of index utilized via Monitoring Console and inbuilt tools.

●Experience in installing and configuration of Dynatrace applications monitoring components

●Configured SCAPM environment alerting mechanism on the Splunk servers.

●Automated the Splunk deployment on the servers using shell scripting and same has been integrated with Resolve (Resolve is an FIS automated tool) Integration of different devices data to Splunk Environment and created dashboards and reports in Splunk.

●Build, customize and deploy Splunk apps as per internal customers

●Experience in creating complex queries, alerts, reports and dashboards

●Experience in setting up HF for all Splunk components, Index clustering and Search heads

●Develop Splunk based Dashboards, Reports and Alerts catering to the needs of Application Support teams.

●Evaluate processes and requirements to determine the best Splunk-based dashboard for tracking and reporting

●Knowledge about Splunk architecture and various components indexer forwarder search head deployment server Heavy and Universal Forwarder License model.

●Analysis, Design and Development of SPLUNK Queries to generate the Report and running SPL Queries.

Education

●Master of Computer Application - Satavahana University, Karimnagar, India (1997-2000)

●Bachelor of Science in Electronics - Government Degree College, Siddipet, India (1994-1997)

Technical Skills

SIEM TOOL

IBM Qradar, Splunk, IBM Guardium.

SPLUNK:

Splunk 5.x and 6.x, Splunk Enterprise, Splunk on Splunk, Splunk DB Connect, Splunk IT Service Intelligence, Splunk Web Framework, Splunk Machine Learning Tool kit, Splunk Hunk.

OPERATING SYSTEMS

Windows 2000, XP, Windows NT, Unix/Linux (Red Hat), VMWare.

DATA ANALYSIS

Requirement Analysis, Business Analysis, detail design, data flow diagrams, data definition table, Business Rules, data modelling, Data Warehousing, system integration

RDBMS

Oracle 11g/10g/9i/8i, MS-SQL Server 2000/2005/2008, Sybase, DB2 MS Access.

WEB TECHNOLOGIES

HTML, DHTML, JavaScript, XML.

WEB/APP SERVERS

Apache Tomcat 6.0, web logic8.1/9.2, web sphere 6.0

CONCEPTS

SDLC, Object Oriented Analysis and Design.

Professional Experience

Splunk Admin/Developer Feb 2018 – Present

Fifth Third Bank, Cincinnati, OH

●Installed & Configured Splunk, Apps & relevant components.

●Installed Sandboxes, Search head clusters and Indexer clusters.

●Did Advanced Searching, Throttling, Reporting and Scheduling alerts with Splunk.

●Removed disk space from the Search Heads, Forwarders.

●Worked on Onboarding logs into Splunk.

●Monitored Splunk Infrastructure for capacity planning and optimization.

●Use Splunk Enterprise Security to configure correlation search, key indicators and risk scoring framework.

●Assist internal users of Splunk in designing and maintaining production quality dashboards

●Arrange necessary trainings to Splunk internal customers

●Design core scripts to automate Splunk maintenance and alerting tasks

●Create presentation layers for Technical, Business and Executive Management showing environment operational health based on Key Performance Indicators

●Experienced in working with Splunk authentication and permissions and having significant experience in supporting large scale Splunk deployments.

●Involved in admin activities and worked on inputs.conf, index.conf, props.conf and transform.conf to set up time zone and time stamp extractions, complex event transformations and whether any event breaking.

●Created Dashboards, Visualizations, Statistical reports, scheduled searches, alerts and also worked on creating different other knowledge objects.

●Experienced on Security Information Event Management and good knowledge on information security products (Firewalls, IDS/IPS)

●Created macros using Rest API's for various saved searches in our environment.

●Created the reports and saved searches for the development environment.

●Developed various dashboards, reports for IT Infrastructure, IT Security, Leadership and other relevant stakeholders.

●Created custom app configurations (deployment-apps) within SPLUNK to parse, index multiple types of log format.

●Created Splunk Apps using XML and Web Components. Knowledge of app creation, user and role access permissions.

●Monitored and administrated multiple monitoring tools like AppDynamics, Dynatrace, New relic and CA APM.

●Developed XML parser for extracting data from Dynatrace server profile Python

●Configurations including AD integration and Management of Cyber Ark Enterprise Password vault.

●Resolved issues in Cyber Ark's Central Password Manager to communicate with hosts to reconcile credentials

●Converted data types (list, raw, table) from Splunk environment to Qradar metrics

●Conducted security investigations into customer incidents using QRadar Security Intelligence

●Extensive experience Implement SPLUNK service and app monitoring for new applications, devices, and platform components.

●Developed Splunk Search Processing Language (SPL) queries, created Reports, Alerts and Dashboards and customized them.

●Designing and maintaining production-quality Splunk dashboards.

●Extensively used App Dynamics to monitor CPU, memory usage, JVM heap memory health, session and thread counts, and application log error.

●Understanding client business requirement and translating into technical requirement and use cases.

●Experience with automation of operational tasks in a fast-growing environment

●Basic administrative skills in Linux and Windows environments

●Strong quantitative and problem-solving skills

●Supported and assisted in the design, configuration, deployment, and integration of Splunk Enterprise Security Suite based on defined requirements and objectives.

●Experienced in Configuring and Monitoring Splunk Behavior

●Handled tickets and created change requests through BMC Remedy tool.

●Worked with Splunk Support team for various Splunk issues and Splunk licenses.

●Worked on UNIX platform to perform server’s health check & troubleshoot.

●Worked on Onboarding logs into Splunk.

●Troubleshooting Logs Validation & have written Regular Expressions.

●Worked with Splunk Support team for various Splunk issues and Splunk licenses.

●Creation of knowledge objects (Lookup Tables, alerts).

●Field Extraction using Delimiters and Regular Expressions.

●Assigned users and the corresponding roles.

Splunk Engineer Sept 2016 - Jan 2018

Capital One, Tampa, FL

●Experience in installing a complex and distributed Splunk environment having search head

●and indexer clustering enabled and maintaining all these instances that are hosted on AWS.

●Upgrading the Splunk instances and syslog servers/ heavy forwarders to a stable version on a quarterly basis.

●Helped with on-boarding data into Splunk using HTTP Event Collector (HEC), Fluentd forwarder, and Splunk UF.

●Developed Glass Tables using Splunk ITSI to help visualize flow of transactions, and to identify weak links across the application's architecture.

●Managed user roles to complement Security and operational utilization.

●Managed alerts and scheduled reports across multiple teams to keep the scheduled search utilization under operational limit.

●Incorporated deep dives into glass tables for a better view into application performance over time.

●Optimized splunk performance by separating indexing and search heads across different machines.

●Maintain and manage the search head and index clusters.

●Administered and help install Universal Forwarders on diverse geospatial servers to forward log events to the indexer clusters.

●Participated in identifying, designing and building Splunk dashboards with drilldown.

●Created dashboards related to financial and business transactions, infrastructure monitoring, and various applications.

●Installing and using splunk apps for UNIX and linux (splunk UNIX).

●Created historical and real-time Dashboards, reports, scheduled searches.

●Helped Create and setup alerts to monitor the environment and data input.

●Created and managed Data models using CIM add-on, and helped users acclimate to the Pivot tool to create reports using Data models.

●Worked with props.conf and transforms.conf files to reduce license costs by redirecting all unnecessary log events to 'null queue'.

●Created KPIs using ITSI to monitor performance of various applications on an enterprise level.

●Created dashboards related to financial and business transactions, infrastructure monitoring, and various applications.

●Worked with summary indexing to help improve search performance.

●Log source integration into the Splunk solution to enable monitoring and reporting.

●Worked with Splunk app deployments, troubleshooting of Splunk Instances, log formatting,

●managing the licenses, managing the user’s roles and capabilities etc.

●Experience in troubleshooting activities of all the instances such as configuration issues,

●indexer clustering etc.

●Experience in troubleshooting activities with the Splunk technical team.

●Engaged as an SME and developed documentation and SOPs to enable the monitoring team.

●Writing splunk queries using SPL to visualize data in different kinds of dashboards, reports and also using SPL for building co-relation searches using it notables.

●Creating new use cases that enhance the company's security threat detection.

●Create Knowledge Objects and utilize it in search queries.

●Utilizing threat intels and developing use cases around the Intel.

●Troubleshooting different splunk application issue that relate to SPL (Search Processing Language).

Splunk Developer Nov 2015 – Aug 2016

Macy's Systems - Atlanta, GA

●Worked as Splunk Developer and analyst in banking/financial projects.

●Created and managed dashboard of various types of logs using customized splunk queries.

●Created dashboard visualizations with the objective of making it simpler for stakeholders to take an informed decision.

●Worked on lookups and KV store.

●Created and optimized alerts and reports as required by the stakeholders.

●Worked on DBX to onboard data from a database in Splunk.

●Identified critical data sources which require faster retrieval and putting them in summary indexing.

●Monitored the license usage and created custom dashboards and alerts to closely observe the usage.

●Worked on writing complex regular expressions to extract fields from different sources of logs.

●Extensively worked on logs to identify sensitive data and masked it using sed command.

●Worked on creating tags and event types to introduce modularity in dashboards.



Contact this candidate