Post Job Free

Resume

Sign in

Data Security

Location:
Vasant Nagar, Karnataka, India
Posted:
January 22, 2021

Contact this candidate

Resume:

Resume- Splunk

Maruthi Prasad

Mobile : +** -776*******

E-mail: adjmeo@r.postjobfree.com, adjmeo@r.postjobfree.com

13+ years of total IT experience with strong experience and 5+ years’ of experience as SIEM Sr Splunk Analyst include in Threat hunting,Threat intelligence, Risk Management and Security Enterprise, Software Analysis, Design and Development for various software applications in providing Business Intelligence Solutions in Data Warehousing for decision Support Systems, and Database Application Development.

3+ years of exp as a Splunk Admin and more than 3 years of exp as a Splunk Developer.

Strong Experience of Cyber Threat Intelligence tooling and particularly using Splunk.

Experience in Threat investigations, reporting, investigative tools and laws/regulations.

Reviewing incident and penetration testing reports and corresponding logs, to identify gaps in our detection capability and provide recommendations to improve them.

Expertise in Security implementation, Enterprise security planning and administration of Security Solutions

Good working knowledge on Python Scripting language and integrating with Splunk.

5+ years of Experience in in Data Warehouse, Data mart, Data Integration and Data Conversion projects ETL using Informatica Power Center 10.2/9.6/8.6.

Good knowledge about Splunk architecture and various components (indexer, forwarder, search heads, deployment server), Heavy and Universal forwarder, License model.

Expert in Extracting, Transforming, Analyzing, Visualizing, and presenting data from diverse business areas and insightful ways to enable Directors, Vice Presidents, and executives to take informed action.

Experience in working with Splunk authentication and permissions and having significant experience in supporting large scale Splunk deployments.

Worked on large datasets to generate insights by using Splunk for Cyber Security.

Microsoft Azure Cloud trainings are attended for AZ-500, AZ-104

Expertise in Preparing, arranging and testing the Splunk search strings and operational strings.

Working knowledge on Splunk App for AWS and Azure.

Knowledge on ELK (Elastic Search).

Experience in developing Splunk queries and dashboards targeted at understanding application performance and capacity analysis.

Creating and managing app, Create user, role, Permissions to knowledge objects

Create dashboard from search, Scheduled searches of Inline search vs scheduled search in a dashboard Various types of charts Alert settings Knowledge of app creation, user and role access permissions.

Experienced in all data processing phases, from the Enterprise, Data Model (Logical and Physical Model), and ETL.

Good Understanding of Data Flow Diagrams, Data Dictionary Techniques, Entity Relation Modeling and Design Techniques, Database Normalization Theory in RDBMS.

TECHNICAL PROFICIENCY

Monitoring Tools for Cyber Security : SPLUNK 6.0, SPLUNK 6.2, SPLUNK 6.3.3,SPLUNK 6.4, SPLUNK 7.2

Data Warehousing : Informatica, Tableau, MSBI and Business Objects

RDBMS : SQL/PL-SQL, SQL Server, DB2, GreenPlum

Schedulers : Control-M, $U

Middle ware : TIBCO BW

Program Language : Java,JSF, Struts

Configuration Mgmt : Ansible

Operating Systems : Windows & Unix

Scripting : Unix Shell Scripting, Perl Scripting, Python Scripting, PowerShell Scripting

Concepts : Data Warehousing, Unit testing,Manual Testing and Performance Testing

Tools : Service Now, JIRA, HP QC,Remedy,CQSR

ACADEMIC CREDENTIALS

MCA completed from Sri Krishna Devaraya University, Anantapur, A.P in 2005.

Certifications and Trainings

Completed NCFM – Financial Markets Certification

Completed NCFM – Mutual Funds Certification .

Completed SCJP – Sun Certified Java Programmer.

Completed Splunk User certification.

Completed Splunk Power User and Admin Certification.

Below Microsoft Azure Cloud Trainings are attended and completed.

-AZ-104 : MS Azure Administrator

-AZ-500 : MS Azure Security Technologies

Microsoft Cloud certification

oAZ-900 : Microsoft Azure Fundamentals

Competed Qualys certifications below –

Vulnerability Management

AssetView and Threat Protection

Scanning Strategies and Best Practices

Reporting Strategies and Best Practices

Policy Compliance Strategies & Best Practices

PCI Compliance

Web Application Scanning

Cloud Agent

PROFESSIONAL WORK EXPERIENCE

Working as a Sr Splunk Analyst in Wipro,Bangalore from 15th April 2019 to 31st Dec 2020.

Worked as a Sr Cyber Splunk Analyst in Unisys,Bangalore from 13th Feb 2017 to 21st June 1018.

Worked as a Sr Splunk Devloper in Accenture Services Ltd, Bangalore from 16th August 2011 to 2nd Dec 2016.

Worked as a BI Lead in Tata Consultancy Services, Bangalore from 3rd March 2010 to 12th August 2011.

Worked as an BI Developer in Cognizant Technology, Bangalore from 11th Sept 2006 to 26th Feb 2010.

PROJECTHANDLED

Project #1: ANZ Splunk Team

Environment: Splunk 7.2

Duration: April 15th to Till Date

Client : ANZ Bank

Role: Sr Splunk Analyst

Description

ANZ is about ensuring the business is managed to take account of social, environmental and economic risks and opportunities.

By taking these factors into consideration across all areas of business, can create and preserve value for customers, shareholders, people, the environment and the communities in which we operate.

ANZ Sustainability Framework supports the business strategy, reflects our material issues and aligned with our purpose.

At the core of the framework is fair and responsible banking - keeping pace with the expectations of our customers,employees and the community, behaving fairly and responsibly and maintaining high standards of conduct.

Key Responsibilities

Splunk Monitoring health status per application interface per business phase

Developed Ambit app and delivered Successfully.

Involved in ERTS app to configure the Infra and Business Dashboards for CL,CRIS,CRE,RAZOR,BECCS and RAY applications.

Created the critical,hardware and warn Alerts for CL,RAZOR,CRIS and BECCS applications.

Involved in Splunk installation and configuration part to onboard the data.

Created the app to onboard the respective data to the CRE,BECCS and RAY applications.

Review the existing dashboard and do a gap analysis

Ensure the dashboard is fit for purpose and accurate.

Granular level details available as per drill down on the dashboard.

Correlation available between application and business dashboard.

Breakup in new vs existing contracts.

Involving N Number of applications per business phase and trend.

Average cycle time for contracts per business phase.

Project #2: Unisys UIT

Environment: Splunk 7.2

Duration: Feb 2017 to 21st June 2018

Role: Sr Cyber Security (Splunk) Analyst

Description

Unisys is a worldwide information technology company. We provide a portfolio of IT services, software, and technology that solves critical problems for clients. We specialize in helping clients secure their operations, increase the efficiency and utilization of their data centers, enhance support to their end users and constituents, and modernize their enterprise applications. To provide these services and solutions, we bring together offerings and capabilities in outsourcing services, systems integration and consulting services, infrastructure services, maintenance services, and high-end server technology.

Key Responsibilities

Splunk resource administration & monitoring (CPU, memory, disk, network, hot and cold location )

Splunk Admin for Creating and managing app, Creating users, role, Permissions to knowledge objects.

Splunk scheduled job administration & monitoring

Ownership responsibilities to obtain and maintain User Access Information.

Collaboration with the wider Cyber security functions to develop hypotheses for new attack techniques and evasion methods.

Splunk Dashboards created for various types of business users in organization.

Reviewing incident and penetration testing reports and corresponding logs, to identify gaps in our detection capability and provide recommendations to improve them.

Splunk License usage monitoring & system optimization.

Splunk Audit user activity.

Splunk alerts created based on the critical parameters, which will trigger emails to the operational team.

Contributing to the continued evolution of hunting, monitoring, detection, analysis and response capabilities and processes.

Prepared, arranged and tested Splunk search strings and operational strings, writing Regex.

Splunk architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, License model.

Parsing, indexing, Hot, Warm, Cold & Frozen bucketing.

Training, developing, mentoring and inspiring colleagues across the function in area(s) of specialism, strengthening Cyber security Operations capabilities.

Researching new and existing threat actors and associated tactics, techniques and procedures (TTPs)

Developing a detailed understanding of their potential impact to the organization, providing recommended solutions for improving our defensive and detective capability.

Optimized Splunk for peak performance by splitting Splunk indexing and search activities across different machines.

Created many of the proof-of-concept dashboards for IT operations, and service owners which are used to monitor application and server health.

Project #3: Sonepar IT

Employer: Accenture India pvt.ltd, Bangalore

Environment: Cyber Security - Splunk 6.4

Duration: July 2013 to 2nd Dec 2016

Role: Sr Cyber Security - Splunk Developer

Sonepar USA is made up of the multiple electrical supply distributors (Op-Cos) across the United States and Puerto Rico.They are a member of the Sonepar group which is world’s largest privately-held electrical distributor based in France. Sonepar US has grown tremendously in the past 5 years through strategic acquisitions of small and medium regional competitors.Currently the company has acquired over 15 companies, 2 within the last year.Due to it’s acquisition strategy and rapid growth, Sonepar USA faced many information management challenges - disparate system platforms, limited integration, manually intensive and informal data collection, limited reporting and analytics capabilities, etc. Sonepar partnered with Accenture to define it’s information management strategy with the vision of creating a best-in-class business intelligence solution to deliver a consolidated view of key data and provide accurate, consistent, and timely analytics and reporting.

Key Responsibilities

Splunk resource administration & monitoring (CPU, memory, disk, network, hot and cold location

Managing and monitoring of indexes, indexers, search heads, deployment, intermediate, rsyslog, syslog-ng and other Splunk supporting systems.

Splunk scheduled job administration & monitoring

Splunk License usage monitoring & system optimization

Training, developing, mentoring and inspiring colleagues across the function in area(s) of specialism, strengthening Cyber security Operations capabilities.

As per Audit Splunk user activity users request develop, test and Implement new field extraction/alert/report/dashboards.

Developing a detailed understanding of their potential impact to the organization, providing recommended solutions for improving our defensive and detective capability.

Collaboration with the wider Cyber security functions to develop hypotheses for new attack techniques and evasion methods.

Ownership responsibilities to obtain and maintain User Access Information

Proactive splunk issue identification & resolution

Optimized Splunk for peak performance by splitting Splunk indexing and search activities across different machines.

Audit Splunk user activity.

Project #4: BIaaS DW

Client: EMC,USA

Employer: Accenture India pvt.ltd, Bangalore

Environment: Informatica 8.6, Informatica Power Exchange, GreenPlum, Control-M, Tableau

Duration: 16th Aug 2011 to June 2013

Role: BI Lead

Description

The EMC BIaaS project consists of a number of challenges that are unique to traditional ETL / BI development. The initial implementation of this endeavor consists of a number of “proof of concept” iterations that will load data from existing EMC sources into the central repository, which is a Greenplum MPP database. Greenplum is a new technology that exists in the same technical space with Teradata, Netezza, and other MPP platforms. As an MPP application, its greatest feature is the ability to perform fast searches across large data spaces. In order to meet performance requirements for this project, high-speed methods of data loading must be developed and an overall framework must be implemented and published..

Key Responsibilities:

Reviewing the code & handle unit testing and performance testing. Ensure adherence to best practice methodology during project implementation.

Developed the Dashboards using Tableau.

Design and develop Extract Transform and Load (ETL) and Data Integration solutions for complex functional and technical requirements primarily using Informatica Power Center

Deliver high quality reliable software as per client’s requirements.

Act as a key point of contact for interfacing with clients and for resolving contingencies.

Involved to create WR# to migrate the code from DEV to QA & PROD.

Involved to create Harvest package for DB scripts i.e. DDL/DML for the maps in GP Database.

Involved to create/get QC docs,code review docs & Technical Design docs uploaded into eRoom of EMC.

Involved daily status call for updates and share the same to action items for the team.

Involved to motivate the team while trouble shooting the issues & escalation times for deliverables.

Experience in Incident/Change/Problem Management by using HP Service Manager 9 Application

Ensure the team to close the ticket before the Service Level Agreement .

Project #5: CISCO Fin ITDS

Client: CISCO, USA

Employer: Tata Consultancy Services Pvt Ltd, Bangalore

Environment: Informatica 8.6, TeraData, Business Objects, OBIEE, $U Scheduler

Duration: Jan 2011 to Aug 2011

Role: BI Lead

Description:

CISCO is mainly Telecom domain products. They provide products as well as services. So in FIN ITDS, Finance Bookings & Revenue Universe having to support all DW users across CISCO.

Responsibilities:

Involved to understand Business Requirements to pull the reports using Business Objects.

Importing table definitions Involved in Designing of Star Schema.

Understanding the data process flow and workflows.

Involved in performance tuning.

Involved to resolve the remedy tickets/cases without any SLA missed.

Involved to work in this project 24/7 to assist the users in terms of security, access, general issues using Business Objects.

Involved to migrate the reports from MARS universe to FIN BI Bookings, Revenue Universe in CISCO FIN ITDS project.

Monitoring daily runs of jobs from $Universe (scheduling tool) on UNIX environment.

Involved to schedule the reports based upon business users requirement with customized dates also.

Involved to work Informatica daily/weekly/monthly/Quarterly jobs using $U scheduler

Involved to communicate the business users if delay the jobs/issues/maintaince activities.

Ensure the team to close the ticket before the Service Level Agreement .

Project #6: USAA -MRKT

Client: USAA, USA

Employer: Tata Consultancy Services Pvt Ltd, Bangalore

Environment: Informatica 8.6, DB2,Control-M, Unix Shell Scripting

Duration: 3rd Mar 2010 to Dec 2010

Role: BI Lead

Description:

USAA is a full financial services company that proudly serves the military and their families. USAA products and services are Insurance, Banking, Shopping & Discounts, Financial Planning services and Investments .USAA is more than just doing business with an organization that's financially strong. It's getting access to truly competitive products, award-winning customer service and the convenience of banking, investments and insurance when and where you want. We are here to help facilitate the financial security of our members, associates and their families by providing a full range of highly competitive financial products and services. Unlike so many other financial services companies, USAA earns some of the highest financial strength ratings.

Responsibilities:

Involved to understand Business Requirements to design and implementation of the tasks and activities.

Understanding the data process flow and workflows.

Involved in performance tuning.

Created Issue log and discussed same in the daily status call.

Involved in Solving Priority 1 Business Cases and Maintaining, supporting the production in a Full Cycle basis.

Involved in critical and high priority time bound issues providing detailed root cause and resolution to business users, documenting the issues and solutions for future reference.

Providing Production and Non-Production Support and also coordinating with DBAs and relevant teams.

Extensively involved in Data Extraction, Transformation and Loading (ETL Process) from Source to target systems.

Ensure the team to close the ticket before the Service Level Agreement .

Project #7:CompuCredit EDW

Client: CompuCredit, USA

Employer: Cognizant Technology Pvt Ltd, Bangalore

Environment: Informatica 8.6, Oracle, Control-M

Duration: Jan 2008 to Feb 2010

Role: BI Developer

Description:

Comp Credit is involved to provide car loans & Credit cards. The company requires different level of analysis regarding loan amount, type of customers, and type of payment schedules, interest calculations. The data warehouse basically is a Data Mart, as it covers only one domain of the business and captures data from their Transactional Database maintained under Client/Server Architecture. CompuCredit is a leading provider of financial services to consumers underserved by traditional financial institutions. The credit cards are issued to the people.

Responsibilities:

Involved to understand Business Requirements to design and implementation of the tasks and activities.

Understanding the data process flow and workflows and Involved in performance tuning.

Involved into the existing architecture and develop the new requirements in ETL end.

Involved in reviewing the code and uploaded relevant documents into Client share point.

Monitoring the daily/weekly jobs and fixed the issues in case of P1/P2/P3 Severity issues.

Interaction with multiple teams during integration and user acceptance testing phases.

Involved to create functional and technical designs for complex flows.

Experience in Incident/Change/Problem Management by using HP Service Manager Application

Maintaining the score cards based on the SLA’s.

Project #8:WellCare EDW

Client: WellCare, USA

Employer: Cognizant Technology Pvt Ltd, Bangalore

Environment: Informatica 8.6, Oracle, SAP, Siebel

Duration: Sept 2006 to Dec 2007

Role: BI Developer

Description:

Whenever a customer wants to order a product, that customer must be a registered customer in the Autodesk database. After the successful registration process, customer will get a username and password. All these customer details such as Contact information, username, and password are stored in Seibel database. All products available to order are maintained in SAP application.

Responsibilities:

Involved to understand Business Requirements to design and implementation of the tasks and activities.

Understanding the data process flow and workflows and Involved in performance tuning.

Involved into the existing architecture and develop the new requirements in ETL end.

Involved in reviewing the code and uploaded relevant documents into Client share point.

Interaction with multiple teams during integration and user acceptance testing phases.

Experience in Incident/Change/Problem Management by using HP Service Manager Application

Maintaining the score cards based on the SLA’s.



Contact this candidate