Post Job Free

Resume

Sign in

Manager Data

Location:
Mumbai, MH, India
Posted:
January 26, 2016

Contact this candidate

Resume:

Anil Kumar

Splunk Developer/Admin

Knowledge about Splunk architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, License model.

Experience on Splunk Enterprise Deployments and enabled continuous integration on as part of configuration management

Expertise in Actuate Reporting, development, deployment, management and performance tuning of Actuate reports.

Parsing, Indexing, Searching concepts Hot, Warm, Cold, Frozen bucketing.

Maintenance of Splunk Environment with multiple Indexers.

Worked on large datasets to generate insights by using Splunk.

Set indexing property configurations, including time zone offset, custom source type rules. Configure regex transformations to perform on data inputs. Use in tandem with props.conf.

Helping application teams in on-boarding Splunk and creating dashboards/alerts/reports etc.

Integration of Splunk with LDAP

Various types of charts Alert settings Knowledge of app creation, user and role access permissions. Creating and managing app, Create user, role, Permissions to knowledge objects.

Worked as Splunk developer to setup Splunk app for a critical new launch.

Field Extraction, Using IFX, Rex Command and Regex in configuration files.

Knowledge of Extract keyword, sed, Knowledge objects, Knowledge of various search commands like stats, chart, time chart, transaction, strptime, strftime, eval, where, xyseries, table etc. Difference between event stats and stats.

Time chart attributes such as span, bins, Tag, Event types, Creating dashboards, reports using XML. Create dashboard from search, Scheduled searches online search vs scheduled search in a dashboard.

Worked on setup of various dashboards, reports and alerts in Splunk.

Use techniques to optimize searches for better performance, Search time vs Index time field extraction. And understanding of configuration files, precedence and working.

Was responsible for data inputs/app creation/objects/views managing in Splunk.

Strong Data Warehousing ETL experience of using Pentaho 9.1/8.6.1/8.5/8.1/7.1 Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools –Proficient in Oracle 11g/10g/9i/8i/7.3, PL/SQL back end applications development Toad, SQL Plus, and PL/SQL Developer.

Experience with Splunk technical implementation, planning, customization, integration with big data and statistical and analytical modeling.

Well-versed with different stages of Software Development Life Cycle (SDLC).

TECHNICAL SKILLS:

Splunk Modules

Splunk 6.1.3,Splunk on Splunk, Splunk Enterprise, Splunk DB Connect, Splunk Cloud, Splunk Web Framework

Languages

SQL, PL/SQL, Perl (Korn shell) Unix Shell Scripts, JSP, C, C++, java, J2EE, CSS, HTML, XML.

RDBMS

Oracle 11g/10g/9i/8i, MS-SQL Server 2000/2005/2008, Sybase, DB2 MS Access.

Tools

APEX 4.2,3.2, Pentaho 4.2/45.0/50/5.2/5.3Power Center Oracle Forms 10g,9i,6i and Reports 10g,9i,6i,ERWIN,VISIO.

Data Modeling

Dimensional Data Modeling (Star Schema, Snow-Flake, FACT-Dimensions), Conceptual Physical and Logical Data Modeling, ER Models, OLAP, OLTP concepts.

Operating Systems

Red Hat Enterprise Linux 7.x/6.x, Sun Solaris 9/10, ES3/4, HP-UX 11.11,11.23, AIX5.3/6.1, Windows 2010/2008/OS X 10.6/10.7/10.8/10.9

WORK EXPERIENCE:

Bank of America, Austin, TX Aug 2014 to Current

Splunk Developer/Admin

Responsibilities:

Knowledge about Splunk architecture and various components (indexer, forwarder, search head, deployment server), Heavy and Universal forwarder, License model

Created splunk app for Enterprise Security to identify and address emerging security threats through the use of continuous monitoring, alerting and analytics.

Developed an intuitive command-line tool that allowed QA and developers on the Product Team to efficiently interact with our RESTful API as a variety of different user types to test new changes merged to the staging environment and to inspect/share relevant responses/exceptions, using Python and Redis.

Created pytest helper functions and re-factored significant portions of unit/integration test code to provide a consistent pattern that new tests could follow.

Added new features to the Django/Tastypie API of the core product.

Analyzing JVM heap settings and tuning.Web Servers - Apache with standalone Managed Servers and Clusters of managed servers.

Developed UNIX shell scripts and auto deployment process. Maintenance of DB objects to maintain the database growth.

Designing and developing dynamic web pages using XHTML, CSS and JavaScript.

Coded extensively in jQuery to make rich Internet web pages and created custom date picker on the website.

Developed mockups and prototypes using HTML and CSS.

Design website and maintain the website content up to date.

Improved documentation for setting up backend developer environment and created a setup script to automate as much of the process as possible.

Created Dashboards, report, scheduled searches and alerts.

Performed installation, new databases design, configuration, backup, recovery, security, upgrade and schema changes, tuning and data integrity.

Provide Regular support guidance to Splunk project teams on complex solution and issue resolution.

Helping application teams in on-boarding Splunk and creating dashboards/alerts/reports etc.

Involved in standardizing Splunk forwarder deployment, configuration and maintenance across UNIX and Windows platforms.

Assisted with sizing, query optimization, buffer tuning, backup and recovery, installations, upgrades and security including other administration functions as part of profiling plan.

Extensive experience on setting up the Splunk to monitor the customer volume and track the customer activity.

Ability to carry out security tasks at network level such as TCP/IP ports through firewall.

Integrate Service Now with Splunk to generate the Incidents from Splunk

Worked on Splunk DB Connect configuration for Oracle, MySQL and MSSQL

Splunk configuration that involves different web application and batch, create Saved search and summary search, summary indexes.

Created many of the proof-of-concept dashboards for IT operations, and service owners which are used to monitor application and server health.

Prepared, arranged and tested Splunk search strings and operational strings.

Expertise in Actuate Reporting, development, deployment, management and performance tuning of Actuate reports

Created SplunkDashboards for various types of business users in organization.

Parsing, Indexing, Searching concepts Hot, Warm, Cold, Frozen bucketing

Field Extraction, Using Ifx, Rex Command and Regex in configuration files

Use techniques to optimize searches for better performance, Search time vs. Index time field extraction. And understanding of configuration files, precedence and working

Create Splunkdashboard from search, Scheduled searches Inline search vs. scheduled search in a dashboard.

Involved in helping the Unix and Splunk administrators to deploySplunk across the UNIX and windows environment.

Troubleshooting of searches for performance issues, integration of Splunk with Ldap.

Various types of charts Alert settings Knowledge of app creation, user and role access permissions. Creating and managing app, Create user, role, Permissions to knowledge objects.

Environment:Splunk 6.1.3, Tomcat 7.x, F5 Load Balancers, Wily Introscope 6.0,Python Scripting, Apache HTTP server 2.4, JVM tuning,RedHat Linux 6.x, LDAP, Splunk UI, JDBC, JDK1.7, J2EE, XML, Oracle 11g, MS SQL Server 2012, SQL, Solaris 10, SVN, CVS.

Direct TV, El Segundo, CA Aug 2013- Jul2014

Splunk Power User

Responsibilities:

Expertise with Splunk UI/GUI development and operations roles.

Prepared, arranged and tested Splunk search strings and operational strings.

Played a major role in understanding the logs, server data and brought an insight of the data for the users.

Optimized Splunk for peak performance by splitting Splunk indexing and search activities across different machines.

Involved in setting up alerts for different type of errors and helped the client to setup alerts for different type of errors.

Analyzed security based events, risks and reporting instances

Prepared, arranged and tested Splunksearch strings and operational strings.

Developed, evaluated and documented specific metrics for management purpose.

Using SPL created Visualizations to get the value out of data

Configure and Install Splunk Enterprise, Agent, and Apache Server for user and role authentication and SSO.

Provide inputs for identifying best fit architectural solutions - deployment for Splunk project.

Deploy, configure and maintain Splunk forwarder in different platforms.

Created Dashboards for various types of business users in organization.

Provided technical services to projects, user requests and data queries.

Involved in assisting offshore members to understand the use case of business.

Assisted internal users of Splunkin designing and maintaining production-quality dashboard

Involved in writing complex IFX, rex and Multikv command to extracts the fields from the log files.

Experience in setting up dashboards for senior management and production support- required to use Splunk.

Involved in helping the UNIX and Splunk administrators to deploy Splunk across the UNIX and windows environment.

Worked with administrators to ensure Splunk is actively and accurately running and monitoring on the current infrastructure implementation.

Involved in installing and using Splunk app for Linux and Unix.

Environment:Splunk, Linux, Windows Server 2012, 2008, Splunk Enterprise Security, ESX, Applications Development, Big Data Analysis, operations analysis.

Agilent technologies, Wilmington, DE Nov 2012- Jul 2013

Splunk Engineer and Developer

Responsibilities:

Expertise with Splunk UI/GUI development and operations roles.

Prepared, arranged and tested Splunk search strings and operational strings.

Helped the client to setup alerts for different type of errors.to data warehouse a

Setting up trusted proxy's for Single Sign-on of the authentication.

Configuration Management and event trigger using Service-Now.

Played a major role in understanding the logs, server data and brought insight of the data for the users.

Involved in setting up alerts for different type of errors.

Assisted internal users of Splunk in designing and maintaining production-quality dashboard.

Involved in writing complex IFX, rex and Multikv command to extracts the fields from the log files.

Involved in helping the UNIX and Splunk administration to deploy Splunk across the UNIX and windows environment.

Worked with administrators to ensure Splunk is actively and accurately running and monitoring on the current infrastructure implementation.

Environment: Splunk 5.0, Pivotal HD, Datameer, Linux, Bash, Perl, Hbase, Hive, Pig, Hawq, Sed, rex, erex, Splunk Knowledge Objects.

T-Mobile USA, Bellevue, WA Sept 2011- Oct 2012

Pentaho ETL Developer

Description:T-Mobile USA is a national provider of wireless voice, messaging and data services capable of reaching over 293 million Americans where they live, work and play This project is focused exclusively on improving the IT ecosystem within the prepaid business, which includes the subscriber types Prepaid, Flex Pay and Wal-Mart Family Mobile. Prepaid subscribers are in scope as well as subscribers that are part of a hybrid account in Samson, and that in/out of scope account type/sub type refers to Postpaid (Samson) accounts only.

Responsibilities:

Involved in gathering and analyzing the requirements and preparing business rules.

Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.

Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.

Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.

Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager.

Extracted data from different databases like Oracle and external source systems like flat files using ETL tool.

Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.

Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings using Informatica 9.1.0.

Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.

Involved in Performance Tuning of mappings in Informatica.

Good understanding of source to target data mapping and Business rules associated with the ETL processes.

Environment: Informatica 9.1, Oracle 11g, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL Loader, Tidal Enterprise Scheduler 5.3.1, Unix, Flat files.

Pentaho ETL Developer Feb 2010-Aug 2011

First Republic Bank, San Francisco, CA

Description:The Enterprise Data Warehouse (EDW) provides services to Regency's business and IT clients. EDW acquires, prepares, and delivers reporting data for business decision making. This operational data is for trending analysis and near real-time reporting (transactional data) . Services include data source analysis, modeling, quality analysis, structure design to enable reporting analysis, mining & discovery of data for an integrated view of Regency's business and customers.

Responsibilities:

Extensively worked with Business Users in gathering requirements and actively cataloging and supporting various issues and providing their solution.

Responsible for coding SSIS processes to import data into the Data Warehouse from Excel Spreadsheet, Flat Files and OLEDB Sources.

Used bunch of transformations in Pentaho transformations including Row Normalizer, Row Demoralizer, Database Lookup, Database Join, Calculator, Add Sequence, Add Constants and various types of inputs and outputs for various data sources including Tables, Access, Text File, Excel and CSV file.

Participated in design of Staging Databases and Data Warehouse/Data mart database using Star Schema/Snowflakes schema in data modeling.

Troubleshoot BI tool problems and provide technical support as needed. Perform other tasks as assigned.

Worked very closely with Project Manager to understand the requirement of reporting solutions to be built.

Gathered business requirement by understanding business Processes and needs.

Installed and Configured Pentaho BI Suite 4.2 & 4.4 along with Enterprise Repository in Pentaho BI server.

Used Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others.

Used different types of input and output steps for various data sources including Tables, Access, Text File, Excel and CSV files.

Configured Pentaho BI Server for report deployment by creating database connections in Pentaho enterprise console for central usage by the reports deployed in the repository.

Implemented Logic with Database lookup table to maintain Parent- Child relationship and maintain hierarchy.

Used Pentaho Design Studio for creating custom parameters as well as generating report.

Used Pentaho Report designer to create various reports having drill down functionality by creating Groups in the reports and drill through functionality by creating sub-reports within the main reports.

Automated file transfer processes and mail notifications by using FTP Task and Send Mail task in Transformations

Applied Configuration, Logging, Error reporting to all Transformation and Jobs to make it deployment easy and troubleshoot package on run time.

Used Pentaho Enterprise Console (PEC) to monitor the ETL Jobs/Transformation on Production Database.

Resolved connectivity issue on different server by using Kettle.Properties file and setup Variable for each DB connection.

Performed Data cleansing by creating tables to eliminate the dirty data using SSIS.

Involved in performing incremental loads while transferring the data OLTP to data warehouse using different data flow and control flow tasks in SSIS.

Responsible for creating database objects like table, views, Store Procedure, Triggers, Functions etc. using T-SQL to provide structure to store data and to maintain database efficiently.

Extensively used joins and sub queries to simplify complex queries involving multiple tables.

Smart Technologies, Hyderabad, India Mar 2009-Jan 2010

Description: It is a Software Development, Telecom, and IT Enabled Services. It provide services to wide range of industries-Telecommunications, Government, Other Utilities, Retail, Banking Services, Finance and Insurance.

Responsibilities:

System Administrator in Lab Tech Projects

Project lead for RMM implementation of LabTech to manage all client devices and streamline remote support and monitoring.

Managed LabTech for our clients with over 1000 agents deployed.

Worked with clients on IT roadmaps and planning for the technology spending and growth.

Managed the day to day IT work for all my clients, including server, network, workstation, mobile device support.

Automated installations of workstations, using scripting, group policy, deployment packages and documentation.

Assist in the planning and execution of the Connect Wise,

Schedule and manage tickets on service board

Project manage new and existing projects with clear measurable goals

Proactively research and maintain knowledge of IT solutions provider and related industries

Responsible for the oversight and support an organization's infrastructure systems such as File and Print Services, Email, Network OS and Applications.

Perform maintenance and support of the availability and functionality of these systems.

Maintenance (includes OS Patching and upgrades), implementation rollouts of new systems, L1,2&3 break fix

Resolve inbound tickets (Level 1 & 2 helpdesk as needed), and ensure SLAs are maintained

Review and monitor our service-ticket boards, and ensure routine network maintenance occurs

Manage all server/desktops to keep them up-to-date with Microsoft and third-party patches, virus definitions, and malware, using our Remote Monitoring Management (RMM) software

Assist with consistent monitoring of Backup devices and manage/escalate failures as needed

Installed & Configured Windows server 2003, 2008, 2012 VMware vSphere, Hyper v, Exchange server 2010/2013& Linux operating systems.

Perform day-to-day Macintosh support activities and processes to deliver enterprise-wide technical support services for Macintosh systems and applications

Installed and Configured applications like Veritas NetBackup / Veeam Backup and Replication.

Experience as a System Administration on various Linux Windows Server 2003/2008/2012 and Mac OS Server Snow Leopard/Lion/Maverick

Record your own work, as well as maintain, update, and create technical support and end-user support documentation

Escalate technical issues outside your skillset to other technical team members

Environment: MS SQL Server 2005/2008,Labtech RMM Tool, Connectwise.

EDUCATIONAL DETAILS:

Bachelors in Computer Science



Contact this candidate