Post Job Free
Sign in

Access Management Hands-On

Location:
Hyderabad, Telangana, India
Posted:
October 25, 2025

Contact this candidate

Resume:

Narendra Pillutla

(Kafka Administrator)

Cell: +91-938*******

Email:

***.******@*****.***

Career Objective

Looking for a challenging role in a reputed organization to utilize my technical, administrative and management skills for the growth of the organization as well as to enhance my knowledge about new and emerging trends in the IT sector.

Professional Summary

Kafka Administration (Open source/Confluent)

• Having 10 years of IT experience in Kafka Administrator and Devops tools

• Experience in managing critical 24/7 Kafka clusters for design, architecture, implementation, and on-going support of Kafka clusters.

• Administrating Kafka platform which includes creating a backup & mirroring of Kafka Cluster brokers, broker sizing, topic sizing, h/w sizing, performance monitoring, broker security, topic security, consumer/producer access management (ACL).

• Experience in Kafka building pipelines and Kafka API's.

• Experience in Confluent Kafka, zookeepers, Kafka connect, schema registry, Avro schemas, Confluent KSQL.

• High availability cluster setup, maintenance, creating topics, setup redundancy cluster,

• Manage large scale Multi-nodes Kafka cluster environments.

• Capacity planning, cluster setup, performance tuning and ongoing monitoring

• Research and recommend innovative, and where possible, automated approaches for system administration tasks.

• Experience in Kafka's mirroring feature (Mirror Maker) tool to mirror a source Kafka cluster into a target (replica) Kafka cluster.

• Ensuring high availability, security, and disaster recovery for the Kafka platform

• Creating key performance metrics and monitoring the utilization, performance and Overall health of the cluster.

Creating Real-time Streaming ETL with Kafka Connector

• Working experience in ETL with Kafka and RDBMS databases using JDBC connector.

• Connecting Kafka with external systems such as databases, key-value stores, search indexes, and file systems using both source and sink connectors.

• Working experience in Single Message Transformations (SMTs) to messages as they flow through Connect.

• Using Converters to convert from one format to another like AvroConverter, JsonConverter, String Converter, ByteArrayConverter.

• Installing Connect Plugins, etc.

Career Highlights:

Client –: Satanda Bank Intuitive.Cloud (Kafka Administrator/(ML/ops)) 3rd-Sep-2024 to 9-may-2025

Satanda is embarking on modernizing their current enterprise data environment For SQL server to snowflake to cater to evolving business and analytical needs Intuitive will support for continues build, automation of confluent Updates/Enhancements, Confluent platform evaluation and enhancement. Client – 1: Deutsche Bank Luxoft (Kafka Administrator/ Devops tools) 13th-May-2023 to 3rd-June-2024

DWS client exposes a set of Restful APIs to customers, partners and third-party developers that provides a level of interoperability with DWS product information. The service then transforms the DaaS NAV messages to the client facing format DWS client NAV schema according to the mapping document.

.

Client – 2: Nokia CAPGEMINI INDIA PVT LTD (Kafka Administrator) 15th-Jan- 2021 to 30th –Nov-2022

Nokia supports flexible network configurations, helping lower the total cost of ownership, while providing flexibility to scale capacity and coverage in line with traffic growth. These highly versatile and compact solutions are suitable for deployment in locations requiring additional capacity or coverage Key to delivering the highest capacity and performance. Client – 3: Discover Financial service Cognizant (Kafka Administrator) 13-May- 2019 to 10th –July-2020

Discover has embarked upon establishing a new modern Big Data enabled information architecture to provide its applications with a centralized framework for managing its enterprise data assets. This includes Environment Support, Change implementation, and Service requests for Hadoop clusters. Client- 4: CPB Virtusa consulting service (Kafka Administrator) 25-May- 2018 to 3rd–Dec-2018

CPB initiated that aggregates the client data from different sources in a central database and provide a single view to client’s relationship with bank. CPB has multiple systems supporting different products in different regions which will be referred as product processors. These product processors maintain all the financial information like transactions and account of client who have relationship with them. Client: 5 Wipro (Hadoop Administrator)

20th-Dec-2016 to 30th–April-2018

The system provides OTC clearing solution which provides connectivity to multiple CCP clearing platforms. It also provides solutions for position keeping for house and client accounts, margin calculation for independent verification and more granular margin allocation, reconciliation with clearing house’s EOD files/reports. It also provides collateral management for client pledge’s collateral, prepares client statement for customers, settlement and accounting with sub ledger functionality. Client: 6 Wipro (Linux Administrator)

6th-nov-2013 to 05th–Dec-2016

AIRB is the project for internal rating base module. The project was for exchanging data between the clients of bank and the backend systems. GXS (Global exchange system) was used for easy communication.

Responsibilities:

• Managed and resolved incident tickets opened by clients as well as those logged by event monitoring system.

• Writing scripts for cronjob entries, which included maintaining various log files.

• Created logical volumes and increased the file system in Linux servers

• Provided day to day support to IT applications and users group

• User Management

• Incident Management according to priority of issues.

• Interact with users on entry level problems and their resolution

• Provided assistance and documentation which allowed the 24/7 operations department to troubleshoot and correct problems without needing to page other employees. Technical Skill’s

• Operating System: Window’s, Linux RedHat, Ubuntu.

• Cloud Computing: AWS (EC2, S3, IAM).

• Monitoring Tool: Cloudera Manager. Grafana,

• Database: HBase, MySQL, Amazon Relational Database Service.

• Big Data Technologies: Hadoop, YARN, HDFS, MapReduce, Zookeeper.

• Eco System: Sqoop, Hive, Flume, Oozie, Impala, Kerberos, TSL, Sentry.

• Languages: SQL, Linux, Teraforrm

Declaration

I hereby declare that all the above furnished details are true to the best of my knowledge. Place: (Narendra.P)

Date:



Contact this candidate