Post Job Free
Sign in

Kafka Administrator with 10+ Years IT Experience

Location:
Plano, TX
Posted:
March 23, 2026

Contact this candidate

Resume:

Mobile: +1-980-***-****

RAKESH ANUMULA

Email: ****************@*****.***

https://www.linkedin.com/in/rakesh-reddy-anumula-4753b91b4/

SUMMARY

** ***** ** ******* ** experience predominantly into Middleware Products like Apache/Confluent Kafka,IBM Datapower Gateway, IBM MQ, IBM API Connect, and automating CI/CD pipelines, infrastructure, and deployments. Proficient in cloud technologies, configuration management and containerization ensuring scalable and resilient systems. Seeking position on current skill set where I can employ an opportunity to explore new technologies.

SKILLS

Technical Skill Apache Kafka,IBM DataPower

IBM API Connect,IBM MQ,

Cloud Skills : Azure and Kubernetes

Cloud Tools : ArgoCD,Git,Azure DevopsRepo

Promotheus, Influx DB

Monitoring : Grafana, Kibana,Splunk,

App Dynamics Instana,

Lpaar2rrd, Chronograph

Technical Tools: Venafi, F5 LBG,SNOW SOAP UI, Postman.

Others : WinSCP, Mobaxterm,

Putty,Beyond Trust, Linux

EDUCATION

Graduated in 2012, Bachelor of Technology Engineering from stream of Electronics & Com Engineering JNTU-Hyderabad University, India.

WORK EXPERIENCE

USA Experience

Employer : Pyramid Consulting

Duration : Sept 2025 to till now.

Designation : Senior System Engineer.

Employer : Tech Vision INC,

Duration : May 2023 to September 2025.

Designation : Senior System Engineer.

Offshore Experience: India

Previous Org : HCL India Pvt LTD

Duration : Feb 2020 to March 2023.

Designation : Senior Software Engineer.

Previous Org : ENSONO Limited LLP(WIPRO)

Duration : June 2017 to Feb 2020.

Designation : Senior System Administrator.

Previous Org : ASICS Technologies

Duration : March 2014 to June 2017.

Designation : System Administrator

CERTIFICATIONS

Certified from Azure 900 Fundamentals from Microsoft.

Certified from CKA (Certificate Kubernetes Administration) issued by Linux Foundation.

Certified Apache Kafka Series Learnings for Beginners.

Data Streaming Engineer certification by Confluent.

Confluent Certified Administrator for Apache Kafka (CCAAK)

PROJECT DETAILS

Current Project 4#: Bank of America

Duration : September 2025 to till now.

Carry out Administering/SRE/Data Engineer duties for Kafka Streaming Platform.

Design, implement, and support Kafka-based data streaming solutions in line with defined architecture.

Experience in creating Kafka cluster and administrating, managing, monitoring.

Build and manage integrations between Kafka and various systems using Kafka Connect. Oversee the performance and reliability of the Kafka cluster, and escalate complex issues to senior team members when necessary.

Experience tasks like topic creation, connector creation, access provisioning. Experience on handling Incident/Change/Ctask/PRB/Ptask/RCA/DEP Items.

Configure, monitor, and maintain Kafka clusters to ensure performance reliability.

Utilize Confluent Kafka to build robust data CICD pipelines for real-time data processing and analytics.

Kafka Admin with good hands-on experience in Linux Environment.

Strong knowledge of Kafka architecture and internals, including Kafka Connect, Schema Registry, Zookeeper, KRaft and Kafka Control Centre.

Knowledge on Kafka Cluster stability, performance optimization, memory management.

Deep hands-on experience with Kafka on Azure,AWS, Terraform, Kubernetes, and cloud-native operational safety practices.

Good hands-on software development in high performance, high availability, scalability, and distributed applications.

Great experience in KSQL DB setups and filtering streaming data.

Experience in Cloud Azure, AWS GCP other relevant Confluent Cloud infrastructure.

Hands on experience on Venafi Certificate Management, manual/auto renewals.

Securing Kafka using SSL/TLS, of SASL_PLAINTEXT and SASL_SSL mechanisms.

Experience in consumer/producer access management (RBAC & ACL).

Good experience in Open shift Cloud Orchestration OCP/K8s Kubernetes.

Perform high-level, day-to-day operational maintenance, support, and upgrades.

Experience on Kafka Sink/Source connectors, schema registry and Cluster Linking.

Experience in setting up cluster linking and MirrorMaker2/Data Replication.

Experience with Cluster linking into Kafka ZK Quorum.

Experience in Automation topic creations and changes using Terraform/Helmcharts.

Kafka security, availability, failover, and monitoring experience with Confluent Kafka Build automated pipelines for self-service of Topic and connector deployments.

Familiarize SOPs with the cluster maintenance processes and implement changes as per the documented installation and validation plans.

Work on creating automated scripts to upgrade/deploy Kafka on all lanes (Terrraform)

Implemented a suite of interactive dashboards using Power BI.

Experience in Terraform automation tools, using Iac (Infra As Code model).

Implemented Sink/Source Connectors MQ, JDBC, Postgress, MySQL,Snowflake ect..

Experience using Git and continuous delivery build systems (build/test/release infrastructure,Azure Devops).

Design and deploy Kafka source and sink connectors, for SQL Server and Snowflake.

Hands on experience in AWS Services like EC2, S3,Glue /Lambda/RestAPIs/Kafka.

Deal issues in SQL/PostgresSQL/Oracle DB /Debizuim /MongoDB system connectors.

Involved in Major Activity in CVS Health Welcome Season support activities and trainings on Splunk/Appdynamics/Grafana/ JMX/Prometheus Alerts/Reports/Dashboards creations and monitoring as part of Application support.

Used Shell Scripts/Autosys/JIL jobs for Kafka performance tuning and monitoring alerting and reporting activities.

Assist in diagnosing and resolving issues within the streaming platform. Help maintain clear and up-to date documentation.

Current Project 5#: CVS & Aetna HealthCare (USA)

Duration: May 22nd, 2023, to September 2025.

Kafka Administrator/Production Support/SRE:

Proven expertise in the installation, administration, production support, and maintenance of Confluent Platform components, including Apache Kafka and related tools.

Hands-on experience with event-driven architecture components such as Apache Kafka, Kafka Brokers, and Zookeeper.

Proficient in scripting with technologies like Shell and Python to automate software processes and system performance, monitoring activities.

Strong ability to analyse and identify root causes, business impact, and underlying issues in complex systems.

Extensive experience in administration, L3 support, and engineering for Confluent Kafka, with a strong track record of managing large-scale, multi-region Kafka cluster environments.

Project 3#: Volvo Cars and Trucks Automobile Services.

Technologies Handled : Apache Kafka

Duration : Feb2020 to Mar 2023

Hands on experience in creating the topics and doing configuration changes.

Experience in Kafka single node and multi node Installation in On-prem on Linux based VMs.

Excellent hands on in implementing Kafka replicator from source cluster to destination Kafka cluster.

Experience in upgrading confluent and Apache Kafka.

Experienced in onboarding the applications in secure and authenticated way.

Collaborate with DevOps and data engineering teams to ensure end-to-end data delivery.

Experience in doing Kafka topics log compaction.

Hands on experience in setting up the Kafka file systems.

Analyzing Kafka performance with disk IO statistics.

Authorizing the Kafka clients by giving proper ACLs.

Experience with Cloud Platforms Azure/AWS/GCP.

Hands on experience in using Kafka GUI tools like Conduktor and Offset Explorer.

Experience in CPU and memory verifications and reports on Azure/GCP.

Experienced in integrating automated certificate delivery with Azure Key Vault, AWS Certificate Manager (ACM), and various cloud services.

Hands on experience in resetting the topic offsets for consumer groups.

Experience in Analyzing the Kafka logs and ZK logs.

Kafka on Azure SRE Activities:

• Creating a namespace and deploying the applications in the namespace.

• Deploying and Managing confluent Kafka Azure Kubernetes services/Docker.

• Creating, editing and upgrading the yaml files of Kafka helm charts and applying to current Kafka cluster in AKS.

• Experience in debugging, diagnosing and troubleshooting production issues.

• Creating and exposing the Kubernetes services as a Load Balancer service for Kafka in AKS for internal and external applications access.

Hands on Experience on MQ QM Migration from MQ Appliance to AKS.

Creating and deploying Azure Kubernetes/Docker services in Azure portal.

Experience in implementing CICD pipelines using Github/Azure Devops Repo and Argo CD.

Experience in horizontal pod auto and manual scaling to support the increased load.

Involved in implementing metric flow using Influx DB/Prometheus to Grafana/Chronograph/ELK nodes.

Hands on in deploying the applications in container orchestration methodology like Kubernetes in Azure.

Experience on alert mechanism on Chronograph using TICKSCRIPTS.

Worked on generating alert for Kafka UPU Usage an5d RAM Usage, when met threshold.

Project 2#: Health Plan Services (India)

Technologies : IBM DataPower, IBM API Connect.

Duration : June 2017 to Feb 2020. (Under Wipro and Ensono).

Development and Administration work on IBM Data power.

Experience with Web Services standards and coding (XML, SOAP, XSLT, XSD and WSDL).

Modified in Web Services Proxy, Multi-Protocol Gateway, and XML Firewall services.

Responsible for promoting code changes created through all the environments.

Maintaining service status/Network Interface status report as per Daily activity.

APIM Administration Lead the Installation, network and monitoring setup, firmware upgrade, domains creation setup, secure backup-recovery, user groups creation and access control etc.

Monitoring / Infrastructure monitoring and observability is the key responsibility.

Experience on handling Service Now Incident/Change/Ctask/PRB/Ptask/RCA/DEP Items.

Have good experience in creating WSP, XFW, MPGW, FSH and Log targets.

Experience configuring DataPower appliances for security, traffic management, mediation, and data transformation policies.

Have experience on Linux implement the IBM Datapower and APIC.

Experience managing IBM APIC gateways on DataPower using API Connect, including access control, rate limiting, and throttling.

Experience on DataPower/API Connect Product to install, maintain, and upgrade.

Managing Security functions like AAA (authentication, authorization, auditing), encryption, decryptions, URL Re-write Policy.

Hands on experience in creating Application Domains, User groups, Packet Capturing.

Secure Backup and Exporting the Configuration as monthly maintenance.

Set up and maintain a centralized repository for managing the API inventory, including classification, ownership assignment, and rating mechanisms.

Design and enforce API governance frameworks and compliance standards.

Adding new Cluster servers and rebooting/restarting servers based on requirements.

Having knowledge in Creating API’s and mapping to the respective revision.

Configured SSL Communication between external parties and DataPower.

Troubleshooting the errors by verifying application and system logs.

Resolving issues that may occur in regular workflows, identification and modification of existing configurations and applications according to requirements.

Project 1#: North Western Mutual Finances, (India)

Technologies Handled : IBM DataPower and IBM MQ Messaging Queue.

Duration : March 2014 to June 2017.

Worked as Tier1 level administrator and handled tier1 level troubleshooting.

Updating the Error Log report of PROD which is captured in Splunk App.

Troubleshooting the errors by verifying application and system logs.

Hands on experience in Application domain creations, user creation and providing access.

Good experience on Regression suite run and update system stability reports.

Reported TPS volumes each peak day and observations.

Maintaining monthly secure backup and export configuration of all the domains.

Remote Administration of Queue Managers through MQ Explorer

Taking Backup of Queue Managers with Scripts

Creating Queue Managers with Linear Logs and Circular Log Modes

Administrating the MQ Software Level Errors and Queue Manager Level Errors

Applying OAM Security to MQ Objects and Queue Managers

Created and configured MQ Objects like QM, Remote Queues, Local Q and Queue Aliases.

Supporting the application like increasing the queue depth, browsing the message content, stopping and starting of applications related instances.



Contact this candidate