Srimukh Yerneni Phone: 469-***-****
Email: *******.***@*****.***
ACCOMPLISHMENTS:
●Around 9+ years of professional IT experience with full project Software Development Life Cycle (SDLC) using Java-J2EE Technologies-Requirements analysis, Designing, Developing, Testing and Deployment of software applications, Maintenance of Client/Server applications.
●Extensive programming and development experience using Core Java and J2EE technologies including JSP, Servlets, Struts (MVC2Architecture), Hibernate, Spring (IOC, Hibernate Template), Spring Boot, Spring Security, Spring cloud foundry, XML, JDBC, LDAP, Log4J, AngularJS, JavaScript, HTML5, CSS3, Bank fusion tool.
●Project implementation skills using core java, J2EE technologies (JDBC, JNDI, EJB, RMI, Servlets, JSPs, JMS) and XML
●Experienced in designing interactive web applications using AJAX, DWR, JSON and DOJO.
●Experienced in implementing application logic using Spring MVC architecture.
●Experience with Teradata load script like BTEQ, MLOAD, FASTLOAD and Teradata SQL optimization
●Skills in optimizing websites for Mobile phone using CSS media queries.
●Expertise in developing applications using Spring Framework and using spring modules such as IOC/ Dependency Injection, AOP, MVC and configuring application context for spring bean factory.
●Implementing Service Oriented Architecture (SOA) using Web Services (SOAP, REST, WSDL, UDDI).
●Extensive experience in implementing J2EE design patterns like Session Façade, Singleton, MVC pattern, Business Delegate, Data Access Object, Service Locator, Transfer Object.
●Experience using NoSQL databases like MongoDB, Cassandra and Redis.
●Experience in using design tools like Rational Rose, MS Visio for Object Oriented Analysis (OOA) and Object-Oriented Design (OOD) using UML.
●Familiar with various SDLC methodologies such as RUP and Agile.
●Experience in unit testing, Integration testing, performance testing, writing UTD, UTR, JUnit Testing tool.
●Coding Maven build-scripts and configuring and using frameworks such as Log4J and JUnit.
●Experience working extensively on both UNIX based and Windows environments.
●Skilled in documentation and user presentation.
●Highly motivated team player with the ability to work independently and adapt quickly to new and emerging technologies.
Education:
● Bachelors in Information Technology – JNTUH, India
●Masters in Computer Science – University of Central Missouri
SKILLS & ABILITIES:
●Java/J2EE Technologies - Java 8, Spring JDBC, Groovy, Bash scripting, DOS commands, Eclipse, Apache Tomcat, EJB 3.1, Spring MVC, JSP, Tiles, Hibernate 3.0, JNDI, JMS, MQ Series, JDBC, RMI, JXL, Java Mail, JavaBeans, SOAP UI, WSDL, Restful, Maven, Gradle, Kafka, RabbitMQ, NodeJS, Elasticsearch, Sonar cube, Sonar lint, Docker, Prometheus, Python 3.7, Spring boot 2.0 +, Spring 5.0 +, Grafana
●RDBMS – PostgreSQL, H2, DB2, MS SQL server, Oracle, MS Access, MySQL, Cassandra
●SCM Tools - Git, Subversion, CVS, Bitbucket, Code Build, Code Commit, Code pipeline.
●AWS Services: AWS Lambda, API gateway, Step Functions, SQS, SNS, SES, CloudWatch, CFN template,
AWS connect, Greengrass, IoT
Subject Matter expert in Lambda and IoT services.
PROFESSIONAL EXPERIENCE:
Amazon Web Services Nov’19 - Present
Cloud Support Engineer II - DMS SME - Lambda and IoT
This is a Client Facing role which involves extensively adopting AWS cloud technologies and
Collaborating with Cross functional team to get Product developed and delivered on time, making sure customer deliverables are on time. Customer Problem statement are addressed
●Managed and maintained the company’s AWS infrastructure, including EC2 instances, VPCs, Elastic Load Balancing (ELB), Security Groups (SG) and CloudFormation templates.
●Designed/Manage Event source mapping on lambda functions - SQS, Kafka/MSK, IoT, Kinesis etc for both Synchronous and Asynchronous type invokes.
●Designed and implemented solutions for automating repetitive tasks using tools such as CloudFormation and deploy them without giving developers more elevated access to account/Environments - Any changes that are done
●Provided technical expertise in cloud architecture design principles and best practices to ensure optimal performance and reliability of systems.
●Implementing Lambda best practices - declaring Database connection outside the handler so that connection doesn't get created every time Lambda function Linux container gets created.
●Educate and Guide Enterprise and business level customers to use Lambda and IoT more efficiently and lower the operational bill
●Training North American Engineers on lambda/IoT and its best practices, troubleshooting Event source mapping issues to decrease Iterator age on functions without increasing limits on throughput on source but by increasing parallelization on function.
●Upgrade hundreds of function runtimes at once via CloudFormation template
●Implementing API mTLS for security using Certificate and private key.
●Leveraging IoT Credential provider with X509 certificates so that we don't have to save Access key and secret key on device - which drastically improves security in case access and secret keys are exposed, entire fleet won't be compromised.
●Implement Custom Authorizer so that we don't want to use other Identity provided services like Cognito - but this is for small traffic application and device management application only because Lambda is subjected to 3K concurrent executions per min.
●Have a deep knowledge and sub pub MQTT service which can update status of device which is deployed on remote location. Where the device has limited connectivity - OTA, persistence session with Qos1.
Technologies used: Serverless - Lambda, API gateway, Step Function, SNS, SQS, SES, CloudWatch, IoT and Greengrass, boto3, AWS CLI v2, Amazon Linux 2, ubuntu and Debian, Python3.8 +, CFN, Ctrial, DDB.
T-Mobile, Bothell, Washington July ’19 – Nov’19
Sr. Java Developer, Distributed Systems
DEEP.io (Digital Enterprise Event Processing) is a messaging framework which enables Microservices to communicate data through events. It is offered as "software as a service".
It leverages RabbitMQ messaging infrastructure for event delivery. RabbitMQ is a open source software that implements the AMQP (Advanced Message Queue Protocol). Kafka is a distributed streaming platform, leveraged for event audit and monitoring. A message queue uses an asynchronous protocol for communication. The publisher (or producer or sender) of the message or the consumer (or receiver) of the message need not interact with the message
queue in real time. Messages placed on the queue are stored until the consumer retrieves them. Typically, messaging systems provide checks and balances to ensure that the messages do not get lost in case of system failures, retry mechanisms and ability to monitor the event / consumer activity.
Responsibilities:
●Utilized Kafka and Cassandra as main data stores to create data pipelines to gather analytics from devices for various business needs
●Created Application to connect Prometheus with application instances instead of scraping metrics from actuators, can monitor application at individual instance with RSOCKET protocol
●Acted as DevOps backup, and supported during production deployments.
●Increased test coverage by 20% by adding new test cases and refactoring efforts.
●Created, REST APIs using Spring Boot, Spring MVC, Spring Data and Spring Security.
●Utilized Docker for containerized deployment
●Developed Microservices Applications to Pivotal cloud Foundry and services to Kubernetes
Environment: Java 8, Spring integration, PL/Sql, Oracle 12c – Sql developer, RabbitMQ, Spring Boot, Spring MVC, Akka, Kafka, Cassandra, Linux, Oracle, Splunk, GitHub, Jenkins, RESTful web services, JSON, JUnit and Mockit, Prometheus, Grafana
Discover Financial Services, Houston Feb’18 – July ‘19
Java / Spark Developer
EPP: Enterprise Payment Platform
Enterprise Payments Platform initiative began under the name Common Payments Platform (CPP)
Which consisted of collective development teams across the Discover Network, Diners Club International and PULSE Networks. The administration of development separated between PULSE and the Discover Global Network
(DGN). As product offerings such as Protect Buy and Digital Payments expand on the Discover and Diners Club International Networks
Responsibilities:
●Developed Enterprise level applications for reporting in Monolithic Pattern, Spring Integration
●Used Spring to create a SPARK SQL at run time used Spring IoC Container
●Used Kafka for log accumulation like gathering physical log documents off servers and places them in a focal spot like HDFS for handling.
●Maintain common Libraries for all the dependency for use with in the EPP
●Acted as module owner for a specific Repo
●Hands on experience in working on Spark SQL queries, Data frames, and import data from Data sources, perform transformations, perform read/write operations, save the results to output directory into HDFS.
●Collected data using Spark Streaming from various sources (LINUX) in near-real-time and performs necessary Transformations and Aggregations to build the data model and persists the data in HDFS
●Used all the open-source Technologies to take away organizational cost of spending on Licensees
●Report formatting matching with IBM sources
●Used Java API to perform various operation on aggregation and transformation logic
●Used web HDFS REST API to make the HTTP GET, PUT, POST and DELETE requests from the webserver to perform analytics on the data lake.
●Used GIT for source code management and to resolve merge conflicts
●Involved in converting Hive/SQL queries into Spark transformations using Spark RDD's.
●Responsible for writing REST based web service for mapping Institution names with Processors and Intercepts ID’s
●Written Awk to data masking to test data in lower environment, to protect details of the actual customer
●Used Spark for interactive queries, processing of streaming data and integration with popular NoSQL database for huge volume of data.
●Written Shell script to Invoke PLSQL for INCETIVE Calculation on Institution level
●Written Dispute related logic which defines card adjustment, card adjustment settlement, terminal adjustment, terminal adjustment settlement, institution summary and processor summary
●Extensive Experience with Concourse User Interface to manage Task for data loading in various environment.
●Extensive Experience in Demoing Sprint worth of work end of every sprint (System Demo) and at the end of every PI.
Environment: Java 8, Spark RDD, Spring Integration, PL/Sql, Oracle 12c – Sql developer, Spring, Kerberos, Unix, shell, Webservices (REST), Gradle, Hadoop, Hive, HDFS, HPC, WEBHDFS, WEBHCAT, Spark, Spark-SQL,
General Electric’s (GE) Oil & Gas, New Orleans, Louisiana Dec ‘16 – Feb’18 Role: Predix Back End Developer
The Plant Operation Advisor (POA) application built for oil and gas plants which helps the process engineers by automatically notifying plant vulnerabilities and is a one stop portal for surveillance and monitoring of the plants underlying machineries. POA is to support operations and reduce the risks at onsite and identify the unplanned downtime for the Asset by providing early warnings and advisory system of facility health vulnerabilities and operating envelope excursions for engineering and providing complete health report for early warnings and excursions and system trends and my watchlist for each level (classification) L1, L2, L3, L4, Owner of Anomaly (Low Severity threat detection) – Analytics, cases modules
Responsibilities:
Participated in requirement analysis of the applications along with business owners.
●Responsible for writing microservices using Spring Boot and Spring cloud foundry
●Responsible for API Platform design on High Available & Scalable Microservices Architecture
●Development of spring boot micro services in PREDIX cloud environment.
●Developed with using secured bearer token-based Service Consumed and Service Provided Spring RESTful (JAX-RS) web services using JSON
●Deploying and managing applications in Cloud Foundry and creating database instances of PostgreSQL
●Used AWS (Amazon Web Services) S3 (Simple storage service) data storage for storing the images of cases, machinery data and accessed them using AWS SDK for Java
●Security is implementing using Spring Security OAuth2
●Good Experience on Cloud configuration, deployment, managing applications
●Analyzing the log files by using Cloud Foundry console
●Responsible for implementing Data Migration as batch job.
●Responsible for Time Series data ingestion, accessing asset model data at different level of platform
●Responsible for Creating custom, general use modules and components which extend the elements and modules of core AngularJS, polymer JS
●Wrote JPA queries for PostgreSQL and Apache Cassandra database. (Time series and Asset Database)
●Used JPA to access data from database
●Written unit test cases, and tested using JUnit and Mockito.
●Implementing logging mechanism using the log4j.
●Using GIT for Source Control and Version Management.
●Implementing Agile methodology with RALLY in the development of the project. Involved in requirement gathering, analysis and designing.
●Developed several Use Case diagrams, Class Diagrams and Sequence diagrams using StarUML tool
●Responsible for finalizing the requirements and getting it implemented and delivered as per schedule
●Also, involved in testing and deployment of the application integration and QA testing phase.
●Played key role in estimating the timelines for the newly provided requirements in Scrum Sprint meetings.
●Responsible for converting the requirements into Technical Design Document
●Responsible for development and executing in demo system for every release.
Environment:
Java 8, Spring Boot, Spring Security, Springcloudfoundry, Webservices (REST), AWSJPA, PostgreSQL, Cassandra, AngularJS, Postman, JSON, GIT, StarUML, Rally
T-Mobile, Bothell, Washington Sep’15 – Dec’16
Software Developer FTR Project:
●Participated in all phases of SDLC, involved in Agile Methodology.
●Used RAD for developing web components such as JSP, Controller tier that includes action classes and business tier that includes EJBs.
●Developed web application using JSF Framework.
●Used JSF framework in developing user interfaces using JSF UI Components, Validator, Events and Listeners.
●Used JQuery and JSF validation framework for front end validations.
●Used Clear Case version controlling to maintain project versions.
●Extensively used the LOG4j to log regular Debug and Exception statements.
●Closely worked with Test Team to identify bugs in application.
●Performed security scans and developed solutions where necessary after reviewing the results.
●Assist with creation of test data for the Quality Insurance team.
REOS Project:
●Involved in various SDLC phases like Requirement gathering, Design, Analysis and Code development.
●Developed business layer using Spring, Hibernate and DAO s.
●Worked in Agile Methodology and involved in the project discussions.
●Used Springs Jdbc and DAO layers to offer abstraction for the business from the database related code (CRUD).
●Implemented Spring Beans using IOC, AOP and Transaction management features to handle the transactions and business logic.
●Designed and developed RESTful services using Spring MVC for AJAX calls for the UI.
●Customized Data Tables and High charts into Angular JS Directives.
●Implemented Angular Controllers to maintain each view data.
●Used Log4J to capture the log that includes runtime exceptions and debugging the application.
●Developed queries using PL/SQL to retrieve data from the database.
●Involved in bug fixing, enhancements and support.
●Used WebSphere Application Server as part of production implementation.
Environment: JDK 1.6/1.7, J2EE, JSP, Web services (SOAP), REST, JSF 2.0, MyFaces, OmniFaces, Spring, Hibernate 3.0, WebSphere Application Server 7.0, JavaScript, Angular JS, HTML, CSS, ANT, JUnit, JQuery, XML, Log 4j, SOA and PL/SQL.
Aditya Birla Group, India May ‘12– Feb ‘14
Java Developer
●Gather requirements from business analyst, analyzed and convert the requirement into technical design.
●Played an active role in gathering system requirements from Business Analysts.
●Developed the application using Struts MVC for the web layer.
●Developed UI layer logics of the application using JSP, JavaScript, HTML/DHTML, and CSS.
●Involved in developing complex Hibernate mapping files, mapping different kinds of associations between tables.
●Developed queries using PL/SQL to retrieve data from the database.
●Developed Test plans, cases and executed them in Test and Stage environments.
●Developed GUI and Business Logic using JSP and Servlets.
●Involved in requirements gathering and converting them into specifications.
●Designed JSP pages using different Tag libraries.
●Involved in bug fixing, enhancements and support.
●Created Stored Procedures, Triggers for the application.
●Developed unit test cases using JUnit for testing functionalities/performed integration testing of application.
●Implemented client-side validations using JavaScript functions.
●Support to UAT, production environments and resolving issues with other deployment and testing groups.
●Extensively involved in Production Support and in fixing defects.
Environment: Java, Servlets 2.1, JSP1.0, JDBC, XML, Hibernate, Oracle, HTML, Java Script, Glassfish, Net Beans