Sridhar Chandra Koritala
*******.********@*****.*** Fremont, CA 94555
Objective
A challenging position in an organization seeking architectural experience and complete software development life cycle experience in delivering products/services based applications with commitment for enterprise quality. Profile
* Architectural experience with full lifecycle development of various products and services in a distributed environment.
* Solid experience in working with various technologies and frameworks involving Web services in SOA using REST and, Jersey, Hibernate, Spring, Spring Boot, Guice, OSGI frameworks Apache Felix and Eclipse, EMF, JDBC, JMS, JNDI, JNI, JSP, Servlets, EJBs, Netty, Apache Traffic Server, Apache Tomcat, JBoss, Maven and Ant, Apache Atlas.
* Solid Experience in Distributed Architecture using Distributed systems like Apache Spark, Hadoop, Hive, Zookeeper and SOA, S3, HDFS etc.
* Experienced with No SQL technologies like Apache Cassandra, MongoDB, Hive, Impala etc.
* Extensive programming knowledge and professional experience in Java, C++ and C languages including large scale distributed environments.
* Experienced in writing complex, performance oriented and scalable server applications and services in various flavors of Unix/Linux and Windows operating systems.
* Over 14 years of experience in software development at various levels of involvement as a team member as well as individual contributor.
* Experience in defining and implementing frameworks, SDKs and REST APIs for various products, services and components resulting in better integration with partner products at various levels.
* Experience in leading teams to a successful delivery of various products using agile and other traditional Software Development methodologies.
* Patent holder from the United States Patent office (Ref: US 7,162,643 B1) and from European Patent Office (Ref: 02732069.6-2413-US0218808) and for a data transfer technology developed as part of a team while working at Informatica. Professional Experience
Staff Software Engineer,
Service Now, Santa Clara (April 4th 2022- current) I am working as part of CMDB-Integrations that works on getting data from external services and publishes the received data after performing ETL into Servicenow’s CMDB.
Open Source Software EngineerStaff
Ahana, Mountain View (Feb1st 2022-March 25th)
I am working as a member of the team that works on an open source Distributed Database called Presto that is originally contributed to open source by Facebook. I worked on connectors that actually fetch data from external services and databases. As part of this effort, I added pushdown optimizations to some of the connectors for performance improvements.
Staff Software Engineer
Cloudera, Palo Alto (May 9th 2016-Feb1st 2022)
* Worked on architecting and implementing the data analytics backend services for Cloudera Navigator used for Data Governance and metadata search services. Proposed new features like Auto Data Classification and Data Loss Prevention using Navigator. Implemented the scalable audits framework using other Cloudera Technologies like KUDU and Impala and made it as a SAAS offering. Implemented a cloud service framework for reporting various kinds of storage reports for operational services in multi-tenant environments. I worked on enhancing Apache Atlas framework for data governance purposes and making it available for Cloudera’s common metadata service offering using microservices architecture. I also worked on improving Apache Atlas performance for search and data ingestion into Atlas that uses the Janus graph database. I worked as a member of the Workload Management team as a lead engineer. The work involved defining architectural enhancements to Workload Management services and implementing them as per need. These services are built using various AWS technologies like Kinesis, Dynamo DBs--the work involves defining/designing microservices that gather data changes from other hosted services and aggregate them in central cloudera hosted services for user reporting/monitoring purposes. Currently, my work involves working with Cloudera’s Hive HMS related service’s metadata objects and publishing them to Cloudera’s services and generating Analytics data for reporting. The work also involved processing the data received by Cloudera’s services and storing them in Amazon S3 for AWS Athena table’s daily partitions. The data is maintained at day cadence to cover historic data as well over time..
Sr Software Engineer (Data Infrastructure & Architecture) Jiff Inc, Mountain View (July 20th 2015-May 8th 2016)
* I provided a new Infrastructure for data services team using Apache Spark and Cassandra data frameworks. Also, as the member of Data Infrastructure and Architecture team, I worked on architecture plus implementation of a new infrastructure to continuously receive user location information and identify the top 5 Point of Interests for each users using Apache Kafka and Cassandra with REST services developed using Spring Boot and Data Lake using Apache Spark and Spark Streaming. I also worked on implementing user profiling services using Apache Spark. Worked on a feature called “Home Stream” that is, in functionality, similar to News Feed applications like Flipboard—home stream provides the users with news related to their healthcare etc.
Principal Software Engineer (Data Architecture)
Lending Club, San Francisco (June 2nd 2014-July 19th 2015)
* As a member of the Data Architecture team, I worked with Product Managers and business team members, implementing various services/frameworks for integration with other business components. Work also involved defining the design/architecture and coordinating efforts with team members. I also worked on implementation of frameworks to import data from various heterogeneous sources. In doing so, the frameworks are also configured to generate hive tables and map data for generated hive tables and partitions. I have also written mapreduce applications for generating data for business reports. I also worked with the data services team to create optimized hive Queries for reporting purposes. Worked on setting up web services for other Lending Club business partners.
Principal Software Engineer
Symantec Corporation, San Francisco (May 12th’11-June 2nd)
* Worked on implementing the new architecture to make the product work on Cloud Deployments that are either in-house or as a service by Symantec. The work involves research and proposal of solutions using various cloud technologies that include zookeeper, Cassandra, Memcached, Apache Traffic Server etc, OpenStack Swift file systems and Apache Felix.
* Worked as a team lead on improving the performance and scalability of Symantec DLP product's using various open source products and defining the standards for providing REST based web services by Symantec DLP product team. The work also involved in researching and implementing POC ideas with various Nosql (using MongoDB) based services etc.
* Worked as a lead on providing a REST based integration services between two of Symantec products.
* Working as a lead on a POC project for incorporating Hadoop and Hive based solutions for reporting enhancements.
* Worked as a lead on implementing an in-house reporting infrastructure project for generating reports using the metrics that are collected from various installations of Symantec DLP product. The reporting framework uses various open source technologies like Spring, Hibernate, Eclipse based Reporting Engine.
* Worked as a lead on improving the performances of various web pages of Symantec DLP products. The work involved in refactoring the existing models, database schema and coming up with Ajax and Rest based solutions.
Principal Software Engineer
Yahoo! Inc., Sunnyvale (Java, C++ and Scala) (Dec 14th'09-May 11th’11)
* Worked as a member of Cloud Computing Serving and Infrastructure team.
* Worked on jDISC (a java based Data intensive Service Container framework) as a member of Cloud Computing Serving and Infrastructure team implementing an infrastructure that is similar to Amazon EC2. The jDisc is implemented using OSGI implementation Apache Felix and JBoss Netty and allows hosting of various kinds of high performing scalable Yahoo internal services that are based on Memcached and providing Rest APIs etc. for usage by other Yahoo services and properties.
* Involved in defining the architecture of the infrastructure and owning of three traffic serving components--Cloud router, Cloud Gateway and BRM. At high level, BRM manages the bindings (network traffic paths from entry to a servicing nodes), and the remaining two components would use these paths and serve the traffic by directing users to one of nodes in a tier containing hundreds of nodes. The technologies involved using Apache Zoo Keeper, Apache Traffic Server, Guice etc and Yahoo’s proprietary NO SQL implementation.
* I was also involved in investigating an approach to automatic deployment and management of software applications on the cloud computing nodes ranging in hundreds or thousands using Scala.
Technical Lead Engineer
Success Factors, Redwood City (Java) (Oct 13th'08-Dec 13th'09)
* Worked as a lead on Analytics Ad-hoc Report Builder that allows the users to build custom ad-hoc reports for various modules of Success Factors. Defined functionality of the Report Builder, designed and implemented a generic server side framework that fetches that data in heterogeneous formats (from various Success Factors modules) and produces data for report generation purposes.
* Work involved investigating various open source software available for providing report generation technologies and making decisions in using them. Principal Software Engineer
Informatica Corporation, Redwood City (C, C++ and Java) (Jul'99 - Oct 13th'08)
* Worked at various levels on both Informatica Tools and Server Platforms.
* Work involved defining the new architecture for the Informatica platform using various technologies like Eclipse, Hibernate etc. My role involved gathering requirements, defining the functionality for various frameworks that include supporting object editors, object validations, drawing components/libraries for use in editors, views using SWT and object modeling using Informatica modeling framework based on EMF, object persistence using Hibernate and EMF. Worked as a team lead for some of these frameworks. While the short term goal of this project was to use these frameworks for Informatica's new product, the long term goal was to bring in all of Informatica's products into one single IDE--currently, these products have different client applications
(IDEs) for use by the end users.
* Worked as a team lead for the project integrating PowerExchange, previously owned by a company called Striva (now Part of Informatica), with PowerCenter. The work involved identifying the method of integration, proposing the integration architecture, defining the functionality and leading the development efforts across the teams located in multiple locations (U.S, U.K and India).
* Worked as a team lead developing a new data transformation widget for Informatica PowerCenter that allows the user to write java code and compile the code within PowerCenter without leaving the need for a Java IDE.
* Involved as a team lead in delivering a connectivity product that calls WebServices from PowerCenter.
* Took the efforts of bringing up the newly established R & D team at an off-site (new) branch to the Informatica's R & D Development Standards and culture.
* Worked as a team lead at various levels of involvement for PowerCenter connectivity products for extracting/loading data from/to JMS, Web methods, MQ and Tibco.
* Involved in defining various sets of API frameworks allowing third party vendors integrate their products with PowerCenter using C, C++ and Java.
* Involved in defining Web Services exposing the PowerCenter Server's administrative and monitoring services.
* Developed a prototype version of ODBC driver and JDBC driver for integrating Informatica's products PowerCenter and PowerChannel.
* Designed architecture and implemented a new product called "PowerChannel" that sends large amounts of data over the Internet after compression and encryption. Was involved in the product for the complete life-cycle of this product.
* Designed and implemented the globalization of extracting data from PeopleSoft using PowerConnect for PeopleSoft.
* Designed and implemented new features for extracting data from PeopleSoft in PowerConnect for PeopleSoft.
Software Intern
CACI Federal Inc., Washington. D.C. (C and C++) May'98 - Aug'98
* Designed and Developed MERBS, Multiechelon Readiness Based Sparing, in VC+ and VB for forecasting budgetary requirements for inventory at various naval bases by creating DLLs in VC++ and GUI in VB.
* Assisted team members in developing the application. Senior Software Analyst
Data Graph Technologies, Hyderabad, India. (C and C++) Feb'96 - Dec'96
* Lead a team for the full life cycle of the project, Point of Sales, a multi-restaurant billing system, using VC++, Oracle, and WINAPI.
* Designed and developed the module Bill of materials of MDI application, MRP 2000
(Manufacturing Resource Planning system), in VC++ and Oracle.
* Used ODBC for data retrieval from Oracle database. Software Programmer
Data Graph Technologies, Hyderabad, India (C and C++). Jan'95 - Jul'95
* Designed and Developed the Front office module for FOCUS in C++ and VC++. FOCUS is a total hotel management package.
* Created an API for porting the module Front office, created in DOS, to Windows without affecting the previous code.
Software Programmer
Bureau of Data Processing Systems Ltd., Visakhapatnam, India. Dec'93 - Dec'94
* Implemented image processing techniques for Vision Developer, an application for creating pamphlets, in C++. Used DOS interrupts for graphics.
* Designed and Developed AOMS (Apartment Occupants and Maintenance System) in C++. AOMS records information about new apartments leased to the organization, maintains apartment history along with the expenditure details incurred through these apartments.
* Designed and Developed Hospital Management System in C & UNIX. HMS included patient’s history module, an inventory module hospital medical department. ISAM is used for data retrieval.
Skills
Technologies: Hibernate, WebServices including REST, ODBC, JDBC, SQL, JMS, JNDI, JNI, JMS, Tibco, Webmethods, JSP and Servlets, Tomcat, AXIS, JBoss, Map/Reduce using Hadoop, Zookeeper.
Languages: C++/C, Java, VC++ (MFC and Win API), SQL, C# and VB.NET. Scripting Languages: Flex, Action Script, Python, VBScript, JavaScript, Flex Tools: Visual Studio, Eclipse, Ant, Maven, Rational Rose, Visio. Operating Systems: WindowsXX/NT 4.0, Windows 2000, AIX, HP-UX, Solaris, Linux. Databases: Oracle, MS Access, SQL Server, MySQL, MongoDB. Education
M.S. in Computer Science,
East Tennessee State University, Johnson City, TN. USA. Jan'97 - May'99 Diploma in Advanced Computing,
Center for Development for Advanced Computing, Pune, INDIA. Jul'95 - Jan'96 B.S.in Electronics and Communications,
Andhra University, Visakhapatnam, India. Aug' 89 - Jun'93