Summary
Highly skilled Full Stack Engineer with a proven track record in detailed architecture, design, and development of frontend, middleware, backend, cloud, and Big Data technologies.
Extensive experience includes:
Successfully eliciting business requirements from various stakeholders.
Expertly designed and delivered transformative, data-intensive, front-end, and back-end applications.
Domain expertise in diverse industries such as insurance, finance, retail, and telecom.
Experience in UI design and development using HTML, JavaScript, JSP, JSF, and Angular.
Experience with headless architecture, Spring Boot, REST API, etc.
Developing cutting-edge data extraction, ingestion, and transformation (ETL) programs for different source systems using industry-leading technologies like Spark, Databricks, Snowflake, Hadoop, Hive, Pig, e, Java MR, and numerous cloud services like ADF.
Expertise in various services & technologies on Azure and AWS.
Extensive experience migrating legacy applications and data (e.g., Netezza, Greenplum, Teradata, and Hadoop) to modern cloud-based application stack and cloud architecture.
An excellent team player with exceptional communication and interpersonal skills.
Quickly adapting to new technologies and environments with ease.
Managing technical teams in the US and offshore.
Big Data
Hadoop, HDFS, Spark, Scala, Kafka, Java MR, Hive, Pig, HBase, Flume, Sqoop, ETL, Oozie, Cloudera, Hortonworks and IBM Big Insights
OS/Cloud
MS Windows, Linux, Azure, and AWS.
Engineering
Core Java, JSP, JSF, Spark, Angular, C, C++
Frameworks
Microsoft Azure, AWS, Spring Boot, Spring Struts, MVC, and EJB
Database
Snowflakes, Mongo, Cosmos DB, Cassandra, SQL, Oracle, Sybase, Informix, and DB2
IDE’s & Utilities
Eclipse, RAD, JBuilder, NetBeans
Application Server
Apache Tomcat, WebSphere, and WebLogic
Client Technologies
CSS, HTML, JavaScript, JQUERY, AJAX, and SWING
EDUCATION & CERTIFACTIONS
Microsoft Certified Data Engineer. https://www.credly.com/badges/5d76a8dd-44a1-46fc-ad97-f02ff6e573ef
Snowflake Certification: https://www.credly.com/badges/92b9b98e-c20c-440d-b732-32e4f8cba839
Certified Hadoop Developer by Cloudera. 100-014-551 is a valid Cloudera certification license ID
Spark Certified Developer By Databricks.
Certified Hadoop Developer by Hortonworks. http://bcert.me/sloxqbtg
Cloudera trained in Java MR and Spark.
Advanced Java training.
1988 – 1992 Osmania University, Hyderabad, India. BS in Electrical Engineering.
1993 - 1995 Illinois Institute of Technology. Completed several classes in MS Computer Engineering.
DETAILED PROFESSIONAL EXPERIENCE
Citi bank Lead Software Engineer/Architect 10/16/2024 – Present
PBWMT Data & Analytics Snowflake - The goal of Personal Banking & Wealth Management Technology Snowflake Data Lake is to create a hybrid data lake which can support all possible integration patterns between on-prem and cloud data lake environments. Citi Managed AWS Services in Public cloud and Snowflake as SaaS services have been identified as cloud services for PBWMT.
Designed and delivered end-to-end pipelines to ingest EAP data into final staging tables in Snowflake
Investigated the issues in infrastructure issues both hardware and software.
Worked with offshore resources and assisted them in both development and testing deliverables.
7-Eleven Lead Software Engineer/Architect 04/01/2024 – 08/01/2024
Integration of 7-Eleven and Speedway backend retail systems supporting in-store systems like POS, printers, safes, coin machines, lottery machines, payment gateway, and handheld devices.
Enhancing (as part of the agile sprint plan) the backend services for inventory management, accounting, transaction receipts, logging, email receipts, etc.
Technology consists of Java, Spring Boot, MongoDB, and AWS.
Accenture Lead Software Engineer/Architect 07/06/2022 – 12/1/2023
While delivering client projects, participated in numerous Accenture internal projects, proposals, and estimates.
Navy Federal Credit Union
Designed and developed data pipelines (ETL) to migrate legacy data to modern data architecture on Azure.
Transformed the data using a layered approach consisting of Raw (Bronze), Refined (Silver), and Reporting (Gold).
Coding in Azure notebooks was done in Pyspark using Databricks packages and commands.
Final Reporting data loaded into Cosmos DB.
Technology: Azure, Azure Data Factory (ADF), Cosmos DB Synapse, Databricks, and Pyspark
Centene
Analyzed the legacy source data in Oracle, healthcare external flat files from third parties, and current legacy EDW Greenplum to design a migration strategy to move to Snowflake on AWS.
Created source-to-target mapping, refined data definitions, and documented business logic.
The source consisted of about 3K database objects, which include tables, views, and functions.
Communicated with the client on the analysis, estimates, timelines, and resource needs.
Managed the team, including offshore resources.
Technology consisted of AWS, Snowflake, and Upsolver.
Wipro Technologies SENIOR ARCHITECT 08/18 – 7/1/2022
Citi Bank 01/01/2019 – 7/1/2022
Big Data Enterprise Analytic Platform (EAP) is a platform as a service that enables businesses to use modern machine learning and artificial intelligence techniques to solve large-scale data challenges.
EAP footprint entails ~70 globally distributed clusters with ~25PB of storage and hundreds of applications serving all the lines of business and major world regions. With the increase in deployments and complexity of managing shared service platforms, the need for accurate monitoring & utilization reporting and infrastructure & process automation becomes essential for EAP’s service offering.
A Platform Utilization Metrics Analytics (PUMA) application is built to address these needs.
Collaborated with users in eliciting requirements and UI flows and Identified and prioritized use cases.
Designed and delivered end-to-end implementation, including automation framework.
Several features (authentication, authorization, roles/access, service delivery ticket creations, and billing for the projects and teams for the utilization of the platform) were designed and integrated into the application, authentication, authorization, roles and access and service delivery ticket creations and billing for the projects and teams for the utilization of the platform.
Numerous standard reports were created to provide information and visibility to users.
Real-time alerts were provided on the application's health and any errors.
Managed a team of about 15 offshore and US-based resources.
Technology consisted of Hadoop, on-premises clusters, Angular 8, Ansible, Spring Boot, and Mongo DB
Capital One Bank Senior Architect 08/06/2018 – 12/31/2018
Capital One is working on a major initiative to move all the databases (SQL Server, Teradata) and Hadoop to Snowflake on the AWS cloud. Data is sourced from different loan servicing applications.
Frameworks were built to ingest, cleanse, and validate data for the following steps:
Managed a team that ingested large amounts of data from Kafka topics, flat files from Verizon, and historical data from Teradata and SQL Server into S3 and Snowflake.
Transformed de-normalized data from Hadoop into a star schema to create reusable entities within Snowflake. Converted Teradata BTeq scripts into Snowflake SQL
Created automated test scripts to validate data between Teradata and Snowflake and between Hadoop and Snowflake while migrating the modules and data.
Responsible for team management, sprint planning, and delivery.
Technology consisted of AWS, SQL server, Teradata, Snowflake, Kafka, etc.
Sirius XM Senior Architect 08/17 – 5/30/2018
SiriusXM provides commercial-free music 24/7, plus the most amazing sports, entertainment, talk, and news.
Designed and documented the architecture for connected vehicles.
Solicited requirements and use cases from end users and designed a data model.
Implemented the methodology for ingestion framework with spark jobs, both batch and streaming. Ingested the data into HDFS and Cassandra database using spark and scala.
Implemented the shell scripts to run the spark jobs in development, QA, and Production environments.
Implemented the data ingestion from Adobe into SiriusXM environments.
Collaborated with the testing team, enabling them to write the automated scripts for testing/validating in staging and integration environments.
Technology included AWS, Hadoop 1.6, Spark 2.1, Spark 1.6, and DataStax 5. x.
Cognizant TECHNOLOGIES SOLUTIONS Architect 07/15 - 07/2017
Hudson Bay Major Retailer / Architect 04/17 – 07/1
Designed and documented the Enterprise Data Lake (EDL) architecture based on the IBM Bluemix cloud.
Designed a methodology for ingestion framework (tables with millions of rows) using Hive, Pig, HBase, and shell scripts. The framework processes large data sets of Replace, Insert, and Update data.
Delhaize Lead/Architect 04/16 – 12/31/2016
Designed, developed, and supported Enterprise Data Lake (EDL) to enable supply channel management to access and analyze the data from various corporate systems, including MF (VSAM, flat files) and DB2.
Engaged business and technical teams to document EDL business and technical requirements.
Designed code generation framework using UNIX shell and Python to automate the Hadoop code artifacts (Big SQL Hive, HBase, Oozie coordinator, Oozie workflow).
Designed and developed a custom data ingestion framework to load hundreds of tables (with millions of rows) using Hive, Pig, and Shell scripts. The framework processes large data sets of Replace, Insert, and Update data.
Designed data analysis and visualization using IBM Big SQL, DSM, and IBM Big Sheet.
Created Pig/Hive scripts (Joins, Grouping, Filtering, etc.) to validate the incoming data, error checking, creating MD5 hash key (primary key) and table audit columns.
Used HBase to load/process large amounts of UPSERT data.
Coordinated work between onshore and offshore teams.
BlueCross BlueShield of IL (BCBSIL) HCSC Architect /Lead Engineer09/12 - 07/2015
Project: Inter Par Plan Pricing Application (IPPA)
The BlueCard program has business rules and processes for communicating with the insurance programs in all the states where Blue Cross is absent. IPPA is an application that implements the business rules, processes, and pricing across all plans used by all 39 Blue companies.
Project: Heron
Heron is the enterprise renewal and proposal rating system for custom markets. It is used to determine monthly premiums for large groups and national accounts. The software takes various parameters, such as experience data, claims data, stop loss, and specific relevant customer data, and implements an algorithm (a series of 15 to 20 math-intensive formulas) that calculates the premium for each customer.
Responsibilities:
Interacted with business users and various stakeholders to elicit requirements.
Hands-on front and back-end development.
Designed and delivered several enhancements to the user interface & back end to meet business user’s requirements. Designed and developed DAO to access database.
Technologies include JSP, HTML JavaScript, Struts framework, Java, Spring, Hibernate, IBM WebSphere, Eclipse, and SQL server.
Multiplan Lead Software Engineer 10/11 - 08/12
Multiplan supplies independent, network-based cost management solutions in the healthcare industry. Fee Manager enables users to modify the discount percentages applied to providers.
Responsibilities:
Interacted with business users and various stakeholders to elicit requirements.
Hands-on front and back-end development.
Swing, EJBs, WebSphere and RAD SAP, HTML, JavaScript, Ajax and JQuery.
State Farm Insurance Lead Software Engineer 11/08 - 10/11
State Farm provides Activity Management software to agents and agent offices to manage clients. This product enables the insurance of cars, homes, and small businesses.
Responsibilities:
Gathered business requirements from users.
Designed the spring and struts framework classes.
Implementation with DAOs, and EJBs for backend. Completed Action and Bean classes for the front tier.
Java, J2EE, and IBM Rational Software Architect
Argonne National Labs Senior Software Engineer 07/06 - 10/08
eBUD is an integrated web-based budget system used by financial managers in the scientific divisions and the Office of the Chief Financial Officer at Argonne National Laboratory to prepare scientific proposals and report current year budget plans for funded proposals.
Responsibilities:
Designed and implemented entity and session beans on the server side.
Implemented JSF pages in the front end and generated Entity Beans with XDoclet.
Java, JSP, JSF, Eclipse, SunOne Application Server, XML, HTML and JavaScript.
Walgreens Software Consultant 05/06 - 7/06
WCard is the Walgreens discount count introduced to save money on prescription drugs. This card is targeted at people who do not have any insurance and are not receiving any benefits from a publicly funded health care program.
Responsibilities:
Designed page flows with HTML, JSP, JavaScript, and style aspects.
Implemented the Request Handlers and DAOs for data persistence.
Java, J2EE, WSAD, Apache FOP (Formatting Objects Processor) and XML.
Northern Trust Bank Software Consultant 09/05 – 04/06
Designed and developed Client Letter Tracking (CLT), one of the components of an integral platform and suite of products in Investors Works Station.
Responsibilities:
Designed the framework classes and completed the implementation.
Tested the code from unit testing through UAT. Worked with the deployment team to deploy the application on the cluster.
Java, J2EE, Apache FOP (Formatting Objects Processor), Weblogic 8.1, Eclipse 3.0
Chicago Board of Trade Software Consultant 05/04 – 04/05
Project Denali is the name given to the broker-to-broker match solution for CBOT pit traded products that will enhance the efficacy of open auction trading at the Chicago Board of Trade—the project aimed to reduce the time of order acceptance and confirmation and provide near real-time trade matching.
Responsibilities:
Interacted with the business team and analyzed requirements.
Worked in the development team implementing the message sender and listener classes.
Provided support on the trading floor, matching the trades manually and sending them to CME for clearing.
Technology: Java, WebLogic, Eclipse2.0, Struts1.1, JSP, Oracle and MQSeries
TIGroup Software Consultant 12/03 – 4/04
Navigation Technologies is a leading provider of digital maps for in-vehicle satellite (GPS) navigation, Internet/Wireless applications, Enterprise/Fleet solutions, and Location Based Services (LBS).
Responsibilities:
Configured the project environment and developed the JSP and velocity pages.
Implemented the security and authentication component.
Northern Trust Bank Software Consultant 04/03 - 06/03
Northern Trust delivers superior custody, trust, investments, and banking services to personal, corporate, and institutional clients. Its FundStrategy application is a custom tool that Risk & Performance Services users use to generate and deliver boardroom-quality performance reports to clients. The tool empowers users to initiate and design performance reports at their desktops to meet their clients’ unique reporting needs.
Responsibilities:
Developed C++ applications for the performance calculations.
Worked on the stored procedures, which generated data and fed several performance reports.
Citadel Investment Group Lead Engineer 07/01 – 01/02
Citadel Investment Group manages substantial investor capital using various sophisticated investment strategies. Designed and developed a solution for real-time data from Reuter’s data feeds and updated the Sybase database.
Responsibilities:
Retrieved real-time data from Reuter’s data feeds and updated the Sybase database.
Worked with TIBCO publish/subscribe messaging model.
Options Clearing Corp Software Engineer 05/00 – 06/01
OCC is a financial derivative instrument clearing organization that provides highly reliable clearance, settlement, and guarantee services.
Responsibilities:
Designed and delivered front-end applications, implementing the GUI with Java Swing.
Java Swing based GUI, JBuilder, CORBA with C++ business objects, visibroker ORB, DB2, MQ series.
Various positions Engineer/Consultant 05/93 – 04/00
Hub Group, Lombard 09/99 – 02/00
Designed and developed in Java/JSP/HTML and XML With IBM HTTP server.
Lucent Technologies 05/97 - 10/01/98
Designed and developed new Java services on Sun Solaris 2.5, Oracle 7. x, Java Workshop together, and Sablime using STP, HP UNIX, and C++.
Ameritech, Chicago Senior Software Engineer 07/95 - 09/96
Designed and developed an object-oriented distributed client-server system that replaced a legacy mainframe application and was implemented in C++ with Rogue Wave Classes.
Motorola, Inc. Software Engineer 05/93 - 07/95
At Motorola, she participated in research, design, and development for the Centralized Base Station Controller (CBSC). The software was developed for Tandem computers using UNIX, C, C++, shell scripts, and the Oracle and Informix databases.