Post Job Free
Sign in

Data Engineer Node Js

Location:
Jacksonville, FL
Posted:
June 30, 2025

Contact this candidate

Resume:

Pushpam Inigo Peter

682-***-****

************@*****.***

https://www.linkedin.com/in/pushpaminigo-peter-7266b016/

SUMMARY

Have 17+ years of experience in Requirements gathering, Analysis, Design, Development, Testing and maintenance of Health Care, Ecommerce, Financial, Kiosk, Cargo, Airlines and Telecom service.

Expertise on server-side Java computing based on JAVA/J2ee, Spark, Scala, Python, Google Cloud Platform, Hadoop, Spring Boot, Kore Ai,AI/LLM, microservice architecture, Angular Js,WebSphere Application Server (WAS), Struts, Spring mvc, Nifi, Node JS, Flex framework and Proficient in using Hive tables, Impala, Mongo DB, PostgreSQL,SQL Server, Parque, Json, AWS s3,AWS EMR, Vector DB, AI and Oracle

Experience in developing and deploying Enterprise Applications using WebLogic App, IBM Web Sphere,WebSphere, Liberty, Tomcat, Eclipse, Eclipse IDE – Scala, Customer Data Platform (CDP) using Segment and looker dashboard.

Experienced in build management ci/cd pipeline Git, JIRa, Jenkins and OpenShift

Proficient in using Tools namely WSDL, Eclipse IDE, SOAP UI and Top-case, TSF, SVN and Clear Case.

Expertise in tuning the Spark Scala / Java program for memory management, catching framework, MapReduce, and Spark Streaming etc.

I have knowledge of hibernating, Kafka Streaming, Akka framework, Snowflake, Aws, and JMS messages.

TECHNICAL SKILLS

Architecture/Frameworks

Spark 3,2 & 1, J2EE, MVC, Struts, MVP, Spring, Flex, Adobe

Languages

Java, Scala, Python, XML, SQL,Fortran, COBOL

Application Server

Liberty, WebLogic, Spring boot, WebSphere 6, Tomcat 4.x/5.x/6.x

Script Languages

Node JS, XML, HTML, Javascript, CSS, JSON,Ajax, Angular JS

Server Programming

Node JS, J2EE, Java, JSP, JDBC, Servlets, EJB, Web Services,, IBM Web Sphere

Design Methodologies

Design Patterns,UML

GUI /Visualization

JAVA, JSP, Swing, Applets, Html, GWT

Tools/Technologies

Eclipse IDE, WebSphere Studio Application Developer (WSAD), Rational Application Developer (RAD), Eclipse, Jenkins, Git, Ant, Clear case, Maven, SVN, SoapUI, Impala, Nifi and looker dashboard

Databases

SQL Server, Oracle, GCP, MongoDB, PostgreSQL, Hive, HBase, Aws s3,AWS EMR and IBM sailfish

Version Control Tools

TFS, Clear Case, Jenkins, SVN, Visual Studio 2010,WINSCP, Putty, Open shift, Toad 12.6, JIRA

Frameworks

MVC, Struts, MVP and Spring

Operating Systems

WINDOWS 2000/2008, WINDOWS NT 4.0

EDUCATION

Master in Computer Application(MCA),University of madras, Loyola College Chennai,Tamil Nadu, India - 2007

Bachelor of Mathematics, ST.Xaviers College Palayamkottai, Tamil Nadu, India - 2004

Diploma in Computer Application, ST.Xaviers College Palayamkottai, Tamil Nadu, India - 2002

OCJP : Oracle certified java programmer.

PROFESSIONAL EXPERIENCE

Employer: Synergy Technologies LLC

Client: Zelle [early warning Banking] Jan2025 to till date

Project: Enterprise Data Infrastructure [EDS]

Role: Sr Software Engineer

Location: Scottdale Arizona,

Working with Zelle digital payment network run by a private financial service company owned by multiple bank system, here data will be handled in petabyte of data & transaction are secured gate way system.

Developing the Kore Ai for the user integrating with IVR, with front end application Angular JS and verify the user by POST request (Rest api call) Spring boot and save the json messages in aerospike/rdbms.

Read the multiple Kafka topic save the Json message in HBase

Applied NumPy for feature scaling, normalization, and dimensionality reduction in machine learning pipelines, improving model convergence speed.

Optimized memory usage of large datasets by applying efficient Pandas data types and chunking strategies, improving runtime performance.

Spark frameworks using Java 11 stream api save the multiple records messages in HBase

Spark frameworks using scala 2.11 identify the data lineage process to error based on the column level validation in HBase, aerospike, rdbms etc. and error log will save in hive and sql depends upon the threshold limit.

Client: Florida Blue – Guide Well Feb 2022 to Dec2024

Project: Enterprise Integration Team – EIS [Multiple Projects]

Role: Senior Software Engineer [ Java /Scala/Python]

Location: Jacksonville, FL

Responsibilities: Florida blue/ Guide well providing coverage for hospital service and blue shield covering physician’s service. Offering insurance offers some form of health insurance coverage in every US state.

Worked on migration task for multiple project service written in Java Spring boot [OpenShift version4].

Worked on the front end, multiple tasks for provider submission have used Angular JS & React

Developed Individuation project – initiative will enable the identity management life cycle of the existing Florida blue constituents (shoppers members, agents) like providers through a redesigned party repository, comprehensive set services written using state of the art technologies, utilities exception process and synchronization service, I developed for the members or an agent having their name, email telephone address and ssn role which includes upset party, split and merge using J2ee, spring boot,micro service, Redis and caching framework & implementation

I have delivered multiple ETL jobs for Fast Access (MongoDB/PostgreSQL/Parque File), includes hdp to cdp migration, and spark Streaming process reading message from kafka. Projects which are Agent, BA, Group, Provider, Rewards, care gaps, inter plan, coding Gaps, claims etc. for this all the collection sources might differ depends upon the requirement, we have the transformation in the form of joining/filtering the table based on the business requirement. source from hive/HBase/Mongo DB/db2/IBM sailfish table and then finally action will be loading or save in MongoDB/PostgreSQL and some of the projects we read from AWS S3 location, these all projects written in spark Java/Scala/Python, Scala IDE tool for development, Gradle build/SBT for build, then deployment in Hadoop cluster (Cloud data platform Horton works, AWS EMR, AWS s3) we used the Jenkins ci/cd pipeline while moving to Prod.

Utilized NumPy for high-performance matrix operations and numerical simulations, reducing computation time by 50% compared to native Python.

Integrated NumPy arrays with Pandas and Scikit-learn to streamline preprocessing for large-scale predictive modeling projects.

Developed Dashboard application using Spark 3 Java/Scala/ Python version ETL for provider service, final table loading in PostgreSQL, it’s a transformation of joining multiple joining from sailfish table. Complexity behind this project to bring the final data formation in graph chart in UI in Angular Js view in circle chart, pie chart etc.

Loved to working on the latest design development AI, Databricks we are using the spark framework to connect and vector DB, iceBreck.

Developed care gaps project – this is related to a medical member who are all having health issue, provider will send a remainder note to a person or member there is health checkup pending. Example diabetic person need to test blood thrice a month. Developed this project in node JS – Agent who are all send the customer details and will send care gaps details request & response would be in Json structure.

Responsible for guiding/mentoring the team members, if they face any technical issue related to spark Scala, java /j2ee,Junit, jars etc. Worked on Production set up git, Jenkins, IBM Web Sphere,Open Shift and deployment in prod, Spark knowledge in data processing and parallel processing.

Worked on all the projects written in the shell script to execute in the Linux and Unix box

Environment : Spark 3, 2.11 Scala/Java/Python, spark /Kafka Streaming, Cloudera platform, Hadoop, Hive tables, HBase, Mongo DB, Java/J2EE, spring boot, Servlet, IBM Sailfish, AWS EMR,AWS S3, PostgreSQL, PL/SQL Store Procedure,, IBM Web Sphere,Parque format, Kafka, REST, Web Logic, PL/SQL, Windows XP, Maven, Shell Script, Impala, JSON,s Unix, Kafka Streaming, Toad 12.6, Jenkins 2 /1, Git, WinSCP, Gradle build, SBT, Putty, Unix Box, Linux Box, Open Shift, Agile methodology.

Client: Master Card Aug2021 to Feb2022

Project: Master Access

Role: Senior Software Engineer (Spark Scala Hadoop & Java)

Location: Jacksonville, FL

Responsibilities: Project which I have been working on for Master card Fraud detection & shop local system.

Working in LTV process complex module, it’s a re-engineering project eliminating the Netezza backup to HDFS location, develop all the queries in sql and dependent upon the module must develop in spark Scala/Java and all the process flow through in Nifi process – livy process, sql triggering etc.

Delivered spark with Java framework, this project deals with shop local system reading a tsv file and downstream move into a pipeline format as parquet the file using nifi – livy process.

Worked on spark Scala, Java, Nifi, hive tables, batch program.

Worked on all the projects written in the sql, Scala, java, Hadoop, Netezza, Oracle.

Environment : Spark 3 Scala/Java, spark with Java coding, Hadoop, Nifi Hive tables, Java/J2EE, PL/SQL Store Procedure, PL/SQL, Mac book, Maven, Shell Script, JSON, Unix, bigbucket, Terminal, Unix Box, Linux Box, Agile methodology, splunk alert.

Client: Walmart ECommerce Jan2021 to Jul2021

Project: Customer Backbone

Role: Senior Software Engineer

Location: Jacksonville, FL

Responsibilities: Project which I have been worked for Customer shopping, Customer will have lot of personal information being collected by stores/online transaction, we are transparent with the customer transaction information, they can do delete, auto purge and access the information.

Worked on customer privacy access project - nifi, have migrated the existing hdfs location to Google cloud platform, we use the segment tools for customer data platform joining with multiple data source.

Delivered spark Scala project related to opt/access/delete project Kafka streaming job read from multiple topics and multiple offset numbers, dynamically stored the offset values in kafka broker and commit the values using the group id, finally filter the json and save the files in gcp location. Once all process is done have to send the success message to api call using put request method and included audit logger.

Worked on Looker Dashboard - chart creation for the flow we worked on the access module, created some complex queries to bring the multiple line, pie, column chart in the graph.

Worked on spark Scala, hive tables, gcp, batch program and customer data platform (cdp).

Worked on all the projects written in python for executing in airflow task.

Environment : Spark 3 Scala/Java, python, Hadoop, Hive tables, HBase, Google cloud platform, Java/J2EE, PL/SQL Store Procedure, Kafka, REST, PL/SQL, Mac book, Maven, Sbt, Shell Script, JSON, Unix, Kafka Streaming, Jenkins, Git, Terminal, Unix Box, Linux Box, Jira, Agile methodology, Looker.

Client: Florida Blue – Guide Well Jun 2017 to Dec2020

Project: Enterprise Integration Team – EIS [Multiple Projects]

Role: Senior Software Engineer [Java/Scala/python]

Location: Jacksonville, FL

Responsibilities: Florida blue/ Guide well providing coverage for hospital service and blue shield covering physician’s service. Offering insurance offers some form of health insurance coverage in every US state.

Using Scala, Generated multiple Reports for Florida blue project. Reports named as membership by contracts, OE renewals, pending binder and membership by contract bronze etc. This is a complex project handling multiple source databases db2/ hive tables/sql/Mainframe db2, do all the transformation and then final action create a dynamic csv file or pipeline delimiter file. This Project, due to huge volume faced the Problem with heap issues in the Hadoop cluster and solved the problem by reparation, shuffling and caching in memory.

Developed spring boot service with asynchronous call update in db2 & mongo dB and written microservice for the care Gaps project source from mongo dB, then I have written spark streaming job for provider update process will get the message from Kafka streaming, this job will update in PostgreSQL table.

Developed care gaps project – this is related to a medical member who are all having health issue, provider will send a remainder note to a person or member there is health checkup pending. Example diabetic person need to test blood thrice a month. Developed this project in node JS – Agent who are all send the customer details and will send care gaps details request & response would be in Json structure.

Worked on spark Scala, python, hive tables, HBase, micro services, batch program, spring boot, Splunk.

Responsible for guiding/mentoring the team members, if they face any technical issue related to spark Scala, java /j2ee, jars etc. Worked on Production set up git, Jenkins, Open Shift and deployment in prod, Spark knowledge in data processing and parallel processing.

Worked on all the projects written in the shell script to execute in the Linux and Unix box

Environment : Spark 3, 2.11 Scala/Java, Cloudera platform, Hadoop,Hive tables, HBase, Mongo DB, Java/J2EE, spring boot, Servlet, IBM Sailfish, AWS S3, PostgreSQL, PL/SQL Store Procedure, Kafka, REST, Web Logic, PL/SQL, Windows XP, Maven, Shell Script, Impala, JSON,s Unix, Kafka Streaming, Toad 12.6, Jenkins 2 /1, Git, WinSCP, Gradle build, SBT, Putty, Unix Box, Linux Box, Open Shift, Agile methodology.

Employer: Cognizant Technologies [2.1 years]

PROFESSIONAL EXPERIENCE

Client: AT&T Project: Retailers & Credit and OPUS April2015 to May2017

Role: Senior Java Developer,

Location: Richardson, Texas & Alpharetta GA

Developed AT&T retailers and Credit, these projects ensure the efficient wireless service to all the millions of customers associated with the USA project and all its brands. This project mainly focuses on pricing rate plans for the AT&T customers named Catalog. This project is called cpc, gears and rome application and database name as authoring & delivery staging.

Developed a Wi-Fi catalog project, trinity project with the entity relationship mapping and built the output in the format of JSON/XML, then published the file in the dmaap for the AT&T customers and worked in the multiple projects.

Developed wll catalog two projects, the first project will read from an excel sheet file, get the zip code details and collect all the addresses of zip code through gsd api call and validate all address details using the multithreading concept and then update the valid address in table. Second project reads the data from an excel sheet and verifies the address using an api call, then finally updates or inserts in a multiple table and refreshes the view table for viewing the details.

Developed new project will able to activate, reactivate, redeem and support customer subscription at the AT&T-mobile carrier switch level using AT&T-Mobile’s Wholesale Enterprise Services Platform and this project will implement the ability to add, remove, or modify a customer’s service package at the carriers with rather than through software deployed to the handset and generation PL/Sql stored procedure retrieve and insert operation.

Developed SafeLink project to create the technology with at& T mobile which will drive Tracfone ability to service subscriber accounts using the AT&T-mobile carrier switch

Develop catalog service which include rate plan service for the AT&T, WebTop framework to a Java, J2EE, and Struts framework, dmaap service and spring framework.

Developed trinity catalog, wll catalog, wifi catalog, hcv catalog, clarify catalog, eis catalog, ams catalog, my att consumer catalog, my att premier catalog and telegance catalog.

Responsible for creating the pl/sql stored procedure based on each requirement using the spring framework jdbctemplate. Worked on dmaap network access publishing and receiving using the network port registration. Worked on the production support for cpc project.

Developing the application based on the design adhering to struts frame work, spring frame work, including JAVA/J2EE,servlet, EJB, pl/sql store procedure, JSON, toad and maven.

Environment : Java/J2EE, Servlet, Spring, EJB, PL/SQL Store Procedure, Dmaap, REST, Web Logic, PL/SQL, Windows XP, Maven, Shell Script, JSON, Unix, Toad 12.6, Jenkins, WinScp, Putty, Unix Box, Linux Box, MongoDB, Agile methodology.

Employer: Hexaware Technologies, USA [7.8 years]

Client: Northern Trust October 2013 to March 2015

Project: Fundamental Development and Dashboard Development project

Role: Senior Java Developer (Location: Chicago, IL)

Environment : Java/J2EE Struts, Servlet, JSP, Java Script, CSS, Spring, EJB, PL/SQL Store Procedure, Flex Builder, JAXWS, Adobe, Ajax, Angular JS,Weblogic10.3.6 ~ 10.3.2, Windows XP, Maven, Ant, Unix, Sonar violation, Oracle, winscp, putty, lotus notes, Visual Studio 2010, XML Spy, JSON and Soap UI

Fundamentals, Fund Brief and performance dashboard applications (aka ‘Fundamentals’) are online performance reporting and analysis tools. They are used by internal Performance & risk analysts, external asset owners and external asset managers to generate performance reports. Fundamentals are fund Brief and the fundamentals dashboard are web based applications delivered to clients over the internet using a web browser.

Fundamentals are a standalone web application. Fund brief and the fundamentals dashboard are delivered as portal blocks on a northern trust portal containing other blocks.

Developed new copy share snapshot, sharing the tabs of dashboard items for internal and external users. Those who can make the copy of tabs 1 to N numbers from the front-end Dashboard application and in the backend Oracle database it’s interlinks with 10 dashboard tables with parent child relationship using the primary key of core ID generation PL/Sql stored procedure retrieve and insert operation.

Developed new dashboard URL portal link to publish the Dashboard website interlink with flex page and verify the User ID by another security controller EJB, this includes a tab with exhibit can display 1-4 exhibit in row or column wise in dynamic and maximize or minimize the Internet Explore size dynamically adjust.

Developed new metadata service for the dashboard using the rpm service, proxy class will take care the java class objects request and response and its calculate metric for the dashboard, then save in the dashboard table for the specific selection of the property.

Develop strategic plan to migrate application from Documented WebTop framework to a Java, J2EE, and Struts framework and flex framework.

Developed new JSP page for the dashboard application portlet using the swift file as view.

Develop new spring transaction management for the dashboard project based on the requirement copy and share snapshot.

Responsible for creating the pl/sql stored procedure based on each requirement using the spring framework jdbc template, each client requirement has been developed using the UML and Visio 2010 diagram

Design and Development of Functional Modules for fundamental development project and dashboard project. Implementation of Service Oriented Architecture (SOA).

Client: Meritain Health Care – An Aetna Company, Buffalo NY USA

Project: EDI Monitoring Oct 2012 to Sep 2013

Role: Senior Java Developer (Location: Buffalo New York)

Environment: Windows XP, Java/2EE 1.6, DG System, SOA, Java Script, Struts, Xml and SQL Server, HTML, WebSphere.

This project will work closely with monitoring team to END-END process of Health Insurance application. Monitoring which includes advising if the records have loaded to the processing system and will be responsible to monitor the FTP (File transfer Protocol), intermediate locations which are sent and received/loaded into destination of the system. The key requirement is to monitor the error locations and send updates to the management. The Company is one of the USA’s largest administrators of self-funded benefit plans and is the leading independent administrator of Consumer-Directed Health Plans. The company provides plan administration, innovative wellness, medical management, disease management, network management, and cost management services.

Responsibilities:

Diagnose the Health care applications (EDI) and applying the appropriate functional complexities of Database relevant to the application.

Identify the business rule for the Health Care Insurance and enhanced working with multi applications.

Develop strategic plan to migrate application from Documentum WebTop framework to a Java, J2EE, and Struts framework. Development & unit-testing of the application customizations.

Develop the custom Java framework for the application to interact with Documentum, the content server; the database Server. Develop extensible Java web services using Tomcat.

Client: DELTA Airlines, ATLANTA GA May 2010- Sep 2012

Project: AXIS –Re-platform CMS & CMS Rewrite Study Phase I & II

Role: Senior Software Engineer and Module Lead (Location: Chennai, India)

Re-engineering Customer Management System (CMS) and will rename it “AXIS”. AXIS is developed to eliminate manual processes, automation inconsistencies between working groups, toggling between systems, duplication of training material and processes, while reducing handling time and improving customers overall experiences. The key requirements of AXIS include new user interface, streamlined call flow processes, provide structure yet enable flexibility so agents can service the customer effectively, DL Term functionality & manual work processes minimized or eliminated, Enhance the customer experience by providing agents high value customer information & suggestive action to be taken, Total utilization of SOA services, Speed to market implementation for technology changes, Single user interface to multiple applications (Rejects, Groups, JV, etc).

The Customer Management System (CMS) is a complex Passenger Service application which provides reservations representatives with an intuitive, streamlined interface for making reservations, checking flight data, reviewing SkyMiles accounts and more. It was developed using the technology, Java Swing which is outdated because of its Standalone nature / Installation necessities. This project, “Delta CMS rewrite study” analyzes the complexity (including the Interfaces complexities) of the existing application and proposes a solution for replacing the outdated technology with new generation technologies.

Responsibilities:

Coding & Unit Testing, Conducting System & Regression Testing, Conducting Peer reviews and Independent Testing. Designing the application using UML modeling tools.

Design and Development of Functional Modules for AXIS such as Seat Map Allocation, Components, Pricing and Flight Segment.

Developing the application based on the design adhering to Google Web Toolkit (GWT) frame work, struts, JAVA/J2EE, XML, CSS and Struts.

Environment: Windows XP, JDK, Google Web Toolkit, IBM Web Sphere, TIBCO AMS, Enterprise SOA, JAX WS, JUNIT4, IBM Clear Case, Check Style Eclipse (RAD) Plug-in and Struts, Servlets, Ant build and SQL.

Client: ANA Airlines Jan 2009 – Apr 2010

Project: ANA PSS - Role: Developer onsite coordinator - Location: Japan- Haneda Airport

Environment: ASCII FORTRAN, COBOL, UNISYS OS2200, FCSS.

Responsibilities:

Maintenance and enhancements for the All Nippon Airlines-Passenger service application. This includes analysis, design, coding, unit testing and system testing of the mainframe system, global and local release management for the application and real time maintenance and problem resolution for the end users.

This includes a complex functionality DEPCK that was developed to minimize airlines human resource effort. It ensures passengers from different locations getting connected to the appropriate flight at correct time during transits. This is achieved by calculating and generating reports based on passengers from in-bound as well as various out-bound flights for each location. Passenger boarding requests, Baggage, connecting flight details, Gate number of the connecting flight were taken care of.

Understanding the client requirements by studying functional document. Writing basic design document for client requirements. Attend weekly status meetings and provide detailed status report to the client. Discussion with the end user and the client regarding new requirements

Developing and deploying high priority changes, Root causes analysis, Code optimization

Coordinating with the business for architectural changes. Helping the Development teams for application related knowledge. Release management for the application.

Client: Unisys Aug 2007 to Dec 2008

Project: Unisys X-Cargo - Role: Developer - Location: Chennai, India

Environment: XML, ASCII FORTRAN, COBOL, UNISYS OS2200.

USAS The project is to re-engineer the existing LMS legacy technology to an “open” J2EE Air Core architecture based on Web Logic and Oracle. During this period was primarily involved in understanding the existing Unisys clear path mainframe, functional analysis, and migration to use cases.

Responsibilities

Involved in coding on basis of these Design document, testing, code Mining the existing code and reviews.



Contact this candidate