Post Job Free
Sign in

Senior Technical Cloud Software

Location:
Plano, TX
Posted:
April 01, 2025

Contact this candidate

Resume:

Mr. Munish Parihar 972-***-****

********@*****.***

EXPERIENCE SUMMARY

•19+ years of experience in the analysis, design, development, implementation, and management of full life cycle commercial applications.

•Proficient in designing domain-level solution architecture and developing event-driven applications using Java/J2EE and SQL/No-SQL databases.

•Experienced in AI/ML data science projects including "Credit Card Users Churn Prediction" and "Personal Loan Campaign."

•Demonstrated success in creating lean, scalable, and reliable architectural solutions.

•Approx. 16+ years of development experience on Java/J2EE platform-based technologies, messaging, data processing and indexing technologies with custom rule engine implementation.

•Sound Understanding of architecture and development skills using latest digital technologies.

•Experienced in using DevOps tools like Docker and Kubernetes for continuous integration, build, and deployment.

•Ability to work adaptively both as a team member and individual contributor, build rapid prototypes, and implement scalable mission-critical solutions.

•Functional experience includes Telecom Development Applications, Network Management systems, Media Streaming, and Shipping Domain products.

ACADEMICS

Master of Computer Application, Rajeev Gandhi University, India, 2001

PROFESSIONAL CERTIFICATIONS

Brain bench certification of ITIL

Brain bench Certification of OO Concepts and design

Data Scientist Toolbox (John Hopkins University)

(https://www.coursera.org/account/accomplishments/certificate/D3FL49SNU3)

R Programming (BIG DATA)

(https://coursera.org/share/5350c6781e91802f229ca1ce0181f949) (BIG DATA)

(TOGAF® Foundation)

Perusing Post Graduation program on AI/ML and NLP (University of Austin Texas)

TECHNICAL SUMMARY

Competencies

Operating Systems: Windows, Solaris, HP UX, Linux

Middleware Technologies: IBM MQ, Message Brokers, JBOSS FUSE, Apache Kafka

Database (SQL/No-SQL): Oracle, MS Access, MySQL, Cassandra, Snowflake (4/5)

IDE: NetBeans, Eclipse, IntelliJ IDEA

Provisioning Tools: Optimize IT, JProfiler, VisualVM

Programming Languages: Java (JDK 1.4-1.8), J2EE (JSP, Servlet, EJB, JDBC, JMS), Spring, Struts, Rest Web Services, PL/SQL, CQL, Spark SQL, Shell Scripting, Cold Fusion, Python (Pandas, scikit-learn, NumPy, Matplotlib)

Microservices Technology: Spring Boot, Docker, Kubernetes, Swagger

Rule Engine: IBM ODM

Web/Application Servers: Apache, Tomcat, iPlanet, JBoss, WebLogic, Nginx

Frameworks: Struts, Spring Boot, Hibernate

Big Data: Data Science, R Programming, Apache Spark, Pig, Hive, Flume

DevOps/Containerization: Docker, Kubernetes, ELK stack

Design Tools: UML, Microsoft Visio, ArchiMate

Cloud Platforms: AWS, Azure

WORK HISTORY

Duration

Organization

Designation

June 2023 – till Now

AT&T USA

Solution Architect

April 2022 – June 2023

T-Mobile USA Inc

Domain Architect

July 2017 – April 2022

Tech Mahindra Ltd.

Digital Architect

Aug 2014- June 2017

Tech Mahindra Ltd.

Technical Lead

Jan 2011 – September 2014

Tech Mahindra Ltd.

Sr. programmer analyst

March 2006- Dec 2010

Tech Mahindra Ltd.

Sr. Software Developer

March 2005 – March 2006

Network Programs India Ltd.

SMTS (Senior member of Technical Staff)

Aug 2004 – March 2005

Agilis Software International Pvt. Ltd.

Sr. Software Developer

Dec 2003 – July 2004

Koenig Solutions Pvt Ltd.

Software Developer

WORK EXPERIENCE

AIML PROJECTS

Stock Market News Sentiment Analysis and Summarization

Develop an an AI-driven sentiment analysis system that will automatically process and analyze news articles to gauge market sentiment, and summarizing the news at a weekly level to enhance the accuracy of their stock price predictions and optimize investment strategies.

Large Language Models, Transformers, Prompt Engineering, Exploratory Data Analysis, Data Manipulation, Word Embeddings,Text Preprocessing.

Credit Card Users Churn Prediction

Customers’ leaving credit card services would lead the bank to loss, so the bank wants to analyze the data of customers and identify the customers who will leave their credit card services and the reason for same – so that the bank could improve upon those areas. As a Data Scientist at Bank need to come up with a classification model that will help the bank improve its services so that customers do not renounce their credit cards.

Development,Train and Test: Performed EDA, Data Pre-processing, Splitted data into Train,Validate and Test, Developed following ML models to evaluate Recall, Precision,f1-Score on given DataSet. Created confusion matrix and feature importance matrix.

BaggingClassifier,LogisticRegression,XGBClassifier,DecisionTreeClassifier,RandonForestClassifier and AdaBoostClassifier Tuning: Tuned models with Oversampled (SMOTE) and Undersampled Technique with Hyperparameter Tuning and pulished tuned results.

Personal Loan Campaign

A campaign that the bank ran last year for liability customers showed a healthy conversion rate of over 9% success. This has encouraged the retail marketing department to devise campaigns with better target marketing to increase the success ratio. As a Data scientist at bank have to build a model that will help the marketing department to identify the potential customers who have a higher probability of purchasing the loan.

Development,Train and Test: Performed EDA, Data Pre-processing, Splitted data into Train,Validate and Test, Developed following ML models to evaluate Recall, Precision,f1-Score on given DataSet. Created confusion matrix and feature importance matrix. DecisionTreeClassifier,RandonForestClassifier and AdaBoostClassifier

Tuning: Tuned models with Class Imbalance Technique with Hyperparameter Tuning and pulished tuned results.

Tech Mahindra (Americas) Inc.

Solution Architect

June 2023 – till date

AT&T USA - OG (Order Gateway)

A set of services (microServices) that allows to store Order data quickly in a common format, check the status of an order, regardless of where its in the lifecycle or channel or product. A common Order Status and format for all Orders across Channels, Products and Fulfillment type, enables a customer friendly display of Order from one source. This would avoid Order information being shown across different channels from different sources in different formats that could potentially confuse the customer. OG covers products like Wireline Wireless, Wireless Broadband.

As a Solution Architect:

Lead the end-to-end architecture of the Order Graph (OG) microservices platform to support a common order data model and order lifecycle orchestration.

Ensure stateless, scalable, and resilient service design that supports rapid order creation and status retrieval.

Design a canonical order format to unify order representation across products and fulfillment types.

Define APIs for storing, updating, and retrieving order information in real-time, agnostic of source systems or sales channels.

Architect the system to track and evolve order states based on incoming events, handling complex scenarios like split orders, partial fulfillment, cancellations, and returns.

Collaborate with downstream systems (billing, provisioning, logistics, etc.) to ensure event-driven integration and status updates.

Enable a centralized and consistent view of order status, eliminating discrepancies across channels such as web, mobile, call center, and in-store.

Ensure OG supports real-time visibility and troubleshooting for customers and support agents.

Define standards and patterns for order microservices, including versioning, event schemas, error handling, and retries.

Work with enterprise architects to align with overall digital architecture and governance frameworks.

Partner with Product Managers, Engineering Leads, DevOps, and Business Stakeholders to translate business needs into scalable tech solutions.

Provide technical leadership, mentoring developers and ensuring adherence to architectural principles and best practices.

Environment: Linux, Java 8, Apache Maven 3.1, SpringBoot 1.5.3.RELEASE, Postman REST Client, Swagger 1.5.15, Soap UI 5.3.0,Jenkins2.5, Microsoft Azure, IntelliJ Idea, ELK, Code Cloud, Kafka

T-Mobile (USA) Inc. (FTE)

April 2022 – June 2023

IDS (Innovative Data Solutions)

Innovative Data Solutions enables decisions, analysis and reporting insights for the T-Mobile Enterprise. IDS mission is to empower T-Mobile business and technology partners to leverage data as a highly valuable asset and to consume information for critical insights enhancing T-Mobile's customer experience. IDS vision is to revolutionize the enterprise to become a data-driven culture.

As Sr. Architect (Domain) is responsible for

•Develop innovative solutions and strategic vision for multiple domains like customer care, marketing, Digital, Device and Supply Chain, Consumer and other domains.

•Support domain-level solutions, design strategies for IDS (Innovative data Solutions).

•Develop architectural solution to migrate business products from Terra data to Snowflake.

•Assist in the designing of Unified Data Platform a Data Mesh to connect multiple business domains to share data products with the help of domain agnostic cross cutting data platform.

•Design architectural solutions to ingest data into UDP (ADLS and snowflake) from the source systems for existing products.

•Developed process for the Technology Selection and Assessment within IDS.

•Collaborate with Enterprise architects, Business users and data architects to execute data strategies.

•Oversee quality, consistency and alignment of design to industry standards.

•Develop and implement proof-of-concept and prototypes to illustrate approaches to technology.

•Develop High level solution design artifact to support the DA and DevOps teams.

•Design Resilient, Scalable and Simplified architectural solutions to solve the customer business need.

Environment: Archi 4.8.0, GitLab 15.5.0, Snowflake, Teradata, Azure.

Tech Mahindra (Americas) Inc.

March 2021 – April 2022

Verizon- Cyber Risk Monitoring (CRM) (Plano TX)

Cyber Risk Monitoring allows customers to understand their security posture and threat environment. By integrating multiple data sources, the Dashboard provides a 360-degree view of a customer's security posture with actionable data that allows users to focus on a clear action plan that will deliver quantifiable return on their security investments. This Dashboard is designed for consumption from the Board level to the Security Operating Manager level. In addition, the Dashboard benchmarks the company against its industry and allows for the monitoring of suppliers and/or partners outside-in security posture leveraging a third-party scoring method that is becoming a risk requirement in some industries, while a best practice in other industries.

Cyber Risk Monitoring Dashboard (“Dashboard”) serve as an overview of your overall standing, while granular, raw data is available for more in-depth analysis. Cyber Risk Monitoring can be purchased at three levels:

● Level 1

● Level 2

● Level 3

Each level incorporates a variety of risk vectors, which examine and quantify a specific aspect of a company’s cybersecurity posture. Risk vectors are weighted and aggregated to determine each level’s score and all level scores flow into your Overall Security Posture.

As Sr. Technical Architect is responsible for

•CRM capabilities migration on AWS for EMEA and APEC region.

•Designing PCI Cloud integration architecture with various data points.

•Identify Root Cause Analysis of CRM On-premises and AWS production issues.

•Designing upgrade plan for the EKS Cluster with terraform based automation.

•Developing terraform templates/Scripts for AWS resource creation.

•Involved in meetings with project stakeholders to understand and analyze the new requirements, Design, Development activities.

•Designing low level solution with various POCs.

•Development of iterative architectural artifacts need to respond CRM Securty architectural need.

Environnent: Linux, Java 8, Apache Maven 3.1, SpringBoot 1.5.3.RELEASE,node.js 12.0.3, Postman REST Client, Swagger 1.5.15, Soap UI 5.3.0,Jenkins2.5, AWS, terraform 0.13, Amazon DynamoDB.

November 2017 – Mar 2021

T-Mobile Event Tracking Repository for All (TETRA) (Plano TX)

(Cloud-Native Digital Transformation SAP Modernization Microservices EDA

TETRA (T-Mobile Event Tracking Repository for All) is a cloud-native, greenfield application designed to modernize and migrate legacy SAP systems (ECC, OER, HANA, PI) into a Domain-Driven, Event-Driven Architecture (EDA). It involved decomposing complex monolithic processes related to device lifecycle, supply chain, inventory, and event tracking into scalable microservices.

TETRA serves as a centralized event repository that connects diverse data sources with varying formats and velocities. It efficiently processes structured, semi-structured, and unstructured data for real-time analytics and dashboard/report generation.

As Digital Architect is responsible for

•Led TETRA capability decomposition using Domain-Driven Design (DDD) principles to transform legacy SAP monolithic architecture (ECC, OER, HANA, PI) into independently deployable microservices for an Event-Driven Architecture (EDA) platform.

•Designed TETRA digital API’s low-level architecture, breaking down SAP-based workflows (device lifecycle, supply chain, inventory management) into agile REST APIs, promoting scalability, modularity, and resilience.

•Conducted in-depth analysis of SAP ECC/OER device lifecycle processes and HANA analytical flows to define bounded contexts, data contracts, and event models for decoupled service design.

•Modernized SAP PI integration interfaces (IDoc, RFC, SOAP) by introducing Kafka-based asynchronous messaging and standardized JSON over RESTful APIs.

•Designed and developed Apache Spark based tetra-device-service, tetra-event-consumer, tetra-xmine-processor microservices as a prototype to be followed by DEV team for devising Spark based solution.

•Designed TETRA’s digital integration architecture, enabling seamless interoperability with heterogeneous data sources—structured, semi-structured, and unstructured—at variable data speeds.

•Responsible for technology selection and platform design, including Apache Spark, Kafka, IBM ODM, ElasticSearch, Apache Solr, MySQL, and Tomcat to support TETRA’s performance, analytics, and rule execution needs.

•Designed and developed Spark-based microservices including tetra-device-service, tetra-event-consumer, and tetra-xmine-processor to prototype large-scale data processing frameworks adopted by DEV teams.

•Built TETRA’s core base framework to enable real-time event subscription and feed ingestion pipelines, forming the backbone of its EDA.

•Designed and prototyped IBM ODM-based rule engine capability for managing complex device workflows, such as event routing, state transitions, and compliance.

•Engineered Spark-based generic feed ingestion architecture, supporting scalable and reusable ingestion pipelines across TETRA’s microservices.

•Prototyped Apache Solr-based indexing layer to optimize response times of metadata-heavy microservices handling event and device history queries.

•Designed Zookeeper based multiple Master/worker node standalone cluster configurations set up as a prototype to deploy Spark based solution.

•Designed Spark High Availability (SparkHA) and Zookeeper-based standalone cluster setups, enabling high-throughput and fault-tolerant distributed processing.

•Involved in meetings with project stakeholders to understand and analyze the new requirements, Design, Development activities.

•Created low-level solutions and technical POCs to validate digital architecture assumptions and guide iterative development.

•Actively engaged with stakeholders and cross-functional teams, converting business requirements into technical designs and delivery roadmaps.

Managed a large team of developers, defined development standards, supervised sprint execution, and reviewed code (PRs) to ensure architectural alignment and quality.

Environnent: Linux, Java 8, Apache Maven 3.1, SpringBoot 1.5.3.RELEASE, Apache Spark 2.2.1, Pivotal Cloud Foundry, sl4j1.7.12, Logback1.1.3, JUnit 4.12 & Mockito 1.10.19, Postman REST Client, Swagger 1.5.15, Soap UI 5.3.0,Jenkins2.5, IBM ODM 8.9.0, Cassandra, JDK 1.8, ElasticSearch 7.1.

Buy Online Pickup in Store (BOPIS) (T-Mobile D2R)

Digital to Retail program will leverage T-Mobile digital assets to drive new Customer candidates into the retail channel by allowing customers to complete their transaction online and pickup in store. This will enable customer to book an appointment online to find the required device available in the nearby store. Main advantage of this product will be there is no need for the customer to get into every store to enquire about the device required by him. Inventory availability API’s serve the purpose to calculate sellable stockon hand for the product that is searched within Store, determining the participation of store or product to be included within Product serach and to find out the availability status of product within stores to acknowledge the customer back .

As Digital Architect is responsible for

•Involve in meetings with project stakeholders to understand and analyze the requirements, Design, Development activities

•Feasibility study and Impact assessment.

•Designing Inventory availability architecture to support BOPIS.

•Solution designing and development of inventory-availability-service microservices.

•Design solution to facilitate event subscription and Spark based feed (.dat file) ingestion.

•Development of Spark based inventory Sales forecast analytics (master-data-changes-d2r and forecast-data-processor).

•Designing solution to populate Sales history to calculate DOS(days of supply)

•Integration strategies and Deployment plan.

•Oversee the preparation of High Level and Low Level Design Documents.

•Daily scrum activities

Environnent: Linux, Java 8, Maven 3.1, SpringBoot 1.5.3.RELEASE, Apache Spark 2.2.1, Pivotal Cloud Foundry, sl4j1.7.12, Logback1.1.3, JUnit 4.12 & Mockito 1.10.19, Postman REST Client, Swagger 1.5.15, Soap UI 5.3.0,Jenkins2.5, IBM ODM 8.9.0, Cassandra,JDK 1.8.

November 2015 – October 2017 (23 months)

Order Capture Engine (Richardson TX.)

OCE is a strategic (end state target) Order Lifecycle Management system aimed to serve all consumer and business products. OCE will leverage existing order management systems to fulfil / provision the orders.

OCE –Key Capabilities: - Order decomposition and orchestration, Store and forward capability for orders, Error correction and resubmit capability for orders, Based on reusable APIs and components, Agent interface for order queues, search, status, error correction etc. Horizontally scalable, Not tied to any one commerce engine or platform, Configurable workflows and rules, Common reporting solution, Leverage open source or products with ELA to reduce licensing costs.

As TECHNICAL PROJECT LEAD is responsible for

•Developing scripts/pipeline to implement dockerization/Containerization deploying various API’s images on Kubernetes cluster.

•Worked on DevOps clustering technologies to manage application containers with Kubernetes.

•Worked with Kafka 0.10.2.0 and ELK stack for logging and dashboarding.

•Worked on memory leakage with Informatics ActiveVOS.

•Tuning JVM heap size, Connection pool, Threads for STAGE or STB environment to optimize performance.

•Fixing technical issues pertaining to Redhat JBOSS FUSE 6.1/6.2.

•OCE Application migration from Redhat JBOSS FUSE 6.1 to 6.2.

•Resolving issues concerning with AJSC5 (AT&T Java-based Service Container) based services.

•Resolution of technical issues pertaining to JAVA/J2EE/Weblogic 12c/Docker.

•Fixing Production issues related to JBOSS/Weblogic server/Java/J2EE by taking Thread/Heap dumps, log analysis.

•Development of iterative architectural artifacts need to respond OCE Digital architectural need.

•Involved into low level design and different POC’s.

•Oversees development of solutions by onshore or offshore resources.

•Collaboration with peer organizations, quality assurance and other stakeholders.

•Unit Testing of the logic.

Environnent: Linux, Jenkins/ Maven 2, Java, J2EE 1.4, JMS,XML, XSLT, XSDS, Oracle 10g, Weblogic 12C,WLST, ActiveVOS 2.0, ATG, Red Hat JBOSS Fuse 6.2/6.3, Camunda BPM, mariaDB,Karaf,Docker 1.8.0, Kubernetes, AJSC5,JDK 1.8.

January 2015- November 2015 (10 months)

GCP for AT&T (at Richardson TX.)

GCP is the platform application supporting all the AT & T Target Architecture Clients. System is divided into 4 Major components. EDB layer covers the application layer supported by Web Service components deployed on Web Logic platform. ETL Layer covers the bulk data processing. ESDB layer covers the DB layer and BO Layer covers the Reporting. Along with these four, EDB/DBOR also got legacy migrated applications. Worked on development of EDB/ETL layer and resolving batch data processing issues.

As TECHNICAL PROJECT LEAD is responsible for

•Worked on development of ETL Jobs for ETL layer.

•Designing and developing various components of the GCP platform.

•Modify existing ETL Jobs to address business requirement or improve work flow.

•Suggests resolution/alternate strategy in order to resolve Data migration timely and effectively.

•Setting up data reconciliation process across systems.

•Lead team for new technology initiatives/business processes or deployments best-practices.

•Defect management, E2E communication with Cross application components for issues resolutions.

•Contribute in Process Improvement like creation of various atomization scripts by implementing various diagnostic modules, setting up of SMTP notifications, and configuration of WLST/Unix scripts to better monitor weblogic specific processes.

•Conduct proof-of-concepts (POC) to meet the business use cases for data migrations.

•Coach and mentor other delivery, quality assurance, and support personnel.

•Resolution of technical issues pertaining to ETL/JAVA.

•Supporting GCP to reduce cost by improving services or provide innovated technology solutions

•Unit Test and monitor GCP performance matrix by VisualVM.

Environment: Sun Solaris 5.10, Maven 2, Java, J2EE 1.4, Python 2.6, Weblogic 8.1 SP6,9.x,10.1.3,10.3.6,WLST.

October 2009 – December 2014 (62 months)

IPOS (Xi) for AT&T (Onsite at Richardson TX.)

IPOS is basically a middle ware which rests between Source systems (SystemX, OPUS, PDC, Phoenix etc.) and the back end systems(CSI, CAM, Telegance, CARE, SalesTally, ED, Echeck etc).The functionalities provided by IPOS are: Create and Manage Accounts, Reserving subscriber CTN, Migration - Migrate Customer CTN from Blue billing system (formerly AT&T) to Cingular Billing Systems, WLNP – Port non-Cingular/non-blue customer to Cingular billing system and network(orange network),Activation - Provision to activate new phones, Fulfilment - Deliver phone, accessories and collaterals to customer, Customer Service – Inquire Customer Account Details, Rate Plan Change. WirelineProduct ordering such as DSL, DTV and Uverse Ordering.

As Sr. Programmer Analyst is responsible for

•Interpret business requirements to articulate the business needs to be addressed.

•E2E communication with Cross application components for issues resolutions.

•Develop technical solution with supporting business cases and road maps.

•Development of Uverse Ordering functionality.

•Development of DTV order flows.

•Coordination with development team, onsite team and client.

•Contribute in Process Improvement like creation of various atomization scripts by implementing various diagnostic modules, setting up of SMTP notifications, and configuration of WLST/Unix scripts to better monitor weblogic specific processes.

•Resolution of defects pertaining to DTV and Uverse.

•Provides technical coaching and mentorship of resources

•Enhancements/ Code fix.

•Unit Testing of the logic.

Environment: Struts 2.0, Web Services, Weblogic Application Server 10.3, JDeveloper 11.1.1.3, Team City 6.0.2, Bugzilla 3.6.1 Release, Apache Maven 3.0.3,Sun Solaris 5.10, Maven 2, CVS/SVN, Java, J2EE 1.4,Weblogic 10.3,Apache Nexus repo.

July 2007 - October 2009 (27 months)

Inbound Architect for BT NGN (Onsite at British Telecom R&D Lab located at Ipswich U.K)

BT Inbound Architect component is comprises with four sub-components named as BT Inbound Architect web-based module, Call plan editor (CPE), IA CISL and NSU.

BT Inbound Architect is a web-based portal that provides secure access to customer’s call data and access to BT's reporting and call routing control services. Its features include an easy access to reports and controls via the internet using a standard web browser, information that helps customer decide on how best to maximise call completion and ability to change the percentage of calls routed to different termination points. BT Inbound Architect provides customer access to BT Inbound Services products such as Full and Simple Control, Rapid Reports, Enhanced Information Statistics (EIS) and Enhanced Raw call Data (ERD).

IA CISL application is a mediatory component, bridges Inbound Architect web module to Alcatel provided CISL platform. This application is responsible to receive provisions from Oracle AQ (Oracle provided JMS), Validate Messages and forward them to CISL platform for further processing. Once CISL receive the messages it respond IA CISL application with response message then IA CISL application keep a log of response message for further usage simultaneously if request message is invalidated then it logged in the cause of the error in the log. Application also keep track of network connectivity with CISL by sending continuous heart beat request to CISL and receive suitable response message against it.

DT (Data transfer) is a servlet based application to help download online data to client machine in Access to facilitate user to create/update Call plans offline.

NSU (network side update) is a servlet based application invoked by the CISL platform to inform Inbound Architect about any updates taken place on CISL side network.

As Sr. Developer is responsible for

•Consult with stakeholders to ensure agreement on system Design.

•Responsible for devising solutions, solution architecture descriptions.

•Development of IA-CISL,NSU J2EE component.

•Works with Enterprise and Application Architects to ensure consistency in technical design.

•Development of Inbound architect interface.

•Providing maintenance of weblogic server for internal development and IVV&T.

•Enhancements/Code fix for faults raised on IA CISl, NSU middleware (J2EE/JMS).

•Providing assistance in design/development of CRs.

•Designing Mission critical strategies like Disaster recovery for Production.

•Documents the “as-is” business Design and uses business design reengineering techniques

Environment: Sun Solaris, C++, Java, J2EE 1.4, JMS, Oracle 10g, WLS Application Server 8.1 sp4, Cold Fusion 8.

March 2006 – July 2007 (16 months)

Work Manager Interface Gateway (British Telecom FFS project)

It is a gateway application which interacts with Hub and vidus Work Manager. The role of WMIG revolve around Validations of the faulty orders posted by the Job Managers and to provide appropriate exception handling mechanism in case of the orders failing to meet business requirement or due to technical System failure. WMIG holds the responsibility to keep a vigil on the entire FFS System components communication link also, it raises link test in case of components gets the communication link failure to the suburb systems.

As Sr. Developer / Weblogic is responsible for

•Design & development of Workmanager application.

•Installation of BT Weblogic 8.1 Sp4 on various Production Server which include Open Reach, Bletchley, Cardiff Site.

•Setting Up a Disaster Recovery (Cardiff) plan for WMIG, Preparation of various Mission Critical Docs.

•Individually Developed the DR Migration Script which is the base of WMGG Mission Critical Strategies.

•Setting Up numerous Domains for WMIG Application for ADE and Production Environments.

•Debugging of Weblogic Related Problems.

•Work with various Test/Production Environments team to solve system bugs.

•Bringing the entire project from Pune to Noida.

•Building up a very strong customer support and communication structure

•Bringing WMIG stable from its vary volatile state

•Worked on Various BT Business processes and tools as per requirements.

Environment: Sun Solaris, C++, Java, J2EE 1.4,XML,XSLT,XSDS,Oracle 10g,WLS Application Server, MS VSS, Test Director.

Network Programs India Ltd.

March 2005 – March 2006 (12 months)

NSP (New Software Platform) Proalarm for Tekelec

ProAlarm application is used for network planning and supervision of faults in the network elements such as Signaling links, Linksets, Signaling points, monitor units, Message Switch, ProTraq, ICP etc. in the form of system alarms, processing error, hardware failure etc. Proalarm is composed of the two applications i.e Map Viewer and Map Config.

As an Sr. Software Developer was involved in the following:

•Involved in development of Map Configuration and Viewer Part.

•Determination of Memory leak inside the application and to provide its solution through Architectural enhancements.

•Assist teams in the development of Systems sub Components dependent on the proalarm.

•Involvement in the development of various sub components (Protraq,Protraffic,Sysalarm).

•Assist Core team for the enhancement of Core layer as per changing interfacing System requirements.

•Solution of Technical Problems against the issue reported by the TEST Team.

•Was involved modifications and enhancements of the



Contact this candidate