Post Job Free

Resume

Sign in

Data Informatica

Location:
Toronto, ON, Canada
Posted:
May 08, 2020

Contact this candidate

Resume:

Swadesh Misra

Enterprise/Solution/Data Architecture

647-***-****

adc45v@r.postjobfree.com

linkedin.com/swadesh-misra

Technology Architect with 10 years of experience and track record of success in challenging, fast-paced IT organizations focused on the development and deployment of new

technologies to drive growth, performance improvement and automation.

Equal blend of Business Function and technology able to lead Design in agile, multicultural, global project teams through accelerated technology development.

Built Target State Architecture using TOGAF framework to support business transformation for undertaking new Organization.

Led POC to modernize Legacy Application and Business transformation using cutting age technologies.

Proficient in applying best practices, designing and working with RDBMS & No-SQL in Complex Business process.

Expertise in solution for Application Integration using SOA middleware and Rest architecture including API Gateway and EIAM on-prem and cloud.

Strong knowledge in Financial application integration for Risk Reporting, Monitoring and Quality Control.

Experience in processing of Complex Standard Data Format for POS system (NRF-ARTS), Payment Systems (EDI/X12), Healthcare using various file format like XML, JSON, EBCIDC, COBOL, ASCHII.

Successfully designed and led the development for real- time data processing to support Retail business, data quality for freight tracking system in Canadian Railway.

Fast paced learner of New Business function and

Technology and have Strong problem-solving skills. SKILLS

Technology Governance

Data Governance

TOGAF-ArchiMate

ETL – Real time, batch, schedular

EAI – API, SOA, Rest

Cloud – Azure, GCP

Data Modeling – ER, Star schema, on-read

Analytics & Reporting

OOPs- C++, VB, .Net, Python

Standard & Best practices- Coaching

Vendor Management

Machine Learning

HIGHLIGHTS

Successful Academic project for building

model in Machine Learning

Led team in GCP Hackathon and secured top

5 ranking among 30 teams

Effectively solved the performance & sizing

issue of Big Data Platform for Loyalty

Management at the Largest Retailor (100m)

Vendor Management and Architecting for

large initiatives of Hybrid Cloud deployment

at Largest Canadian Retailor & Bank

Stable and performant Data Integration

Application Design & Development (30 m)

EDUCATION

Master of Management in AI (2020)

QUEEN’S UNIVERSITY, Stephen J.R. Smith

School of Business, Kingston, ON

Big-Data Course- Coursera, San Diego U

Algorithm, Analytics Course- Ryerson U

Dual Bachelor’s in Technology (1999 &

Science (1996), Calcutta University, Kolkata,

WB, India

MMAI Academic projects- Smith School of Business, Queens University, Toronto (08/2019-Present) Increase dwell time of customer and hence revenue in Yorkdale mall of Oxford Property

• Source data- 1.7 m customer journey- stores, revenue per store, store distance

• Data exploration, cleaning, feature generation, engineering

• Algorithm evaluation using R- XGBoost, Naïve Bayes, SVM, Logistic Regression Image Classification – Positive content and Negative Content

• Source data- 15K images from Data world with sentiment score

• Exploration, Handling class imbalance, cleaning, resizing

• CNN – TensorFlow is used after comparing with PyTorch/Keras

• Hyper-parameter tuning, Transfer Learning, Object detection Environment: R Studio, Anaconda, Azure Notebook, ML Studio, Databricks, Queens JupyterLab, Github Solution Architect, BMO: Corporate & Commercial Lending, Toronto (01/2019- 01/2020) Recruited as Lead Architect to manage design and architecture of Commercial Lending applications and risk reporting. The application portfolio includes Facility Management (APMS), fulfilment Commercial lease/loan (LoanIQ), Online loan booking (LTS & Intellect) and legal document management systems (LiveLink-Opentext). The interaction of these applications with BMO Corporate Department like Risk, GL, Enterprise Reporting is critical.

Responsibilities:

• Governance of Design and Architecture across the Commercial Lending floor interacting with BMO centralized hub -SmartCore

& Connector Grid, IDP

• Creating the standards for Application Design, real time/batch Integration, job Scheduling, Middleware

• Building Data Model of Mobile Desktop for Deal booking application, Digitization

• Interact with vendors like DTCC, Adaptive, Greenplum (rebuild in Greenplum from Netezza) etc.

• Evaluate vendor proposed solution as lead Architect for LoanIQ, LiveLink, LTS, Intellect, APMS (Asset Portfolio Management System)- e.g CECL

• Design of Commercial Lending Common API Gateway- evaluation of Zuul using Spring-boot

• BMO payment hub (Enterprise Payment Business Service) architecture -Loan Servicing application using payment domain application (WRAP) and clearing with SWIFT for Canada and Fedwire for US.

• Payment Systems Interaction Patterns,

o Channel initiates individual payment via Smart Core REST/JSON API and Smart Core calls EPBS services via SOAP API with ISO 20022 payload

o Product Systems initiate with ISO 20022 payment (SOAP API for Online, Messages via JMS/MQ, Batch payments with ISO 20022, EDI/X12 EMFT or CIPG)

Environment: Red Hat Enterprise Linux Server release 6.4, HDP, Hive version 0.13.1, Shell Script, Oracle 12c, Excel and VBA, IIS, WAS, TIBCO EMS, Spotfire, IBM MQ, JMS provider, Datapower Enterprise/Solution Architect, Loblaw-SDM, Toronto (08/2016- 12/2018) The program allows to customers to participate in a unified loyalty program across the enterprise. The program is delivered through a customer centric, omni-channel solution. The business opportunity of this to understand the customer behavior and preferences across the grocery and drug store divisions, thereby allowing targeted and personalized offers to customer increasing the wallet share for the business.

Loblaw Companies wanted to build an integrated platform to support Loyalty management across multiple retail domains like Grocery, PC Bank and Shopper Drugstore. Some of existing technology/ toolsets are going out of the stack and few new are introduced like EagleEye (EE), Hadoop Ecosystem. As SAP CRM is replaced by EE, there are many streams of integration work across several external systems, internal ESB and persistent layer in Hadoop are introduced. Responsibilities:

• As Enterprise Architect:

Created Reference Architecture for data integration (real-time & batch), used Sparx-Enterprise Architect tool

Participated in RFP review and SAAS vendor selection and formulated the scoring for Procurement team (Vendor Portal & CSI project)

Key design decision for Guiding Star project by evaluating two solution alternatives.

• As Solution Architect, created Solution Blueprint for

Guiding Star (redesign from Teradata/BODs to Informatica/BDE & Hive) –re-engineering

Enterprise Store Locator in Bullseye- T4G & Shoppers store data ETL (PWC, BDE, Hive) - re-engineering

Data Analytics Platform in Azure PaaS (DistCP, BLOB, HDInsight, Data Bricks, Spark, WASB)

Digital Middleware in Google Cloud Platform (Akamai, API Gateway, AGIGEE, GKE, VPC, Interconnect, DSG)

Targeted offer flow –PCMS & SWEDA for automating offer push retry - re-engineering

PC Tender Bonus real-time (Impact assessment and accommodate new changes in POS, ESB- IIB, Event processing- JSon, Transaction warehouse changes –Hive, ETL changes –BDE & HQL)

CASL optimization - centralized OPT out data and update CRM

SFMC integration- Batch, Realtime, Call Center modernization

Hadoop Data Lake build using Informatica BDE

• Solution Architecture includes but not limited to:

Understand the requirement and conduct sessions with BSA and Business Owner, review the existing process and code.

Identify network, firewall, API Gateway, Hardware, Data Security changes for the solution- with Infrastructure & Security Architects

Identify monitoring and alert mechanism for failure points.

Present the proposed solution to Architecture Review Board

Once approved, socialize with delivery team regarding the solution and continuing support till the solution is deployed to production.

• Data Architecture includes

Working with Analytics team to come up with Common Transaction Model in Hive (Logical & Physical modeling using Sparx).

Using DBVisualizer/Beeline to query data for analyzing the discrepancy, behavior, insight of event flow.

Consolidated the XML schema from PCMS which inherits the NRF-ARTS schema.

Automated the Model creation in Sparx using with java script.

Data mapping, Interface Agreement, Functional Design review

• Mentor developer team in XML transformation, B2B studio for dynamic parsing of Offer Json.

• Build prototype solutions based on initial business requirements.

• Perform POC with Dev team

Informatica Cloud integration for SFMC.

Perform POC on Informatica DIH for Data ingestion into Hadoop.

Perform POC on ATLAS

Explore different integration alternatives

Environment: Sprax, (HDP)Hadoop on Teradata Appliance, SUSE Linux, Informatica 9.6.1 and BDE, DIH, Shell Script, Excel and VBA, ORACLE, Teradata, Azure, Google Cloud, IIB, B2B Studio Data/ETL Architect, Scotiabank, Toronto (02/2016- 08/2016) Re-engineering the A4-OSFI regulatory report, analyze the data, develop ETL solution according to New A4 changes and produce the report. Phase 2 includes moving data to Data Lake and generates the report using Cognos from Hive tables. Responsibilities:

• Review the BRD and analyze the OSFI changes and liaise with Business for BRD changes and translate the BRD towards solution.

• Conceptual, Logical, and Physical Data Modeling for the OSFI reporting.

• Prepare Data Mapping for the ETL and Solution Design including DB object creation, Informatica Mapping, Alert, and scheduling.

• Build prototype solutions using Informatica Mapping, mapplet, workflow and schedule in Maestro.

• Perform POC on Spark, scoop for Loading data to HDP platform

• Conduct weekly status update meeting with team and IT management.

• Coordinated with testing for defining test strategies, test plan and defect resolution using Quality Center. Environment: Red Hat Enterprise Linux Server release 6.4, DB2, Informatica 9.6.1 and BDE, Cognos 10.2, Shell Script, Excel and VBA, Data Stage

Solution Designer, CIBC: Enterprise Data Hub, Toronto (07/2015- 12/2015) EDH is a hub that will house data from different sources throughout the enterprise. This hub is meant to be a single source enterprise wide. Cloudera suit is used for building the infrastructure of Hadoop, HDFS, Hive, Impala etc. Many downstream systems will consume the data from EDH, like Risk Department, Finance Technology, Fraud reporting. Variety of file format (EBCIDC, COBOL, XML, ASCHII etc.) from sources were loaded to Hive via Informatica BDE and shell scripts. Lot of challenges have been encountered on this new technology integration with Informatica, Cognos, SAS etc. Responsibilities:

• Review Master Enterprise Data hub final architecture and recommend modification based on POC performed on file format like Parquet, Sequence file, Avro as well the file compression like LZO or snappy.

• Analysis of Interface agreement and Source data to propose generic reusable design solution for N-Tier application environment

• Prepared generic design document and presented to multiple development team to adapt the generic methodology which reduces effort from 5 days to 1 using Shell Scripting and HQL on Linux.

• Troubleshoot Cognos reporting solution on Hadoop-Hive via Impala- performance issue resolution, establish the best practices like Stats gathering on indexes.

• Build prototype solutions based on initial business requirements. VBA tool for code generation and Configuration file creation.

• Review data mapping and Developed BDE mapping and reported limitations to Informatica requesting resolution.

• POC on HParser to build a generic parsing framework for EDH.

• Designed documented and developed Autosys job dependency and scheduling for Informatica Workflows/Shell Scripts.

• Integration test and performance bench marking for Big Data. Environment: Red Hat Enterprise Linux Server release 6.4, Hadoop 2.5.0-cdh5.3.0, Hive version 0.13.1, Informatica BDE, Cognos 10.2, Shell Script, Oracle 11g, Excel and VBA

Solution/Data Architect, BMO Capital Market, Toronto (11/2014- 07/2015) Recruited for QRM tool implementation for SMR and later re-architecting Fast Pre-Deal Check (FPDC) application. FPDC is envisioned as a lightweight gateway that can be logically inserted between a trading and risk system such as SunGard Adaptiv or Jaguar (referred to in this document as the Master System) to facilitate an expedited credit check by reserving credit (Carving Out) in the Master system ahead of time and issuing it to other systems after a quick (overestimating) check. This allows for the mitigation of the following delays: 1. Physical Communication – FPDC is deployed in a geographically closer location than the risk system thereby eliminating physical communication delays

2. Processing – The FPDC’s algorithms for checking credit availability trade accuracy (and hence overestimate the risk involved) for speed allowing the credit validation check to be returned quicker to the requesting party. Assignment: Add a new channel to the existing application and setup the Disaster Recovery for whole application. Responsibilities:

• System study and analysis of the existing system integration design document between sources like Broadway, Adaptiv and WSS for two instances of NY and LDN.

• Prepare design solution for adding a new Channel for IET trading using MQ, Gemfire and Netezza instance tables for Security, Credit lines and Deal info.

• Design Disaster Recovery by proposing the minimal code changes to replicate the GemFire in memory database update via MQ pub-sub. Integration test for new instance and DR.

• LoanIQ - wire routing-WRAP-SWIFT and LoanIQ -payment execution-Smartcore service gateway-PSH-FedwireMQ-Fedwire via EPSB where it performs payment work flow/orchestration, invokes Wire Payment Engine (EWP) to generate the Wire payment delivery information, and then EPBS deliveries it to Clearing/Settlement system

• Wire Payments involves

o Channel -Smart Core -EPBS

o PSH (Payment Engine)

o WRAP -FedWire/SWIFT (Clearing/Settlement System)

• Prepare Solution Design for SMR reporting requirement to OSFI.

• Analyzed, designed the solution - loading data from EDW to QRM Environment: MQ on WebSphere, GemFire, Netezza, MS SQL Server, Autosys, C# .Net Windows Services and Web Services, QRM, SQL Server 2008, SQL Server SSRS, Excel and VBA

Recruited as FTE by VERAX SOLUTIONS CORP, Toronto (Sep 2011- Nov 2014) Architect/Technical Lead, CIBC Treasury: BASEL3 (05/2012- 11/2014) One major deliverable for Basel III Liquidity Risk Project is to create a complete and sufficiently granular cash flow for all on and off- balance sheet instruments that give rise to liquidity risk. This is a major deliverable for Basel III Risk project that would deliver on regulatory requirements for a robust liquidity management process and the ability to produce OSFI regulatory reporting and support the FSA reporting requirements. Solution consists of:

(1) Building a Master Staging environment that contains granular on and off-balance sheet data and has reconciliation capabilities, to be used by the liquidity management group for liquidity analysis and input to the QRM system

(2) Building Infrastructure to run the QRM system (3) Employing a Reporting capability. The following three options are under consideration for reporting tool: (i) SSRS, (ii) QRM reporting capability. (4) Building End to End processes Responsibilities:

• System study and analysis of the existing source system feed (Retail and Capital Market) and new feed with respect to requested attributes of instrument N-Tier application environment-re-engineering

• Analyze and implement profit and loss Attribution and IR Derivative, Options, Swaptions etc. for Capital Market feeds.

• Analyze, review Bloomberg feed for rating changes of security feeds and integration with nightly batch job.

• Conceptual, Logical, And Physical Data Modeling for Data warehouse in Erwin & Oracle Modelor.

• Design Solution Framework for end to end flow in order to meet Basel3 reporting requirement for OSFI and management reporting-36 million records daily, 50 million on month end- using reusable components.

• Key Design decision after analyzing the solution while loading data using Oracle External tables, SQL Loader or Informatica File loader.

• Perform POC

performance benchmark using Informatica and SQL Server 2008 versus Oracle 11g on EXADATA.

Integration between EDW to QRM via VB scripting on ETL, Stratification, Planning and Reporting

QRM DB creation automation.

Informatica Cloud integration for Customer Data

• Proposed a hybrid model using Informatica Push down optimization.

• Build prototype report in SSRS based on initial business requirements.

• Designed and developed Autosys job dependency and scheduling for Informatica Workflows.

• Resolve 32 bit/64 bit QRM –Autosys integration via wrapper in VB.Net and Iexpress and 32/64 bit Oracle client for SSRS and QRM

• Worked with enterprise architecture team for Infrastructure estimation, volume analysis, and cost estimation.

• Designed and developed reusable UNIX shell script for file validation, watcher, retry using control table.

• Designed and developed several reusable components- mapplets, transformations translating the complex business requirement.

• Translated business requirements into data mapping and Informatica mappings/workflows- Developed mapping and workflow strategy.

• Reviewed and recommended performance tuning of PL/SQL code – processing time brought down from >16 hours to 2 hours. Environment: Linux, Oracle 11g on Exadata, ERWIN, TOAD, Informatica 9, HP Quality Center, Autosys, SQL Server 2008, SQL Server SSRS, QRM, Excel and VBA.

Architect/Technical Lead, CP Railways: The Data Quality project (09/2011- 04/2012) The railway company is providing freight transportation services over a 14,000-mile network in Canada and the US. Our high-density network serves virtually every major sector and ships commodities. This project aims to implement an Informatica based solution within Equipment Movement Events (EME) system which is in PL/SQL to:

• Protect downstream systems and database from data issues

• Identify frequent event issues and channel the issues back to source Responsibilities:

• System study and analysis of the application from the client and the various business Owners.

• Perform information systems needs assessment, information gathering and recommend appropriate business systems and IT infrastructure for the business user group.

• Design and architected the Data quality solution using Informatica Power Exchange, Designer, Workflow Manager 9.0 which loads real time data from MQ into stage tables and process the data to load into the final tables

• Design and develop UDT component using Informatica Data Transformation studio (xsd schema).

• Designed and developed several reusable components- mapplets, transformations.

• Translated business requirements into Informatica mappings/workflows- Developed complex mapping and workflow strategy

• Conducting proof of concept for the design approach of converting the PL/SQL code using Informatica Power Exchange and Power Center

Environment: Linux, Oracle 11g and PL/SQL, TOAD, Informatica 9, Power Exchange, IBM MQ, Informatica B2B Studio (xsd schema), HP Quality Center, Control M, Excel and VBA, Visual SourceSafe Recruited as FTE by CGI, Halifax, NS (Feb 2011- Aug 2011) Team Lead, ETL Architect, TUFTs Health Plan (02/2011- 08/2011) The objective of IHM project is to lower medical trend and administrative costs while improving the health of members by preventing onset and/or reducing complications of chronic disease. In addition, the health management landscape is continuously evolving, and Tufts Health Plan needs to stay aligned with competitors by shifting from disease/condition-specific programs to more disease- agnostic, risk stratified population management, with seamless integration across programs. Responsibilities:

• Requirements analysis and detailed functional design for the various blocks identified within the application

• Study of various interfaces, Custom built application data and data sources such as Legacy system, Oracle 9i, PL/SQL for re- engineering many to Informatica mapping.

• Design and architected the integrated Health Management solution for sub vendors using Informatica Designer, Workflow Manager 9.0

• Leading the project of support and development projects for ETL production framework in Informatica- followed SDLC methodology and Conceptual, Logical, And Physical Data Modeling.

• Communicated with developers, infrastructure, project and program managers, and business users to develop and deploy project solutions including hardware and software deployments and upgrades.

• Developed several reusable components for project team, Source control for Oracle procedure and Package in WinCVS

• Job automation and dependency setup in Cisco Tidal Environment: UNIX, Oracle 11g and PL/SQL, TOAD, Informatica 9, Cisco Tidal, WINCVS, Lotus Note Application Recruited as FTE by TATA CONSULTANCY SERVICES LTD, India and USA (Nov 2002-Jan 2011) BI Project Lead, ASP / J & J - Irvine, CA, USA (03/2008-01/2011) Advanced Sterilization Products (ASP), division of Ethicon, Inc., a Johnson & Johnson company, has a proven track record designing and delivering innovative infection prevention solutions. BI department is maintaining an EDW system for Sales organization and financial

& supply chain reporting. Assess the current business data quality and implement data governance, in partnership with the business, to support reliable business reporting and analytics. Gather business requirements for effective goal driven analytics and reporting dashboard.

Responsibilities:

• Design and architected multi-tiered application for sales promotional project. Responsibilities include requirement gathering, system architecture design, ETL development, data modeling, data integration/migration, custom reporting.

• Dimensional data modeling using Star Schema for the various metrics related report, Logical, and Physical Data Modeling.

• Develop, maintain, and manage Cognos Cubes, Transformer Models, Framework

• Build prototype solutions based on initial business requirements.

• Managed Cognos Models and Reports. UI design specification to be used for report creation and testing

• Designed the Dashboard Templates and Created Various Dashboard reports for the Business Users

• Implement offshore-onsite support and delivery center for BI technology (SQL Server Administration, Informatica and Cognos)

• Design of ETL system in DTS, SSIS & Informatica and complex stored procedure for COGNOS report- re-engineering many DTS/SSIS to Informatica

• Enterprise level backup & recovery strategies an Implemented Quest LiteSpeed backups to network drives

• Performed Data Center Consolidation project, Job automation & Alert

• Leading the team for Informatica administrative tasks such as Installation, Configuration, Up gradation, hot fix applies, Workflow scheduling, Repository backup, disaster recovery planning for SQL Servers

• Leading the team for SQL Server administrative tasks such as Installation, Configuration, Physical/Logical database design, Database Creation, Up gradation, Cloning, Patching, Proactive Performance tuning and monitoring, Trouble shooting, Backup and recovery, Partitioning, Replication.

• Involved in the Global Shared Service Model implementation for J&J companies

• Accountable for multiple project deliverable from assessment through SOW, Cost budget control, planning and scheduling, team management

Environment: Windows XP, ERWIN, SQL Server 2000, 2005, 2008, Informatica 8.6, Cognos 8.4, Excel and VBA, Visual SourceSafe DBA & Integration Architect, Administrative Office of Courts, Olympia, WA and India (09/2007-02/2008) Courts running in the state of Washington are currently using JIS application built in Legacy systems to manage the life cycle of Cases filed. There are near about 230 courts currently running in the state. Current case management system stores the case related data and data of persons involved directly or indirectly with the cases in the IBM DB2 Legacy System. The current system uses a common database to store person data related to cases filed in all the courts throughout the state. Administrative Office of Courts (AOC) is maintaining the current case management application. Responsibilities:

• Design web-services for exposing COBOL programs by SOA framework, BPEL process, XSL transformations

• Troubleshooting Performance issues as reported by Clients using various Monitoring tools like HP Open View

• Configuring, Managing and resolving issues in Standby databases, Oracle installation, migration and Oracle upgrade services for all 9i to 10g. Setting up offshore development environment - Windows DB2 and SQL 2005 database

• Design and construction of DTS package for extraction of data from SQL database and load to SQL 2005 database

• Design and construction procedures and cursors in DB2 and SQL server, Performance tuning (Explain plan, SQL Tracing and STATSPACK) and Defining standards for Database documentation, creating documentation of the database environment

• Establishing and providing schema definitions, as well as table space, table, constraint, trigger, package, procedure, and index naming Conventions.

• Managed all aspects of product development including gathering system and reporting requirements, defining project scope, scheduling programming tasks, defining milestones, setting up development environments, enforcing coding standards, writing test cases, and delivering finished product on time and on budget Environment: Windows XP, SQL Server 2005, Oracle SOA suits, Excel and VBA Integration/Technical Lead, Toyota Financial Services– Torrance, CA, USA and India (03/2005-08/2007) OSCAR has a core product called ALSCOM, which interfaces with different discrete systems for contract acquisition. A contract is an application for an automotive loan. The applicants can be an individuals or business. An application is entered through delivery channels like OCA, Dealer Daily and RouteOne. The data enters ALS COM via a VB Interface application as an XML. Analysts in TFS branch offices open it up for decision in ALSCOM. The applications can be declined, countered, or approved. Then they get uploaded to Host (Lease/Retail Mainframe). In the Host they get discounted (rate of loan repayment is finalized) and booked. Then the approved applications get funded. OSCAR also has VB Applications interfacing with TFS Data warehouse and Credit Scoring (based on Bureau reports of individual Applicants), TFS Lexus Database, OFSC etc. Responsibilities:

• Application support (24x7 on call by rotation) – L2, L3 support, change management (Ticket resolution and scheduled release)

• Setup and Configuration of Cluster BizTalk server with SQL 2000 server, disaster recovery environment for BizTalk server

• Installing and upgrading the SQL Server and application tools- physical Data Modeling of OLTP system.

• Design, develop the database structure, Store Procedure as necessary for Business need

• Worked on business requests for various decision-making analysis

• Implemented the record retention policy while archival of the data- to keep 135 days data

• Performance tuning of the queries, manage procedures and triggers, damage Control Operations when the system malfunctions

• Support production database backup and log shipping jobs, manage service pack, security pack updates on servers

• Manage SiteScope alert set up and Microsoft MOM, and coordination with Offshore and Onsite

• Prepared and conducted training through technical documentation preparation. Environment: Windows 2000, SQL Server 2000, BizTalk 2002 (xsd schema), Excel and VBA, Visual SourceSafe ETL Programmer, ALCOA Integration Services, ALCOA Inc. USA and India (03/2003-02/2005) ETL - Development and testing of the INO project using Informatica. The summary bill invoices for ACP and Presto business units of Alcoa are handled using the INO project. The INO transforms XML data received from the XML gateway, to the EDI file that is fed to Gentran for further processing. The output schema was generated using Specbuilder, and ETL Designer was used to map the XML data to the EDI layout. A Trading Partner lookup file was used to map the Interchange ID and qualifier of the TP, based on the Customer ID received from Oracle (XML Gateway).

Responsibilities:

• Analysis, Design and development, SCM (S/W Configuration Management)

• Coordinated with different support groups.

• Worked on change controls raised.

• Damage Control Operations if the system malfunctions

• All reporting activity

• Coordination with Offshore and Onsite

Environment: UNIX, Oracle 8 and PL/SQL, TOAD, Informatica 7, WINCVS, xsd schema Module Leader, Origination System- GE Finance (Watford, UK) – (11/2002-02/2003)

• Daily job monitoring, bug fixing, enhancement request handling, HLD/LLD design document preparation, coding, testing.

• Develop SQL procedures, design Trigger, Cursors

• Performance tuning of the queries

• Preparation of Test Specifications.

• Modification and development of Omnis program, designing the screens using Omnis Classic.

• Unit Testing.

• Ticket management

Recruited as FTE by CEPL, WB, India (Aug 1999-Nov 2002) REFERENCES AVAILABLE UPON REQUEST



Contact this candidate