Resume

Sign in

Manager Data

Location:
Fremont, California, United States
Posted:
February 19, 2018

Contact this candidate

Resume:

SUMITHA SURESHKUMAR

ac4jca@r.postjobfree.com / 443-***-****

Work Status – Green Card

Current Location: Fremont, CA (Working Remote)

PROFESSIONAL SUMMARY

** ***** ** ********** ** IT industry. Roles played during the career include Software Development Engineer, Senior Developer, Onsite-Coordinator, Business Analyst, SME, SPOC, Application Developer and Quality Analyst.

7 years of extensive hands on experience with Informatica client tools like designer, workflow manager, workflow monitor, repository manager and Informatica Metadata Manager.

Working with HIVE databases.

Expertise in run/export Lineage in Metadata Manager and used it for Analysis.

Part of POC for IDQ project using Informatica Developer to check on Quality Checks and created few mappings.

2 years of Mentoring Junior Developers and MTU (MetLife Technology University) folks to help them technically and taking thru’ business sessions.

Slowly Changing Dimensions, Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling.

Worked in different phases of the projects involving Requirements Gathering, Design, Development, Deployment and Project Closure

Domain Knowledge in Healthcare, Automobile, Finance and Insurance Industry.

Hands on Experience in Application Development, Production support and Application Quality Assurance.

Extensively worked with EDI transactions (835, 277, 820’s)

Worked with X12 Transactions.

Very Strong skills and clear understanding of requisites and solutions to various issues in implementation throughout the Software Development life cycle (SDLC).

Excellent Knowledge in creating Test Scripts and executing those across environments. Also strong experience in Execution and Documentation of Regression, System & User Acceptance Test Plans, Test Scenarios and Test Results for Large Mainframe System

Expertise using the defect tracking tools like Test Director, Quality Center and Silk Test.

Worked as SME and/or Business analyst for Vehicle Accounting (EVMS), Order Management Services (OMS), Order Fulfillment Portfolio, Order Management, Vehicle allocation, Conflict Management and After-Sales in MOPAR-Chrysler.

Gathered variety of domain knowledge in Automobile Industry from Chrysler, Ford and Nissan.

Received many appreciations including but not limited to VENDOR EXCELLENCE AWARD and Pat-on-the-back awards.

Expertise in OLAP, OLTP and MQ Series

Managing overall product performance, providing oversight of significant enhancements to the product and setting priorities to ensure client satisfaction.

Having strong analytical ability and excellent communication skills

Provide leadership and strategic direction to the project team, providing coaching, training, development and feedback to line managers as needed.

Hands on experience in Change management and Project Management.

Build a standards-driven, scalable, secure and dynamic portal for the information delivery of specific based applications

Develop enhancements, resolve issues, coordinate testing of statement releases and oversee regular production efforts including data extracts, data warehouse feeds and printing.

EDUCATION

Bachelors of Engineering (Electrical and Electronics Engineering)

CERTIFICATIONS

Mainframe

VHDL (Very High Level Descriptive Language)

C, C ++

Certified in Business Communication Etiquette.

TECHNICAL EXPERTISE

Operating Systems

Windows, Mac, OS/390, ES/9000

Database

DB2, IMS-DB, Oracle, SQL Server, MS-Access,HIVE, Teradata

Languages

COBOL, C, C++ PL/SQL, VHDL, JCL, REXX

OLTP

CICS, IMS-DC

ETL

Informatica Power Center 9/8.6.1/7.x/6.x, Source Analyzer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor,Informatica Developer, SQL Server DTS, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), PowerExchange,Tableau,MDM

Data Modeling

Dimensional Data Modeling (Star Schema Modeling, Snowflake Schema Modeling, Fact and Dimensions Tables), Relational Modeling, Database Normalization Logical Modeling, Physical Modeling, ER Diagrams, FACT and Dimensions Tables, Physical and Logical Data Modeling

Development Tools

VSAM, TSO/ISPF, SPUFI, File-Aid, Endeavor, Expeditor, Panvalet, File Manager, Eztrieve, DCCS, IBM Fault Analyser, Changeman, DB2 Connect, Zeke, Easytrieve, Coolgen, Roscoe, File Aid, Expediter, Tracemaster, Endevor, Sclm, Changeman, Qmf, Spufi, Boole Babbage, Ibm Fault Analyser, Ibm Debugger, Panvalet, Gql Tool, Imc Dd, SAS,Qlikview,Cognos

Packages

HTML, MS Office

Methodologies

OOSD, UML

Skilled

Oracle Sybase, Visual Basic, Power Builder, Mqseries, Ims Dc, Strobe, Teradata

Quality Assurance

Silk Test (Automation)

PROFESSIONAL EXPERIENCE

METLIFE Software Development Engineer

Cary, NC Jun 2014 – Till date

Description: MetLife is that we are one of the world's largest insurance providers, with over 90 million customers in 60 countries. They have global scale, surprisingly nimble. So nimble that in three months our Global Technology & Operations group developed and implemented a new MongoDB/NoSQL application that aggregates data across 70 legacy systems, storing 24 terabytes of data and handling 45 million agreements with 140 million transactions.

Responsibilities:

• Responsible for the ETL data integration process which is accomplished through the use of Informatica Powercenter, Informatica Metadata Manager and Data Quality, SSIS and Tivoli Scheduler.

Export the Metadata into repository and perform data lineage thru’ Informatica Metadata Manager for Analysis and debugging.

Performed POC for IDQ data quality checks and created multiple mappings.

Strong understanding or hands-on experience with data quality, data profiling and metadata management

Strong understanding of data structure classifications, workflow

Create mappings to match data to a source system and confirm correctness of data using IDQ

Requirement gathering and get clarify on any queries related to the requirements

Providing Technical Masking solution and Designing

Overall Coordinating with Data masking team and client for work distribution On time and defect less Delivery

Development and Testing of Data masking code as developer

Help define workflows that will include Informatica

Document all solutions created

Performance tune existing solutions as well as solutions in which you will be creating

Work with multiple teams, development and business, to create solutions designed around their needs

Importing and extracting data from multiple sources like Teradata,Oracle,SqlServer,files,MF…

Extensively used a variety of complex mappings involving aggregator, filter, router, expression, joiner, and sequence generator to load data into data marts.

Developed UNIX Shell scripts to automate repetitive database processes.

Involved in performance tuning on Informatica mappings and worked with both look up and unconnected lookup transformations.

Used update strategy transformation and configured the mappings to handle updates, and to preserve the existing records.

Created tasks, sessions, worklets and workflows in the workflow manager, as well as worked on status updates using workflow monitor.

Worked with reusable transformations and mapplets.

Mentor the MTU and junior associates technically and Business process-wise.

Perform data object analysis by defining, analyzing, and validating data objects and relationship through the use of scripting and Informatica toolset.

Design, develop and maintain ETL data integration process with the Business Intelligence and IT team for multiple Admin Systems like PAS.

Implement data objects and integrating data shared across multiple source and target systems. Establish interfaces, developing and modifying functions, programs, routines, and stored procedures to export, transform, and load data

Coordinating actions among developers, production control, DBA, infrastructure team and project vendor to implement out the ETL scripts and objects.

Validates ETL by developing and executing test plans and scenarios including data design and data quality. The toolsets include Informatica Powercenter and Informatica Data Quality (with data profiler and analyzer).

Perform and maintain performance of data warehouse by identifying and resolving data conflicts, optimizing databases, networks and hardware.

Improves data integration by designing and evaluating new data interchange formats. The environment is mainly consists of SQL Server database technologies, Informatica and supporting Business Intelligence toolsets such as Cognos.

Analyzed and documented user requirements and changes on a regular basis to maintain the integrity of the development.

Held regular meetings with BA’s for a clear business requirement understanding and to reduce gaps between the actual requirement and implementation.

Worked with various sources such as relational and XML files to populate the data mart.

Studied the current transactional databases in order to understand the existing system and created the mapping of the database objects from source to target database.

Involved with business/functional team on requirements gathering for converting the old legacy to mainframes.

Responsible for the delivery

Coordinating with different systems for data sync issues.

Involved in preparing weekly status report and monthly job processed report.

Carry out Quality Assurance activities for all deliverables.

Provide Technical Guidance to the team

Worked with IMS Data to compare the productivity of the products.

Day-to-day Activities:

•Design Review, Peer/Code Review

•Mentoring MTU and Junior developers

•Sonar Scan

•Veracode Scan

•Unit Testing Coverage

•Currency Upgrades

•Functional Signoff

•Performance Signoff

•Deployment Plan

•Release/Rollback plan

•ROS Guide/Run book update

•Monitoring Jobs

•Release Roster Entry

•Turnover checklist

•Change ticket state

Environment: Informatica Power Center 9, Oracle, PL/SQL, Toad, Work Flow Manager, Work Flow Monitor, Informatica Developer (IDQ), Informatica Metadata Manager,Maestro, IWR, Erwin, Unix Shell Scripting, XML, COGNOS, RADGauge, FTP, Qlikview

NCTracks MMIS Senior Consultant

Raleigh, NC Jun 2011 – June 2014

Description: MMIS is a medical assistance program administered in North Carolina by the Division of Medical Assistance (DMA) for certain low-income individuals and families. Medicaid is a health insurance program for low-income individuals and families who cannot afford health care costs. Medicaid services low-income parents, children, seniors, and people with disabilities.

NC Tracks is the system that converts the old legacy system into mainframes. Also it manages the NC MMIS system by having different subsystems like Recipient, Provider, Claims, Reference, Finance, TPL and Managed Care for handling the MMIS program.

Responsibilities:

Analyzed and documented user requirements and changes on a regular basis to maintain the integrity of the development.

Held regular meetings with BA’s for a clear business requirement understanding and to reduce gaps between the actual requirement and implementation.

Worked with various sources such as relational and XML files to populate the data mart.

Studied the current transactional databases in order to understand the existing system and created the mapping of the database objects from source to target database.

Extensively used a variety of complex mappings involving aggregator, filter, router, expression, joiner, and sequence generator to load data into data marts.

Developed UNIX Shell scripts to automate repetitive database processes.

Involved in performance tuning on Informatica mappings and worked with both look up and unconnected lookup transformations.

Used update strategy transformation and configured the mappings to handle updates, and to preserve the existing records.

Created tasks, sessions, worklets and workflows in the workflow manager, as well as worked on status updates using workflow monitor.

Worked with reusable transformations and mapplets.

Involved with business/functional team on requirements gathering for converting the old legacy to mainframes.

Responsible for the delivery

Coordinating with different systems for data sync issues.

Involved in preparing weekly status report and monthly job processed report.

Carry out Quality Assurance activities for all deliverables.

Provide Technical Guidance to the team

Worked with IMS Data to compare the productivity of the products.

Environment: Informatica Power Center 9, Oracle, PL/SQL, Toad, Work Flow Manager, Work Flow Monitor, Maestro, IWR, Erwin, Unix Shell Scripting, XML, IBM Mainframe/PC, COBOL, JCL, DB2, File Manager, VSAM, Endeavor, IBM Debug tool, IBM DB2 file manager, SPUFI, QMF, FTP.

MOPAR ORDER PROCESSING Senior Consultant

Detroit, MI Jan 2011 to June 2011

Description: MOPAR Order Processing supports the central DB2 repository for all Part orders and is a vital part of Vehicle Order Fulfillment. In this project, we are transferring all the stocks from DIAMLER to FIAT ware house locations based on the stocking levels and excess and availability.

Responsibilities:

Responsible to write BRD’s for the requirements from users and Chrysler Team, also co-ordinate with the offshore team for the understanding of the requirement and allocate work.

Generate lot of ADHOC reports using Eztrieve programs for the business users to validate the orders and to check for back orders and the superseded orders details.

Co-ordinate with the offshore team for work allocation and to detail about the Business requirements on a Daily-basis to set Targets.

Rewrite or create batch interfaces that extract data from the Part Master and Dealer tables according to various criteria and send files to numerous legacy systems.

Develop modules involved in vehicle order process. It involves coding and testing several new BMPs and DB2 programs.

Coding, Testing and Documenting the BMPs.

Setting up the JCL control cards in QA

Environment: MVS/ESA, Windows 2000, VS-COBOL II, JCL, MQ Series, IMS/DB-DC,

DB2, DCCS, SPUFI, QMF, File-Manager, DB2 CONNECT, IBM Fault Analyzer.

ORDER MANAGEMENT SERVICES Senior Software Engineer

FORD Motor Company Sep 2009 till Dec 2010

Description: OMS/SOB supports the central DB2 repository for all vehicle orders and is a vital part of Vehicle Order Fulfillment. In this project, we are rewriting or creating batch interfaces that extract data from the Single Order Bank tables according to various criteria and send files to numerous legacy systems. We are also developing modules involved in vehicle order processing/edit. It involves coding and testing several new BMPs and DB2 Stored Procedures.

Responsibilities:

Assign tasks to the team members

Provide Status report to Client on Weekly basis

Analysis and Understanding of the Spec and Provide technical guidance to the team members

Problem Analysis

Problem Solving/Fix on existing modules / functionality / programs

Enhancements to existing system based on the specification provided.

Address reported production problems

Coding, Testing and Documenting the New DB2 Stored Procedures and BMPs.

Setting up the JCL control cards in QA.

Environment: MVS/ESA, Windows 2000, VS-COBOL II, JCL, MQ Series, IMS/DB-DC,

DB2, Change man, SPUFI, QMF, File-Aid, DB2 Development Center, Expediter.

CVMS Developer

FORD Motor Company, UK Apr 2006 to Dec 2008

Description: Company Vehicle Management System is a system that tracks and records the inventory of company vehicles, beginning with the order process and ending with disposal. The system was originally created by Ford US to manage company vehicles in North America and the system was adapted to Ford of Britain. CVMS is jointly owned by the Employee Transportation Department (ETD)/Inventory Accounting. The system maintains the status of a vehicle as it progresses through its life cycle starting from Ordering of a vehicle, generation of license, title, and tax information, and recording the disposal of vehicles from information received from MIVIS (which is the remarketing interface). CVMS system interfaces with many other systems both external and internal viz, eCCO, OVID (VISTA), EUROMIVIS, EUROVAC, VIPER, AFRL, PAYROLL, MARITZ.

Responsibilities:

As a Senior Developer involved in Effort Estimation, Coordinating activities, Adopting CMM process in the entire software life cycle, Production Support, Design, coding, testing, Review.

Co-ordination with the client during UAT for compliance of functional requirements

Requirements gathering from the Clients

Coordinating tasks with offshore

Production Support working onsite

Application Design, coding, testing, Review, Implementation

Worked in UK and was supporting the application from UK.

Had a Direct communication with Business Users and was also handling Customer Queries.

SPOC for Vehicle Management-Accounting related queries.

Also maintain Metrics data and SCRP's and Disaster recovery was taken care.

Had Customer Appreciations.

APPLICATION MAINTENANCE / ENHANCEMENTS

Responsible for all small enhancements and moderate or cosmetic production support code fixes are implemented in releases managed using SDM. Also it’s a one-man show, so I was handling the complete Application.

Responsible for all enhancements should cover the standard SDM processes of Id & Assess, Analysis, Design, Build (including testing), Implementation and transition to Support and Maintain. All relevant documentation (application specific and SDM) should be included and maintained in line with Ford Corporate Standards.

Responsible for the release implementation.

PRODUCTION SUPPORT ACTIVITIES

Provide production support from FITSI.

Monitor EVMS Production Batch Jobs.

Provide primary support contact of the EVMS system during these working hours.

Monitor EVMS scheduled and on-demand batch processes.

Monitor the EVMS Common email account during the production support hours.

Investigate and resolve Production Abends or related production system issues.

Provide a weekly written summary of Production abends and system outages to the EVMS lead.

Daily transition updates - discuss the daily status, open issues, and obtain daily direction from the EVMS mainframe lead.

Work on system defects.

Communicate directly with Business Users and Onsite Supervisor.

Environment: MVS/ESA, Windows 2000, VS-COBOL II, JCL, MQ Series, IMS/DB-DC,

DB2, Change man, SPUFI, QMF, File-Aid, DB2 Development Center, Expediter.

Service Warranty Deals – Package 5 Developer

NISSAN AUTOMOBILES Nov 2006 to Mar 2007

Description: Nissan North America deals – Service Warranty deals with the processing of claims submitted by dealers for the repairs done for Nissan Vehicles. CPIA is the major system that deals with the business rules that affect the processing of the claims. The project is basically a production support project with few minor enhancements also in scope.

Responsibilities:

Mainly involved in Production Support

Analysis of the specifications provided by the clients

Handling Trouble Tickets

Understanding Logical flow of the Application from Business Knowledge

Estimate the effort needed to execute the project.

Implementing the changes given by client.

Documentation

Knowledge sharing

Ensuring timely and defect free delivery.

Onsite communication.

Environment: MVS/ESA, VS-COBOL II, JCL, VSAM

DB2, SPUFI, QMF, File-Aid, Panvalet, Endeavor

AIG Insurance Developer

American International Group Jan 2005 to Oct 2006

Description: American International Group, Collectively called as AIG Insurance is one of the leading U.S based international Insurance and Financial services Organization. AIG comprises of many modules collectively referred as Claims reporting system and all policy related informations. The project is following the onsite-offshore methodology and objective of the project is to reduce cost and improve services.

Responsibilities:

Knowledge sharing

Ensuring timely and defect free delivery.

Onsite communication.

Responsible for overseeing the Quality procedures related to the project.

Environment: MVS/ESA, COBOL, JCL, VSAM

DB2, SPUFI, QMF, File-Master, Panvalet, Intertest, MAPR2

Unified Contract System (UCS) Developer

Dec 2002 to Dec 2004

Description: DL is one of Germany’s largest and most successful Leasing finance companies. The company provides a comprehensive range of leasing services for Information & Communication Technology Assets, Vehicles, Machinery & Industrial equipments, Medical Technology equipments, and Energy Products on behalf of thousands of business customers throughout Europe. DL belongs to the Sparkassen - Finanzgruppe, one of Europe’s most successful financial services organizations.

UCS processes the offer from the UOS (Unified Offering System), creates a new contract and manages the contract throughout the contract period. UCS contains the following modules: Product Management System, Core Contract System, Service Contract System, Asset Management System, Finance & Collections, Unified Customer Model, and Account Logic Processor.

Core contract gains its significance in UCS, because it covers the contract life cycle process like Contract Registration, Contract Administration, and Contract Termination.

Responsibilities:

Finance & Collection, Contract Registration.

Analysis of the specifications provided by the clients

Active involvement in review processes for quality delivery.

Bug Fixing and making changes as per the client requirement.

Design and Development

Coding using COBOL

Testing - unit testing & integration testing

Responsible for overseeing the Quality procedures related to the project.

Environment: MVS/ESA, COBOL, JCL, VSAM

DB2, SPUFI, QMF, File-Master, Panvalet, Intertest



Contact this candidate