Post Job Free

Resume

Sign in

Senior Bigdata Engineer/ Lead SDET

Location:
Irvine, CA
Posted:
October 07, 2023

Contact this candidate

Resume:

Priyanka Das

Senior ETL Architect/ Bigdata/ Lead SDET

Rancho Santa Margarita, CA 92688

adz7ro@r.postjobfree.com

+1-714-***-****

Objective: A dynamic ETL, DWH/BI, Big Data with Cloud Services Professional with more than 12 years of experience; seeking opportunity in a truly global and challenging environment that offers organizational as well as personal growth capacities. Summary:

• Total 12+ Years of IT experience in having led teams to deliver multi-stakeholder projects in Analytics.

• Experience in different tools and technologies like Informatica, Datastage, Teradata, Netezza, Unix, Oracle, PL/SQL, Autosys, Cronacle, CA Workload Automation.

• Senior level experience in full life cycle (SDLC) Data Warehousing, Data Integration, Business Intelligence as Sr. ETL/Informatica Developer, Data Warehouse Developer and Tester.

• Good Knowledge on Hadoop ecosystem and different frameworks inside it – HDFS, YARN, MapReduce, Hive, Sqoop, ZooKeeper, Oozie, Kafka, NiFi and Real-time processing Framework (Apache Spark).

• 5 years of project experience as SDET Lead, working in a Bigdata Project commencing Unit Testing, SIT and UAT with zero defect leakage to PROD. In depth knowledge of STLC, SDLC, Bug Life Cycle and Testing Concepts.

• Automated the test scripts using Python and Robot Framework and performed end to end Automation/Regression Testing of the ETL jobs. Also used Selenium with C# using NUNIT Framework.

• Used Dell Boomi for the iPaaS capabilities and ETL features. Dell Boomi supports event-based, realtime and batch processing ETL. Migration of the ETL flows from Tibco XML to Boomi JSON and validating them. Splunk and MLA Validations.

• Experience in Cloud Integrations using Dell Boomi using Cloud to On-premise to cloud based applications.

• Experience in using ServiceNow for Incident ticketing, change management, SNOW.

• Skilled in system integration testing of event-driven architecture and publish/subscribe messaging patterns using Tibco BusinessWorks, Tibco Enterprise Message Service, Dell Boomi, and Apache Kafka.

• Worked on APIs using Postman and Ready API.

• Developed a Test Data Generation Tool/Framework using Python and robot Framework and integrated it using Jenkins for a one-click generation of Test File. Successfully demonstrated the framework to all the stakeholders.

• Expertise in using JIRA software with Jenkins and Github for real time bug tracking and issue management and implementing CI/CD.

• Domain knowledge in functional areas like Customer Data Engineering, Insurance, Retail, Healthcare and Banking.

• Working closely with the stakeholders & solution architect. Ensuring architecture meets the business requirements. Experience in building highly scalable, robust & fault-tolerant systems. Improving data quality, reliability & efficiency of the individual components & the complete system

• Setting & achieving individual as well as the team goal while working in an Agile environment

(Regular participation in Sprint Planning and Demo, Backlog Grooming, Sprint Retro and Daily Stand Up calls).

• Designing, creating and tuning physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to support normalized and dimensional models. Maintain the referential integrity of the database. Created proper Primary Index taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Also performed Collect Statics on FACT tables for performance Tuning.

• Worked in the capacity of Business Requirements analysis, Data Mapping, Database design and designer (ETL & Reporting) of large data warehouses.

• Specializing in Design, Development and Testing of ETLs that load data from multiple sources

(Oracle, Teradata, Netezza, DB2, Flat Files) into the Warehouse & Mart using Informatica, Datastage, SQL Loader; Design & Development of PL/SQL routines(Stored procedures, triggers, functions), SQL Queries, Teradata BTEQs and UNIX Shell scripts for data load & processing and testing.

• Define the schema, staging tables, and landing table, configuring base object, foreign-key relationships.

• Scheduling of ETL workflows; Data Modeling (Logical & Physical) – using Erwin; Optimization of ETL routines; Workflow/Session level parameterization, Implementing partitioning and concurrency concepts in workflows for performance enhancements.

• Creating ServiceNow Request/Incidents for Deployments.

• Testing and Support during Implementation of ETL Mappings.

• Handling/Maintaining Quality and Process Related Aspects of a Project; Work on Risk assessment and Mitigation and Contingency plans, Capture data for Quality of Service, Prepare the Monthly Governance Decks, WSRs, BCM Plans, Conducting Defect Prevention Meetings; Suggesting various Quality and Process improvement measures; Conducting Reviews; Maintaining Review Checklists/Issue Trackers and resolving issues with appropriate solutions.

• Worked as a single point of contact for the Testing Team in case of ETL related issues. Mentoring members of the team. Technical Knowledge Transfer Sessions for the team on Informatica, Datastage, Teradata, Cronacle, Autosys. SQL, ETL and DWH Concepts.

• Having good understanding on US healthcare domain knowledge on Claims, billing, payment processing, Medicaid, Medicare, Authorization.

• Prepared Defect Summary report by using the bug tracking tools like Azure DevOps.

• Facilitating Onsite-Offshore coordination and status meeting with the team. Task allocation and collecting status updates from the team to manage smooth completion of the task in hand.

• Excellent in Performance Tuning and optimizing ETL Jobs, SQL queries, BTEQs and HQLs.

• Experience in migrating Informatica mappings/Workflows/sessions to Production Repository.

• Ability to effectively and efficiently lead a team and also work individually with good interpersonal, Technical and communication skills.

• Excellent Analytical, Organizational and Problem solving skills. Ability to focus, manage time and priorities. Excellent communication skills, proactive, self–starter, good team player, and enthusiastic. Authorized to work in the US for any employer

Work Experience

Lead SDET/ Bigdata Architect

Vaco - Hyderabad, Telangana

November 2021 to July 2023

Client: Caresource

Project: ESB Core Project

Position: Lead SDET / Bigdata Architect

Description: Enterprise Service Bus (ESB) is a middleware platform which provides a secured, scalable and cost-effective infrastructure that enables real-time data exchange among many systems. An ESB may connect with services, processes, applications, internal systems, data stores, or analytical systems to facilitate the flow of data.

Responsibilities:

• As a SDET Lead, worked in a Bigdata Project commencing Unit Testing, SIT and UAT with zero defect leakage to PROD. In depth knowledge of STLC, SDLC, Bug Life Cycle and Testing Concepts.

• Automated the test scripts using C# with Selenium using NUNIT Framework and performed end to end Automation/Regression Testing of the ETL jobs.

• Used Dell Boomi for the iPaaS capabilities and ETL features. Dell Boomi supports event-based, realtime and batch processing ETL. Migration of the ETL flows from Tibco XML to Boomi JSON and validating them. Splunk and MLA Validations.

• Creating incident tickets, change management requests, SNOW using ServiceNow.

• Validating HL7 Inbound and outbound interfaces through Boomi, TIBCO.

• Testing various HIE (Health Information Exchange), Clinical files using TIBCO, Boomi.

• Worked on migration project from TIBCO to Boomi cloud.

• Involved in reading Kafka messages from the flat files.

• Involved in testing REST, SOAP services using Ready API, Postman, SOAP UI and tested both XML, JSON formats.

• Ability to read various system input/output files; XML, JSON, Flat files, 834, 835, etc using STM (Source Target Mapping).

• Involved in writing SQL queries to validate data in migration for backend testing.

• Using MS SQL server Management studio for creating and executing simple to complex SQL queries for testing data loaded in table.

• Developing Test Scenarios, Test case preparation, Test script preparation, Test Execution Defect Management Daily status reporting.

• Prepared Test Automation Plan, Estimations and reviewed Test Scenarios to be covered in the script.

• Developed a Test Data Generation Tool/Framework using Python and robot Framework and integrated it using Jenkins for a one-click generation of Test File. Successfully demonstrated the framework to all the stakeholders.

• Used Github and Jenkins for implementing CI/CD.

• Writing Test Plan/ Strategy document, Test Cases and doing Test Data Setup.

• In the ETL track, worked on Manual and Automation Testing, Backend Testing, Smoke Testing, Integration Testing, Functional Testing, Load Testing, Regression Testing, User Acceptance Testing and End-to-End Testing.

• Followed Defect life Cycle to open and track bugs in TFS, HP ALM and Azure DevOps.

• Domain knowledge in functional areas like Customer Data Engineering, Insurance, Healthcare .

• Working closely with the stakeholders & solution architect. Ensuring architecture meets the business requirements. Experience in building highly scalable, robust & fault-tolerant systems. Improving data quality, reliability & efficiency of the individual components & the complete system

• Setting & achieving individual as well as the team goal while working in an Agile environment (Regular participation in Sprint Planning and Demo, Backlog Grooming, Sprint Retro and Daily Stand Up calls).

• Designing, creating and tuning physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to support normalized and dimensional models. Maintain the referential integrity of the database. Also performed Collect Statics on FACT tables for performance Tuning.

• Worked in the capacity of Business Requirements analysis, Data Mapping, Database design and designer (ETL & Reporting) of large data warehouses.

• Specializing in Design, Development and Testing of ETLs that load data from multiple sources into the Warehouse & Mart using Informatica ; Design & Development of PL/SQL routines(Stored procedures, triggers, functions), SQL Queries and UNIX Shell scripts for data load & processing and testing.

• Define the schema, staging tables, and landing table, configuring base object, foreign-key relationships.

• Provided Project Implementation Support at the time of successful go-live.

• Provided Warranty Support and provided the Complete Project KT to Production Support Team to hand over the project successfully.

• Facilitating Onsite-Offshore coordination and status meeting with the team. Task allocation and collecting status updates from the team to manage smooth completion of the task in hand.

• Performance Tuning and optimizing ETL Jobs, SQL queries, C# scripts and Test Cases/Strategies.

• Trained QA team on how to execute Test Automation Scripts and provide Reports. Environment Used: Dell Boomi, Tibco GEMS tool, SQL Server 2018, C# with Selenium using NUNIT Framework, Postman, Ready API, HP ALM, Service Now, Splunk, Github, Jenkins, Kafka, Confluent, Tibco EMS BW, FACETS, Selenium WebDriver, TFS, Quality Center, Azure DevOps, AXWAY, Visual Studio, Ultra edit, Notepad++.

Lead SDET / Bigdata Architect

Cognizant Technology Solutions - Kolkata, West Bengal January 2019 to November 2021

Client: Manulife Financials

Project: DMO and SMDM Project

Position: Lead SDET / Bigdata Architect

Description: This Client has expertise in supporting medium to large sized projects that require data movement or that impact their existing applications and processes (DMO - Distributed Middleware Office, C360 - Customer 360, SMDM - Semantic Master Data Mart). Responsibilities:

• As a SDET Lead, worked in a Bigdata Project commencing Unit Testing, SIT and UAT with zero defect leakage to PROD. In depth knowledge of STLC, SDLC, Bug Life Cycle and Testing Concepts.

• Automated the test scripts using Python and Robot Framework and performed end to end Automation/ Regression Testing of the ETL jobs. Used CA Workload Automation and CA DevTest for the same earlier.

• Prepared Test Automation Plan, Estimations and reviewed Test Scenarios to be covered in the script.

• Developed a Test Data Generation Tool/Framework using Python and robot Framework and integrated it using Jenkins for a one-click generation of Test File. Successfully demonstrated the framework to all the stakeholders.

• Expertise in using JIRA software with Jenkins and Github for real time bug tracking and issue management and implementing CI/CD.

• Writing Test Plan/ Strategy document, Test Cases and doing Test Data Setup.

• In the ETL track, worked on Manual and Automation Testing, Backend Testing, Smoke Testing, Integration Testing, Functional Testing, Load Testing, Regression Testing, User Acceptance Testing and End-to-End Testing.

• Followed Defect life Cycle to open and track bugs in JIRA and XRAY.

• Domain knowledge in functional areas like Customer Data Engineering, Insurance, Healthcare .

• Working closely with the stakeholders & solution architect. Ensuring architecture meets the business requirements. Experience in building highly scalable, robust & fault-tolerant systems. Improving data quality, reliability & efficiency of the individual components & the complete system

• Setting & achieving individual as well as the team goal while working in an Agile environment (Regular participation in Sprint Planning and Demo, Backlog Grooming, Sprint Retro and Daily Stand Up calls).

• Designing, creating and tuning physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to support normalized and dimensional models. Maintain the referential integrity of the database. Also performed Collect Statics on FACT tables for performance Tuning.

• Worked in the capacity of Business Requirements analysis, Data Mapping, Database design and designer (ETL & Reporting) of large data warehouses.

• Specializing in Design, Development and Testing of ETLs that load data from multiple sources into the Warehouse & Mart using Informatica ; Design & Development of PL/SQL routines(Stored procedures, triggers, functions), SQL Queries and UNIX Shell scripts for data load & processing and testing.

• Define the schema, staging tables, and landing table, configuring base object, foreign-key relationships.

• Provided Project Implementation Support at the time of successful go-live.

• Provided Warranty Support and provided the Complete Project KT to Production Support Team to hand over the project successfully.

• Facilitating Onsite-Offshore coordination and status meeting with the team. Task allocation and collecting status updates from the team to manage smooth completion of the task in hand.

• Performance Tuning and optimizing ETL Jobs, SQL queries, Python scripts and Test Cases/Strategies.

• Trained QA team on how to execute Test Automation Scripts and provide Reports. Environment Used: Informatica 10.1.1, SQL Server 2018, Python 3.7, Robot Framework, CA Workload Automation, CA DevTest, PyCharm, Kafka, NiFi, Sqoop, Hive, Oozie 3.0, JIRA, XRAY, Confluence, Slack, Jenkins with Docker, GitLab 13, GitHub 2

Team Lead/Lead Developer – ETL/Semantic

Wells Fargo Home Mortgage - Fort Mill, SC

February 2014 to November 2018

Client: Wellsfargo Home Mortgage

Project: Semantic Systems Data Mart Project

Position: Team Lead/Lead Developer – ETL/Semantic Teradata View and Macros / Teradata BTEQs / Unix Scripts / Crontab Scheduler)

Description: As part of Functional/Technical delivery, the service includes activities like: Development of Teradata View and Macros, Teradata BTEQs, Unix Scripts and Crontab scheduling of jobs and loads. Handling MSP SORCING, SSD, CDS, SASGRID MIGRATION, BUILD TRACKING TABLE. Regression testing support. Performance tuning of ETL codes and Reporting queries. All BI milestone related documentation. Occasional production support.

Responsibilities:

• Involved in Requirement gathering, business Analysis, Design and Development, testing and implementation of business rules.

• Performed bulk data load from multiple data source (ORACLE, legacy systems) to TERADATA RDBMS using BTEQ, Multiload, TPT and FastLoad

• Created, optimized, reviewed, and executed Teradata SQL test queries to validate transformation rules used in source to target mappings/source views, and to verify data in target tables

• Performed tuning and optimization of complex SQL queries using Teradata Explain.

• Responsible for Collect Statics on FACT tables

• Performance Tuning of sources, Targets, mappings and SQL queries in transformations

• Designing, creating and tuning physical database objects (tables, views, indexes, PPI, UPI, NUPI, and USI) to support normalized and dimensional models. Maintain the referential integrity of the database.

• Created proper Primary Index taking into consideration of both planned access of data and even distribution of data across all the available AMPS.

• Build tracking table to monitor the load status, loading steps and to monitor severity of errors.

• Helped BI Capabilities manage the complex inter-operations between shops and manage the cross functional demands with JIRA.

• Updating SLA reporting, customizable team queues/dashboards, real-time reporting and ownership notification in JIRA 5.2

• Involved in building Data marts, data structures, data storages, data warehouses, data archives, and data analysis.

• Wrote numerous BTEQ scripts to run complex queries on the Teradata database.

• Used volatile table and derived queries for breaking up complex queries into simpler queries.

• Created a cleanup process for removing all the Intermediate temp files that were used prior to the loading process.

• Define the schema, staging tables, and landing table, configuring base object, foreign-key relationships.

• Writing teradata sql queries to join or any modifications in the table

• Involved in SASGRID Migration and also with configuring Queues/Policies in SASGRID environment, usage tracking, monitoring, alerts notification and load balancing.

• Involved in Teradata14 migration and streamlined the Teradata scripts, shell scripts migration and DDLs to accommodate teradata 14 reserved keywords.

• Developed UNIX shell scripts to run batch jobs in production.

• Involved in analysis of end user requirements and business rules based on given documentation and working closely with tech leads and analysts in understanding the current system.

• Developed Teradata views and macros to accomplish the ETL work. Environment Used: Teradata 15 Teradata SQL Assistant, Teradata Manager, Teradata view pointer, BTEQ, MLOAD, FLOAD, FASTEXPORT, Erwin Designer, UNIX, Korn Shell scripts. Lead Developer - ETL (Informatica / Teradata BTEQs / Cronacle Objects) Tata Consultancy Services (TCS) - Kolkata, West Bengal September 2010 to October 2011

Client: GE Healthcare

Project: Life Sciences General Ledger Visibility Project Position: Lead Developer - ETL (Informatica / Teradata BTEQs / Cronacle Objects) Description: As part of Functional/Technical delivery, the service includes activities like: Development of Teradata BTEQs, Unix Scripts and Cronacle objects (Job Chains, Calling Scripts, Event Waits, and Locks). Enhancement of existing Informatica Mappings and BO reports. Support for existing LS GL functionality. Regression testing support. Performance tuning of ETL codes and Reporting queries. All BI milestone related documentation. Knowledge transfers to the RTS team. Responsibilities:

• Understanding of the business requirements and High level Design of systematic flow of the mapping and database tables. Created DLD document from the business requirements and implemented that in the design. Done Data Mapping as per the requirements.

• Detail level design and data flows creation for loading from Oracle, Teradata tables and flat files to staging database as per business logic and handling error scenarios for the same.

• Changes in design/coding with the frequent changes in business requirements.

• Data from Oracle, Teradata tables and flat files are segregated, fine-tuned and transformed using Informatica Mapping.

• Determination/Coding/Implementation of UNIX shell scripts for mapping execution and data validation and test automation.

• Analyzed and developed new codes to suggest new functionality and to support reusability.

• Development of Teradata BTEQs to Insert, Update, Delete the Target table data.

• Used Redwood Cronacle as the Scheduling tool. Developed the Cronacle Job Chains, Calling Scripts, Event Waits, and Locks. Scheduled them at proper time and window to run the Informatica mappings and Teradata BTEQs.

• Developing unit test and integration test plan and test cases for different business scenarios.

• End to End Software development including coding of mappings, creating sessions and workflows, unit testing, system testing, regression testing and integration testing as well as analysis of errors occurred.

• Unit test planning and execution and UAT testing coordination. Zero defects at UAT.

• UNIX scripting for test automation.

• Detailed design of performance testing plan and resolving and closing the Change Requests as and when raised.

• Handling/Maintaining Quality and Process Related Aspects of the Project to attain CMM Level 5 for the Project.

• Prepared the Project related documents like: KM Entitlement Plan, BCP, RTO Calculation Doc, UPP Doc, Induction Manual, Inventory List, Project Deployment Doc, Performance Matrix for the Teradata BTEQs etc.

• Submitted Case Study Documents for upload in K-Bank.

• Provided Project Implementation Support at the time of successful go-live.

• Provided Warranty Support and provided the Complete Project KT to RTS (Production Support) Team to hand over the project successfully.

Environment Used: Informatica 8.6.1, Teradata, UNIX, Redwood Cronacle, SVN Sr. ETL Developer and Module Lead

Infosys - Bhubaneshwar, Orissa

November 2009 to September 2010

Client: Northwestern Mutual Life Insurance

Project: Field Financial Systems & Form 5500 Reporting Project Position: Sr. ETL Developer and Module Lead

Description: The Field Financial System effectively evaluates the expansion of the fields force and assures financial health of Network Offices and District Network Offices. The primary objective of FFS is to deliver the required and necessary functionality that the current systems (GABS and QuickBooks) lack. The Project scope includes:

• Replacing GABS within the Network Offices and ensure the new system meets general accounting standards and new requirements.

• Replacing QuickBooks by integrating the District Network Offices to align with Network Offices and manage their businesses.

Form 5500 Schedule C Required Reporting Project involved development and Modification ETL mappings in order to develop Schedule C reports which will be mailed to plans with 80 or more participants. The DOL(Department of Labor) requires that Form 5500 schedule C be completed for all DB and DC plans covered by ERISA with 100 or more participants where the service provider received $5,000 or more in compensation at the plan level

Responsibilities:

• Understanding of the business requirements and High level Design of systematic flow of the mapping and database tables. Created DLD document from the business requirements and implemented that in the design.

• Detail level design and data flows creation for loading from DB2 tables and flat files to staging database as per business logic and handling error scenarios for the same.

• Changes in design/coding with the frequent changes in business requirements.

• Data from DB2 tables and flat files are segregated, fine-tuned and transformed using Informatica features like Filtration, Aggregation, Joiner, Sorter, Expression, Connected and Unconnected Lookups on DB2 tables, Router, Mapplets, Update Strategy and Source Qualifier.

• Developed Datastage Job Sequences/routines to transform the data as per the requirement.

• Data Processing experience in designing and implementing Data Mart applications, mainly transformation processes using ETL tool DataStage (Ver8.2/7), designing and developing jobs using DataStage Designer, Data Stage Manager, DataStage Director and DataStage Debugger.

• Experience in Mapping Server/parallel Jobs in DataStage to populate tables in Data warehouse and Data marts.

• Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort, Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding.

• Developed job sequencer with proper job dependencies, job control stages, triggers.

• Determination/Implementation of reusable transformation components and Mapplets in different downloads and fine tuning of transformations to reduce the coding/testing effort for mapping; and UNIX scripts for mapping execution and data validation.

• Performance tuning of PLSQL/SQLs used in the mapping and reducing the execution time.

• Developing unit and integration test plan and test cases for different business scenarios.

• End to End Software development including coding of mappings, creating sessions and workflows, unit testing, system testing, regression testing and integration testing as well as analysis of errors occurred.

• Unit test planning and execution and UAT testing coordination.

• Detailed design of performance testing plan and automation for generation of data for performance testing.

• Scheduling ETL Jobs using Autosys.

• Handling/Maintaining Quality and Process Related Aspects of the Project to attain CMM Level 5 for the Project.

Environment Used: Informatica 8.1.1, Datastage 8.2, DB2, UNIX, Autosys tool ETL Developer and Module Lead

Infosys - Pune, Maharashtra

December 2008 to October 2009

Client: Bank of America

Project: GTS Merchant Database and Reporting Project Position: ETL Developer and Module Lead

Description: This project involves design, development and testing of scripts and Informatica mappings. As part of the development, the ETL process loads data from the source files to Staging Area which will be logically separate from the target database. The staging area will be in Netezza database and have table structures as replica of the source files. While loading data from source to staging basic validation checks on source files are done. Based on this loaded data the reports are developed and delivered to the client. Responsibilities:

• Designed & Developed LLD Document, ETL mappings that load data from source files (UNIX, Mainframes) using Informatica and PL/SQL routines (Stored procedures, triggers, functions) and UNIX shell scripts for data load & processing. Optimization of ETL routines.

• Analysis of data files to incorporate the appropriate changes in the source definitions in order to read data correctly.

• Testing and Support during Implementation of ETL Mappings. Loaded data in Production environment using the mappings and scripts.

Environment Used: Informatica 8.6, Netezza 4.5 Database Appliance, UNIX ETL Developer and Module Lead

Infosys - Bhubaneshwar, Orissa

October 2007 to November 2008

Client: PETSMART

Project: Informatica / Netezza Migration Project

Position: ETL Developer and Module Lead

Description: This project involves migration, development and testing of large number of scripts and Informatica mappings in a short duration of time. Migration of DW environment to more scalable DW appliance-based infrastructure. As part of this migration, the client plans to retire his Oracle based DW environment and migrate/convert all the existing data/structure in Oracle (an Oracle footprint may be retained to provide for functionality that does not exist in Netezza), ETLs to Netezza based DW environment.

Responsibilities:

• Designed & Developed PL/SQL routines (Stored procedures, triggers, functions) and UNIX shell scripts and ETL mappings that load data from multiple sources (Oracle, Db2, Flat Files) using Informatica, SQL Loader.

• Testing and Support during Implementation of ETL Mappings. Environment Used: Informatica 8.5.1, Netezza 4.5 Database Appliance, Oracle 9i, UNIX, SQL Server Education

B.Tech in Information Technology

West Bengal University of Technology - Kalyani, West Bengal M.S. in Software Engineering

International Technological University - San Jose, CA Skills

• DATABASE (10+ years)

• ETL (10+ years)

• SQL (9 years)

• UNIX (9 years)

• XML (2 years)

• APIs (2 years)

• Kafka (5 years)

• Dell Boomi (2 years)

• Python (5 years)

• C# (2 years)

• Agile (5 years)

• Git (5 years)

• Jenkins (3 years)

• Azure (5 years)

Additional Information

Certifications :

• Informatica® Certified Developer

Exam R: PowerCenter 8 Architecture and Administration Exam U: PowerCenter 8 Advanced Mapping Design

Exam S: PowerCenter 8 Mapping Design

• Informatica® Power Center 7 Mapping Design

Achievements:

• Received Award for exceptional performance in the CareSource ESB Project 2022.

• Received WOW Award for Excellent Delivery 2020, in Manulife Project – Always Striving Never Settling.

• Received multiple client appreciations on the TDG Tool, Automation of the ETL scripts and multiple successful/defect-free Production Implementations in 2019, 2020 and 2021.

• Prepared the Project related documents like: KM Entitlement Plan, BCP, RTO Calculation Doc, UPP Doc, Induction Manual, Inventory List, Project Deployment Doc, Performance Matrix for the Teradata BTEQs, Defect Tracker, Test Case Document, Test Reports etc.

• Submitted Case Study Documents for upload in Knowledge-Bank. Provided Knowledge Transfer sessions to the team members.

• Provided Project Implementation Support at the time of successful project go-live.

• Provided Warranty Support and provided the Complete Project KT



Contact this candidate