Post Job Free

Resume

Sign in

Data Manager

Location:
Pune, Maharashtra, India
Posted:
June 29, 2020

Contact this candidate

Resume:

Ankur Nayak

COVER LETTER

Over * plus years of IT experience in the field of various ETL application analysis, design, development, implementation of Database Systems, Data Warehousing (DWH), Dimensional, Data Modelling, Data Integration for clients such as NORTHWESTERN MUTUAL, BANK OF AMERICA, GE, AT&T, TATA COMMUNICATIONS, and WELLS FARGO in organizations GENERAL ELECTRIC (GE) HEALTHCARE, INFOSYS LIMITED, TATA CONSULTANCY SERVICES (TCS), TECH MAHINDRA & currently with PERSISTENT SYSTEMS as Lead Project Engineer.

add7bs@r.postjobfree.com 095******** Hinjewadi, Pune, India WORK EXPERIENCE

02/2017 – Present

Lead Project Engineer

Persistent Systems

Pune

03/2015 – 02/2017

ETL Consultant

Tech Mahindra

Pune

12/2013 – 03/2015

Senior ETL Informatica Developer

Tata Consultancy Services

Pune

03/2011 – 12/2013

Software Engineer

Infosys Limited

Pune, Bhuvaneshwar, Mysore

11/2010 – 03/2011

Trainee

General Electric

Bangalore

CERTIFICATES

RPA Advance Developer Level 3 (03/2020 – Present)

UiPath

EDUCATION

07/2011 – 06/2013

Master of Business Administrator

Annamalai University

Chennai

Information Systems

07/2006 – 06/2010

Bachelor of Engineering

Technocrats institute of Technology

Bhopal

SKILLS

HONOR AWARDS

PAT ON BACK

Infosys Limited

INSTA AWARD

Tata Consultancy Services

LANGUAGES

English

Full Professional Proficiency

Hindi

Full Professional Proficiency

Courses

25/06/2020 Your Resume

https://ineedaresu.me/#/generate 1/6

ANKUR ANKURNAYAK NAYAK

ankurnayak18@add7bs@r.postjobfree.com 0099551111886666331155 https://www.linkedin.com/in/ankurnayak18/

PROFESSIONAL PROFESSIONALSUMMARY SUMMARY

● Over 9 plus years of IT experience in the field of various ETL application analysis, design, development, implementation of Database Systems, Data Warehousing (DWH), Dimensional, Data Modelling, Data Analysis, Data Integration, Reporting, Business Intelligence (BI), Code testing strategies, plans, Technical Troubleshooting, Debugging, ETL Scheduling and Scripting, Deployment and Migration of code. Along AlongClient with Technical analysis for Complex Projects, Testing and Root Cause Analysis (RCA) /Diagnosis, Client Communications Communicationson and Presentations, Solutions Design, Analyzing Data and Making It Visible, Collaborate on Strategic Strategicand Consulting and Decision Support, Cross-functional Coordination, Project Scoping, Budgeting, and Goal Goaland setting and the development of application in Web-Based environment, distributed n-tier and Client/Server environments for clients such as NORTHWESTERN MUTUAL, BANK OF AMERICA, GE, AT&T, TATA COMMUNICATIONS, and WELLS FARGO in organizations GENERAL ELECTRIC (GE) HEALTHCARE, INFOSYS INFOSYSwith LIMITED, TATA CONSULTANCY SERVICES (TCS), TECH MAHINDRA and currently with PERSISTENT SYSTEMS as Lead Project Engineer.

● Strong Data Warehousing ETL experience of using Informatica 9.1/8.6.1/8.5.

● Experience in all the phases of the Data warehouse lifecycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.

● Experience in creating High-Level Design and Detailed Design in the Design phase.

● ●Business Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence IntelligenceServer solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.

● ●Monitor PowerCenter Client tools - Mapping Designer, Repository Manager, Workflow Manager/Monitor and Server tools – Informatica Server, Repository Server manager.

● ●with Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modelling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.

● ●and Strong experience in Dimensional Modeling using Star and Snowflake Schema, Identifying Facts and Dimensions, Physical and logical data modelling using Erwin and ER-Studio.

● ●and Expertise in working with relational databases such as Oracle 12c/11g/10g, SQL Server 2008, and Teradata.

● ●into Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor) as ETL tool on Oracle and SQL Server Databases.

● ●sessions Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.

● ●UNIX Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.

● ●Oracle Proficient in the Integration of various data sources with multiple relational databases like Oracle 12c/12c/Data11g /Oracle10g, MS SQL Server, Teradata, and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.

● Experience in using Automation Scheduling tools like Autosys.

● Worked extensively with slowly changing dimensions.

● ●including Hands-on experience across all stages of the Software Development Life Cycle (SDLC) including business businessuser requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.

EEXXPPEERRIIEENNCCEE

LEAD LEADLEAD PROJECT ENGINEER / TEAM LEAD

Project Projectinitio 1: iHUB Development, Maintenance, and Migration from Informatica to Ab initio Description: Description:interdependencies - Wells Fargo has several SORs (System of records) which have many interdependencies among amongInformation them. This creates a complex web that must be transformed into a relatively optimized Information Email:Phone:

C L I C K T O E D I T

C L I C K T O E D I T

Persistent Systems /Pune /February 2017 - Current

25/06/2020 Your Resume

https://ineedaresu.me/#/generate 2/6

hub that interconnects those SORs in a relatively simpler manner. We worked on about 20+ Credit, 40+ deposit depositEnd 25+ Treasury SORs. This project follows the OFSAA Model to provide appropriate information to End User. User.information However, since we are in the middle layer hence, we offer flexibility to customize/abstract information for forwhich downstream users like E-one, WDM Etc. based on business needs. We get data from SORs which processes processesSTORE through Validation and Conformance process which performs in OPERATIONAL DATA STORE

((whichODS) Which then processes via OFSAA model. “ihub” is an RDBMS repository of source system data which works workscalled as a provisioning layer for downstream systems. ihub follows the industry-standard data model called OFSAA. OFSAA.Informatica The System of Record (SOR) is integrated into ihub by the Informatica framework. Informatica The framework was developed using Informatica as third party API utilized using JAVA and PL/SQL coding. Informatica Informaticathe ETL framework enables binding the downstream functional area into a group and scheduling the batch batchone as a functional area as a single group for processing. With the Informatica ETL framework building, one to towere one mapping becomes very easy using the framework JAR file. As per business requirements, we were migrating the Informatica code into Abinitio with similar implementation. Responsibilities:

● Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.

● Parsed high-level design specification to simple ETL coding and mapping standards.

● ●sources Designed and customized data models for Data warehouse supporting data from multiple sources on ontime real-time

● ●Data Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

● Created mapping documents to outline data flow from sources to targets.

● ●design Involved in Dimensional modelling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.

● ●and Extracted the data from the flat files and other RDBMS databases into the staging area and populated the Data warehouse.

● ●Informatica Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.

● Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.

● Developed mapping parameters and variables to support SQL override.

● Created mapplets to use them in different mappings.

● Developed mappings to load into staging tables and then to Dimensions and Facts.

● Used existing ETL standards to develop these mappings.

● Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.

● ●and Created sessions, configured workflows to extract data from various sources, transformed data, and loading into the data warehouse.

● Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.

● Extensively used SQL* loader to load data from flat files to the database tables in Oracle.

● Modified existing mappings for enhancements of new business requirements.

● Used Debugger to test the mappings and fixed the bugs.

● ●of Wrote UNIX shell Scripts & PMCMD commands for FTP of files from a remote server and backup of repository and folder.

● Involved in Performance tuning at source, target, mappings, sessions, and system levels.

● ●to Prepared migration document to move the mappings from development to testing and then to production repositories.

● ●and used data lineage to trace sources of specific business data, which enables them to track errors and apply applyprotocols steadier data governance protocols

● ●in Development in Ab-initio- Designed, developed Ab Initio graphs for loading into DWH. Involved in creating Ab Initio DMLs and written XFRs for implementing the business logic transformation. Translating the logical data model in Erwin to physical database design, database construction, design designand optimization and functional testing. Performed Reverse engineering on existing graphs and stored storedspecific procedures to know their present functionality and in turn to know where specific enhancements enhancementswith are to be applied. Implemented Data Parallelism through graphs, which deals with data, data,Initio divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data. Developed parameterized graphs using formal parameters. Used various component of Ab Initio graphs like Aggregator, Match sorted, Join, Denormalize sorted, Reformat, Rollup, Scan, Dedup, Lookup, Partition by Key, Round Robin, Gather, Merge, Filter-by- Expression ExpressionInterleave etc. Worked with Departition Components like Concatenate, Gather, Interleave 25/06/2020 Your Resume

https://ineedaresu.me/#/generate 3/6

Scheduled Scheduledcreating the processes using Autosys. Created Auto Sys job stream to schedule jobs by creating box jobs and JIL templates.

Environment: Informatica Power Center 10.x, Workflow Manager, Workflow Monitor, PL/SQL, Oracle12c, Erwin, Autosys, SQL Server, Teradata, GIT, Ab Initio GDE 3.x Project ProjectEnhancement 2: Development and Enhancement Description: Description:a - The Basic Architecture is to process source system data with Informatica ETL mapping into a staging stagingThe table in the Staging DB on the Teradata. The mapping will handle all the data transformations. The Staging Stagingrecord tables processed with a BTEQ SQL script into the final target table. The BTEQ will manage the record tracking trackingBI (i.e. SCD Type 2 or 1) for the target table. Once the data is in the target production table, various BI tools toolsis including BO / Power BI / Necto can access it. Therefore, the primary objective of the project is performance performanceimplementing tuning for current existing ETL design at Informatica and BTEQ scripts level, implementing changes and new development as per business requirement. Responsibilities:

● ●and Single point of contact for all the ETL related activities, understanding the current design and implementing ETL changes accordingly.

● ●the Responsible for ETL technical discussion & code Review call with Architect and deployment of the ETL code in all required environments before production.

● File extract generation, data quality implementation in some Informatica jobs as per business logic.

● ●quality The main objective is to bring rigour and quality focus to the ETL implementation so that a quality architecture and quality infrastructure is delivered.

● ●in Created shortcuts for reusable source/target definitions, Reusable Transformations, mapplets in Shared folder.

● Involved in performance tuning and optimization of mapping to manage a very large volume of data.

● Prepared technical design/specifications for data Extraction, Transformation and Loading.

● ●and Worked on Informatica Utilities Source Analyzer, Mapping Designer, Mapplets Designer and Transformation Developer.

● Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Router, Sorter, Expression, Expression,and Normalizer, Sequence Generator, Update Strategy, Stored Procedure, Connected and Unconnected Unconnectedimplement lookups, Update Strategy, Filter transformation, Joiner transformations to implement complex business logic.

● Implemented SCD Type 1 for loading data into the data warehouse dimension tables.

● Implemented error handling for invalid and rejected rows by loading them into error tables.

● ●using Analyzing the sources, transforming data, mapping the data and loading the data into targets using Informatica Power Center Designer.

● ●and Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.

● ●and Used Variables and Parameters in the mappings to pass the values between mappings and sessions.

● Designed presentations based on the test cases and obtained UAT sign-offs.

● ●higher Documented test scenarios as a part of Unit testing before requesting for migration to higher environment levels and handled production deployments.

● Recorded defects as a part of the Defect tracker during SIT and UAT.

● Identified performance bottlenecks and suggested improvements.

● ●requirements Performed Unit testing for mapping developed, to ensure that it meets the requirements

● Handled major Production GO-LIVE and User acceptance test activities. Environment: Informatica PowerCenter Designer 9.x, Informatica Repository Manager, Oracle 10g, Erwin, TOAD, TOAD,Teradata PL/SQL, SQL Developer, Teradata ETL ETLCONSULTANT CONSULTANT

Project: DTI (Data Transformation and Integration) Description: Description:with - DTI project work for collecting the telecom detail in each region for the different vendors with the thehighest respect of a lot of different products of AT&T. Our job to provide real-time, lowest latency, highest quality qualityvarious transformed and integrated data from various data sources which batch feeds we receive in various formats formatsinto (flat file, mainframe EBCDIC, XML etc.), identify daily deltas, integrate, transform and load into ttaarrggeett

databases databaseson using Informatica. The main functionality of this project is to collect the data from each region on aa

Tech Mahindra Limited /Pune /March 2015 - February 2017 25/06/2020 Your Resume

https://ineedaresu.me/#/generate 4/6

daily, daily,a weekly, monthly and quarterly basis in different languages and need to process data through ETL on a per perand requirement basis. Later, Data used by the reporting team to capture the report for the vendors and clients as per their requirements.

Responsibilities:

● Created Informatica ETL mappings using different transformations like Source Qualifier, filter, Aggregator, Aggregator,and Expression, Connected and Unconnected Lookup, Sequence Generator, Router and Update Strategy.

● ●solved Test the data with all conceivable conditions in the DEV and UAT cycles so that errors can be solved before the production.

● ●Sources Designed and Developed Informatica Mappings from Scratch to Load the Data from various Sources to the Staging system and Warehouse System.

● Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads.

● Responsible for mapping migration, importing and exporting maps from different servers.

● Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.

● Created Test cases for the mappings developed and then created integration Testing Documents.

● Prepared the error-handling document to maintain the error handling process.

● Scheduled Informatica jobs using Autosys and Scheduler.

● Closely worked with the reporting team to ensure that correct data is present in the reports. Environment: Informatica PowerCenter Designer 9.x, Informatica Repository Manager, Oracle 10g, Erwin, TOAD, TOAD,Teradata PL/SQL, SQL Developer, Teradata SENIOR SENIORDEVELOPER INFORMATICA ETL DEVELOPER

Project: Sales Use Tax (SUT) & Value-Added Taxation (VAT) Description: Description:different Sales use tax projects basically work for collecting the tax detail in each region for the different vendors vendorsthe with the respect of a lot of different products. The later functionality of this project is to collect the ddaattaa

from fromprocess each region on a daily, weekly, monthly and quarterly basis in different languages and need to process data datareport through ETL on a per requirement basis. Later, data is pulled by the reporting team to capture the report for the vendors and clients as per their requirements. Responsibilities:

● Designed ETL specification documents for all the projects.

● Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.

● ●Applied Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.

● Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.

● ●changing Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.

● Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Aggregator,develop Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.

● ●to Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.

● Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.

● Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E- mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.

● Extensively used workflow variables, mapping parameters and mapping variables.

● Created sessions, batches for incremental load into staging tables and scheduled them to run daily.

● ●inherit Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

● Implemented Informatica recommendations, methodologies and best practices.

● ●provide Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.

● Involved in Unit, Integration, System, and Performance testing levels.

● ●and Written documentation to describe program development, logic, coding, testing, changes and corrections.

● Migrated the code into QA (Testing) and supported the QA team and UAT (User).

● Created a detailed Unit Test Document with all possible Test cases/Scripts.

● Conducted code reviews developed by my teammates before moving the code into QA. Tata Consultancy Services /Pune /December 2013 - March 2015 25/06/2020 Your Resume

https://ineedaresu.me/#/generate 5/6

● Provided support to develop the entire warehouse architecture and plan the ETL process.

● Modified existing mappings for enhancements of new business requirements.

● ●to Prepared migration document to move the mappings from development to testing and then to production repositories.

Environment: Environment:SQL Informatica PowerCenter 9.1.0, Oracle 11g, SQLServer2008, Unix, Autosys, SVN, Toad, SQL ddeevveellooppeerr

SOFTWARE SOFTWAREDEVELOPER ENGINEER / SOFTWARE ETL DEVELOPER Project ProjectDevelopment 1: EPH & Fiserv Development Description: Description:Enterprises - In this project, we were taking care of the entire Data Warehousing Process for Enterprises Problem Problemfor Handling & Fiserv application of Bank of America. As a team member, I was responsible for development developmentwithin and support. I was responsible for extracting data from various source systems within databases like SQL, ORACLE, TERADATA files in the form of flat files through Unix Scripts, SFTP, cleansing, and andconstructed applying various business rules in a staging area through Informatica mappings. Later, we constructed mappings mappingslike whereby we imported data from flat files, RDBMS, XML files and using various transformations like lookup (connected, unconnected), sorter, aggregator, router, filter, update strategy, sequence generator, joiner, joiner,and normalizer, stored procedure to manipulate the data. Finally, we loaded the data in the target and built an Enterprise Data Warehouse.

Responsibilities:

● Understand the Business Requirements, Existing System.

● Understand the OLTP database Data Model and OLAP database Dimensional Model.

● Extract the Data from Source to Staging the Staging to Load into Data Warehouse.

● Developed Informatica Mappings, Sessions and Workflows.

● Developed the Mapping to load the data into Fact tables.

● ●it Developed the Mapping to Load the data into Flat files from fact tables, Dimension tables and send it to oust side vendors.

● Develop and Maintain the Customer Address History by ETL code of SCD type 1 and SCD type 2.

● Performed Unit testing and Performance Tuning testing.

● ●PROD Efficiently created migration packages for migrating the code from DEV to SIT/UAT and PROD environments.

Environment: Environment:SQL Informatica PowerCenter 9.1.0, Oracle 11g, SQLServer2008, Unix, Autosys, SVN, Toad, SQL ddeevveellooppeerr

Project ProjectDevelopment 2: NM FPS Development

Description: Description:with - The project was a part of Northwestern Mutual (NM) Insurance Company, which deals with the theGeneral insurance business. This project linked the Sales, Accounts, Business Units, Payment Plans, General Ledgers, Ledgers,with Budget Ledgers, Policy benefits, Insurance revenues, Profit & Loss etc., data in accordance with the theInformatica inventory. In this project, we loaded the data in target systems SQL server DB. We developed Informatica mappings mappingstransformations whereby we imported data from flat files, data from Oracle DB. By using various transformations like lookup (connected, unconnected), sorter, aggregator, router, filter, update strategy, sequence generator, we wethe cleansed and applied business rules on data. We also used SQL and Lookup overrides to extract the desired desiredused data. Finally, we loaded the data to Data Warehouse and subsequently to Sales. We used Informatica is an ETL tool and SQL Server as the DB. The client awarded this project. Responsibilities:

● Worked with various internal teams to understand the existing systems and environments.

● Understand the Mapping Specification Documents and Business Rules.

● ●into Extract the Data from Source (Oracle Tables) to Staging Develop the Business logic and load into Inventory Data Mart.

● Developed Informatica Mappings, Sessions and Workflows using transformations (reusable & standalone).

● Designed the lookup transformation and Sequence generator transformations.

● ●Update Created ETL mappings including various transformations such as Source Qualifier, Lookup, Update Strategy, Expression and Stored Procedure, Filter, Router, Sequence Generator. Environment: Environment:SQL Informatica PowerCenter 9.1.0, Oracle 11g, SQLServer2008, Unix, Autosys, SVN, Toad, SQL ddeevveellooppeerr

TTRRAAIINNEEEE

Infosys Limited /Mysore, Bhuvaneshwar, and Pune /March 2011 - December 2013 General Electric /Bangalore /November 2010 - March 2011 25/06/2020 Your Resume

https://ineedaresu.me/#/generate 6/6

PPRROOJJEECCTTSS

TOOLS, TOOLS,LANGUAGES TECHNOLOGIES, AND LANGUAGES

● ●op Data Integration ETL Tools: Informatica PowerCenter 10.x/9.x (Expert), Ab initio GDE 3.0 & Co-op 3.0 (Intermediate)

● Databases: Oracle 12c/11g (Intermediate), Microsoft SQL Server 2012 (Intermediate), Teradata (Basic)

● ●Teradata Database Tools: Oracle SQL Developer, SQL Server, Teradata

● ●plus Load Utilities: SQL*plus

● ●SVN Version Control Tools: GIT, SVN

● ●Outlook MS Office Tools: Microsoft Word, Microsoft PowerPoint, Microsoft Excel, Microsoft Outlook

● ●FTP Scheduling tool: Autosys (worked on scheduling & monitoring through Unix), FileZilla 3.9 (FTP Solution)Solution)Crontab, Crontab

● ●One Infrastructure Tools: Jira, Version One

● ●SQL Languages: Unix, SQL

● ●Windows Environment: Linux, Unix, Windows

● ●Healthcare Domain Experience: Banking, Financial Services and Insurance (BFSI), Taxation, Telecom, Healthcare

● Methodologies: Scaled Agile/Scrum, SDLC (waterfall) Self-Study/Re-Skilling (Studying and Preparing for certification through Online Courses): -

● ●Kettle Data Integration ETL Tools: Talend Data Integration, Pentaho Kettle

● ●Data Data Quality and MDM Tools: Informatica Data Quality 9.6.1(IDQ) & MDM (Master Data Management)

● ●Erwin Data Modelling Tool: Erwin

● ●PowerBI Data Lineage and Data Visualization: Talend Open Studio, Tableau, QlikView, PowerBI

● ●Amazon Cloud Platforms: AWS, IAM, VPC, ROUTE53, Amazon Redshift, Amazon EC2, Amazon S3, Amazon RDS, RDS,SQS VPC, IAM, Elastic Load Balancing, Auto Scaling, CloudFront, CloudWatch, SNS, SES, SQS

● Data Science:

Random Machine Learning: Logistics Regression, Naive Bayes, Decision Tree, Random Forest, Forest,means KNN, Linear Regression, Lasso, Ridge, SVM, Regression Tree, K-means Point Analytics/Predictive modelling Tools: Alteryx, Knime, Statistica. Toad Data Point Seaborn Visualization BI Reporting Tools: Tableau, QlikView, Python - Matplotlib, Seaborn R Languages: Python, R

● Robotic Process Automation Tool and Technologies: Ui Path (Prepared 7 ROBOTS)

● ●Kubernetes DevOps: Docker, Kubernetes

AWARDS AND APPRECIATIONS:

● Pat on the back: Given for showing commitment and Technical Excellence in Project.

● ●in FSI Insta Award: Given for showing commitment and Technical Excellence along with hard work in the project.

● ●Capital BCM Certificate of Appreciation: Given for efforts to learn and implement Banking and Capital fundamentals in Project deliverable.

CERTIFICATION:

● ●2020 Certified RPA Advanced Developer - UiPath in Mar 2020

● Certified IBM Cognos 10 BI Data Warehouse Developer.

● Certified 448 TS MS SQL Server 2008 BI Development & Maintenance.

● Successfully cleared Infosys Training program of 5M includes more than 32 courses. EEDDUUCCAATTIIOONN

MASTER OF BUSINESS ADMINISTRATION (INFORMATION SYSTEMS) BACHELOR OF ENGINEERING (B.E.)

C L I C K T O E D I T

C L I C K T O E D I T

Annamalai /Chennai /2013 /70 GPA

Bhopal /Madhya Pradesh /2010 /74.97 ( Hons. by vice chancellor grace VCG) ) GPA



Contact this candidate