DEEPAK DHANIGOND
Cell: 416-***-****
https://www.linkedin.com/in/deepak-dhanigond-1134604/
adbakw@r.postjobfree.com
Objective :
Looking a challenging position as a Software engineer in a reputable company to enhance my ability and to develop my skills.
Highlights
Around 16 years of professional IT experience
Total 14 years in ETL Analysis, Design, Development, Testing and Implementation of various technologies like Ab initio, Data Warehousing Applications.
2 Years of Hadoop Admin Experience with responsibility of building Enterprise Hadoop Cluster HDP 2.X and day to day administration tasks
Gained Solid experience in the Analysis, Design, Development and Testing of Data warehousing
solutions and in developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Ab Initio.
In-depth knowledge of UNIX shell scripting.
Prepared complex SQL queries and worked on various database like DB2, Oracle
Expertise in working Agile mythology such as SCRUM using Rally, Dev-ops environment and
Waterfall mythology.
Experience in analyzing business specification documents, developing Test Plans, Test Strategy, Test Scope and defining Test Cases and Automating Test Scripts.
Extensively worked on both Manual and Automated Testing. Performed Smoke, Unit Test, Functional, System and Integration, UAT, Performance and Regression Testing.
Involved in complete life Cycle Implementation experience in SDLC & STLC: gap analysis, data mapping, writing specifications, design, development, testing, quality adherence, implementation, troubleshooting and customer support.
Having good domain skills in Banking, Logistics and retail.
Technical Skills:
ETL Tools
Ab initio
Hadoop Eco System
HDP 2.x, CDH 5.x, HDFS, MapReduce, Spark, Hive, HBase, Sqoop, Pig and Oozie
Databases / RDBMS
Oracle 8i/9i, Teradata 13.10
Languages
Sql, Pl/Sql, Shell Scripting
Operating System(s)
Sun Solaris, Suse Linux Windows XP, Windows Server, IBM AIX, CentOS
Schedulers
Control-M, Autosys, CA-7
Deployment Tools
Jenkins, Ansible
EXPERIENCE
NOV 2019 – TILL DATE
ETL DEVELOPER JP Morgan CHASE, COLUMBUS OHIO
POC BIA Data lake project is to measure the developer’s productivity which has close to 12000 developers in CCB (Consumer and Community Banking). The segment involved pulling data from JIRA and other tools (Team Central) to check the deliverables and story point calculation of the teams with drill down to Developer level
Responsibilities:
Requirement Gathering, Effort Estimation and Analysis impact.
Developed and Implemented extraction, transformation and loading the data using Ab Initio.
Work with the team of analysts, developers, testers and drive delivery of ETL Solutions.
Manage all aspects of a BI technical roadmap, including platform requirements
Design of KPIs
Worked with reports developer to understand their needs
Responsible for data definitions
Source system identification
APRIL 2018 – SEP 2019
HADOOP ADMINISTRATOR/PLATFORM SUPPORT, MANULIFE, TORONTO
Responsibilities:
Responsible for building Enterprise Hadoop Cluster HDP 2.X and day to day administration tasks
Co-ordinating with networking, infrastructure and development teams to ensure high availability
Cluster maintenance including adding new nodes to existing cluster and realigned server components between master nodes based on requirements
Add and remove nodes using Ambari
Expanded cluster capacity from 18 nodes to 24 nodes by bringing new node into cluster (Add node operations) removed problem nodes for diagnostic and remedying
Providing inputs to architecture team for capacity planning
Working with Hortonworks support team for high severity tickets
Collaborating with Enterprise application team to Install operating system and Hadoop updates, patches, version upgrades when required.
Driving initiative to automate the recurring manual activities for monitoring and operations using Unix Scripting.
Enabling High Availability for multiple server components.
Implemented Kerberos and AD security authentications.
Setup HDFS and Hive authorization using Ranger
Automated code promotions/deployments using Ansible
FEB 2017 – MARCH 2018
ANALYST / DEVELOPER, SCOTIABANK, TORONTO
As a Tech Lead for the project to HollisWealth Migration. The business is moving from Scotia to Industrial Alliance, HW and some of the associate applications may no longer be needed. There are some Upstream Data feeds to HW corporate database which are still required up until certain point to meet compliance requirements. Some downstream data extracts/reports are still required to run as per business requirements.
Responsibilities:
•Production support for DCDB, Migration support for Migration of Holliswealth to Industrial Alliance
•Manage all aspects of a BI technical roadmap, including platform requirements and support for end-user BI tools.
•Lead architect in development, design, execution, and implementation of complex dashboards and BI self-service solutions.
•Worked on UNIX, Shell scripting in development and support of various existing shell scripts.
•Work in conjunction on multiple use cases involving Hive/Hbase/HDFS/Kafka
•Partner with internal teams and vendor partners to provide technical leadership in order to develop and support effective BI solutions, performance tuning and reports
JUL 2013 – JAN 2017
ETL LEAD, BANK OF MONTREAL, TORONTO
Managed all ETL deliverables from the IBM Global resourcing team guiding onshore DEV leads, offshore DEV leads and supporting BAs in the work stream providing Business Analysis Support, Development and Production Deployment plan and reporting to client stakeholders. Work segment is in ETL part of Data Integration of Banks various lines of business from Mortgage, Everyday Banking, Lines of Credits, Investments, Loan Origination and supporting BASEL components that includes Bank Wide Asset reconciliation.
Responsibilities:
•Created Ab initio Graphs, Ab initio code review tuned them for better performance.
•Worked on UNIX, Shell scripting in development and support of various existing shell scripts.
•Created Naming conventions document for all the components
•Acted as team lead and key point of contact for the business for any production issues
•Managing any production issues raised and follow up with business and other vendors through to final resolution.
•Worked on UNIX, Shell scripting in development and support of various existing shell scripts.
•Created Naming conventions document for all the components
•Created the UTC documents
•Worked with Ab Initio Multi File Systems (MFS) and complex ETL transformation
•Database Design, tuning, configuration Management.
•Created and coded functional and business constraints, packages and triggers
NOV 2011 – JUL 2013
ETL DEVELOPER/SEMANTIC MODELER, TERADATA CORPORATION, TORONTO
Responsibilities:
•Information Strategy Road Map
• Data Acquisition & Environmental Review
• Deliver a Roadmap
• Co-ordination of all the teams
• Involved in Semantic layer design
•Performance tuning BI applications
•Worked with reports developer to understand their needs of performance for report generation
•Support for Production Implementation
•Responsible for data definitions
•Source system identification
•Worked with end users to define business requirements and needs, performing data validation
•Wrote Business rules mappings for ETL team for coding
(THE CLIENTS FOR THE ABOVE ROLE WERE, LOBLAWS, WIND MOBILE AND MCCAIN FOODS)
JUL 2010 – NOV 2011
SENIOR ABINITIO DEVELOPER, AMERICAN EXPRESS, SALT LAKE CITY, USA
This project takes input from TSYS US which is card processor platform used for authorizations and settlements of credit card data. Data that has passed through Amex, Master Card and VISA from merchants is processed here. Data is passed 5 days a week to CPS for additional processing and reporting. TSYS Europe is the card processor platform used for authorizations and settlements of credit card data. Data that has passed through MC and VISA from merchants is processed here. Data is passed 5 days a week to GE CPS for additional processing and reporting.
Responsibilities:
•Analyze and understand complex business Requirements from High-level documents
•Estimate the effort for enhancement and development activities based on the complexity.
•Create low-level design documents based on the HLD understanding and develop Ab Initio
•Graphs based on the requirements.
•Create test plans and capture test results and review the results with business.
•Periodically conduct code reviews to ensure that it meets the standards and best practices.
•Highlight issues, concerns etc. at the earliest and do defect prevention/resolution activities.
•Co-ordinate with the offshore development team and provide guidance to junior members of
•the team.
•Involved with Performance tuning to decrease the run times.
•Provided L1-L3 support to all the applications and handled the change management and
•incident ticket resolution.
JUL 2009 – JUL 2010
SENIOR ABINITIO DEVELOPER, FEDEX, MEMPHIS, USA
This project provides the processes, the engine to uniquely identify a business at a location or individuals at their business location that interact with FedEx at any level through touch points. The project covers the US-inbound, US-outbound shipping data. This helps the marketing/retail/sales and other business users to identify unique individuals and businesses that interact with FedEx without account numbers. Helps functional areas such as Sales, Marketing, Retail understand the various roles such as Shipper, Payer and Consignee
Responsibilities:
•Analyze and understand complex business Requirements from High-level documents
•Estimate the effort for enhancement and development activities based on the complexity.
•Create low-level design documents based on the HLD understanding and develop Ab Initio
•Developed and Implemented extraction, transformation and loading the data from the legacy
•systems using Ab Initio
•Created detail data flows with the source to target mappings and convert data requirements
•into low-level design templates.
•Responsible for data cleansing from source systems using Ab Initio Components such as
•join,dedup sorted, Denormalize, Normalize, Reformat, Filter by the expression, Rollup
•Worked with Partition Components like Partition by key, Partition by Expression, Partition by Round Robin to partition the data from a serial file.
•Developed Generic graphs for data cleansing, data validation, and data transformation.
•Involved in System and Integration testing of the project.
•Tuning of Ab Initio graphs for better performance.
•Used phases/checkpoints to avoid deadlocks to improve efficiency.
MAR 2008 – JUL 2009
SENIOR ABINITIO DEVELOPER, BANK OF MEXICO, MEXICO CITY, MEXICO
EPM management reporting is the project being done for Banamex, one of the leading bank of Mexico. Alfa GL provides GL Ledger Balances as a file feed required for the EPM management reporting purposes and warehouse EDW. An Ab Initio interface reads the ledger balances, translates the balance data and transforms it into formats suitable for loading into the EPM GL Ledger table in the Financial Warehouse (FW). As part of the transformation process, balances are transformed from a horizontal to a vertical format, to facilitate management reporting.
Responsibilities:
•Worked closely with the Business Analysts and application leads for requirement study, analysis and design of the application.
•Created low-level design documents for the various application process
•Developed Abinitio graphs that transfer data from flat files to Teradata data warehouse
•Extensively worked on most commonly used components to complex components like scan, Roll up, Assign keys, Partition, Departition, Sort, Join, Reformat Components etc.
•Used Ab Initio Multifile systems for improving heavy data loads
•Migration of Abinitio graphs according to the new specifications. Responsible for ad-hoc code changes for migrations
•Involved in end to end data validation by interacting with Business (UAT)
•Responsible as a delivery lead for an offshore team
JUNE 2006 – MAR 2008
ABINITIO DEVELOPER, BANK OF MONTREAL, TORONTO
The New Basel Capital Accord (NBCA) initiative is a complex, multi-year, enterprise-wide program to design and implement business processes and supporting IT systems required to ensure compliance with revised capital adequacy regulations based on the NBCA. While meeting the Office of the Superintendent of Financial Institutions (OSFI) minimum reporting requirements is mandatory, it is critical that additional functionality is provided to ensure that OSFI’s Minimum AIRB Requirements are met and that all related processes are successfully integrated into the Bank’s processes.
Responsibilities:
•Created Ab initio Graphs, Ab initio code review tuned them for better performance.
•Worked on UNIX, Shell scripting in development and support of various existing shell scripts. • Created Naming conventions document for all the components
•Created the UTC documents
•Worked with Ab Initio Multi File Systems (MFS) and complex ETL transformation
FEB 2005 – JUNE 2006
ABINITIO DEVELOPER, WIPRO TECHNOLOGIES, BENGALURU / CHENNAI
This project was for Capital One Canada. I was a part of the team which was responsible for extraction, transformation and loading of data. Present system was using PL/SQL codes and Oracle as database. Main aim of the project was to convert all the PL/SQL codes to Ab initio Scripts and load the target which was Teradata. The project deals with credit cards holders with insurance and has deceased.
Responsibilities:
•Created Ab initio Graphs, Ab initio code review tuned them for better performance.
•Worked on UNIX, Shell scripting in development and support of various existing shell scripts. • Created Naming conventions document for all the components
•Created the UTC documents
•Worked with Ab Initio Multi File Systems (MFS) and complex ETL transformation
NOV 2003 – FEB 2005
ABINITIO DEVELOPER / PROD SUPPORT, TESCO, LONDON, UK
This project was for Samsung Tesco in Korea. There are around 36 Tesco stores in Korea from which daily transaction data come to UK in the form of flat files at the end of each day. This data is extracted and loaded on to the data warehouse in UK the database being Teradata, with ETL tool Ab initio. The .ksh files of Ab initio are scheduled through the CA-7 scheduler (Main Frames)
Responsibilities:
•Production Support
•Enhancement to current Ab initio code for any fixes and performance tuning
•Worked on UNIX, Shell scripting in development and support of various existing shell scripts.
•Dealing with database errors
•Resolving network issues
•Generating reports at the end of the batch using Business Objects
EDUCATION
APRIL 2002
MASTER’S IN COMPUTER APPLICATIONS, pes college of engg, mandya