Ajinkya U Bhole
Contact: *********@*****.**.** +91-880*******
Career Summary
• A focused and dedicated IT professional with over 9+ years of experience in AWS Cloud, ETL Development, Snowflake, Data Visualization and DevOps Engineering with great exposure in Informatica PowerCenter, Shell Scripting, Python, Oracle Database and Control M.
• Hands-on experience in working with technologies like AWS, Informatica PowerCenter, IICS, Agile, Scrum & Waterfall Methodologies and Tools.
• Excellence in AWS, Snowflake and ETL development by analyzing requirements & providing work effort estimates, identifying & implementing corrective measures based on gathered metrics.
• Developed mappings in Informatica and AWS glue jobs to load the data from various sources into target using different transformations.
• Worked in snowflake for 6 months dealing with data storage, processing, and analytic solutions that are faster, easier to use, and far more flexible than traditional offerings
• Used the latest data and visualization tools and software to provide the business with insights on customers and make recommendations for improvement.
• Possess technical background and assisted in ensuring a suitable & successful delivery of agreed cloud services.
• Capable to created, maintain, and evolve an AWS Cloud Infrastructure for running applications.
• Extensive exposure of managing data masking & data quality projects and identifying & developing processes using Unix Shell Scripting and Python with skills in creating designs & plans for the integration for all data warehouse technical components.
• Interfaced with clients to identify business problems and develop solutions aligned with business objectives.
• Skilled in the creation & maintenance of Oracle and Snowflake database and standards following best practices.
• A keen analyst, with a flair for adapting quickly to dynamic business environments and devise business strategy, processes, services, & roadmap.
Professional Experience
● Currently Working with Capgemini Technology Services India Limited since Oct 2021.
● Working with Accenture Solutions Pvt. Ltd. since June 2016 till Oct 2021.
● Worked with Syntel Pvt. Ltd. since March 2014 till June 2016. Trainings/Certifications
● PG in Big data engineering by BITS.
● AWS Certified Developer - Associate Certification Tools and technology
● AWS Cloud: S3, Athena, Glue, Lambda, Airflow
● ETL Tools: Informatica: 9.X, 10.X, Informatica IICS
● Languages: SQL, PL/SQL, Python, Spark, Unix
● Job scheduling: AutoSys: JIL, Control-M
● Databases: SQL server 2014, Oracle, Teradata, Netezza, Snowflake
● BI Tools: Tableau
● Bigdata Tools: Pig, Hive, Sqoop
● DevOps tools: JIRA, Jenkins, Bit-Bucket, Stash, Confluence, SharePoint Projects and assignments
Feb 2023 – till date
● Project – BI AH Engage Analytics
● Job Location – Offshore (Pune)
Engage analytics project is about building an ecosystem platform for animal health commercial business for data, reporting, enabling self- service and transition from legacy data warehouses. Tools/Technology
AWS Lambda, AWS glue, S3 bucket, Airflow, Snowflake Responsibilities:
• Working as Solution Arch./ Technical Lead to understand the Business and technical requirement of AWS code development.
• Creating Data Engineering roadmap for team.
• Manage and mentor Data Engineering team.
• Communicate and gather stories, design and manage the implementation of Data Engineering pipelines.
• Connecting with source systems’ owners/ PoCs for gathering information on the systems, cataloguing as part of Data profiling.
• Responsible for requirement gathering, design and implementation of Data Engineering pipelines.
• Design and develop reports for BA team as well as Data Science teams.
• Modelling of DWH and Data Lake.
• Working on ingesting the data in snowflake from AWS S3 via Glue/snow pipe.
• Worked on SQL and SP in snowflake.
• Handled technical training for freshers as part of on-boarding. Oct 2021 – Jan-2023
● Project – BI Medex
● Job Location – Offshore (Pune)
We are working on total health care management system, which manages the total patient information maintained in three different modules. Patient Information, Accounts Information, Monitoring. The Patient information module contains the patient details, up to date health information. In the Accounts Information module day-to-day bill settlements will be entered into the online system. In Monitoring, it monitors from the day one to discharge date of the patient and Case Sheet will be maintained which is confidential. Tools/Technology
Informatica 10.x, Oracle, Control-M, Jira, Tableau Responsibilities:
• Worked as Offshore Technical Lead along with Scrum Master and managing the team of 14 members. Responsible for all scrum activities.
• Working with Solution Arch. to understand the Business and technical requirement of ETL development
• Have ownership of the client requirements, deliverables, and accountabilities to ensure adherence to Architecture Governance processes, industry best practices and to maintain consistency with clients’ Architecture vision
• Drive the solution development and documentation of solution designs ensuring good architectural practices are observed through the lifecycle of the solution development.
• Provide oversight of architectural direction for and on behalf of our clients to the team.
• Working with agile development teams and delivery architects through all phases of the solution lifecycle.
• Take ownership of the solutions and be the face off to our internal and external senior stakeholders.
• Develop excellent working relationships with our clients to become their trusted advisors.
• Relationship management of the technical relationships with vendors, third parties and the wider Capgemini.
• Ensure solution delivery is performed according to agreed specification.
• Own technical problem management and resolution relevant to the solution development. Aug 2018 - Oct 2021
● Project – EDW-Integrated Data Layer
● Job Location – Pune/Dublin
● Company name: Accenture Services Pvt Ltd.
EDW - Integrated Data Layer is the centralized data warehouse for the bank aimed at enterprise-wide data integration for enabling global wide reporting and analytics capabilities. Our Development, enhancement and maintenance efforts were mainly to achieve this goal. Tools/Technology
Informatica9.6, Informatica IICS, Teradata, Unix, Python, Control-M, Jira, Bit-Bucket, Jenkins Responsibilities:
• Worked as Offshore Technical Team Lead and managing the team of 8 members.
• Working with Business Analyst to understand the Business and technical requirement of ETL development
• Evaluating the effectiveness and accuracy of new data sources and providing timely communications on significant issues and developments
• Collaborating cross-functionally and implemented analytics and visualization components for device data analysis platform
• Deploying product updates as required while implementing integrations as and when they arise
• Specifying, documenting and developing new product features, and writing shell scripts
• Coordinating with the Project Management Teams to monitor progress and implement the initiatives
• Suggesting technology-based solutions for enhancing functional efficiency and achieving business excellence and managing risks and planning for contingencies to ensure minimal effect on deliverables
• Loading data from various data sources and reviewing / suggesting suitable improvements in data models; enforcing data modelling standards, best practices in the design
• Developing process-automation tools, maintaining program modules including operational support, problem resolution, production support, preventative and corrective maintenance and enhancements
• Interacting with team members to ensure smooth progress of project work, ensuring adherence to quality norms throughout the implementation process
• Working on Big Data technologies for recent requirement. June 2016 - Aug 2018
● Project - CIM
● Job Location - Offshore (Pune)
● Company name: Accenture Services Pvt Ltd.
Commercial Information Management [CIM] platform for Customer CRM application. The objective of project to develop the Data Integration layer model to integrate data from disparate sources, both external and internal systems, in a configurable, scalable, consistent and reusability manner. Another objective of project to develop consistent code across CIM platform also that with verify few customizations, code should be reused to across all courtiers where client is operative. Tools/Technology
Informatica9.6, Informatica10.1, Oracle 12C, Unix, Autosys, Jira, Stash Responsibilities:
• Involved in functional study of the application and interact with client to understand Tech Spec.
• Developing new mappings, sessions, workflow, stored procure, Unix shell scripting and implementing business logic to meet customer requirement and satisfaction.
• Understanding and modifying the existing mapping, Sybase Stored Procedure, Unix Shell Scripting etc.
• Scheduling job (Informatica Wf, Unix Shell Script, file watcher) using the tool called Autosys.
• Create Test Case, ETL design specifications and Developed ETL code documentation.
• Interactions with Business Analyst and Dev teams to resolve issue.
• Also working on any production issue on priority.
• Running the Jobs/workflows for ETL process.
• Prepared and ran SQL queries to verify Dimensional and Fact tables.
• Created /update scripts for recasting the old structure into new for Business Recast March 2014 - June 2016
● Project - BIDW
● Job Location - Offshore (Pune)
● Company name: Syntel Pvt Ltd.
Client is an American multinational mutual insurance company that provides financial services to cooperatives, credit unions, their members, and other customers worldwide. Our client sells commercial and consumer insurance and protection products. Product Manager Component
(PMC) deals with product insurance related information i.e. All the information related to insurance, this information is stored in database as Product information with a certain predefined hierarchy, all the attributes of a card are stored as Features or Groups in PMC database. Along with PMC Database we also store product information in CRT database known as Common Reference Tables Database the most critical part of PMC. CRT database is a common product repository for all the client's platforms. Tools/Technology
Informatica 10.1, SQL Server 2014, UNIX, Control M Responsibilities:
• Understanding of existing Business process, project functional and technical specifications.
• Developed code using shell Scripts, Informatica mappings, workflow and sessions as per ETL specifications
• Checked and analyzed data quality and interacted with business analysts and clients on regular basis
• Created the unit test cases for scripts/mappings developed and analyzed the data
• Conducted code review and performance tuning to ensure quality and adherence to the compliance
• Created System Design Specifications (SDS), Performance Metrics and Deployment Documents
• Scheduled data loads through Control-M and maintained report relevance Scheduled Informatica jobs Educational Qualification
Post-Graduation Aug’18-Aug’19
Birla Institute of Technology and Science (BITS) - Pilani Big data engineering
Post-Graduation Diploma Aug’13-Feb’14
Centre for Development of Advanced Computing (CDAC) – Bengaluru System Software Development
Graduation (BE) Jun’09-July’13
Ramrao Adik Institute of Technology (RAIT) – Mumbai Personal Details
Sex
Marital Status
Date of Birth
: Male
: Single
: 10th November, 1991