Post Job Free

Resume

Sign in

Data Integration Lead Developer

Location:
San Antonio, TX
Posted:
October 15, 2023

Contact this candidate

Resume:

Ravi Adulla

San Antonio, TX 210-***-**** ad0d3v@r.postjobfree.com Ravi Adulla LinkedIn

PROFESSIONAL SUMMARY:

** ***** ** **** ** experience in areas of Data Warehousing, Data Integration/Migration ETL Projects. Experienced in Agile Methodology while managing assignments for a global multi-national client in various domains of Banking, financial services, and insurance (BFSI) sectors. As an Informatica Lead developer involved in requirements gathering, analysis, development, testing, deployment, monitoring, and ongoing support of data extracts, imports, and Data Warehouse transformations and modeling for business.

TECHNICAL EXPERTISE:

Informatica Tools:

Informatica Power Center (PC) 10.5/10.2/9.5/9.1 Informatica Intelligent Cloud Services (IICS/IDMC) Informatica Master Data Management (MDM) Informatica Data Quality (IDQ) Informatica Power exchange CDC

Databases:

On-Premises Databases: Oracle, Netezza, and Flat files (Delimited and Fixed width) and DAT, CSV, and XML/JSON Files

Cloud Databases: AWS-S3, Snowflake, Salesforce

Languages: SQL/PLSQL, Snow SQL, Unix/Linux Shell Scripting, knowledge on Python

DBMS Tools: Toad, SQL Developer, Squirrel, DBeaver

Scheduling Tools: Control-M Schedular, Informatica Scheduler

Version Control System: GitHub code Migration Tool - CI/CD pipelines

Project management tools: Jira and Rally

Operating Systems: Windows XP/NT/2000/10/12

Microsoft Tools: Notepad ++, Snipping, Beyond Compare

WORK HISTORY:

Working With Tata Consultancy Services Limited (TCS) from Sept-2017- till date (Deputed Onsite San Antonio, Texas, USA, starting from April-2022)

Worked With Capgemini Technology Services India Limited, Bangalore/Hyderabad, India from May’2014 –Jul’2017.

Worked With Accenture Services Pvt. Ltd (Under the payroll of Datamatics Global Services Ltd) in Bangalore, India, from Dec’2012 to Mar’2014.

Worked With Acme Tele Power Ltd (Under the payroll of Acme, MaFoi Randstad, Adecco, TDS) in Hyderabad, India from Jul’2008 to Jan’2012.

EDUCATION:

Bachelor of Technology from the TKR College of Engineering and Technology affiliated by Jawaharlal Nehru Technological University, Hyderabad, India July 2005 to April 2008.

Project: 1

Project Name: Compliance and Regulatory (CNR) Modernization

Client Name: United Services Automobile Association (USAA), San Antonio, Texas, USA.

Technologies: Informatica IDMC/IICS, Snowflake Cloud, Unix, BMC Control-M 9.0, Gitlab, Jira

Role: Informatica Cloud Developer

Duration: Apr’2022 – Present

Roles and Responsibilities:

Experienced in the Agile software development lifecycle Cycle and underlying processes like Planning, Desing, Development, and Deployment. used JIRA tool for tracking stories.

Worked with business users to understand their data needs, gather Business requirements, technical design, and Technical Specs documents, analyzed the scope of the work, asked the clarifications, acceptance criteria and documented the processes as part of the Lift and Shift project.

Attended Daily stand-ups and report to Scrum Master as per the stories assigned in Jira. If we have any impediments, we discuss it in parking lot with tech leaders and get the approval for next iteration.

Involved in Program increment (PI) planning and Iteration planning besides scrum master and the Tech leads. and involved in capacity planning and assign stories to team members as per the priority.

Involved in retrospective meetings with Product Owner and Tech leads, updated in Mural board them about what we went well, what didn’t well, and why and raised the impediments if any.

Good knowledge of P&C insurance Domain, Guidewire Conversion or Integration Policy Center, Claim Center, Billing Center

Analyzed and interpreted the legacy existing Informatica power center mappings business logic and their dependency in stat and regulatory reports like PASP, PLSP, MCAR. TXQD etc.

Prepared Source to Target(S2T) design documents, by taking input from the Tech leads and get them reviewed for starting the development.

Rewritten and optimized Financial prem and loss source queries by using Snow SQL, joined with upstream databases like Policy, and Claims. replicated the same functionality as per the existing PAS (Policy Accounting System) hosted on premise Netezza database.

Experienced with columnar databases like Snowflake. Optimized the existing informatica mappings logic converting into snow SQL queries WITH CTE functions.

Knowledge about Informatica Cloud Solution Architecture, concepts like Secure Agent and Connectivity and local systems.

Exposure on Amazon web services (AWS) and cloud computing and experienced in configuration and using AWS-S3 V2, Snowflake, Salesforce, Flat files connectors in IICS connections. Extracted and loaded file data in S3 bucket.

Created asset like mapping canvas by using Informatica cloud (IICS/IDMC) in Cloud Data Integration (CDI) by using different types of transformations like Expression, filter, router, aggregator, connected/unconnected lookup, joiner, Union, Sequence generator, Sorter, transaction control transformation, Normalizer as per the business logic.

Experienced in configuring REST API and Rest V2 connector by using Swagger files creation by using business services, webservices transformation. Exposure on using Postman tool for design, build, share, test runs for MT and Task flows running, and for getting https request and response.

Configure Input and Output Marcos in expression transformation to cleanse source file files for repetitive and complex mapping.

Configured IN Parameters for Source, Target, Source Object, and Target. Implemented Dynamic File name for creating new targets at runtime. Used IN/OUT Parameter for passing one MCT output to another.

Designed Mapping Configuration Task, Power Center task, Synchronization Task, and Replication Task, and Dynamic mapping task.

Configured and designed Mass Ingestion Task on Database (NZ) for initial and incremental loads and Files by using File Lister.

Experienced in extracting and transforming complicated data from XML and JSON formats by using Structure Parser (ISM) and Hierarchy Parser (Hierarchy Schema) transformations.

Created taskflows for Orchestration of framework to control Data Tasks by using Assignment task, Decisions task, Notification task, Command task, File Watch Task, Sub taskflows, Sequential Tasks, Parallel task, and Linear tasks along with the Jump, Throw, Wait.

Prepared test cases and Unit testing documents as per the ETL logic and executed them and captured the actual results and expected results.

Performed unit testing for the tasks developed by me and done peer review for the mapping developed by team members.

Performed reconciliation on Netezza final reports and Snowflake final report formatter files. Done regression testing on modified code if nay.

Collaborated and interacted with distributed team members and customers and raised impediments with upstream teams regarding if there was any data discrepancy in their tables.

Involved in UAT testing among the business users analyzed, and fixed on business concerns, and got the sign-off if any.

Performance tuning informatica jobs at mapping level and session level by eliminating performance bottlenecks by using Key Range, Fixed Partition and Push down optimization (PDO) techniques in IICS.

Involved in Unix Shell scripting and Configured E3 Setup in Unix for all the Informatica, Recon, and FTP jobs to B2B servers.

Knowledge of Python scripts like reading and understanding as part of ETL.

Designed Control-M jobs for all the Informatica, Recon, and FTP jobs to B2B servers as per the Scheduling requirements like weekly, monthly schedule.

Migrated informatica components, Snowflake tables, Unix components, and Control-m cycles by using GitHub deployment tool and creating local branches from master branch, created merge requests, and executed CI/CD pipelines with proper approvals on change request.

Demonstrated JIRA stories at the end of every sprint and get closed by Product owner.

Provided KT to the support team and provided production support during the warranty period.

Excellent analytical, organization, presentation, and facilitation skills; ability to handle multiple tasks under tight deadlines.

Project: 2

Project Name: Statistics and Regulators Reporting

Client Name: United Services Automobile Association (USAA), Hyderabad, India

Technologies: Informatica 10.2, Netezza, Unix, BMC Control-M 8.0, Jira

Role: Informatica Power Center Developer

Duration: Jan-2019 – Mar’2022

Worked on enhancement requirements and made changes in Netezza SQL queries as per the requirements on existing STAT mappings for Premium and Auto reports.

Designed informatica mappings as part of new requirements and deployed the code to prod after regression testing and rerun from Control-M for required reports.

Worked with Informatica Designer Components - Source Analyzer, target Designer, Transformation Developer, Mapplet and Mapping Designer.

Created mappings by using transformations such as Source Qualifier, filter, router, aggregator, lookup, source qualifier, joiner, Union, Sequence generator, stored procedure etc.

Designed Informatica workflow like Sessions, Command tasks such as Email, Assignment, Control, Event wait, Decision Tasks etc.

Designed and prepared the test plans and scripts for Unit testing as per the requirements.

Performed unit testing for the mappings developed by me and peer review was done for the mapping developed by my team members.

Strong experienced in debugging and analyzing session log files on prod failures.

Compared the target results before and after change done on Netezza final files and shared with Client’s Tech lead for review.

Experienced in improving performance of the existing jobs by using partitioning and push down optimization methods.

Analyzed and raised impediments with upstream teams regarding data discrepancy in their tables.

Good knowledge of PLSQL components like cursors, procedures, functions, Triggers, packages.

Involved in Unix Shell scripting related to Informatica job execution by using PMCMD Command.

Written Unix Schell scripts for FTP process, from the remote server to Informatica servers.

Handled Param Files and Source files in Unix by using Vi Commands and others.

Monitored Control-M jobs for Informatica, Recon, and FTP jobs to B2B servers as per schedule.

Analyzed and found the root cause of the production issues that was reported by the business and Support team and helped them to fix the prod issues.

Analyzed at Level 3 and found the root cause of the production issues that was reported by the business and Support team and helped them to fix the prod issues.

Project: 3

Project Name: Data Management

Client Name: Wells Fargo, Hyderabad, India.

Technologies: Informatica MDM 10.0, Informatica IDQ, Oracle-SQL, Unix, BMC Control-M

Role: Informatica MDM Developer

Duration: Sept-2017 to Dec-2018

Roles and Responsibilities:

Worked with Solution Architect and Lead to analyze business requirements and derive technical solutions.

Created document as per the MDM Party data model, source system definition, data mapping, and cleansing requirements, trust score, and matching rule definitions.

Created landing tables and Base Objects tables according to client requirements.

Defined source systems to create staging tables and relationships defined between the Base Objects.

Created mappings by using cleanse functions and configured delta detection in the staging process.

Defined the Trust and Validation rules to get the right master records.

Created Match and merge rule sets in for the base objects by defining the Match Path components, Match columns, and rules.

Configured tokenization process in base objects to generate tokens for match processing.

Created match rules like a Fuzzy match for finding the probabilistic match and an exact match for finding the deterministic match for the records in Base Objects.

Configured match process by using match rules to consolidate matched records.

Created queries and packages to display and update data in Base Objects.

Triggered Match process, and Merge process, and analyzed various infrastructure tables like VCT, VXR, STRP, MTCH, XREF, DRTY, History tables etc.

Configured Batch group to execute Stage, Load, Tokenization, Match, and Merge processes as per the dependency.

Worked on Informatica IDQ for different types of data profiling, data standardization, Data rules to find data animalities by using address doctor and other advanced transformation.

Knowledge of User Exit interface, SIF API, IDD, and ActiveVOS.

Project: 4

Project Name: ETL 2.O (WSS, ATOM)

Client: General Electric (GE), Bangalore, India

Technologies: Informatica PC 9.5, Informatica Power Exchange CDC, Oracle 11g, Rally

Role: Informatica ETL Developer

Duration: May’2014 – Jul’2017

Role and Responsibilities:

Good knowledge in Trading and large investment Banking technologies like ATOM and WSS applications.

Experienced with Informatica Designer Components - Source Analyzer, target Designer, Transformation Developer, Mapplet and Mapping Designer.

Experienced in using transformations such as customized source qualifier, Expression, filter, router, aggregator, lookup, source qualifier, joiner, Union, Sequence generator, stored procedure, Normalizer, Sorter, Update strategy.

Experienced with Informatica workflow Components like Sessions, Command tasks such as Email, Assignment, Control, Event wait, Decision tasks.

Prepared Source system documents and by using existing Informatica mapping analysis documents.

Preparation of DMS (Data Mapping Sheet) documents based on the flow of the existing mapping.

Preparation of ETL LLD (Low level design) documents and based on HLD and briefing from Module Lead. Source query prepared as per the business logic and used in customized source qualifier.

Created the capture registrations/Data maps on source oracle tables Informatica Power Exchange CDC for application integration modules for real-time data integrations. Imported CDC data maps into Informatica power center as application source qualifier.

Developed Informatica mappings for loading Incrementally (Delta Load) stage tables by using different transformations for cleansing data.

Worked with different modeling and understood data warehousing concepts like dimensional modeling (Star schema and Snowflake schema).

Developed one-to-one ETL mappings from source to staging, SCD Type-1(SCD1) mappings for Integration Layer,

Developed SCD Type-2(SCD2) mappings for Dimension & Fact tables in enterprise data warehouse (EDW) for OLAP used by analytics reporting teams.

Followed coding standards and best practices to develop code and check code quality Share developed code with supervisor for review.

Preparation of test cases and as per the requirement documents and executed them.

Performed parallel testing between existing output data and new data for mapping.

Analyzed the discrepancies in reconciliation process and fixed them.

Preparation of Deployment plan to deploy the code to higher environments.

Collaborated with other team members and customers during stand-up meetings.

Involved in Unix Shell scripting related to Informatica job execution scripting and for FTP process.

Project: 5

Project Name: Black Box

Client Name: Cisco Systems, Bangalore, India

Technologies: Informatica 9.1, Oracle10g, Salesforce, Dollar U Scheduler

Role: Informatica Support Role

Duration: Dec’2012 to Mar’2014

Roles and Responsibilities:

Worked for informatica support project using Informatica power center, PL-SQL, Unix, Dollar U Scheduling Skills.

Well versed with all Informatica Client Components (PowerCenter Designer, Workflow Manager, Workflow Monitor, Repository Manager).

Experienced in Informatica debug mode to triage and analyze session log for production errors.

Worked with support team and ensured that the application is running as per the SLA and Metrics.

Communicated with stakeholders in case of delays or SLA breaches.

Responsible for 24x7 production system support, including off hours and weekend 'on call' production support responsibilities.

Support weekly and monthly customer report processing and monitored daily system health check.

Involved weekly incident/bug review meetings with application team and Issue Tracking for prod issues.

Managed incident life cycle like Remedy tool, drive those to permanent resolutions, implement preventive measures to avoid repetitive issues.

Debugged, troubleshot, and conducted root cause analysis, and resolved production problems and data issues within a given SLA.

Followed up with third party vendors for issue resolution and prevention.

Support release of deliverables into production and perform on demand meetings with development team supporting issues encountered due to production changes.

Performed custom, ad-hoc data analysis and reporting based on stakeholder requests within SLA.

Good knowledge on PLSQL components like cursors, procedures, Functions, Triggers, Packages, SQL loader, and bulk binds.



Contact this candidate