Post Job Free
Sign in

Business Intelligence Sql Developer

Location:
Irving, TX, 75063
Posted:
April 08, 2025

Contact this candidate

Resume:

Pasha Mohammed

Ph# 913-***-****

Email: *******.****@*****.***

Sr. ETL Developer, SQL PL/SQL Developer

Summary

Data Warehousing and Business Intelligence professional with 18 years of experience in IT industry worked on industry leading tools and technologies using Informatica PowerCenter, Power exchange, Snowflake, IICS, Informatica IDMC, Oracle SQL, PL/SQL, and Unix Shell Scripting.

Used OBIEE (Oracle Business Intelligence Enterprise Edition) tool to gather, store, and visualize enterprise data, enabling organizations to collect, present, and deliver data to decision-makers in a timely and understandable manner.

Create DAC (Data Access Control) which acts as a framework to run Informatica ETL routines using pmcmd commands. Involved in migration of DAC to lower environments to higher environments.

Good exposure in Informatica IDMC which is used for Data Integration, Data Catalog, Data quality, Data Governance, API, Data preparation and Data marketplace for the cloud environments.

Implemented a CI/CD pipeline for a web application using Jenkins and Git.

Good experience in developing python scripts which are used to call in IICS preprocessing or post processing commands based on requirement

Good understanding with JavaScript’s, XML, Java basics, SQL, MDM (Master Data Management) Concepts.

Performed Informatica Admin role, involved in user management, production and lower environment code deployments, monitoring the workflows, recovery and re-start workflows in case of any failures.

Created ETL mappings in IICS Informatica cloud using mapping assets for supporting the business decisions.

Develop procedures, functions and packages using Oracle PL/SQL and MS SQL server

Expert in transforming complex business logic into Database design and implemented by creating Stored Procedures, Views, and T- SQL Scripting.

Performance Tuning of Informatica mappings and SQL queries.

Experienced in using Informatica Admin console for creating repository services, recycling services, creating users and assigning privileges and permissions.

Technical Skills

ETL

Informatica Intelligent Cloud Services (IICS),

Informatica PowerCenter 10.2 and 10.5,

Informatica Power Exchange,

Snowflake,

Informatica IDMC,

OBIEE/OBIA Business Intelligence tools,

DAC (Data Access Control) and

Python programming techniques

Databases

Oracle 14g/13g,

MS SQL Server 2008/2005/2000,

Postgress and

IBM DB2

Programming

SQL, PL/SQL, Unix Shell scripting

Data modelling

Erwin Tool

Operating Systems

UNIX, Windows, MS – DOS

Version Control

Git, GitLab, Azure Git

Project Management

JIRA

Cloud

AWS, Azure and Google Cloud

Education

B. Tech (Bachelor of Technology) from Jawaharlal Nehru Technological University, Hyderabad, India

Experience Details:

Organization

Designation

Duration

CGI Technologies And Solutions Inc.

Sr. Informatica Developer

Jun/2022 - Till date

Mastech Digital Systems Inc., USA

Sr. Informatica Developer

Feb/2021 – Jun/2022

Tech Mahindra Limited, USA

ETL Lead Developer

July/2012 - Feb/2021

Mphasis an HP Company

Software Developer

Jan/2007-Jul/2012

Professional Experience

Client : PNC Bank Feb 2021 – Till date, Pittsburgh, PA

Project : Rewards Agility Loyalty Role: Sr. Informatica Developer

Project Details:

This is a technology modernization project for one of the leading banks in the US, which aims to modernize the current Rewards platform (legacy) to a modern Microservices based system using container-based platform & Micron CI/CD pipeline. The system aims to enhance customer experience by allowing rewards enrolment & redemptions into various program to be real-time with new Vendor’s Agility platform.

Responsibilities:

Design and develop Informatica ETL mappings and workflows to read the data from Mainframe source system and generate AL files for Epsilon (Third party vendor)

Consider existing OBIE oracle jobs and prepare design and implementation documents to convert the same programs to IICS Informatica mappings and tasks.

Develop ETL workflows to read reject files created by Epsilon and feed the data back into the Mainframe ATZ systems which is the actual source system

Prepare DAC commands to run through ETL post session commands.

Develop Unix shell scripts to read latest files from source system and rename to actual data file to be used in ETL data flow. Archive the old files when processed.

Establish connectivity from Snowflake to AWS and GCP

Used Snowflake to read AWS and GCP data files from S3 Bucket and load into Snowflake Rewards warehouse. Created Snow pipes to pick up the files automatically and load into Snowflake warehouse.

Perform unit and integration testing based on technical and functional specification for each feed

Performance testing to analyse how fast data is getting read from source system and how fast the AL file is getting generated,

Develop Mainframe CA7 jobs to schedule ETL Informatica workflows to run at specified time.

Active involvement in production deployments and postproduction validations. Used Artifactory and Repository manager tools for code migration to higher environments.

Client : Swiss Re Management (US) Corporation Mar 2018 – Feb 2021, Kansas, KS

Project : Finance MDM using EBX Role: Sr. Informatica / SQL Developer

Project Details:

Finance MDM application offers 'Master data management' developed in EBX software product mandated by Finance domain users and dependent systems. The Finance MDM co-exist in perfect harmony with Group MDM data, adhering to the basic principle that Group MDM is first choice of consumption for Finance master data. The functionality of Finance MDM is to cater services like manage, integrate, and distribute master source data. The application shall provide Bi-temporality of master data, seamless integration of Group MDM, transparency in distribution of master data by integrating to group MDM and offering a single consumption platform for various Finance applications and users. The application thus built shall be scalable to on-board master data of any type and from various Swiss Re sources like ATLAS, CorFIT etc. as long as it adheres to the core principles.

Responsibilities:

Design and develop Operational Database Store by capturing real time data from IBM DB2 using Informatica Power Exchange and apply business logic using varies transformations.

Skilled in Informatica mappings using Source Qualifier, Lookup, Router, Normalizer, Joiner, Update Strategy, Aggregator, Stored Procedure, XML generator and SQL transformation.

Prepared ETL design documents by analysing and reviewing the complex business needs and functional requirement

Implement SCD type 2 logic for expiring old record with current date and inserting active record with infinite date in Group MDM objects using Power Center ETL jobs.

Involved in troubleshooting of the production issues, collected performance data of transformations. optimize and tuning ETL processes at Design level, Mapping level, Workflow level and Database level. Implemented Pushdown optimization and partitioning to improve performance.

Created Indexes, Partitions, Functions, Procedures, Triggers and Packages using PL/Sql in Oracle. Experienced in performance tuning the query using explain plan, Identify the performance bottleneck so that appropriate indexes and partitions and optimizer hints applied.

Developed analytical reports for business planning and reviews.

Performed Informatica Admin role involved in user management, production and lower environment code deployments, monitoring the workflows, recovery and re-start workflows during failures

Client : Swiss Re Management (US) Corporation Aug 2016 – Feb 2018, Kansas, KS

Project : NACA (North America Cash Allocation System) Role: Sr. Informatica Developer

Project Details:

In Swiss Re, NACA enables the user to analyse the customer information based on the different payment options, allocates cash, and update journal entries. NACA is the only system available that allows the storing of document images which has the ability to view the check image from in house repository. NACA has another source of data feed called as Lockbox. It will receive the data from Bank of America in the form of Batches.

Responsibilities:

NACA is a cash allocation system which has financial information from different banks and need to maintain security of data while feeding into Oracle database and archiving image and data files using Informatica PowerCenter workflows.

Identify data issues faced in terms wrong check image tied up with different data element and resolve the incidents on priority basis by connecting right image with right data.

Missing JE20s and JE30s which are classified as credit and debit amounts of vendor checks and wires handled on priority basis and get missing data loaded into the system within expected time.

Improve system performance by running the jobs in off hours and make available data to end users during working hours the day.

Develop Informatica mappings and workflows to create a new feed for Bank Fees automation transactions which comes from Robotics (GIGI) and feed into NACA

Query NACA Oracle database for identifying any data quality issues and report to the upstream data feed delegates to fix the quality issues raised while generating complex reports.

Created analytical reports for business planning and review. Collaborate and work with business analysts and data analysts to support their data warehousing and data analysis needs.

Worked on existing Informatica workflows for performance tuning and query optimizations

Provided on-call production support.

Client : InVentiv Health Care Feb 2015 – Jul 2016, New York, NY

Project : InVentiv SFDC Dev and Support Role: Sr. Informatica Developer

Project Details:

The inVentiv Health Clinical Division is a leading provider of global drug development services to pharmaceutical, biotechnology, generic drug and medical device companies, offering therapeutically specialized capabilities for Phase I-IV clinical development, bioanalytical services and strategic resourcing from a single clinical professional to an entire functional team. With 7,000 passionate employees, the inVentiv Health Clinical Division works to accelerate high quality drug development programs of all sizes around the world

This project frontend application built in Salesforce, which maintains the data of Patients, Practitioners, Facilities, Applications and Orders objects. Data migration jobs will run to migrate the data from different source systems like Oracle, Flat file and SFDC to load the above objects.

Responsibilities:

Involved in Data migration from Oracle to Salesforce by developing Informatica Cloud mappings

The Health Insurance Portability and Accountability Act (HIPAA) sets the standard for sensitive patient data protection used in the frontend portal

Understanding Salesforce data model and implement the solutions to load related objects by implementing upsert or insert operations

Used Informatica Cloud concepts to verify the source data in Oracle and deploy the source data in Oracle

Creating Data Synchronization tasks in Informatica Cloud to sync the data between source and target

Perform Unit testing and integration testing on loaded objects and resolve the issues raised by Client Team.

Resolve the issues in terms of data load, data Synchronization which are raised in Sales Force and consider the data getting loaded into Sales Force without any warning or rejected records.

Schedule Informatica cloud jobs and monitor the logs and identify if any data missing or data mismatches issues.

Working closely with Client Team and get the requirements and develop the mappings accordingly

Client : Con-way Aug 2012 – Feb 2015, Portland, OR

Project : Con-way Freight BIDM Role: Sr. Informatica Developer

Project Details:

Con-way Freight is the premier provider of reliable, regional and nationwide LTL (Less than truckload) service to customers all over the world. In addition to world-class transportation performance, it offers exceptional customer service at every level, supported by industry professionals and state-of-the-art processes and technology that save time and ensure consistent exception-free shipping. The Con-way Freight Enterprise Business Intelligence project involves creating data models moves data from point of origin into the Data Warehouse using Informatica (ETL tool), and collaborates on building micro strategy and business objects reporting analysis.

Responsibilities:

Gathered business scope and technical requirements and technical specifications of ETL jobs.

Prepared phases of software development life cycle (SDLC) and prepared LLD, HLD design documents

Architected highly optimized ETL solutions end to end and conducted a review meeting with technical SMEs.

Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router transformations

Fine-tuned existing Informatica mappings for performance optimization using pushdown optimization, concurrent run method, partitioning, persistent cache mechanism in lookups.

Developed slowly changing dimensions of all varieties including type-I, type-II and type-III.

Analysed existing system and created source to target profiles and business documentation on the enhancements required

Adjusted/created conceptual and physical Dimensional data Model aligned with business requirements using Erwin tool and generated DDL scripts.

Conducted the Unit testing, UAT and System testing of the ETL processes.

Handled the migration process from development, test and production environments using Deployment groups, Xml import/export and GitHub.

Client : Dollar Thrifty Automotive Group Jan 2007 – Jul 2012, Tulsa, OK

Project : DTAG Data Warehouse Role: Informatica Developer

Project Details:

DTAG is a car rentals business company. It gets rentals and reservations from several locations and from third party agents also. This data comes in the format of flat files and using Unix, Informatica tool this data get loaded into the Data Warehouse and from there it will be passed down to downstream applications like Commission/Loyalty, TIPS and TLS and Data marts.

Primary purpose of DTAG Data warehouse is to maintain historical data, which includes rentals, reservations and customer data. It contains rental information for both Dollar and Thrifty brand including Licensee rentals. It stores all the reservations data using files from Trips and Quick Keys as source. It contains all the customer related information. It stores the Blue-chip and Dollar Express Profile data. Data from different applications analysed through Dashboard Apps using warehouse as a source.

Responsibilities:

Involved in production support of Oracle and Thrifty Warehouses.

Involved in enhancements of the project and analysed the business rules and implement in designing mapping.

Created Informatica mappings using different source of Flat file, Oracle and most of the transformations used like Source Qualifier, Aggregators, Lookups, Filters, Ranker and Joiner.

Used workflow manager to create and manage sessions and batches and to configure session connections.

Worked extensively with complex mappings using to develop and feed Data Marts.

Configured the workflows to send e-mail when it suspends or succeed.

Involved in performance tuning of ETL.

Environment: Informatica PowerCenter, Oracle SQL, Pl/Sql, Putty, GitHub, SQL, SQL Developer, Unix, Unix Shell scripting, Oracle SQL, Sql Server

Domains worked

Banking

Insurance

Healthcare

Logistics

Distribution and Fulfilment



Contact this candidate