Post Job Free
Sign in

Data Warehouse Business Intelligence

Location:
Charlotte, NC
Posted:
April 15, 2025

Contact this candidate

Resume:

Abhinay Pilli

Sr. Informatica Developer

Phone: 984-***-****

Email: *******.********@*****.***

PROFESSIONAL SUMMARY:

Overall 9 plus years of IT experience in analysis, design, development and implementation of Business Intelligence solutions.

Informatica: Eight plus years of experience in Data Warehouse using Informatica Power Center, Informatica Power Exchange. Experience in Performance tuning of sources, targets, mappings, transformations and sessions, partitioning data and session partitions.

Data Analysis: Eight plus years of strong Business Analysis experience on Data Analysis, Business Requirement Documents, Data Migration Documents, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis, Impact Analysis Documents.

Eight plus years of experience with Oracle database. Experience with SQL, PL/SQL, SQL*Plus, SQL*Loader, Triggers and Oracle query optimization using PARALLEL HINTS, Explain Plan.

UNIX: More than six years of experience in UNIX shell scripting.

Experience in prioritizing and handling multiple tasks at any given time and estimating accurate level of effort (LOE).

Knowledge in NoSQL technologies like MongoDB, Redis and Cassandra and relational databases like Oracle, PostgreSQL, Dynamo DB, MariaDB and MySQL databases.

Developed Python Scripts to Parse the Flat files, CSV, XML,JSON and extract the data from various sources and load the data to Data warehouse.

Worked on Data Profiling using IDE-Informatica Data Explorer and IDQ-Informatica Data Quality to examine different patterns of source data. Proficient in developing Informatica IDQ transformations like Parser, Classifier, Standardizer and Decision.

Proven experience with the development and implementation of large-scale, enterprise data warehouse solutions.

Experience in full Software Development Life Cycle (SDLC) Experience in Agile software development utilizing Scrum and test-based development (TDD), Waterfall model and Kanban board.

Strong understanding of the entire AWS Product and Service suite primarily EC2, S3, VPC, Lambda, Redshift, Secrets manager, EMR and other monitoring service of products and their applicable use cases, best practices and implementation, and support considerations.

Monitoring resources and Applications using AWS Cloud Watch, including creating alarms to monitor metrics such as EBS, EC2, ELB, RDS, S3, SNS and configured notifications for the alarms generated based on events defined.

Experience in configuring, deployment and support of cloud services including Confidential Web Services (AWS).

Experience in using Automated scheduling tools like Autosys & Control-M

Experience in working with the offshore development team, senior level managers, coordinating with business teams and leading while executing projects.

Having knowledge on Dimensional Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP, FACT& Dimensions tables, Physical & Logical data modeling.

Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.

Experience using GitHub, Bitbucket, Jenkins,Udeploy and other Version Control systems for Code Check-in and deployment.

Diverse background with fast learning skills and creative analytical abilities with good communication and technical skills.

Good team player with flexible and adaptable approach to work.

Certifications:

AWS Certified Solutions Architect – Associate

https://www.youracclaim.com/badges/555e15d5-3bcf-407d-b7b6-24385b49960b/linked_in

Education Summary:

Bachelor’s in Computers Science (CS) from Jawaharlal Nehru Technological University, Hyderabad-2014.

Master’s in computers science from SNHU, Manchester- 2016

Technical Skills:

ETL

Informatica Power Center CDIPC,10.x, 9.x/8.x/7.x/6.x

Informatica Tool

Informatica B2B Data Exchange, Informatica Developer

Methodologies

Star Schema, Snowflake Schema.

O/S

Windows 2000/XP/7/8/10, UNIX, MS DOS, MAC.

Programming Languages

Unix Shell scripting, SQL, PL/SQL, Perl, Python

Databases

Oracle19c,12c,11g/10g/9.x/8.x, SQL Server 2014, 2012, Teradata, Postgre SQL, DB2

Cloud Environment

AWS Snowflake, AWS RDS, AWS Aurora, Redshift, EC2, EMR, S3, Lambda, Glue, Athena, SQS, SNS, ELB, VPC, EBS, RDS, Route 53, CloudWatch, AWS Autoscaling, AWS CLI

GUI

TOAD 16.x,12.x,9.x, SQL Developer

Scheduling Tool

Control M, AutoSys

Reporting tools

Tableau, OBIEE, PowerBI

Other Tools

Jenkins, Quick Build, SVN, GIT, Bitbucket, AWS services, U-deploy, Liquibase, Splunk

PROFESSIONAL EXPERIENCE:

Client: Fannie Mae, Richmond, VA Jan 2024 – Current

Role: Sr. Data Engineer

Responsibilities:

Worked extensively with AWS services like EC2, S3, CloudFront, SNS, SQS, RDS, IAM, CloudWatch, VPC, ELB, Auto scaling groups, Route 53,

Secrets manager, Lambda, Step Functions, Glue.

Design and develop various ETL processes in AWS Glue to migrate data from external sources like S3/Text/CSV/Parquet into AWS Redshift.

Designed Python scripts to parse data from XML, JSON, .CSV and Parquet files and added controls to check the data quality and perform cleansing before loading data to RDS.

Hands on experience on Amazon Event bridge, AWS step functions, Elastic search,AWS Secret manager.

Hands on experience working on databases like Redshift and Aurora PostgreSQL.

Work on daily jobs to move the files from remote host to S3 environment.

Design Stage jobs to pick the data files from S3 and load to redshift staging layer.

Created Python Script to start and stop the EC2 instances in AWS.

Worked on identifying bottlenecks in existing Production jobs and performed various performance tuning strategies like Database tuning, Partitioning, Index Usage, Aggregate tables, Session Partitioning, Load Strategies, Commit intervals and transformation tuning.

Writing complex SQL’s using Joins, Sub-queries and correlated sub queries to load data and help business analysts to perform validations.

Participated in Failover testing and performed necessary steps to switch the jobs, services and database to contingency from production and then switch back to Production after successful testing.

Worked on application migration to Cloud from On-prem and setup Informatica services and configured jobs on Cloud.

Analyzing the system for any new enhancements/functionalities and perform impact analysis on the application for the anticipated ETL changes.

Used Cloud watch for monitoring CPU utilization and system memory on EC2 instances.

Worked on cost optimization strategies like Auto shutdown RDS and EC2 instances after business hours, using S3 object lifecycle policies and efficiently using ELB.

Worked on a POC for merging two applications and reducing cost for maintenance and decreasing multiple hops for data.

Find and resolve complex build and deployment issues during deployments and provide 24*7 support for troubleshooting and debugging production issues.

Worked on performance tuning in Informatica by implementing Bulk load strategy and using higher commit interval.

Developed complex mappings and enhanced existing mappings functionality using various transformations to implement the business logic and load data incrementally.

Worked on various Cloud maintenance activities like upgrading database versions, docker image version upgrade, manage service updates, compliance and Cloud Governance activities.

Environment: Informatica Power Center 10.6.1, Informatica CDIPC, SQL, Oracle, PL/SQL, SQL Loader, TOAD 16.x, Autosys, UNIX, AWS( S3, RDS, Glue, PostgreSQL, Redshift, CloudWatch, EC2, Step functions, Lambda, SQS, SNS, ELB), GitHub, Jira, Ansible, Splunk, Docker, Sqlplus, Bitbucket, Jenkins, Python.

Client: Wells fargo, Charlotte, NC July 2022 – Dec 2023

Role: Sr. ETL Informatica Developer

Responsibilities:

Worked on various phases of SDLC from requirement gathering, analysis, design, development, testing and production migration.

Worked on Informatica Power center tools like Designer, Workflow manager, Workflow monitor and Repository manager for translating business requirements and generating data.

Extensively used various transformations like Source Qualifier, Filter, Router, Expression, connected and unconnected Lookups, Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop complex mappings in the Designer.

Extracted data from various source systems like Oracle, Teradata, SQL server and flat files as required and designed the ETL process to load data into Oracle database.

Developed the code to maintain database security by enabling auditing and DB objects utilization by users and applications.

Developed Post session and Pre session shell scripts to perform tasks like merging flat files after creation, deleting temporary files and changing the file names as required etc.

Designed and developed Unix shell scripts as part of ETL process to compare control totals, automate the process of loading, pushing and pulling data to and from different servers.

Involved in Structural and Functional testing and code migration to production via Jenkins, Liquibase and Udeploy.

Capturing end user stories on Jira, analyzing and laying out a design in continuous integration environment and grooming the end user stories with business users and analysts and deliver user stories at the end of every sprint.

Developed mapping parameters and variables to support SQL override and dynamic parameter.

Worked with data modelers to update the logical and physical data models in Erwin for data warehouse tables.

Worked extensively on Real-time Change Data Capture (CDC) using Informatica Power Exchange to load data from DB2 Main frame and VSAM files.

Worked on Performance tuning to optimize the session performance by utilizing, partitioning, pushdown optimization, pre and post stored procedures to drop and build constraints.

Created new Autosys jobs and modified the existing Autosys to update the dependencies and job schedule.

Worked on resolving failures with Jenkins Build and deployment pipelines.

Environment: Informatica Power Center 10.5.3, Informatica Power Exchange, SQL, Oracle, PL/SQL, SQL Loader, TOAD 16.x, Control-M, UNIX, AS-400(DB2), GitHub, Jira, ALM, Sqlplus, Bitbucket, Jenkins,Teradata.

Client: The Hartford, Charlotte, NC Jan 2021 – June 2022

Role: Sr. ETL Informatica Developer/Production Support

Responsibilities:

Designed and customized data models for Data warehouse supporting data from multiple sources on real time. Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.

Involved in design, development and implementation of ETL Processes in Power Center. Responsible for managing, scheduling and monitoring workflow sessions.

Developed Transformation logic to cleanse the source data with any inconsistencies before loading data to Staging area which is the source for Stage tables.

Design and implement informatica B2B Data Transformation for converting any input unstructured data into required output format using B2B Data Parser, Mapper, Serializer and executing them in Informatica PowerCenter workflows.

Involved in creating and managing Informatica B2B data Exchange objects like partners, profiles, Endpoints, workflows, event attributes and statuses based on the input file process flow requirements.

Design and implement Informatica B2B and Informatica B2B managed file transfer protocol for transferring files between different partner systems.

Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in Informatica Designer.

Created PL/SQL by using cursors, triggers and procedures to load data into database and generate scripts for data migration and data validation.

Created a Perl script to copy files from source server to the windows NAS location for the Informatica jobs to process.

Created an Autosys job to automatically cleanup older files to resolve issues with space in Production server.

Conduct Functionality, Integration, System testing and investigated defects. Conduct Root cause analysis and resolve production issues.

Performance tuning and support of ETL, Database jobs.

Develop Informatica ETL code to load data in DWH tables.

Involved in creating Autosys jobs to trigger Unix shell scripts & Stored procedure.

Leading Production deployments from onsite and coordinating with offshore resources.

Used Liquibase for database components deployment, GITHUB and Bitbucket for code check-in and migration.

Environment: Informatica Power Center 10.5.3, Informatica B2B Data Exchange, Informatica Developer, SQL, Oracle, PL/SQL, SQL Loader, TOAD 16.x, Autosys, UNIX, Perl, GitHub, Liquibase, Jira, Sqlplus, Bitbucket, Jenkins.

Client: Wells Fargo, Charlotte, NC Apr 2019 – Dec 2020

Role: Sr. ETL Informatica Developer

Responsibilities:

Working with business analyst to understand the business requirements and created technical requirements documents.

Prepared the Technical Design according to the business requirements.

Design and Developed complex mappings by using Lookup transformation, Expression, Sequence generator, Update, Aggregator, Router, Stored Procedure to implement complex logics while create mappings.

Experience in creating complex database subprograms in PL/SQL (created Packages, Stored procedure, Triggers, Functions, Exception Handlers).

Worked with business analysts to understand the business requirements and created Technical Requirements Documents and Technical Design documents according to the business requirements.

Created and used reusable Transformation, Mapplets using Informatica PowerCenter.

Design and Developed complex mappings by using Lookup transformation, Expression,

Sequence generator, Update, Aggregator, Router, Stored Procedure to implement complex logics while create mappings.

Developed mappings using Informatica to load the data from sources such as Relational tables, Flat files, Oracle tables into the target data warehouse.

Contributed and actively providing comments for user stories review meeting within an AGILE SCRUM environment.

Configured workflows with Email Task, which would send mail with session, log for Failure of a sessions and for Target Failed Rows.

Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).

Worked on Change Data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.

Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.

Developed mappings/Transformation/mapplets by using mapping designer, transformation developer and mapplets designer using Informatica PowerCenter.

Used the Slowly Changing Dimensions (SCD type 2) to update the data in the target dimension tables.

Extensively used types of caches like static, dynamic and persistent caches while creating sessions and tasks.

Extensively used Informatica debugger to find out the problems in mapping. Also involved in troubleshooting and rectify the bugs.

Environment: Informatica Power Center 10.4.1, SQL, Oracle 12c/11g, PL/SQL, SQL Loader, TOAD 9.x, Autosys, UNIX-Sun OS, OFSAA, OBIEE, Quick Build, GitHub, Jira, ALM, Sqlplus.

Client: Bank Of America, Charlotte, NC Aug 2017 – Mar 2019

Role: Sr. ETL Informatica Developer

Responsibilities:

Used features like email notifications, scripts and variables for ETL process using Informatica Power Center.

Involved in Data Extraction from Oracle and Flat Files using SQL Loader Designed and developed mappings using Informatica Power Center.

Extensively used Power Center to design multiple mappings with embedded business logic

Actively involved in migrating the Informatica version, interacted with DBA, Middleware, Informatica administrators and UNIX/Linux administrators to identify and analyze the impact/risk on the upstream/downstream systems.

Involved in fixing of invalid Mappings, Performance tuning, testing of Stored Procedures and Functions, Testing of Informatica Sessions, Batches and the Target Data.

Carry out Defect Analysis and fixing of bugs raised by the Users.

Performed key role in migrating Oracle 11g to Exadata Databases

Implemented security by creating User logins, Roles and granting Users access to the database per their role.

Developed new features and fixes following an Agile methodology.

Assisted Scrum team in preparing the Sprint backlog as part of Agile methodology.

User interaction, requirement analysis, functional specification, design, development and implementation.

Involved in setting up transmission for the external sources to send their files through SFTP,NDM and FTP.

Automated the code deployment and EC2 provisioning using Ansible and Terrafoam.

Have done POC on Redshift spectrum to create external tables by using S3 files.

Involved in writing Java API for Amazon Lambda to manage some of the AWS services. Have done POC on AWS Athena service.

Involved in Linux server migration from RHEL 5.11 RHEL 7.6

Involved in code migration from SVN BitBucket

Loaded data into Amazon Redshift and use AWS Cloud Watch to collect and monitor AWS RDS instances within Confidential.

Create external tables with partitions using Hive, AWS Athena and Redshift.

Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools like Toad, PL/SQL Developer and SQL* plus.

Partitioned the fact tables and materialized views to enhance the performance.

Created Informatica maps using various transformations like SAP BAPI/RFC, SAP IDOCs transformations, Web services consumer, XML, HTTP transformation, Source Qualifier, Expression, Look up, Stored procedure, Aggregate, Update Strategy, Joiner, Union, Filter and Router.

Extensively used ETL to load data from Flat files which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 11g.

Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.

Environment: Oracle 12c/11g, SQL, PL/SQL, Autosys, ANSI SQL, Oracle RDBMS, WinSCP, UNIX, AWS console, Informatica Power Center v 9.6.1 & 10.2.0, SVN and GIT, Toad for Oracle 12.9, Oracle EDQ.

Client: Clean Harbors Environmental Services, Norwell, MA July 2016 - July 2017

Role: ETL Informatica Developer.

Responsibilities:

Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.

Designed ETL specification documents for all the projects.

Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.

Extracted data from Flat files, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.

Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.

Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.

Created IAM policies for delegated administration within AWS and Configure IAM Users / Roles / Policies to grant fine - grained access to AWS resources to users.

Improved infrastructure design and approaches of different projects in the cloud platform Confidential Web Services (AWS) by configuring the Security Groups, Elastic IP's and storage on S3 Buckets.

Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.

Extensively used the Add Currently Processed Flat File Name port to load the flat file name and to load contract number coming from flat file name into Target.

Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.

Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.

Extensively used workflow variables, mapping parameters and mapping variables.

Created sessions, batches for incremental load into staging tables and scheduled them to run daily.

Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.

Implemented Informatica recommendations, methodologies and best practices.

Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.

Involved in Unit, Integration, System, and Performance testing levels.

Written documentation to describe program development, logic, coding, testing, changes and corrections.

Migrated the code into QA (Testing) and supported QA team and UAT (User).

Created detailed Unit Test Document with all possible Test cases/Scripts.

Responsible for Query Tuning, Performance tuning Informatica jobs and support during warranty period.

Driving the Data Mapping sessions with customers and creating the detailed data mapping documents.

Involved in Data Model design sessions with DBA.

Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Environment: Informatica Power Center 9.1.0, Oracle 11g, SQLServer2008, MS Access, Python 2.x/3.x,JavaScript,MySQL, AWS console, Windows XP, Toad.

Client: Liberty Solutions, India. July 2014 –Dec 2014

Role: Data Analyst

Responsibilities:

Developing mapping document indicating the source tables, columns, data types, transformation required, business rules, target tables, columns and data types.

Working with multiple sources such as Relational Databases, Flat files for extraction using Source Qualifier and Joiner.

Developing various mappings to load data from various sources using different transformations to store the data in target table.

Supporting the Onsite Production Team to resolve issues.

Closely working with Log files to trace for any issues.

Importing Flat files from Core FTP to Informatica Server as well Informatica Server to Core FTP.

Built PL/SQL procedures for applying business logic.

Communicate with the business users to understand the requirement.

Explain team goal and objective to TM and assist team in organizing to accomplish work.

Communicate assignments, milestone and deadlines to the team and individual.

Functioning as a technical expert of the team.

Review of codes and Technical Support to the Team.

Conduct System Testing and Unit Testing.

Responsible for on –time delivery with zero defects to the client.

Environment: Informatica Power Center 8.1, Oracle9i, TOAD, UNIX shell scripting, SQL Developer



Contact this candidate