Post Job Free
Sign in

Sql Server Business Intelligence

Location:
Lawrenceville, NJ
Posted:
August 08, 2025

Contact this candidate

Resume:

AARATI NARAHARI

609-***-****

***.***@*****.***

SUMMARY:

A motivated Informatica-certified, Scrum-certified developer with over 14 years of work experience in the IT Industry in the analysis, design, development, and maintenance of various software applications in client-server environments. Experience in providing Business Intelligence solutions in Data warehousing and Decision Support Systems using Informatica Cloud IICS / PowerCenter and SSIS. Experience in Software Analysis, Testing, Design, Development, and Production Support in Data Warehousing using Postgres, Snowflake, Oracle, SQL Server, and Teradata.

Experience developing strategies for Extraction, Transformation, and Loading (ETL) mechanisms using Informatica PowerCenter and Informatica IICS.

Experience in generating various dashboards for performance.

Experience in Database design and development using Postgres, Snowflake, Oracle, and SQL Server.

Experience in PL/SQL Programming.

Experience in UNIX shell scripting and automation of sessions.

Experience using Airflow, Autosys, and Tidal.

Certified Scrum Master.

Excellent problem-solving skills with a strong technical background and good interpersonal skills.

Managed GitHub repositories and permissions, including branching and tagging.

Version control systems: Git, Subversion, and Informatica versioning.

CERTIFICATIONS:

Informatica Certified Designer (Architecture and Administration, Mapping Design, Advanced Mapping Design).

Scrum Certified Master.

EDUCATION:

Bachelors of Engineering (Electronics and Communications Eng.),

Osmania University,

Hyderabad,

India.

PROFESSIONAL EXPERIENCE:

Centers for Medicare and Medicaid Services (CGI Federal). Oct 2022-Till Date

Senior ETL Developer/Production Support

Worked in Provider Enrollment Chain and Operating System enhancement project. PECOS supports the Medicare providers with the enrollment process by allowing users to securely and electronically submit and manage the Medicare enrollment information

Accomplishments and responsibilities:

●Captured the existing functionality as per requirements for phase 2 implementation.

●Created and debugged JSON data to capture the requirements accurately per definition.

●Extracted data from multiple sources such as flat files and relational sources. Integrated the data using Informatica Power Center 10.5.

●Used Confluence and JIRA for tracking and documentation.

●Conducted peer reviews.

●Strong understanding of data modeling, schema design and ETL processes.

●Migrated Oracle procedures and functions to Postgres.

●Performance tuning and testing of complex SQL queries were performed.

●Performed production support for smooth execution of the workflows by debugging issues and performing root cause analysis.

●Created workflows and loaded data using Informatica IICS into Snowflake Datawarehouse.

●Used Jenkins, GitHub and Airflow for CI/CD.

●Coordinated with the testing team to help create scenarios for testing.

●Collaborated with the Business Analyst and Project Manager to develop, design, and schedule the jobs as needed per requirements.

●Worked on extract generation and enhancements.

●Accessed S3 bucket.

Environment: Informatica Power Center 10.5, Informatica Cloud IICS,Postgres, SnowFlake, SQL, Oracle, SQL Server, Unix, Python, Jenkins, Airflow, Jenkins, GitHub

Bank of America. Oct 2021-Oct 2022

Senior ETL Developer

Bank of America is an American multinational investment bank and financial services holding company. RILR and RAILS are critical applications that process compliance data. This data is processed with Informatica Power Center and loaded into a SQL server data warehouse.

Accomplishments and responsibilities:

●Created data pipelines to load data and ensure data quality.

●Implemented Slowly Changing Dimensions as a part of Full or incremental load into Staging or data warehouse.

●Worked with the testing team to help create scenarios for testing.

●Worked on implementing complex business logic through SQL procedures and functions.

●Coordinated deployments.

●Participated in code/peer reviews after performing unit testing and integration testing with the team.

Environment: Informatica Power Center 10.4, SQL Server, Unix, BitBucket, GitHub.

Girl Scouts of the USA, New York. Oct’ 19 –Sept’21

Senior ETL Developer

Girl Scouts is the preeminent leadership development organization for girls. As part of the Data Engineering team, data from different sources, such as Salesforce, Opsuite, Netsuite, Google Analytics, and Litmos APIs, needs to be extracted, transformed, and loaded into the warehouse, which resides in Snowflake. The extracted data is then presented to the business users through Looker as exploration.

Accomplishments and responsibilities:

●Captured the existing functionality as per requirements from SSIS to Informatica.

●Worked on designing and coding the mappings and mapplets as per the requirements.

●Elimination of any redundancy of data storage was implemented to enhance the performance of the existing code.

●Used the Informatica IICS (Cloud services) to amp up the loading times of the data into Snowflake.

●Resolved business user issues in Looker and traced them back to source systems as needed.

●Worked on writing Design documents to implement the business needs for a technical document.

●Used transformations and mapplets to implement business requirements efficiently.

●Interacted with Informatica to work on different issues with the tool.

●Managed Informatica permissions and privileges as an admin.

●Used GIT for versioning. Worked on code push, pull and merges.

Environment: Informatica Cloud/IICS, Snowflake, Looker, SQL Server, Salesforce, Google Analytics.

Albridge Solutions

Bank of New York Mellon. Oct’ 08 – Aug ‘19

Lead Developer/Production Support Specialist /Data Research Analyst

Albridge is a data management and financial technology solutions company for several financial services companies. Worked as a Data Research Analyst to analyze issues raised by clients as well as proactively worked on identifying and fixing existing data anomalies. As a developer and production support specialist, I supported many applications, including front-end reports, replication feeds, Data warehouse loading from an ODS (Operational Data Store), and extraction processes to feed downstream systems.

Accomplishments and responsibilities:

Worked with the Business to understand the requirements and created the data model accordingly. I also wrote the technical requirements for several implementations and enhancements.

Worked on the design and architecture of the implementation of disaster recovery.

Performance enhancements were made at the database and mapping levels by tuning mappings and sessions for all the processes.

Analyzed and supported several version upgrades for Informatica 7.x/8.6.1/9.5.1/10.1.1/10.2 HF1

Guided team with code consolidation, deployment, and troubleshooting of production issues.

Designed a technical reconciliation key to trace the source record in all the target systems.

Mappings were created using transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, Lookup, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer, and update strategy to meet business logic in the mappings.

Analyzed and supported Oracle upgrade from 11g to 12c.

Loaded data into Teradata database.

Wrote several PL/SQL procedures, functions, and packages to implement the design.

Supported SVN to GIT conversion for the entire code base.

Worked as production support for data loads to provide timely execution of the daily and monthly loads.

Managed GitHub repositories and permissions, including branching and tagging.

Worked on enhancements to several reports to monitor performance.

Designed the batch control and error capture data models.

Prepared structured data flow diagrams and simulated source files for development purposes.

Worked on shell scripts to support the execution of Informatica workflows and processes.

Participated in peer reviews to approve the functional requirements.

Conducted design reviews for the code.

Collaborated with business analysts to support their data warehousing needs. ● Worked as a production support analyst to analyze and resolve the root cause of an issue.

Ability to collaborate and work with several teams, such as business analysts, DBAs, and System Administrators, to meet timely objectives.

Worked with SFTP servers for file processing.

Environment: Informatica 10.2 HF1, Oracle 9/10/11g/12c, Teradata, Linux, Tidal, SVN, GIT

Lehman Brothers, Jersey City, NJ. Sep’ 07 – Sep’ 08

Senior ETL Analyst/Developer

The objective for the DC project is to acquire the account activity data that is valued and classified by strategy, account configuration data which defines the scope of activity to be processed, and referential data that are required for activity enrichment from various WMR and Non-WMR sources and transform it to the format required for the Performance Reporting (PR) and Client Review (CR) components.

Accomplishments and responsibilities:

● Designed and developed ETL technical requirements for activity adapters (trades, journals, holdings, and transactions), FX rates, and accrual services from functional requirements.

● Developed mappings and workflows to extract data from various sources like DB2 and MF flat files to report data store to analyze the data as part of Input analysis. ● Worked on Input analysis to determine the quality of source systems and SLA requirements.

● Analyzed the source data from Reuters, Lipper, Morningstar, and ADP. ● Worked on technical requirements of Alternative Investments (Private Equity and Hedge Funds).

● Worked on UNIX scripts for request and response mechanisms from various source systems, such as ESM (Enterprise Security Master), TMS (Transaction Management System), ADP, RDS (Reporting Data Store), and Lipper.

● Developed ETL support documents and scheduled workflows in Autosys. ● Used Business Objects (XI, 6x and backwards), with experience in Supervisor, Designer, Reporter, BCA Server (Publisher and Scheduler), Web Intelligence 2.5/2.6.1/6.0, Info view, Application Foundation and Schema Design.

● Worked in the designing and building of the Business Objects Universes, Classes, and objects for the business requirements.

● Developed mappings using web services and XML components.

● Worked on product master data as part of reference data management to have one version of truth across all applications enterprise-wide, as the product is sourced from

many internal and external source systems and products are referred to in each source system differently as is in sedol and cusip.

Environment: Informatica Power Center 7.1/8.1, DB2 UDB, SQL, Business Objects XI/6.5, Windows NT/2000, and Sun Solaris 5.8, PVCS, DBArtisan, UNIX-AIX 4.3, Autosys

UBS December 2006-August 2007

ETL/Oracle Developer

UBS is the world’s leading investment banking and securities business, serving a wide range of clients, including corporations, government hedge funds, financial sponsors, private equity firms, banks, brokers, and asset managers worldwide. The main objective of this project is to populate EUE.EUE is an Operational Data store that Calculates entitlements for SOX-relevant applications from user entitlement data. These data feeds come from different legacy systems, relational databases and also in different nonstandard formats. This data is then cleansed and transformed as per the business rules and populated into the data warehouse (GERS). GERS is a shared data warehouse that integrates data from various global authorization systems in UBS to provide entitlement information for reporting purposes.

Accomplishments and responsibilities:

● Cleansed, transformed, and loaded data into ODS.

● Created several triggers and packages to load into subsequent systems. ● Used Business Objects to create universes with objects and classes. ● Built reports to project data from different perspectives using slicing and dicing and drill-down methodologies.

● Expertise in SQL-tuned queries for performance.

Environment: Informatica Power Center 7.1, Power Exchange Navigator, Oracle 10g/9i, PL/SQL Developer 5.1.4, Shell Scripting, Business Objects 6.5, UNIX-AIX 4.3



Contact this candidate