Post Job Free

Resume

Sign in

Sql Server Informatica Developer

Location:
Avon, CT
Posted:
January 02, 2024

Contact this candidate

Resume:

PRRAVEENA RAJASEKARAN

ETL Informatica Developer

Email: ad2d9d@r.postjobfree.com Cell: +1-860-***-****

SUMMARY OF QUALIFICATIONS:

•My 6 years’ experience in ETL development spans across the entire development life-cycle. From requirements gathering and design to implementation, testing, and production deployment, I have a comprehensive understanding of the process and am able to work efficiently and effectively within each phase. In addition, I have experience in working with a range of databases, including Oracle and SQL Server, as well as experience in building integrations with a range of enterprise Datawarehouse applications in HealthCare Insurance and Financial domains.

•Experience in performing POC on IICS as a part of Informatica PowerCenter to Informatica cloud conversion in current project created mappings IICS using Source, Target, Lookup, Expression, Filter transformations.

•Exposure in writing SQL using analytical functions like Ranking, case statement etc.… using ORACLE 12c/11g, SQL Server 2017.

•Excellent experience in designing and development of ETL Methodology using Informatica PowerCenter 10.4, 9.6,8.6

•Interacted with end-users to identify and develop Business Requirement Document (BRD) and transform it into technical requirements.

•Experience in Debugging and Performance tuning of targets, sources, mappings and sessions in Informatica.

•Worked in Change Data Capture Methodology using Informatica PowerCenter 10.x/9. x

•Prepared various types of documents like requirements gathering, ETL specification, Data Mapping, Test cases, Data dictionary etc.

•Expert in writing SQL query using multiple databases like ORACLE, SQL Server etc.

•Performed Unit testing, System testing, Integration testing and users for UAT and prepared test report in different phases of project.

•Experience with Data Extraction, Transformation, and Loading from Multiple Sources.

•Good understanding of the Data modeling concepts like Star-Schema Modeling, Snowflake Schema Modeling.

•Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and Mapplets, look-ups.

•Extensively used Slowly Changing Dimension (SCD) technique in business application.

•Experience in Performance tuning of ETL process using pushdown optimization and other techniques.

•Working experience in Agile and Waterfall methodologies.

•Excellent communication skills, ability to communicate effectively with executive and management team, having strong analytical, problem-solving skills.

•Strong skills in data analysis, problem-solving, and team collaboration; have a proven ability to work effectively in a fast-paced environment, manage multiple priorities, and meet tight deadlines while maintaining quality output.

Education and certifications:

•Masters in Business Administration from Barathiyar university, India

Technical Skills:

ETL TECHNOLOGY

Informatica PowerCenter 10.4/9.x, Informatica Intelligent Cloud Services - IICS

DATABASES

Oracle 12c/11g, MS SQL Server 2017/ 2014

OTHERS

TOAD, SQL, Rally, JIRA, Jenkins, Airflow, MS Office

Professional Experience:

Client: Bank Of America, Charlotte, NC May 2023 to Current

Role: Application Programmer / ETL Informatica Developer

Bank of America is one of the largest financial institutions in the United States, providing a wide range of banking and financial services to individuals, businesses, and institutional clients. Bank of America serves millions of customers, offering them access to various financial products and solutions like business banking, mortgages, loans, investment management, and wealth management.

The objective of this project is to convert Informatica and SQL scripts embedded within Perl scripts to Python scripts for Bank of America's Corporate Investments Data Warehouse (CIDW). The migration process will be facilitated through an in-house tool called DH360. The converted Python scripts will then be deployed using Jenkins, while the ETL (Extract, Transform, Load) orchestration will be managed through Airflow.

Responsibilities:

•Conversion of Informatica and SQL scripts by understanding the logic and functionality of existing scripts and rewriting them so that it can be embedded in Python wrapper.

•Integrating Python scripts with DH360, an in-house migration tool, to facilitate a streamlined migration process.

•Conducting rigorous testing and validation, including unit, integration, and end-to-end testing, to ensure the accuracy and performance of the migrated scripts within the CIDW environment.

•Deploying converted DH360 scripts using Apache Airflow and ensure their availability for execution in the production environment.

•Orchestrating ETL workflows for the CIDW using Apache Airflow, including scheduling, dependencies, and execution of Python scripts.

•Documenting the migration process through comprehensive documentation, creating step-by-step instructions to capture data lineage.

•Participating in daily scrum calls to provide updates on progress, discuss any challenges, and align with the team on the overall project goals.

•Collaborating with team members during sprint planning sessions to prioritize tasks, assign responsibilities, and estimate efforts for timely completion.

•Engaging in regular code reviews to ensure code quality, share knowledge, and provide constructive feedback to fellow team members.

•Contributing to sprint retrospectives, offering insights and suggestions for process improvement, team productivity, and efficiency.

•Conducting demos at the end of each sprint to showcase completed work, gather feedback from stakeholders, and incorporate necessary adjustments into subsequent iterations.

•Collaborating with cross-functional teams, such as database administrators and quality assurance, to address dependencies, resolve issues, and ensure smooth execution of the migration project.

•Actively participating in team discussions and knowledge-sharing sessions to exchange ideas, leverage collective expertise, and foster a collaborative work environment.

Environment: Informatica PowerCenter 10.5, SQL, Oracle 12c, TOAD, Airflow, Jenkins, Jira, Autosys.

Client: Aetna, a CVS Health Company, Hartford, CT Mar 2021 to May 2023

Role: ETL Informatica Developer

Aetna is one of the nation's leading diversified health care benefits companies, serving an estimated 46.7 million people with information and resources to help them make better decisions about their healthcare.

The objective of this project is to enrich data from new sources into existing IDW i.e., integrated data warehouse to enable improved decision-making, strategic plans, support and solutions that favorably impact costs, quality of care, outcomes, and customer satisfaction through an information-driven environment that leverages their integrated data assets for competitive advantage.

Responsibilities:

•Actively involved in interacting with business users to record user requirements and Business Analysis.

•Documented Data Mappings/ Transformations as per the business requirement.

•Prepared Technical Design documents and Test cases.

•Creating and maintaining source-target mapping documents for ETL development team.

•Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.

•Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.

•Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.

•Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse tables.

•Developed several reusable transformations and mapplets that were used in other mappings.

•Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.

•Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.

•Migrated Mappings, Sessions, Workflows from Development to Test and then to UAT environment.

•Wrote hundreds of SQL query for data analysis to validate the input data against business rules.

•Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.

•Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.

•Used PMCMD command to start, stop and ping server from UNIX.

•Improved performance testing in Mapping and the session level.

•Coordinated with scheduling team to run Informatica jobs for loading historical data in production.

•Maintaining bug, status using JIRA tool

•Working on POC on IICS to convert Informatica PowerCenter to Informatica cloud

Environment: Informatica PowerCenter 10.4, SQL, Oracle 12c, TOAD, Control M, Rally

Client: The Hartford, Hartford, CT Jan 2020 to Mar 2021

Role: ETL Informatica Developer

Description: The Hartford is an insurance company that provides a range of insurance and financial products and services to individuals and businesses. The offerings include property and casualty insurance, group benefits, and mutual funds. The Hartford serves a diverse range of customers, including individuals, small and mid-sized businesses, and large corporations.

The Data Warehouse project involves designing and building a large-scale database that consolidates data from various internal and external sources. This data warehouse is used to support the Hartford's business intelligence and reporting needs, enabling it to make data-driven decisions.

My responsibilities include building ETL processes to load data into the Warehouse, ensuring data quality, Triage and issue resolution and supporting downstream business uses access data.

Responsibilities:

•Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.

•Actively involved in interacting with business users to record user requirements and Business Analysis.

•Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for this project.

•Parsing high-level design spec to simple ETL coding and mapping standards.

•Maintained warehouse metadata, naming standards and warehouse standards for future application development.

•Created the design and technical specifications for the ETL process of the project.

•Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.

•Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.

•Worked with various complex mapping, designed slowly changing dimension Type1 and Type2.

•Performance tuning of the process at the mapping level, session level, source level, and the target level.

•Created Workflows containing command, email, session, decision and a wide variety of tasks.

•Tuning the mappings based on criteria, creating partitions in case of performance issues.

•Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.

•Resolving the tickets based on the priority levels raised by QA team.

•Developed Parameter files for passing values to the mappings as per requirement.

•Scheduled batch and sessions within Informatica using Informatica scheduler and also customized pre written shell scripts for job scheduling.

Environment: PowerCenter 9.6.1, Oracle 12c, SQL Server 2017, Control M, Putty, WinSCP, Notepad++, JIRA

Client: Liberty Mutual Insurance, Boston, MA Sept 2017 to Dec 2019

Role: ETL Informatica Developer

Description: Liberty Mutual Insurance creating innovative insurance products, services, ideas and technologies to meet the world’s ever-changing needs—breaking away from old mindsets and thinking outside of traditional insurance roles. It offers personal insurance products for home, auto, and property-casualty, small business insurance, mid-to-large-size commercial and specialty insurance, surety, and reinsurance, as well as manages globally invested financial assets. This project involves building a data warehouse by consolidating data from a variety of Sources into a central data warehouse to solve multiple use cases to support downstream application.

Responsibilities:

•Worked with the business team to gather requirements for projects and created strategies to handle the requirements.

•Documented Data Mappings/ Transformations as per the business requirement.

•Worked on project documentation which included the Technical and ETL Specification documents.

•Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.

•Involved in extracting the data from the Flat Files and Relational databases into staging area.

•Mappings, Sessions, Workflows from Development to Test and then to UAT environment.

•Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.

•Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.

•Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.

•Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.

•Extensively worked on Unit testing for the Informatica code using SQL Queries.

•Implemented various Performance Tuning techniques.

•Wrote SQL queries to view and validate the data loaded into the warehouse.

•Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.

•Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.

•Used PMCMD command to run workflows from command line interface.

•Improved performance testing in Mapping and the session level.

Environment: Informatica PowerCenter 9.1, SQL Server 2014, Oracle 11g, TOAD, Putty, Control M, UNIX, WinSCP



Contact this candidate