Post Job Free

Resume

Sign in

Data Engineer Informatica

Location:
Richmond Heights, MO, 63117
Salary:
80
Posted:
January 25, 2023

Contact this candidate

Resume:

Anirudh Raj

aduxf1@r.postjobfree.com

469-***-****

PROFESSIONAL SUMMARY

Over 8 years of IT experience in the Telecommunication industry as a Data Engineer and Data Warehousing tool (ETL) experience using the tool Informatica Power Centre.

Responsible for all phases of the System Development Life Cycle (SDLC), from Requirement gathering, Analysis, Design, Development, and Production in various industries viz., Insurance, Finance, Telecom, and Retail.

Strong understanding of the principles of Data Warehousing using Fact Tables, Dimension Tables, and Star Schema modeling.

Worked with Informatica Power Centre using various Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, and XML Source Qualifier.

Developed Mappings and Mapplets, Sessions, Workflows, Worklets, and Tasks using Informatica Designer, Workflow Manager.

Experienced in unit testing of Informatica mappings using a debugger.

Experienced in Performance Tuning and Debugging of existing ETL processes.

Excellent analytical and troubleshooting skills.

Excellent communicator with exceptional team-building skills.

Designed, and Create templates, and Design Docs of offers for Production.

Worked closely with the Data Governance team and Division team to help resolve problem-solving issues through

Mapped Offers in Enterprise Product Catalog (EPC) database by opting for Markets in Availability, Pricing, and Rates Codes for Various Comcast products and promotions utilizing the Amdocs billing system.

Co-developed the SQL server database system to maximize performance benefits for clients.

Performed rate code to product mapping for financial impact.

Lead the EPC Data Setup to Support Dot Com integration testing.

Investigate & resolve EPC Data related issues and tickets to support multiple integration environments.

Investigate & resolve Dot Com defects about non-EPC Mapping in the Integration environment.

Fluent in scripting (Python and JavaScript) and object-oriented programming languages (Java and Python).

TECHNICAL SKILLS

Data Warehousing / ETL

Informatica Intelligent Cloud Service (IICS), Informatica PowerCenter10.x/9.x/8.x, Informatica Data Quality, Power Exchange, Wherescape, Informatica, Talend.

Cloud Technologies

AWS, Azure

Data Modelling

Star Schema Modelling, Snow-Flake Modelling, FACT and Dimensions Tables, Physical and Logical Data Modelling, Datamart, OLAP, OLTP, Erwin, and Oracle designer.

Operating Systems

Windows, Linux

Programming languages

Linux shell scripts, PL/SQL, Python

Databases

Oracle, Teradata, DB2, SQL Server, Netezza7.1, Snowflake, Salesforce, Salesforce Marketing Cloud (SFMC).

Scheduling Tools

Autosys, Tidal, and Control-M.

Methodologies

Agile and Waterfall.

Education: Bachelor’s in Computer science and engineering, JNTUH, India

WORK EXPERIENCE

Client: PNC, PA Dec 2021 -Till Date

Senior Data Engineer

Responsibilities:

Developed the mappings using transformations in Informatica according to technical specifications.

Created complex mappings that involved the implementation of Business Logic to load data into the staging area.

Designed and developed a custom Datawarehouse to support cross-platform analytics.

Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter, and Union.

Created IICS jobs to extract the data from flat files, database and transform the data as per business logic, and load the data in target systems.

Created complex mappings that involved the implementation of Business Logic to load data into the staging area.

Implemented Exception Handling Mappings by using IICS code to remove the Error records/ Invalid records.

Extracted data from various sources like Oracle, flat files, and XML

Implemented slowly changing dimension methodology for accessing the full history of accounts.

Developed Application integration services using REST, SOAP, and WSDL.

Created Mapplets to reduce the development time and complexity of mappings and better maintenance and worked on different sources like Oracle, and flat files.

Implemented performance-tuning logic on targets, sources, mappings, and sessions to provide maximum efficiency and performance.

Managed postproduction issues and delivered all assignments/projects within specified timelines.

Environment: Informatica Intelligent Cloud Service (IICS), Informatica Power center 10x, AWS S3 Bucket, Salesforce, SQL Server 2016, PL/SQL, Agile, Wherescape, SQL, Erwin 4.5, Business Objects, Windows script, Flat files.

Client: CITI Group, India May 2017 – Sep 2021

Data Engineer

Responsibilities:

Worked closely with the clients in understanding the Business requirements, data analysis, and delivering the client’s expectations.

Used Informatica Power Centre 9.6 for extraction, loading, and transformation (ETL) of data into the target systems.

Extracted data from different sources like relational Databases, and flat files.

Created mappings in power Centre Designer using Aggregator, Expression, Filter, Sequence Generator, Update Strategy, Union, Lookup, Joiner, Source Qualifier, and Stored procedure transformations.

Developed mappings/Reusable Objects/applets by using a mapping designer, transformation developer, and mapplet designer in Informatica Power Centre Designer.

Worked extensively with different caches such as Index cache, Data cache, and Lookup cache (Static, Dynamic, Persistence, and Shared).

Worked with spark ecosystem using Spark MS SQL and Scala queries on different formats like parquet files and CSV files.

Developed error handling & data quality checks in Informatica mappings.

Used Informatica Power Centre Workflow manager to create sessions, and batches to run with the logic embedded in the mappings.

Built end-to-end ETL pipeline in Talend Studio for Big Data by extracting data from third-party vendor APIs and ingesting, and storing them in an Amazon S3 bucket.

Performed unit testing on the Informatica code by running it in the Debugger and writing simple test scripts.

Utilize MongoDB to create NoSQL databases that harvest data from a variety of sources.

Actively participate in daily conference bridges concerning QA test scripts & related data integrity.

Ensure effective Triaging of defects found and reduce defect rejection.

Participate in future state design meetings impacting OFT (order Flow Though).

Proactively keeps EPC Design Team aware of testing-related issues and expected design changes in the Integration environment.

Analysis and design of existing and new business processes that span multiple areas of the organization, including Customer Operations.

Completes assessments of business processes and solution recommendations.

Involved in Performance tuning for sources, targets, mappings, and sessions.

Migrated mappings, sessions, and workflows from development to testing and then to Production environments.

Migrated mappings, sessions, and workflows from development to testing and then to Prod environments

Documented the process for further maintenance and support.

Client: Syntel, India Jul 2014 –Mar 2017

Data Engineer

Responsibilities:

Collaborate on the configuration and implementation of offers and rules within EPC to support Marketing needs and expanded the use of the Sales Portal across additional channels.

Monitor offer request queue and team capacity for implementation; Ensure proper work queue prioritization and timely escalation of issues.

Facilitate collaboration among EPC, Divisional IS, and Marketing teams to ensure clarity of direction and proper implementation of offers through EPC and the Billing Systems.

Monitor existing reports and alerts to identify offer data discrepancy issues among EPC and Billers, and coordinate with EPC, Divisional IS, and Marketing teams to facilitate analysis and timely resolution.

Manage offer QA reporting – eliminate or consolidate reports and add new reports or alerts as needed to drive improved offer quality assurance.

Proactively identify and implement improvements to drive increased efficiency and quality of offers implemented through EPC and the Billers.

Collaborate on Offer Management process improvement across all business lines for the company.

Collectively, the EPC team will be responsible for defining a cohesive plan that includes the design of technology, people, and processes to support strategic improvements in offer management.

Collaborate with various teams to support strategic improvements in offer management.

Responsible for identifying and resolving information flow, content issues, and issue resolution.

Participate in the facilitation of team meetings, creation of status reports, and preparation of executive presentations to communicate EPC accomplishments, challenges, opportunities, and recommended courses of action.

Design the offer as needed by the market to help businesses with better solution design as needed.

Experience with ETL tools like Informatica, Talend & Wherescape.

Extracted TBs of data from MS SQL Server and Teradata to One Lake.

Production support with new offer design as well as updating existing to meet the requirement.

Exposed to NoSQL technologies like MongoDB.

Designed, and Create templates and Design Docs of offers for Production.

Work closely with the Data Governance team and Division team to help resolve problem-solving issues through OFA.

Map Offers in the EPC (Enterprise Product Catalog) database by opting for Markets in Availability, Pricing, and Rates Codes for Various Comcast products and promotions utilizing the Amdocs billing system.

Perform rate code to product mapping for financial impact.



Contact this candidate