Post Job Free
Sign in

Informatica Developer Data Analyst

Location:
Charlotte, NC, 28202
Posted:
March 10, 2023

Contact this candidate

Resume:

Nirmala E Contact: +1-952-***-****

***********@*****.*** Work Status: Green Card

PROFESSIONAL SUMMARY:

9+ years of experience as Sr. Data Engineer for Enterprise Data warehouse across Healthcare, Financial, Insurance domains using AWS Glue, Terraform, Amazon Web services, ELT Framework, SNS, SQS.

Strong SQL, PL/SQL experience to write, debug and optimize complex scenarios.

Good working experience in AWS Glue, ELT Framework,Talend, Informatica

Experience in writing programs using Java, Python

Experience in Hadoop-Hive, PySpark and Hive Queries

Experience in Tableau BI Reporting tool

Extensively worked with tools like Jenkins to implement build automation CI/CD. Setup Continuous integration with Jenkins and make use of a wide range of plugins available to set up smooth developer friendly workflows.

Proven experience in leading teams technically and providing solutions and mentoring them on technology stacks

Working experience with Cloud infrastructure of on-Premises Kubernetes environment and AWS S3.

Experience in using JIRA Tool with Jenkins and GitHub for real time bug tracking and issue management as part of Agile Methodology.

Experience in Extraction, Transformation and Loading (ETL) data from various data sources into Data Marts and Data Warehouse using Informatica power center components (Repository Manager, Designer, Workflow Manager, Workflow Monitor, and Informatica Administration Console).

Strong in implementation of data profiling, documenting Data Quality metrics like Accuracy, completeness, duplication, validity, consistency.

Have good skills on understanding and development of business rules for its Standardization, Cleanse and Validation of data in various formats.

Strong Experience in developing Sessions/Tasks, Worklets and Workflows using Workflow Manager tools -Task Developer, Workflow & Worklet Designer.

Experience in using Automation Scheduling tools like Autosys and TWS.

Knowledge on SOAPUI and Postman. Tested Web Services (SOAP and RESTful) using SOAP UI tool.

Experience in writing Unix shell scripts.

Expertise in team building, problem solving, Resource management and multitasking. EDUCATION:

Master’s in Business Administration (Information System Management)

Bachelor of Science

TECHNICAL SKILLS:

ELT and ETL Tooling AWS Glue,Talend,Informatica,Datastage Tools AWS,KAFKA,SOAPUI,Postman,Jenkins,

Google Bucket

Version control GitLab,GitHub and Bitbucket,versionone ETL Scheduling Tools stonebranch,Autosys,TWS

RDBMS Postgres,Redshift,Oracle 11g/12C,DB2

Reporting Tools SSRS,Tableau9

Methodology Agile and waterfall

Language/Scripting Python,Java,Scala

Containerization Docker & Kubernetes

Data Modelling Tool Power Designer and Erwin

Professional Experience in chronological order:

Blue Cross Blue Shield, MN Jan’22 - Till Now

Role: Data Engineer Lead Sr

Responsibilities:

Responsibility for code changes,research and development solutions for challenging new or unknown issues.

Created Instances in AWS as well as worked on migration to AWS from Legacy system

(SQL server to cloud Technology) Amazon web services cloud services EC2,SNS

,CloudWatch,SQS

Performed ELT operation to load data from Postgres table to Redshift Database.Python script used to copy data postgres Database to S3 bucket and Redshift Landing Tables.

Crawler Terraform helps to create Athena tables to load data from S3 Archival to Athena tables.

Experienced in cloud provisioning tools such as Terraform and Cloudformation.

AWS Lambda functions in pythons for AWS Lambda which invokes python scripts to perform various transformations and large data sets in Redshift.

Created SNS Topics incase of data load success/failure AWS SNS Topic used to send email notification to end user and SNS topics created in Terraform scripts.

Processed Data Load Type for SCD Type 1 and SCD Type 2.

Design and develop ETL processes in AWS Glue to migrate data from external sources like S3,parquet/text files into AWS Redshift,Created AWS Athena tables through crawlers.

Experience on cloud databases and data warehouse(Postgres DB to Redshift DB)

Used normalization to improve the performance and involved in ETL code using PL/SQL in order to meet requirements for extract,transformation,cleansing and loading of data from source to target data structures.

Stonebranch used as a Job Scheduler.

Environments: Gitlab, Python scripting basic, Jira,Terraform, ecosystem S3, redshift, athena, glue, aws redshift, Lambda, EC2, Amazon cloud watch, SNS UnitedHealth Group, MN Nov’19-Jan’22

Role: Sr.Talend Developer

Responsibilities:

Developed high level technical design specification and low-level specifications based on the business requirements.

Resolved issues found during documentation review (documentation errors, missing information).

Performs training other individuals within the department per assigned training plans.

Working in an Agile environment to support SDLC including requirement, design, build, test, implement, and coordinate other team members in the development process.

Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow, tJava, tJavarow, tConvertType etc.

Worked with various File components like tFileCopy, tFileCompare, tFileExist, tFileDelete, tFileRename.

Extensively used tmap component which does lookup & Joiner Functions, tJava, tLogrow components.

Experience in working with Hadoop-Hive, PySpark and Hive queries

Experience in Tableau BI Reporting tool

Worked on improving the performance of Talend jobs.

Created triggers for a Talend job to run automatically on the server.

Worked on Exporting and Importing Talend jobs.

Created jobs on pass parameters from child to Parent job.

Monitored the daily runs, weekly runs and Adhoc runs to load data into the target systems.

Debugged numerous issues in Talend. Assisted the code Review feedback and provide the feedback for improved efficiency.Ability to work with teams in multiple time zones and join meetings.

Understand job execution in TWS Workload and help the team run batch jobs when needed.Worked with PMs on time management and got allocation for projects. Environment: Talend open studio for ESB 6/7, Kubernetes, On-Premises Cloud Infrastructure, Oracle 12g, WinSCP, Jenkins, GitHub, Amazon Web Services. Ameriprise Financial INC, MN Sep’18 - Oct’19

Role: ETL Informatica developer/Data Analyst

Responsibilities:

Extensively Worked with Data Modeler, Data Analysts and Business Users to gather, verify and validate various business requirements.

Identified various source systems, connectivity, tables to ensure data availability to start the ETL process.

Created Design Documents for source to target mappings- Created Workflows, Tasks, database connections, FTP connections using Workflow Manager.

Extensively developed various Mappings using different Transformations like Source Qualifier, Expression, Lookup (connected and unconnected), Update Strategy, Aggregator, Filter, Router, Joiner etc.,

Used Workflow Manager for Creating, Validating, Testing, and running the workflows, sessions and scheduling them to run at specified time.

Created pre-session and post-session shell commands for performing various operations like sending an email to the business notifying them about any new dealer branches.

Providing the Architecture/Domain knowledge to Report developers for the creation of Dashboards and Reports.

Performed Fine tuning of SQL overrides for performance enhancements and tuned Informatica mappings and sessions for optimum performance.

Created Source definitions and Flat File Target definitions using the Informatica

Used UNIX commands (vi editor) to perform the DB2 load operations.

Created detailed Unit test plans and performed error checking and testing of the ETL procedures using SQL queries, filtering out the missing rows into flat files at the mapping level.Scheduled the ETL jobs using TWS scheduler.

Environment: Informatica Power Center 8.6.1/8.1.1, Power Exchange 9.1, IDQ 9, Oracle 11g, TOAD, Cognos 9, Tableau UNIX, Autosys, SQL*Loader, IBM DB2, Flat files, SQL, putty, UltraEdit-32, shell Programming, Toad, Quest Central. Farmers Insurance, Los Angeles, California May’16 - Sep’18 Role: ETL Informatica developer

Responsibilities:

Involved in the requirements definition and analysis in support of Data Warehousing efforts.

Worked on ETL design, creation of the Informatica source to target mappings, sessions and workflows to implement the Business Logic.

Used various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement complex logics while coding a Mapping.

Designed and developed Informatica Mapplets using different transformations like Address validator, matching, consolidation, rules etc. for data loads and data cleansing.

Used the Integration service in Power Center 10.0 to start multiple instances of the same workflow.

Tested in different Tasks in workflows which included Session, Command, E-mail, Event-Wait etc.

Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.

Extracted data from Flat files, SQL Server and Oracle and loaded them into Teradata.

Used various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement complex logics while coding a Mapping.

Created Data Breakpoints and Error Breakpoints for debugging the mappings using Debugger Wizard.

Worked on Autosys Scheduler to automate the Workflows.

Tested all the mappings and sessions in Development, UAT environments and migrated into Production environment after everything went successful. Environment: Informatica PowerCenter 8.6.1, Power Exchange 8.6, Informatica Data Quality

(IDQ 8.6.1), Informatica Data Quality (IDQ 8.6.1), SQL Server, Oracle 11g, PL/SQL, Flat files, MySQL, WinSCP, Notepad++, Toad, Quest Central, UNIX scripting, Windows NT.



Contact this candidate