Post Job Free
Sign in

Data Warehouse Etl Developer

Location:
Rogers, MN, 55374
Salary:
120000
Posted:
December 03, 2024

Contact this candidate

Resume:

Anvesh Kumar

ETL Developer

Email: *********@*****.***

Phone: 510-***-****

Professional Summary:

•Enthusiastic and Business Savvy IT Professional with 10+ years experience in IT in working as Informatica developer and specialized capability in building Enterprise Data Warehouse, Data Marts, Operational Data Stores in real industry segments like Health, and Insurance Using Informatica Power Center 10.5.2,10.4.1, 10.3.1,10.2.0,10.1/9.6.1/9.5.1/8.6.1/8.1.1.

•Experience in handling, configuration, and administration of databases like MySQL and NoSQL

•Experience on Health rules over six years

•Experience in Tableau desktop 10.x/9.x/8.x, Tableau Reader, Tableau prep and experienced in analysis, modeling, designing, and development of Tableau reports and dashboard for analytics.

•Good Experience with Snowflake Cloud Data warehouse, AWS S3

•Good Experience with Snowflake utility Snow SQL.

•Experience in Alteryx platform, and involved in data preparation, data blending, and creation of data models, data sets using Alteryx.

•Extensively worked on HEALTH EDGE (Health Rules Payor) application for reviewing the mismatched amounts and running the claims and CAPITATION payment runs and return to provider MEDICARE and Medicaid 835 files.

•Experienced with full life cycle of Software Development (Planning, Analysis Design, Deployment, Testing, Integration and Support).

•Involved in Error Handling, capturing Missing/Error records loading into Exception table.

•Knowledge on FACETS.

•Extensively worked on EDIFECS to re-process the .DAT 835 and 837 files for NAVINET, 27X and Claims payment & CAPITATION runs.

•Experience in SQL coding and tuning.

•Expertise in IDQ using IDQ Developer.

•Expertise in whole life cycle of DWH (Data Warehouse) with Analysis, Modeling review, ETL Design, Performance improvement, Unit/System testing and support.

•Strong involvement in Understanding Dimensional Modeling - Star and Snowflake Schema, Identifying Facts and Dimensions.

•Expertise in working with databases Oracle 11g/10g/9i, Teradata 14x, 15x, SQL Server 2005/2008, DB2, My SQL, and Sybase.

•Extensive involvement in creating Stored Procedures, Functions, Views and Triggers, Complex SQL queries.

•Extensively worked on IDMC (Intelligent Data Management Cloud) Tool to load data into Snowflake.

•Extensive involvement in composing UNIX shell scripts and computerization of the ETL forms utilizing UNIX shell scripting.

•Worked broadly with CDC (Change data capture) & SCD (Slowly changing dimensions).

•Expertise in implementing performance-tuning techniques both ETL & Database level.

•Experience in working both Waterfall & Agile Methodologies.

•Good relational abilities with solid capacity to connect with end-clients, clients, and colleagues.

TECHNICAL SKILLS:

Operating Systems

Windows, Unix, Linux, MS-DOS

Modeling

Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, Erwin, Microsoft Visio.

Databases

Oracle SQL, MySQL, IBM data studio

RDBMS

Oracle 11g,10g/9i, Teradata 14/12, DB2, SQL Server 2005/2008, MySQL, Sybase, IBM data studio, Oracle SQL developer

QA Tools

HP Quality Center

ETL Tools

Informatica Power Center 10.5.2, 10.4.1, 10.3.1,10.2.0,10.1/9.6.1/9.5.1,8.6.1 and Informatica Data Quality (IDQ) developer 10, IDMC (Intelligent Data Management Cloud)

Reporting Tools

Cognos, Business Objects, Tableau

Languages

XML, UNIX Shell Scripting, SQL, PL/SQL

Methodologies

Data Modeling Logical/Physical/Dimensional, Star/Snowflake, ETL, OLAP, Complete software development life cycle.

Cloud Technologies

Snowflake, SnowSQL

Client- Medica, MN Sr ETL Informatica and Snowflake developer

06/01/2023–6/31/2024

Responsibilities:

Designed and Developed ETL process using Informatica 10.5.2,10.4 tool to load data from wide range of sources such as Oracle, Flat files, Snowflake as source.

Used pushdown optimization for the fast source read and to reduce the slowness off the workflow run.

Developed and supported multiple projects such as HR CATE Fraud shield PRE and POST SHIELD, ADHERE Health Eligibility and Pharmacy RX claims making sure data is flowing across multiple systems as per the Business Needs.

Involved in multiple projects and is the developer for MEDE Analytics, Adhere Health, Fraud shield PRE and POST SHIELDS projects.

Used IDMC (Intelligent Data Management Cloud) Tool to load data into Snowflake.

Used Hierarchy builder to generate the XML as the target.

Created snow pipe for continuous data load.

Created/Build UNIX Shell scripts to pull data from Vendors and drop into informatica environment using FTP process.

Involved in designing, developing and deploying reports in MS SQL server environment using

SSRS-2008 and SSIS in Business Intelligence Development studio (BIDS)

USED ETL(SSIS) to develop jobs for extracting, cleaning and transforming and loading into

target tables.

Extensively used SSIS transformations such as Script Component, Pivot, Unpivot, Multicast,

Derived column, Data Conversion, Row Count and send mail task etc.

Experience with using SSIS performance tuning counters, error handling, event handling, rerunning of failed SSIS packages using checkpoints.

Bulk loading and unloading data into Snowflake tables using copy command.

Created DWH, Databases, Schemas, Tables, write complex SQL queries against snowflake.

Used Email task, Control Task, Event wait and command task in informatica workflows.

Integrated and automated data workloads to Snowflake warehouse.

Ensure ETL/ELTs succeeded and loaded data successfully into Snowflake DB

Created Test cases for Unit Test, System Integration Test and UAT to check the data.

Used partitions in sessions to Improve the performance off the database load time.

Followed Agile Methodology

Created mappings in IDMC and used multiple complex transformations as per the business logic.

Automated the jobs using UC4 Automic scheduling tool.

Extensively worked on ETL Performance Tuning for tune the data load, worked with DBAs for SQL query tuning etc.

Extensively created dynamic param files by developing informatica mapping and by using control table.

Environment: Informatica 10.5.2, 10.4.1, CDC, Oracle Exadata, Flat files, Oracle SQL developer, Snowflake, Snow pipe, Web -Services, TOAD, WINSCP, UNIX.

Client- UCare, MN Sr ETL Informatica /Snowflake/ IDQ developer

07/26/2018–5/30/2023

Responsibilities:

Involved in extracting the data from SQL Server, Flat files using Informatica Power Center.

Used Informatica IDQ Address Doctor transformation for address cleansing.

Created the Dynamic stored procedures to move the data from load ready tables to the DW tables.

Worked extensively on change data capture to capture the incremental loads based on the CT tables or views from Raw layer and load it into IDL layer.

Proposed to retire the legacy system and move all the Medicaid enrollment details to new system CURAM.

Medicaid card and Eligibility flat files are generated to send to MMIS for Medicaid benefits.

Sourced data comes flat files, DB2 databases, and external feeds, and data is extracted.

Discussed the business requirements with State resources in determining MedicAid Eligibility.

Developed Informatica mappings, sessions, workouts, and workflows to extract Human Demographics.

Created ETL mappings to migrate data from the legacy ANSWER system to the CURAM system.

Implement one-time Data migration of Multi level data from SQL SERVER to SNOWFLAKE by using Snow SQL.

Stage the API or KAFKA Data (in JASON file format) into Snowflake DB by Flattening the same for different functional services.

Involved in production support on weekly rotational shift basis.

Implement ETL process in Alteryx to extract data from multiple sources (SQL Server, XML, Excel, CSV) and schedule workflows.

Worked on Tivoli Workload scheduler (TWS) for scheduling the informatica Jobs.

Created tables, views, secure views, user defined functions in Snowflake cloud Data warehouse.

Extracted and loaded CSV files, JASON files data from AWS S3 to Snowflake Cloud Data warehouse.

Installed hot fixes and patches and conducted performance tuning activities.

Migrated ORACLE and SQL database tables into SNOWFLAKE Cloud Data Warehouse.

Manage End- to -end complex data migrations, conversion, and data modeling (Using Alteryx, SQL), and create visualization using tableau to develop high quality dashboards.

Processed data in Alteryx to create TDE for Tableau reporting.

Monitoring of production job as per rotation

Inform stakeholders in case of any failure or SLA miss. Involve the appropriate team to debug & fix the issue.

Provide enhancement to the development team.

Involvement in Release deployment & validation

Responsible for verification and Validation of inbound HL7 Messages (NEW, STR, END, PRD, DRG, ALM, TMS). These message types and segments are customized HL7 Messages

Mapping Inbound HL7 Messages to hospital's EHR systems

Mapping patient vital sign information to the outbound OBX segment (HL7 V2.X).

Involved in creating the XML target files for the Data Eco System downstream extracts.

Actively involving in the data analysis /data profiling.

Used HEALTH EDGE (Health Rules Payor) application for reviewing the mismatched amounts and running the claims and CAPITATION payment runs and return to provider MEDICARE and Medicaid 835 files.

Used EDIFECS to re-process the .DAT 835 and 837 files for NAVINET, 27X and Claims payment runs.

Creating sessions, batches and unit testing of the mapping/sessions and batches.

Used Pre and Post session emails to know the session failures and success and scheduled jobs for loads on daily basis.

Received 834/835 or 820 premium files will be processed via EDIFECS, to convert to xml to load into HR integration layer and 820 remittance files will be loaded into gateway.

834- daily and monthly file- Daily (delta or incremental) and monthly reconciliation file and if there are any updates to the member information is shared back to DHS (Department of Human services) by EVS PCP data file.

Involved in creating the deployment groups to migrate the code from DEV Environment to TEST and TEST to STAGE repositories.

Involved in writing the Batch scripts to handle the file list creations.

Involved in creating the Jobs and Job plans using UC4.

Worked as a production support based on the on-call shifts.

Subject matter expert in processing the DELEGATES DATA which deals with Dental, Fulcrum and Chiro data to process the .csv files to the Guiding Care.

Worked on TFS for updating the stories and tasks assigned.

Environment: Informatica Power Center 10.4.1, 10.3.1,10.2.0,10.1, Snowflake, Oracle 11g, Ms SQL server 2012, DB2, Flat Files, XML files, Automic UC4 and Putty, Autosys, UNIX, Windows XP, and MS Office Suite, TFS Service now and Informatica IDQ, TWS

Client- Anthem, Inc. Atlanta, GA IDQ Informatica Developer & Informatica power center developer

Date- 07/22/2017- –07/25/2018

Responsibilities

•Created POC (proof of concept) for IDQ (Informatica Data Quality).

•Created LDO’s (Logical Data objects), PDO’s (Physical data objects) and profiles for each JIRA stories.

•Created scorecards in Informatica Analyst to validate the valid and invalid rows.

•Monitoring the Scorecard timings in the Informatica Monitoring tool.

•Worked on Data trends to maintain Data Quality maturity, Data preparations activity, Overall Data Quality responsibility and Excel and SQL usage.

•Performed data remediation for cleansing, organizing, and migrating data so it's fit for purpose or use and detected and corrected (or removing) corrupt or inaccurate records by replacing, modifying, or deleting the “dirty” data.

•Worked on IBM data studio to pull the Valid and Invalid records.

•Created various mappings using different transformations in LDO’s such as Union, Sorter, Joiner, Expression, Aggregator etc., transformations.

Involved in extracting the data from SAP using Informatica SAP R3 connector, Informatica BCI and from diverse data sources using Informatica power center (Flat files, Relational tables, XML).

Used Informatica IDQ Address Doctor transformation for address cleansing.

Created SOAP Web Service for Address Doctor in Informatica Developer tool.

Tuning SQL queries for speedy extraction of data.

Used Pre and Post session emails to know the session failures and success and scheduled jobs for loads on daily basis.

Involved in creating the Jobs and Job plans using UC4.

Wrote Stored procedures, Stored functions, Packages and used in many Forms and Reports

Environment: Informatica Power Center 10.1,9.6.1/9.5.1, Oracle 11g, DB2, Teradata, Flat Files, Erwin 4.1.2, SQL Assistant, Toad, WinSCP, Putty, Autosys, UNIX.

State Farm Insurance Company

Us State Farms (Troy, MI)

Work Location: SCube Soft Solutions Informatica Developer 6/12/12 – 08/13/15

Responsibilities:

•Involved in analyzing the mapping specification document.

•Involved in analyzing the data between source and target.

•Prepared solution design and component specifications as per the business requirements.

•Involved in all the development activities right from designing to deploy the code into formal environment.

•Involved in creating Data Maps, Extracting CDC data from Mainframe Sources and updating them into repository tables.

•Implemented CDC (Change Data Capture) in using Variables and creating job tracker tables.

•Prepared unit test cases and executed in development environment.

•Prepared the validation SQL’s equivalent to Informatica mappings to make sure the application is working fine.

•Involved in developing the UNIX scripts to trigger the informatica jobs.

•Developing Informatica Mappings & Tuning them when necessary.

•Having Experience in using the Scheduling tool like Autosys.

•Optimizing Query Performance, Session Performance.

•Prepared integration test cases for specific component as per the business requirement) tuning.

•Designed and developed database views and stored procedures.

•Used SQL to extract the data from the database.

•Worked with Development Life Cycle teams.

•Wrote Stored procedures, Stored functions, Packages and used in many Forms and Reports

•Wrote database triggers for automatic updating the tables and views.

Environment: Informatica Power Center 9.5,9.1.1/8.6.1, /7.1.1/8.1.1, Oracle, SQL, PL/SQL, Windows NT, UNIX, Flat Files, 11g, SQL, SQL Assistant, Windows XP, Unix

EDUCATIONAL QUALIFICATIONS:

Bachelor’s in Information technology from JNTUH- 2008-2012

Master’s in Information assurance from Wilmington University -2015 to 2017



Contact this candidate