Post Job Free
Sign in

Data Engineer Solutions

Location:
Irvine, CA
Salary:
negotiable
Posted:
June 04, 2025

Contact this candidate

Resume:

KOEL MOHANTY

San Clemente, CA *****

********@*****.*** / 707-***-****, LinkedIn: www.linkedin.com/in/koel-mohanty-12008116

OBJECTIVE

Result driven ETL engineer with 20+ years of experience designing & implementing Transactional DB’s & DWH both in On-premises & Cloud in various industries: Healthcare, Pharmaceuticals, Finance, Retail. Automobiles and Insurance deliver end-to-end solutions to meet present and future needs. Skilled in Informatica Data Management, Toad/ Foundry Data Modeling, Data Quality, Data integration and design and develop data solutions in On-premises, AWS, GCP Platform. Adept at optimizing performance, leading cross-functional teams to deliver business solutions timely.

SUMMARY

Enterprise Architecture/ Modeling: Experienced in ETL framework, delivering architecture designs, value stream mapping & aligning technology initiatives with operational efficiency goals. Making use of Dimensional Data Modeling, Star/Snowflake Schema using ERwin. Ensure adherence to Agile and DevSecOps methodologies

ETL/ ELT Expertise: Proficiency in experience with INFORMATICA. Architect/design ETL process to support DWH using Informatica, ODI, OBIEE, Foundry Workshop and deliver files using various SFTP connections for marketing analysis & integration via REST APIs across multiple domains. Perform & supported query optimization

Technical Leadership: ETL, OLAP, MDM & Report delivery methods, creating Fact and Dimension tables in Oracle & MSSQL DB. Experience with SQL, PL/SQL loading financial data to Oracle/SQL DB, PostgreSQL & BQ

Business Process Modeling/Optimization: Reducing operational costs, streamlining processes & enhancing system act. Leveraged DevOps solutions using CI/CD Tools: Jenkins, GitHub, IntelliJ, SourceTree, Bitbucket

Agile Practitioner: Skilled in Agile project management using Scrum methodologies, leveraging tools like JIRA, Kanban & Agile methodologies. Adept at leading cross-functional teams and collaborating with senior leadership.

Familiar in Agile ceremonies (sprint planning, daily standups, sprint review & retrospectives), and product backlog

Strategic Planning: Expert in planning and delivering in communication, Project management, decision making, critical thinking & build mockups. Define & communicate the technical vision & roadmap for digital health products

SKILLS

ETL/ELT

Informatica Power Center, IDQ, IDMC, IICS, DAC, SSIS, ODI, ADT, ActiveBatch, Palantir Foundry

Servers/Cloud

Oracle, Oracle SQL Developer, SQL 2019 Management Studio, REST API, AWS RDS, GCP, DBT

Data Modeling

Logical/ Physical/Dimensional, OLAP, ERwin, PowerDesigner, ER/Studio, Toad, SQLWorkbench, SCT

Database

Oracle 8i/9i/10g/11g/12c/19c, EXADATA, MS SQL Server, Teradata, Netezza, PostgreSQL, Big Query

Application/ PM Tool

Office 365, SharePoint, Visual Studio, SourceTree, BitBucket, Git, IntelliJ, Confluence, Productboard

OLAP/Reporting Tool

Business Objects 6x/XI,, WEBI, Siebel Analytics, Power BI, Oracle BI, OBIEE 11x, Excel

Languages/Ticketing

PL/SQL, SQL, T-SQL, XML, SAP(ABAP)/4), Daptiv, Service Now, Cherwell, ZEMA, JIRA, Kanban

EXPERIENCE

Sr. Data Engineer CTC Global, Inc. - Irvine, USA 2023 - 2024

Developed proof of concept for R&D using fiber optics technology. Measured temperature, strain, and vibration directly on the line at millimeter increments using fiber optic technology. Conducted geo-spatial analysis calculating Amperage in GHz using temp, strain, vibration & frequency data. Familiar with company products, OS & client’s business processes

Built Ontology backed Time-Series analysis & monitoring Palantir Foundry product as per business requirements

Managed data from multiple sources and maintained ETL processes. Assisted with Analytics use cases data load

Taking the CSV file (having measurements data on transmission lines), rearranging the information, converting file into Time-Series data, processing the data and presenting data in a required fashion with Geo-Spatial location

Developed ETL/ELT pipelines to flow data from multiple sources to the staging database and apply business logic to populate normalized and denormalized data structure. Worked with Data Architect to design ETL data flow

Provided team support across territories, offering creating insights into new analytical tool like Contour in Foundry

Took initiatives in addressing gaps in our data, identifying solutions, and improving reliability to establish a competitive edge and firmly cementing our data quality as a key differentiator of our platform

Worked on Foundry's Ontology Workshop module, & Data pipelines to prepare the Datasets, build the Object table, created connecting to Filters, Embedded object Views, Created Widgets and Metric Cards used as parameters and configured with inputs, outputs, display options & actions required by their customers

Sr. Data Engineer/ DaVita, Inc. – Remote, USA 2020 - 2023

Worked on Davita's Patient Interoperability Data Sharing Project consists of patient consent, Data curation, FHIR, EPIC & CMS, have framework accessible to all parties, identity/support, data standardizations/ validations using INFORMATICA and Google BigQuery DBT. Gathered & organized data from (EHRs), EPICS, CMS, patient surveys, financial systems

Move golden de-dupe data from Transient to Curated SQL server. Assisted with ETL Batch job issues

Develop and implement data models, database structures, and strategies for data storage and access

Developed mappings using INFA ETL to load to Big Data from various sources. Profiled & cleansed data quality.

Performed data analysis and data profiling using complex SQL Queries/ window functions on source systems

Involved in analyzing to reduce complex data issues. Worked on quality issues & dynamic parameterization

Involved in checking the completeness of OID, de-duplication, setting up rules, data completeness: parent-child relationships, mandatory fields and relationships to other data concepts like encounter to observation

Implemented automatic way of error handling & recoverability using control mechanism in Informatica batch load

Using Google Cloud Platform to optimize and maintain Big Query views for reporting's better performances

Used BitBucket, IntelliJ, Jenkins, Git, Github, SourceTree and SDLC pipelines manifests for deployment

Participated in sprint planning, backlog grooming, and sprint reviews to align with clinical business priorities.

Validated technical implementation through UAT, working closely with QA teams and healthcare professionals

Implemented continuous improvements & feature updates based on user feedback & emerging healthcare trends

Created & documented data architecture blueprints that align with business objectives & technical requirements

Worked closely with software engineers, cloud architects & DevOps teams to design scalable & resilient solutions.

Collaborated with data teams & healthcare interoperability platform (HL7, FHIR APIs) for seamless data exchange

Sr. ETL Developer / Technical Product Owner/ Jenny Craig, Inc. - Carlsbad, USA 2008 - 2020

As a Lead ETL Developer, with strong business insight, responsible for functional/technical responsibilities in the ongoing maintenance & rolling enhancement releases required to support company's INFORMATICA and OBIEE implementation. Worked in PL/SQL and ETL interfaces dealing with mixt data: ODS, OLTP, OLAP, JDE, SQL Server, Oracle, Teradata & PostgreSQL. As a Product Owner, balances vision, execution and communication to ensure team builds what user needs.

Responsible for communicating product vision that aligns with customer needs & to design, develop, test, debug, implement & support activities of the DW/BI team by defining work packages, setting timelines, managing ETL, code migration-git, diagnose ETL issues, identify/ mitigate risks to ensure successful delivery of ETL Analytics

Engaged with clients to gather feedback on key challenges & used these insights to shape the product roadmap

For analytics solutions, involved in end-to-end product management from inception to implementation to scaling

Working with Informatica REST API to access information, create, update and delete connections, schedules, start jobs, import and export. Helped the team focused and on track with production issues, root cause analysis

Developed the ETL strategy & improvements to enhance business capabilities & documented DWH process flows

Created ETL WF’s with error/exception handling. Used ERwin to design conceptual, logical & physical data model

Delivered ETL solutions using relational Database modeling & designed for data integration framework in DWBI

Worked with existing team of consultants to conduct knowledge transfer about new Cloud implementation and test plan creation for integration and UAT. Help with design, planning and timely delivery of end user training

Assisted business in identifying priorities where technology can be leveraged to support business effectively

Involved in SQL tuning & enhancement of SQL & Informatica processed & migrated data from legacy JDE system

As needed, troubleshoot, and resolve technical/functional incident issues. Understood BRD and created TDD

Created packages by using tools in Data Flow Transformation, Data Conversion, Export Column, Merge Join, Sort, Union All, Conditional Split & more for existing/new packages & CSV file import from different sources

Involved in data migrations, designing, planning and developing programs to optimally extract, transform and load data from disparate data sources to the PostGreSQL, Oracle, Teradata and SQL server Database target systems

Monitored data quality metrics. Facilitated smooth data migration during system upgrades or transitions

Helped the team to revolutionize the Cloud! Securely migrated Oracle DB to PostgreSQL DB on AWS DMS.

Taking responsibility of integrity & delivery of the larger solution on moving EDW applications to AWS cloud to have potential cost savings and convert OLTP/OLAP schema from Oracle DB to Amazon RDS using AWS SCT

In a POC project, Snowflake used to ensure it aligns with the target Architecture. Used AWS SCT to convert Oracle DDL into Snowflake-compatible SQL. Run counts & checksum and run parallel queries to validate outputs.

Sr. ETL Developer / Insurance Company of the West, Inc. - San Diego, USA 2007 - 2008

The ICW Group is a multi-line property/casualty insurance group in San Diego, offers a range of Insurance products, commercial property insurance, workers' compensation, surety bonds and automobile insurance

Responsible for analysis, design, development, implementation & maintenance of software applications to meet Clients needs in the areas of Finance and Accounting. Monitored and documented end-to-end ETL architecture

Oversee the mapping of data sources, data movement, interfaces, and analytics to ensure data quality

Experienced in ETL data into DWH using Informatica Power Center (Repository Manager, Designer, WF Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect on Oracle, DB2, SQL DB

Involved in Data Validation to validate the data between the source data, ICW/MIS data and EDW data to improve data quality & integrity. Convert complex data flows & multiple ODI ETL layers into a single ODI ETL layer

Technical Data Steward / Sempra Energy - San Diego, USA 2006 - 2007

Supporting Sempra's mission to deliver energy, ensuring data is managed responsibly/effectively across the organization

Ensured compliance with data privacy law by supporting activities data governance policies, protection & privacy

Built batch data pipelines to ensure robust data flows & optimized storage & query performance in MS SQL server

Implemented and maintained data quality rules and processes to ensure data accuracy and consistency.

Participated in the development, implementation & maintenance of a Information Governance process framework

Consultant of Keane / KIA Motors America, Inc. - Irvine, USA 2005 - 2006

Consultant of Keane / Pfizer Inc., San Diego, CA, USA 2004 - 2005

Created Mappings using Informatica Mapping Designer to create various transformations as per the business logic which extracted data from heterogeneous sources to load to the target tables exists in different databases.

Developed new ETL & enhancements and modified existing code using INFORMATICA Power Center.

Migrated production data to DWH, ODS, Finance DM, downstream views and documented business processes

Involved in UAT, Systems Integration, performance tuning, error handling & modeling Documenting TDD & FDD.

Worked with DBA and Architect to provide system design that is dependable, scalable & maintainable

Developed Unix scripts for ETL processes such as starting/stopping Infa services, Server reboot, creating files with date stamp, moving files based on date in a filename, append data to a new row in a CSV file everyday etc

Delivered standards & best practices & performed Performance tuning at the both Mapping & the Database level

Informatica Developer / 3Com, Inc. - Santa Clara, USA 2003 - 2004

EDUCATION

MS: Electrical Engineering, GPA: 3.7, West Coast University - Los Angeles, CA, USA

TRAINING & CERTIFICATION

Informatica Power Centre 9x

CSPO (Certified Scrum Product Owner)

CCNA 1 & 2 certification, ITIL v2

Palantir Foundry Foundations

AWS reinvent conference focused on cloud computing, AWS services like Amazon EC2, S3, Amazon RDS



Contact this candidate