SIJO KANNADAN VARGHESE +610*********/+1-214-***-****/ ********@*****.***
Current Location: Texas
DATA ENGINEER / DATA SPECIALIST
PROFILE SUMMARY
Experienced data management specialist with strong expertise in data warehouse and data lakehouse solutions.
Proven track record delivering data solutions across finance, banking, insurance, fleet management, and telecom industries.
Established best practices, standards, ETL frameworks, automation, release management, and guidelines for design, development, deployment, and support in data analytics.
Designed and implemented solutions for acquiring, storing, and analyzing structured & semi-structured data from both on-premise and cloud services, including data governance and security frameworks.
Technical leadership and hands-on experience across data access, integration, modeling, visualization, design, and implementation.
Skilled in infrastructure knowledge, technical/data architecture, problem-solving, performance tuning, and large-scale data integration & engineering.
Extensive expertise in Informatica (installation, configuration, administration), along with proficiency in Python, Power BI, SQL, dbt, snowflake, databricks and related technologies.
Strong focus on continuous improvement, modular design, readability, maintainability, supportability and scalability.
Solid finance domain knowledge.
Optimized large-scale ETL and analytics workloads using Databricks and Snowflake, improving query performance and cost efficiency.
TECHNICAL SKILLS
Data Engineering: Informatica, Python, dbt, SQL, PL/SQL
BI, Languages & Tools: VB, Tableau, Excel, Hadoop/HDFS/Hive/Spark, Attunity, Actifio, Power BI, Alteryx
Databases: Teradata, ORACLE, SYBASE, MS SQL Server, DB2, Green plum, MySQL, Netezza, Exadata, Postgres
Operating Systems: UNIX, Linux, Windows
Cloud & platforms: Databricks, Snowflake, BigQuery, AWS, Azure, GCP
Data modelling: 3NF, Star schema, Data Vault
Methodologies: Medallion Architecture, Big Data, Data warehouse, Data Integration/migration, Client Server
Domain: Investment Banking/Capital Markets (market risk, credit risk, prime service), Group Finance, Insurance, Retail Banking, Leasing, Telecom
Management Tools: Clear Case, DB Artisian, SVN, VSS, Confluence, GitHub
Scheduling Tool: Control – M, Inbuilt scheduling tools
Secondary Skill set: VB Script, Java Script, Crystal Report, MS Office, Perl, Selenium
Other: Release/change management, automation, performance tuning, continuous improvement
CERTIFICATIONS
Data Architecture for Data Engineers: Practical Approaches
Fundamentals of the Databricks Lakehouse Platform
Introduction to Python for Data Science and Data Engineering
Get started with Data Analysis on Databricks
PROFESSIONAL EXPERIENCE
Client: AT&T USA Nov 2025 – Apr 2026
Data Engineer
Description: Worked with Snowflake, databricks, dbeaver for data sphere project. AT&T is an American multinational telecommunications holding company located in Dallas.
Responsibilities:
Designed and developed data pipeline using databricks
Data storage at snowflake level
Analysing source systems and data, preparing SQL, data mapping and data pipeline
Environment: Snowflake, Databricks, notebook, azure, AWS, dbeaver
Client: DataDivers Australia Feb 2025 – May 2025
Data Engineer
Description: Worked with Snowflake, Power BI, AWS and PostgreSQL for data platform hardening project. Data Divers is technology-consulting organization focused on data, insights and AI.
Responsibilities:
Designed and developed a custom monitoring dashboard and auditing report using Power BI
Created data model (star schema) and data pipeline to populate application data to snowflake DWH
Data pipeline and analysis using Snowflake notebook
Medallion: Architecture the Medallion framework, encompassing bronze, silver and gold layers, to structure and enhance data management
Environment: AWS, Snowflake, Power BI, PostgreSQL, notebook
Client: NCS Australia Apr 2023 – Feb 2024
Data Consultant
Description: Queensland railway data platform upgrade was delayed and running out of budget. I fixed the data platform upgrade overcoming multiple challenges. NCS is a leading APAC based technology services firm.
Responsibilities:
Upgraded Informatica data platform to higher version for Queensland Rail
Participated in pre-sales demonstration showcasing technical capabilities and data solutions
Designed & implemented PoC using Python, Databricks, dbt, Power BI, PostgreSQL, AWS Lambda
Design and development of highly reliable and scalable data pipelines and data platform
Designed and developed Power BI for data visualization, building a self-serve platform to empower senior executives to make informed, data-driven decisions
Designed and developed Python and DBT -based applications to automate data loading and analysis processes
Data pipelines: Designed and built batch and ad-hoc based data pipeline
Data Architecture/Data Modeling: Designed and implemented conceptual and physical data models for marketing team
Folder structure: Established structured folder hierarchy within the data lake, enhancing the organization and efficiency of data storage
Data Analysis: Hands-on experience in data profiling, transformation, and visualization to support business insights
Model Engineering: Exposure to training, fine-tuning, and optimizing ML/AI models for analytics and decision-making
Prompt Engineering: Practiced designing effective prompts and chains for improving LLM outputs in business scenarios
Environment: AWS, Linux, Informatica 10.5, DBT, Python3.11, Excel, Databricks, Power BI, Alteryx
Client: CustomFleet Australia Apr 2019 – Apr 2023
Data Specialist
Description: As part of high-level design, I contributed in solution design, security, installation, process improvement, vendor management and technical / data architecture. Customfleet is Australia and New Zealand’s premier fleet management organization.
Responsibilities:
Participated in Snowflake & AWS PoC.
As member of architecture team, contributed to solution design, security, installation, configuration, process improvement, vendor management and data application architecture
Reduced license cost 50% by cleaning up Control M and ETL jobs
Installed and upgraded Informatica to 10.5
Created ETL frame work, application architecture
Installed and configured Actifio (Data provisioning tool)
Introduced AD Authentication and access controls for data tools
Performance fine tuning of Informatica mappings and cleaning up environment and removing orphan mappings
Provide data engineering practice consultancy across platforms, products and delivery teams
Identify and drive opportunities for continuous improvement and simplification
Deliver high quality, sustainable solutions to meet business requirements, leveraging approved delivery frameworks and by applying industry best practice
Developed reusable frameworks and tooling to accelerate data engineering maturity
Take responsibility for data technologies and data assets creating roadmaps and technical direction
Provide thought leadership, mentorship and up-skilling for data engineering team
Team Management: Led many teams of offshore consultants specializing in data engineering
Subject Matter Expertise (SME) and Support: Provided SME support to multiple teams engaged in data modeling and engineering projects
Quality Control and Testing Framework: developed quality control and unit testing framework to ensure data quality and system reliability and reduced deployment errors significantly
Environment: Windows, Informatica 10.5, Oracle 19C, Control M, Attunity, Actifio, SnowFlake
Client: Accenture Singapore Feb 2017 – Apr 2018
Technical Architect
Description: Accenture is a global professional service firm, I was part of Advanced Technology Architecture (ATA) team.
Responsibilities:
As a Technical Architect I participated in sizing, coordination, and implementation.
Installation, configuration, release management and administration of Informatica.
Established and administered end to end data platform.
Technical documentation.
Collaboration with Business Data Managers, Data Stewards, different technical teams and cross-functional representatives from various lines of business to discern regulatory reporting metadata management requirements.
Subject Matter Expertise: Provided SME support to data modeling and engineering projects.
Environment: Informatica 9.6, Oracle 12C, Linux, Netezza
Client: Deutsche Bank Singapore May 2013 – Sep 2016
Data Consultan6
Description: Worked as data consultant across many clients in Singapore.
Responsibilities:
Worked as a data consultant across global financial firms (Deutsche Bank, UBS, etc.)
Developed, finetuned data pipeline.
Completed delayed data platform upgrade within budget and schedule.
Lead Informatica migration team for upgrading platform from version 861 to 951.
Team Management: Managed a team of 6 consultants from a vendor.
Established a Data pipeline using ETL framework, integrating various source systems.
Involved in Requirement Gathering, Data Analysis (Source to Target mapping spec) and development.
Worked extensively on Performance Enhancements for Informatica Workflows reducing batch run times.
Onshore Coordinator – Led a team of 5-8 developers.
Extensive experience in ETL process consisting of data transformation, sourcing, mapping and loading.
Competence in data warehousing (DWH) operations, including backups, scheduling, restoring, user administration, and environment provisioning.
Proficiency in the deployment and configuration of DWH solutions and tools.
Defects Analysis and Bug Fixing: analysis and resolution of defects and bugs, ensuring the projects overall reliability.
Environment: Informatica 9, Oracle 11g, Unix, Greenplum, Ctrl M
Client: Credit Suisse Singapore Apr 2007 – Mar 2012
Data Consultant
Description: A global investment bank and financial services (now part pf UBS), I worked on both prime service and Risk team.
Responsibilities:
Migrated Java based data integration to Informatica.
Established release management process, reduced release related issues significantly, experience in change and release management.
Fine-tuned many long running ETL jobs.
Fixed complex data quality issues and bugs.
Maintenance, optimization, fine-tuning and enhancement of Informatica mapping.
Involved in Requirement gathering.
Experience in Database Design, Entity-Relationship Modelling, Star Schema & Snowflake with Kimball methodologies.
Expertise in source-to-target field-level mapping, data cleansing and data loading
Comprehensive experience across all stages of the software development life cycle (SDLC), including analysis, design, development, unit testing, and integration testing.
Data Analysis: I conducted thorough data analysis to gain insights into data structures and requirements
Version Control and Automation.
Environment: Informatica 8, Oracle, Sybase
Client: DBS Singapore Sep 2004 - Jan 2006
Data warehouse Consultant
Description: DBS is Singapore’s leading consumer bank. I was responsible for populating DBS Hong Kong trade finance and other application data to central DWH (Teradata).
Responsibilities:
Source system requirement analysis.
Data modelling and profiling.
Source to target mapping based on NCR FS-LDM framework.
Data pipeline development, testing and deployment.
Environment: Unix, Informatica, Teradata
Client: NIIT Technologies India Apr 2004 – Aug 2004
Programmer
Description: NIIT (Coforge) is a multinational IT service company, I worked as a data engineer.
Responsibilities:
Fine-tuned many long running ETL jobs
Enhancement of Informatica data mappings
Involved in Data Extraction from Oracle and Flat Files, designed and developed mappings using Informatica PowerCenter
Familiarity with configuring Informatica data platform
Environment: Windows, Informatica, Oracle
Client: GEFA India Dec 2000 – Apr 2004
Programmer
Description: GE Financial Assurance, the insurance and investment arm of General Electric (GE). It was a part of GE Capital, GE's financial services division.
Responsibilities:
Development and enhancement of data mappings
Development and enhancement client server application using VB
Development of reports using Business Objects and Crystal Report
Competence in front-end application development utilizing Visual Basic
Knowledge in database design, development, optimization, and data analysis/data profiling
Environment: Unix, Business Objects, ORACLE, Visual Basic 6.0, SQL Server 7.0, Crystal Reports 7.0, SQL, Oracle 8.0, DB2, Informatica PowerCenter 1.7, ASP, VB Script, Java Script, XML
EDUCATION
Diploma in Computer Engineering, Technical Education board of Kerala, India, April 1995.