ABHIJIT GHADGE
***************@*****.*** +1-312-***-****
️Lead Software Engineer Snowflake Cloud Data Architect
PROFESSIONAL SUMMARY
Lead Software Engineer and Cloud Data Architect with 20+ years of experience designing, developing and leading enterprise-scale data solutions across finance, healthcare, and manufacturing. Proven track record of delivering cloud-native data platforms usingSnowflake,AWS, and ETL/ELT patterns. Led end to end, large-scale migrations, optimizing data architectures, and integrating AI/LLM capabilities into the data stack.SnowPro Core Certified, with hands-on experience in data modeling dimensional, Data vault and multi-tenant architecture.
️ TECHNICAL SKILLS
Cloud Platforms:Snowflake, AWS (S3, EMR), Azure
ETL / ELT Tools:Informatica, SSIS, Coalesce, DBT, HVR, IICS (exposure) Programming:SQL, PL/SQL, Shell Scripting, Python, JavaScript Data Modeling:PowerDesigner, Erwin, Toad Data Modeler Scheduling Tools:ActiveBatch, Autosys, UC4
Databases:Oracle, SQL Server, DB2, Sybase
Big Data Tools:Hadoop, Kafka, Hue, BigQuery
AI / LLM:Snowflake Cortex, LangChain, Retrieval-Augmented Generation (RAG) for Prototype
CERTIFICATIONS
Snowflake SnowPro Core Certified,Informatica, Certified,Six Sigma Green Belt
PROFESSIONAL EXPERIENCE
Lead Data Engineer / Cloud Data ArchitectatLTCG / Illumifin—May 2022 – Present
● Led Snowflake-based modernization of SQL Server workloads usingData Vaultand Dimensional Modeling.
● Designed and implementedmulti-tenant data architectureusing RBAC, masking, and row access policies.
● Automated end-to-endETL/ELT pipelineswith SSIS, HVR, and dbt for real-time and batch processing.
● IntegratedAI/LLM toolsincluding Snowflake Cortex and LangChain to prototype intelligent data workflows.
● Collaborated with cross-functional teams across Azure, Snowflake, and reporting layers. Tech Stack:Snowflake, SQL Server, HVR, SSIS, dbt, PowerDesigner, Python, Azure Lead Software Engineer / Data Architect
CME Group—Aug 2012 – Apr 2022
● Migrated on-premHadoop applicationsand large-scale data sets intoAWS and Snowflake, improving performance and reducing operational costs.
● Designed and maintainedregulatory compliance pipelines(e.g., Messaging Practices, TAG50, Block Trade), processing up to8 billion records daily.
● Architected data models and pipelines usingInformatica,Kafka,Snowflake, andAWS.
● Tech Stack:Informatica, Oracle, Snowflake, AWS (S3, EMR), Kafka, Erwin, POC on Big query ETL Technical LeadWells Fargo Advisors(Jan 2011 – Aug 2012)
● Executed brokerage data integration projects, including advanced data modeling and ETL process enhancements.
● Managed offshore teams and code reviews.
● Environment: Informatica, DB2, TOAD.
ETL Developer RR Donnelley—Jan 2010 – Dec 2010
● Created ETL workflows, test cases, and supported offshore teams for integration and reporting projects.Tools:Informatica, Oracle, Shell Scripting Earlier Roles (2003–2009):
● HPD NYC:Designed staging/data mart environments using dimensional modeling.
● GE:Backend Developer for expatriate system.
● Motorola:Developed Oracle-based solutions for sales data.
EDUCATION
Master’s Degree in Computer Science
Authorized to work in the U.S. without sponsorship