Post Job Free

Resume

Sign in

Warehouse Architect Etl Developer

Location:
Rockville, MD
Posted:
June 13, 2023

Contact this candidate

Resume:

Jayakumar Kuppusamy

adxoug@r.postjobfree.com

281-***-****

Snowflake’s SnowPro Core Certification,

Snowflake Advanced Training Certificate

Summary - Enterprise Database Management, Cloud Warehouse, Migration & Integration to Cloud ~ SME

● Highly focused and experienced specialist in ETL Development, Data Warehouse, Migration, Integration, Data Analyst, Data Science, & Business Intelligence . Able to work effectively as part of Team and Independently.

● Exceptional proficiencies in Oracle, Snowflake, MS SQL Server -Business Intelligence tools(SSIS,SSAS,SSRS), ETL(Datastage, ODI, Talend, SnapLogic, Matillion ETL/ELT, AWS Glue.

● Snowflake SME and Complete Solution - Security roles and UserAccess Control -Time Travel for querying and restoring data and Fail-safe for disaster recovery,Continuous Data Protection - unified warehouse for both structured and unstructured data - Data Sharing within account, regions and third party reader account - Federated Authentication, SSO and MFA. Account Administration - Cloud, Storage Credit Usage & Billing, Controle usage with Resource Monitor.

● Data Analytics and Management - Data development - Data analysis (descriptive, predictive, and prescriptive- Extract, Transform, Load (ETL/ELT) - Business Intelligence (BI) - Data visualization - MPP Database and Concurrent Processing - SnowFlake Virtual Data warehouse / Snowflake Administration- Access Control, Monitor Credit and Data Usage, Network Policy, Security MFA Etc.

● Strong experience in ETL pipelines tools IBM DataStage design/admin, Informatica power center, ODI, Pentaho data integration, talend open studio, & cloud native Modillion ETL/ELT for snowflake, (knowledge on Redshift, Big Query version).

● Worked on a Kafka connect cluster to read data from Kafka topics and write the data into Snowflake tables. Implemented to read log files and load to production DB for analytics, Data Integration, Migration & Analytics using Spark, Kafka, Pipeline, Snowpipe tools.

● Experienced in Data mart, OLTP & OLAP implementations with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation, production support, Informatica mappings, SQL, PL/SQL scripts & compliance frameworks PCI, SOX, SOC 2, ISO 27001

● Extensive experience SaaS cloud (Salesforce, NetSuite ERP, Marketo, ServiceNow etc), on-premises enterprise

(Oracle EBS, PeopleSoft HCM) applications, databases prior experience Oracle, IBM DB2, Netezza, Teradata etc.

● Strong knowledge of SDLC (viz. Waterfall, Agile, Scrum)

● Complete architecture and process data Migrate on premise Oracle database structure to Redshift MPP data warehouse, Postgres, S3, AWS Data Pipeline.

Key Competencies

● Enterprise Applications Support, Database Administration: Snowflake, Netezza Mako server, IBM PureData system for Hadoop, Aginity workbench.

Oracle, Sybase, Ingress. PostgreSQL, MongoDB, MySQL, Data Modeler, Netezza database, IBM Integrated Analytics system, Appliances for high-performance data warehouse and advance analytics

● EMPAC Enterprise Asset management: Discoverer BI 10g - Create End User Layer (EUL) for all financial modules, administration, access management, security of data.

● OBIEE 11g – Physical, Business Model and Mapping, Presentation layer. BI Admin console to manage users, groups and repository objects. JDeveloper with Oracle Applications Framework (OAF) Reports, Forms development, compliance frameworks. Experienced in Web based applications using Oracle APEX HTML-DB, ASP and HTML

● Database Integration, Migration, Data Warehouse and ETL tools: Snowflake data warehouse, Modillion ETL/ELT mapping in AWS cloud environment. Complete data stream process Kafka(topic,consumer, producer) –> S3 -> Snowflake -> Matillion ETL-> Analytics DB.

● Strong understanding on Data Warehouse ETL/ELT process, IBM DataStage 9.2. Talend, Pentaho, SnapLogic. Informatica PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager. Power Center, Informatica Data Quality (DQ), Creation of complex parallel loads and dependency creation using workflows.

● Informatica Intelligent Cloud Services (IICS) Very good knowledge in Cloud Services Data Integration, Data Quality, Profiling, API Manager, SaaS Application Integration, Application Integration Console and Master Data Management Cloud Etc.

● Data Modeling: Logical data model (LDM), Data Element Dictionaries (DED), Physical Data Models

(PDM), Interface Control Documents (ICD), data format specifications, Version release documentation, data harmonization and mappings. Erwin, Toad Data Modeler and LucidChart.9.1/8.6.1/8.5/8.1/7.1

● Extensive experience with T-SQL in constructing Triggers, Tables, implementing stored Procedures, Functions, Views, User Profiles, Data Dictionaries and Data Integrity.

● Report Analytical Tools: Crystal Reports writer, Actuate eReport, Logi Analytics, Tableau, IBM Cognos Analytics, MicroStrategy, Qlik Sense. Looker, Business Intelligence tools(SSIS,SSAS,SSRS), Azure Databricks, Sagemaker ML/Quicklight.

● Tools/Programming Languages/Operating Systems/Development, and Security Analysis: C and C++, Python/SQL (PyPL, NumPy, Pandas, SciPy, MetPlotLib), COBOL, HTML, XML, AngularJS, CSS, JavaScript, JQuery.

● Development tools: SnowSql, SQL editor TOAD, pgAdmin, IDE visual studio, XCODE, Ionic cross-platform mobile apps.

● Security Analysis tools: Burp Suite, OWASP ZAP, Carbon Black defense, sensor (Cb), Splunk, Rapid7. Education & Certifications

MS in Computer Applications (M.C.A), National Institute of Technology Calicut (NIT), India BSc Physics - Applied Electronics, Madras University, Chennai, India

● Data Science for Business certification from Harvard Business School (Online)

● Snowflake Advanced training completion certificate

● Snowflake's SnowPro Core Certification

Issuing authority Snowflake : License Issues Jul 2020 Expires Jul 2022, Credential ID : https://www.youracclaim.com/users/jayakumar-kuppusamy

● Master Data Management

Credential IdentifierCredential ID UC-1XS22870

● AWS Analytics

Credential IdentifierCredential ID UC-4N23C42L

● AWS Certified Solutions Architect - Associate 2020 Credential IdentifierCredential ID UC-EO4PJNHT

● Python for Data Analysis and Visualization

Credential IdentifierCredential ID UC-FS3LHPJC

● Snowflake Data Warehouse & Cloud Analytics

Credential IdentifierCredential ID UC-BVI6CVPO

● Oracle Certified Professional

DBA Backup,and Recovery workshop, Sql and PL/SQL, DBA - Oracle Discoverer 3.0 for Administrators,training from Oracle Corporation.

● Certificate of training - AWS Analytics – Athena, Kinesis, Redshift, QuickSight, Glue( PySpark) and pipeline, Spark Streaming, Kafka Streams, Master Data Management(MDM). Employment History

GCOM Information Technology Inc - Mar 2019 - Current ( Full time) Inovalon, Bowie, MD - Data Driven Healthcare Solution Company Software Development Engineer II (Snowflake) - Consultant Jun 2022 – May 2023

( Healthcare payers Data integration/migration to Snowflake)

● Developed pipeline using SnowSQL stored procedures from different source databases to unload data to S3. This process is an incremental unload for the data load from daily ingestion.

● Complete logic developed to batch the unload process to increase performance, low latency, parallelism using snowflake concurrent process capability.

● Snowflake task scheduler used to trigger the daily run, worked on possible monitoring needs, email notification

(preview feature in Snowflake), Data consumed by Lambda service followed by analytics run in Kubernetes container.

● Users accessing the data from web applications, run extract jobs for the required specification(client/project specific) run, preview, export( with different format) with secured data transfer methods.

● Focus on Snowflake feature in data ingestion, performance best practices, cost analysis, benchmarking the performance in current Aurora Postgres, MS SQL process to Snowflake warehouse.

● Data set worked on payer, Risk for member from LOB Medicare, Medicaid, commercial.

● Regulatory reports are exported from the analytics system, shared securely to clients state DOH departments,, and auditing agencies per business requirement(CMS, CHIP,NCQA, State DOH specific plan metrics.

● Worked in Healthcare Risk management data and electronic reports on incident, intervention, response and review.

Environment: Aurora Postgres, MS SQL Server, Snowflake, Azure DevOps Board, CI/CD pipeline, .NET, C#S, Snowflake SnowSQL, Python, RabbitMQ, JavaScript, Stonebranch ( task job orchestrate/automation), XML report exports,Inovalon One platform -HEDIS certified QSI-XL, Risk Management. Co-op Solutions, Rancho Cucamonga, CA

Data Architect(Snowflake) - Consultant Jan 2022 – Apr 2022

● Architect and design pipeline to ingest data from MS SQL DB -> Azure Blob -> Snowflake.

● Implemented Snowflake Data sharing for the Credit Unions and 3rd Party.

● Transaction system data designed using STAR schema dimensional model, JavaScript stored procedure developed to load data in DIM and FACT final Enterprise Data Warehouse(EDW).

● Involved in Designing and defining the Enterprise JAMS schedule workflows.

● Azure Boards, Azure DevOps Services integrated to GitHub repo. CI/CD pipeline works to move DEV to BETA, PROD deployment.

● Helped platform team to configure RBAC access hierarchy framework to establish Role-based accesses and privileges.

● Executive dashboards/ detailed report developed for executives, accounting and Tax dept.

● Developed POC source data load framework, support QA team for validation in lower environments. Environment: Snowflake, MS SQL Server DB, Azure Blob Storage, MS Teams collaboration, Project management

(Agile, Scrum project), ServiceNow, JavaScript pipeline to load data from Blob to Snowflake, JAMS Jobs Scheduler. SSRS reports - parameterized reports, Drill down, Drill through, Matrix, Tabular reports, Sub reports. Cox Automotive Inc, Atlanta, GA

Sr. Data Warehouse Developer/Architect - Consultant Dec 2020 – Nov 2021

● Created conceptual/physical data model for car resale system. Dimensional data model created - star schema, snowflake schema, Hybrid Schema

● Worked with the Data Engineering / Integration team to create a data pipeline - Postgres DB-> DMS-> Ignite(s3) data lake-> Snowflake.

● Configure the SQS and Snowpipe for continuous data flow from Ignite(S3) to Snowflake.

● Storage Integration, Staging,Stream,Task created to stage the raw data.

● Stored procedures developed to transform the JSON data into Fact, Dimension tables, views using LATERAL JOIN, FLATTEN function.

● Access and Security designed based on RBAC.

● Support the pipeline in DEV, QA, UAT, SUPP and Prod environments.

● Worked on best practices, Warehouse monitor, Query optimization using query plan, data cache, secured data sharing, Zero copy clone.

● Support MicroStrategy BI team for data needs, performance tuning in Data Ingestion, Transformation & Report Query.

● Snowflake performance optimization- query profiling, cache management and cluster key. Access and security architecture - RBAC, Row level, column level access dynamic data masking policy. Environment: Snowflake, AWS platform, MicroStrategy BI, MS Teams collaboration, Rally Project management

(Agile, Scrum project), ServiceNow, Collibra common data center, AWS Services - Glue services, Lambda function(Python) pipeline to load data to S3, CloudWatch, Athena, RedShift Spectrum, IAM,PostgresDB -> DMS-> S3(Enterprise data lake)->Snowflake Staging-> Data Warehouse layer. Toyota Financial Services, Plano, TX

Consultant - TCS - Data & Analytics team

Sr. Data Warehouse Architect(Snowflake) - Consultant Feb 2020 – Nov 2020

● Defi/ Sagent auto lending service system data source.

● IBM Netezza data warehouse, Other ERP data to snowflake.

● Data migration using ETL Talend Cloud Integration to Snowflake data warehouse AWS platform.

● Data visualization developed using Microstrategy BI/Looker tools from Snowflake data warehouse schemas Account, Lease, Loan and Tax department.

● Auto insurance data integrated from source OLTP to warehouse OLAP system.

● Talend ETL jobs scheduled (AutoSys) to run from data lake Raw data to DW.

● Lead development team in onshore/offshore communication model.

● Snowflake best practices and Object-Level Security design.

● Management of application day-to-day data operations : Data storage management, quality, transformations, transfers

● Digital modernization, data analytics in cloud platform Snowflake/AWS

● Implementation of DQ and MDM solutions to manage the data issues. Environment: AWS Data Lake, Source system API, ETL Talend, Snowflake AWS platform, AutoSys scheduling tool, MS Teams collaboration, JIRA Project management (Agile, Scrum project), ServiceNow request management system.AWS PostgresDB -> DMS-> S3->Snowflake Staging-> Data layer. CISCO – Duo Security, Ann Arbor, MI - Consultant Aug 2019 – Jan 2020 Cloud Data Warehouse Architect(Snowflake)/Matillion ELT/ETL

● Heavily involved in Data integration, Database Management and data warehouse design, architecture for dimension and facts based on metrics requirement, development form Salesforce objects.

● Structured, heterogeneous data loaded to snowflake staging tables using matillion S3 load/unload components.

● Widely used Python interfaces used to load into S3 are performed for unstructured data.

● Snowflake data warehouse/architect from Salesforce SaaS applications NetSuite, Recurly, Zendesk, Marketo, through API.

● Performance issues are managed in a snowflake cluster server by scaling up or down.

● Matillion ETL tool uses Salesforce Query component, Salesforce API to load data into target staging table, S3 staging area into Snowflake.

● Involved in the Orchestration jobs run manually or using the scheduler.

● Worked with Salesforce API properties set for batchsize, timeout. Matillion jobs are scheduled to run sequentially or concurrently.

● Transformation jobs developed to create dimension and fact tables for visualization development.

● Snowflake is directly connected to Tableau, other analytical tools, utilizing power, and scalability of warehouses. Completed data stream performed using Matillion jobs by ETL, ELT approach. Business intelligence solutions derived from visualization tools like Tableau, Looker. Environment: AWS, Matillion ETL/ELT, Snowflake, Tableau, Looker (source applications Salesforce, NetSuite, Recurly, Zendesk, Marketo using API)

Capital One, Mclean, VA Mar 2019 – Jul 2019

Data Warehouse Architect/Database Engineer

● Snowflake warehouse data validation from Data lake, Developed dashboard reports, data analysis work for executives in Commercial Bank - Credit Solutions department.

● Worked on testing Hadoop extract script for calculation, columns mapping on defined metrics. Application system data worked on loan processing and portfolio management.

● Data warehouses designed as per the metrics needed from business users, facts and dimension objects are created.

● Tableau used to develop Dashboards and analytical reports from snowflake databases and other sources. Environment: Hadoop, Tableau/server, DataLake, Snowflake, AWS Henry M. Jackson Foundation, Bethesda, MD Aug 2008 – Feb 2019 Sr Database Engineer– BI/ETL developer

● Development, Enhancement, Implement and support of internal HJF departments, external Customer/programs from Oracle E-Business Suite R12 /Human Resource system PeopleSoft HCM 9.2 application. The reporting tools used to develop and deploy are Oracle BI Discoverer, Actuate eReport designer, Crystal Reports writer, Web-based Log Analytics Business Intelligence system and Tableau Desktop/Server.

● Data warehouse created from PeopleSoft Financial system and Oracle E-Business Suite R12 are the data source for BI report development. Database packages/procedures are created and scheduled to run on periodically.

● Worked heavily with Oracle Business Intelligence Enterprise Edition (OBIEE), reports, dashboards and troubleshooting issues. Involved in the Data validation/testing using SQL.

● Oracle Data Integrator studio (ODI) - ETL (Extract, Transform, Load) data mappings, monitoring, repository management. Knowledge modules are very well used.

● Worked on Master Data Management (MDM) concepts and Informatica MDM tool -Informatica MDM Hub Console, Hierarchy Manager (HM) and Informatica Data Director (IDD).

● Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.

● Experienced Data mart, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation, and production support.

● LogiAnalytics studio predictive analysis reporting tool used from Data Warehouse, created analytical reports, data visualization, dashboards

● Oracle On-Premise to AWS Cloud Postgres DB (RDS) using AWS services like VPC, EMR, S3 and DMS tools.

● Re-Write several slow performing, incompatible SQL queries to Postgres and run cost efficiently during Post-Migration.

● Analyze the Jobs In detail to determine the right instance/cluster type (Storage/IO/Compute/Memory Optimized) for cost efficient performance.

Environment: Oracle Database, Discoverer BI/OBIEE, EBS financial application, PeopleSoft HCM 9.2, LogiAnalytics, Tableau, Crystal reports writer, compliance frameworks PCI, SOX, SOC 2, ISO 27001 EPCO Inc., Houston, TX Dec 2007 – Aug 2008

Oracle Technical consultant

● Involved in the process improvements: Procurement, Accounts Payable and Master Data Management. The proposed solution will enable efficient and automated procure-to-pay processes across P2P Project and its Suppliers.

● Provided solutions based on web development to automate processes within the enterprise and collaborate with the Supplier ecosystem.

● Responsible for writing high level technical documents and reviewing with technical and functional teams.

● Responsible for database design, data analysis, mapping of columns between the systems.

● Support the Legacy (EMPAC) Asset management system to Oracle financial system interfaces AP, Purchase Vendor Site, GL, PA interface etc.

Environment: Oracle Database, Discoverer BI/OBIEE, EBS financial application, EMPAC asset management system. Bahrain Petroleum Company (BAPCO), Bahrain Feb 1999 - Nov 2007 Senior Oracle developer/Analyst/ ETL Developer

● Involved in the Indus EMPAC (Enterprise Maintenance Planning and Control) interface to Oracle Financials 11i

(AP, GL, AR and PA), PL/SQL Development.

● Developer and provided support for the EMPAC asset management system in functional, technical and report requirements.

● Developed Key performance Indicators (KPI) for Material, Maintenance department from EMPAC System by creating Data warehouse using Data Stage, Director & Manager.

● Data warehouse created for stores and project staging materials to produce Inventory and stock movement reports using Oracle Reports/Crystal report writer.

● In-house Stock replenishment project (SRP) to have more reporting facilities for EMPAC and forms developed to run API.

● Interface developed using the tool Meridium Interface Manager, Jobs are scheduled to run using scheduler.

● Responsible for Database repository creation (EUL) and ETL performed using Data stage manager, Director and Designer.

● Microsoft SQL/Sybase/ Oracle Administration for Database performance tuning and Capacity management planning

Environment: Oracle Database, IBM DB2, Discoverer BI/OBIEE, EBS financial application, PeopleSoft HCM 9.2, Logi Analytics, Tableau, Crystal reports writer, IBM DataStage ETL, TSQL, Sybase



Contact this candidate