Post Job Free

Resume

Sign in

Software Engineer Data

Location:
VasanthaNagar, Karnataka, 560001, India
Salary:
80
Posted:
February 09, 2024

Contact this candidate

Resume:

V S Ravikiran

Sr. Data Engineer

ETL Data Warehousing Business Intelligence Data Engineering

+1-518-***-****

ad3h5p@r.postjobfree.com

TECHNICAL SKILLS

ETL Tools: Azure Data Factory, Azure Databricks, Azure Synapse, Snap Logic, Informatica, SSIS, ODI

BI Tools: Power BI, Tableau, QlikView, OBIEE

Databases: Snowflake, Azure SQL, Oracle ADW, SQL Server, MySQL

Cloud Suite: Azure, AWS.

Streaming: Kafka, Spark stream

Source Applications: Oracle EBS, PeopleSoft, Siebel CRM

Scheduling: Dollar Universe, DAC

Programming/Scripting: Python, SQL, Pl/SQL, Unix, Power Shell.

Version Control: VFS, GIT

Ticketing Tool: Version One, JIRA, Service Now.

EDUCATION

Master of Computer Applications from Kakatiya University, Andhra Pradesh 2007 -2010

WORK EXPERIENCE

Sr. Software Engineer - Lead IT Consulting - Jun 2020 to till date.

Tech Lead - Projects – Cognizant

- Dec 2014 - Jun 2020.

Software Engineer - TEK Systems

- Feb 2011 - Dec 2014

CERTIFICATIONS

Oracle PL SQL Developer Associate

PROFILE SUMMARY

•IT Professional with 13+ years of expertise in Data Warehousing, Data Integrations, BI Technologies like Azure Data Factory, Azure Databricks, Snap logic, Informatica, ODI, Snowflake, Oracle, Power BI, Tableau & OBIEE.

•Expertise in Snowflake Cloud DWH and strong experience in other cloud technologies like Azure and AWS.

•Expertise in working with Snowflake Multi - Cluster Warehouses, Snow pipes, Internal/External Stages, Store Procedures, Clone, Tasks and Streams.

•Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading data into snowflake tables.

•Involved in end-to-end implementation of migrating SQL Server and Oracle objects like tables, views, functions, procedures to snowflake.

•Working experience on Snowflake Architecture and its unique concepts like Zero Copy clone, Time travel, data sharing, data replication.

•Expertise in creating data pipelines using SnapLogic and Azure Data Factory.

•Expertise in developing framework for cloud orchestrations tools to capture the CDC from traditional ERP Systems, REST API’s, web services.

•Expertise in loading external tables w/ parquet files, streams, and tasks.

•Implemented complex pipelines in ADF, SnapLogic to extract the data from REST API’s, pagination, structured & unstructured data.

•Implemented CI/CD pipelines for Azure SQL, Azure Data Factory using Azure DevOps.

•Experienced in Agile methodology from design, implementation, solutioning, testing to delivery and maintenance & support.

•Handled Azure DevOps branching strategy using source tree and created and merged relevant release branches to master branch, reviewing and approving code pushed by developers.

•Implemented VPD at DB level restrict the PII / PHI information to unauthorized users.

•Given technical training to technical and end users on OBIEE & ODI at Onshore location

•Expertise in installation, development, configuration, validation, deployment & reporting of OBIA 11.1.1.10.2 (ODI 12c & OBIEE 12c) on Linux machines.

•Mentored Junior Engineers and prepared knowledge base documents on Snowflake.

•Involved in Snowflake knowledge sharing sessions within the organization.

•Adaptable to new environments with knack to learn new skills.

PROJECT DETAILS

Lead IT Corporation; Duration: May’20- – Present

As Technical Lead Projects

Enterprise DWH Implementation (Oct’21 – Till Date)

Client: Humana Inc.

Environment: Snowflake, Snap logic, Informatica, MS SQL Server, Oracle

Requirement Gathering

Involved in requirement gathering for new requirements and enhancements to existing requirements.

Enterprise Datawarehouse Initiative

Reengineer the solutions developed for existing on-prem source systems to Service Now Cloud.

Develop the mappings to extract the data from qTest webhook to overcome the limitations of qTest Rest API’s.

Developed complex routines to extract the data from the REST API’s.

Developed nested snap pipes to extract the data from REST API’s.

Implemented pagination techniques to overcome the limitations of the API’s.

Developed reusable pre and post workflow snaps to optimize the development framework.

Developed a framework to overcome the limitation of snaplogic to restart the job from the failed pipe.

Extensively used JSON snaps for flattening the unstructured data.

Introduced a reporting layer to remove the burden on the ETL Layer and reduce the Report development effort.

Converted the ODI mappings and Informatica developed for legacy systems to Snaplogic.

Performance Tuning

Involved in tuning ETL jobs in Informatica, Snaplogic, ODI and optimized the ETL run time.

Introduced a reporting schema in snowflake to Optimize the report development phase.

Security

•Implemented Data Level and Object Level security to safeguard the data in Snowflake

Team Management

Daily interaction with team to ensure deliverables are met on time

As Senior Data Engineer

T-Mobile (May’20 –Oct’21)

Tools: Snowflake, SQL Server Management Studio, Oracle, ADF, Informatica Power Centre, Power BI.

Requirement Gathering

Experience in requirement gathering and doing the gap analysis.

Marketing DataMart Initiative

Created internal and external stage and transformed data during load.

Created tables, views, MV’s and Secure Views for data loading from external sources

Created Snowpipes to auto-ingest JSON, and CSV files from AWS S3 bucket to Snowflake Stages.

Implemented Tasks and Streams in Snowflake to capture the change data and transform/process the data into Snowflake tables.

Created Stored Procedures in Snowflake to Cleanse and Transform the data as per the client requirements.

Designed Staging, Integration, Information and Archive areas on S3

Used Temporary and Transient tables on diff datasets

Shared sample data using grant access to customer for UAT.

Created S3 events for job triggering.

Performance tuning of Big Data workloads.

Developed incremental spark jobs for extraction of data from Source Databases into Staging Tables created on S3

Developed ETL jobs with transformations as per existing job designs in Snowflake, Snow sql and loaded Dimensions, Facts and Fact Summary tables

Worked on Snowflake streams to process incremental records.

Designed Update and Insert strategy for merging changes into existing Dimensions and Fact tables

Unloaded the data from snowflake to send data to downstream.

Performance Tuning

Involved in tuning ETL jobs in ADF ETL run time.

Introduced a reporting schema in snowflake to Optimize the report development phase.

Security

•Implemented Data Level and Object Level security to safeguard the data in Snowflake

Team Management

Maintaining an offshore team, gather new requirements and share across the team

Daily interaction with team to ensure deliverables are met on time

Cognizant Technology Solutions; Duration: India Dec’14 – May’17, USA May’17 – Jun’20

As Technical Lead Projects

Humana Inc. – Enterprise DWH Implementation (Jul’18 – Apr,20)

Environment: Snowflake, Snaplogic, Informatica, ADF, MS SQL Server, Oracle

Requirement Gathering

Involved in requirement gathering for new requirements and enhancements to existing requirements.

Enterprise Datawarehouse Initiative

As part of EDW Program, developed datamarts for 13 different source systems with storage technologies like Oracle, SQL Server, Rest API, and Flat Files.

Reengineered the existing stored procedure, functions, views, agent jobs in SQL Server, Oracle and convert them to Snowflake.

Created External tables pointing to azure blob (JSON / parquet files) and pushed data to permanent tables using stored procedures.

Reengineered SSIS jobs, Informatica and ODI jobs to Snaplogic. Developed framework for Snaplogic to handle the incremental and full loads.

Developed reusable workflows

Implemented Snowpipe solution to move data from on premise to snowflake using Informatica Power Center and python scripts.

Identified PII columns and Implemented Dynamic data masking policies for same.

Created procedures in snowflake to validate all views postproduction deployments.

Created View lineage using inbuilt snowflake functions.

Performance Tuning

Involved in tuning ETL jobs in Informatica, Snaplogic, ODI and optimized the ETL run time.

Introduced a reporting schema in snowflake to Optimize the report development phase.

Security

•Implemented Data Level and Object Level security to safeguard the data in Snowflake

Team Management

Daily interaction with team to ensure deliverables are met on time

As Associate Projects

Humana Inc. – OBIA & Custom DWH Implementation (Mar’15 – Jun’18)

•Moved to US in 2017 to drive the Custom DWH implementation, Cloud migration activities and OBIA future upgrades.

•Responsible for analysing the customer requirements for the systems HR, TRECS, Allocadia, Spend sights, PET and OBIA system.

•Single handedly delivered the OOTB OBIA 11.1.1.8.1 Analytics for Finance, Projects, Proc & Spend Analytics for Oracle BI Applications 11.1.1.8.1 product

•Developed ETL framework and Reporting layer schema to optimize the Reporting development time.

•Performance tuned the ETL and Reporting layer to optimize the runtime.

Yankee Candle (Dec’14 – Feb’15)

•Responsible for performing gap analysis between the current and new system.

•Converted the discoverer reports to BI Publisher reports for AP, AR, and GL Subject areas.

Frontline Consulting Services (TEK Systems), India; Duration: Feb’11 – Nov’14

As Software Engineer

Fujitsu (Jan’14 – Nov’14)

•Implemented OOTB OBIA Analytics Finance, Projects, Proc & Spend Analytics for Oracle BI Applications 11.1.1.7.1 product.

•Customized the ETL & Reporting Layer as per the customer requirements.

•A very first team to implement the new OBIA product with new architecture.

Oracle Corporation (Jan’13 – Dec’13)

•Developed ETL & Reporting Layer for Process Manufacturing and Enterprise Asset Management Analytics for Oracle BI Applications 11.1.1.8.1 product.

As Associate Software Engineer

Arthrex Inc. (Apr’12 – Dec’12)

•Implemented OBIA 7.9.6.3 + Extension pack to enable Data Warehouse Solution for Manufacturing Analytics from EBS Source System.

•Responsible for gap Analysis and extensively customized DWH to meet the customer requirement and converted legacy Discoverer, BIP reports to OBIEE analytics.

UMUC (University of Maryland) (Sep’11 – Mar’12)

•Responsible for implementing Enterprise Data Warehouse Solution for Student Analytics module from PeopleSoft Source System using ODI 11g, OBIEE as ETL layer and the Reporting layer respectively.

As Trainee Engineer

GXS (Feb’11 – Sep’11)

•Implemented B2B mappings for EDI standard like ANSI X12, EDIFACT for various customers like Panasonic, Cargill, Sony & Home Depot.

•Developed inbound and Outbound mappings for following documents 810, 835, 837, 850, 856, 860, 867, 865

Trainings

•Trained Juniors on the GXS AI workbench.

•Provided trainings to peers on the ODI 11g, BI Apps 11g Configuration and Customizations.

•Trained peers and juniors on the Snaplogic



Contact this candidate