VENKY VISHNUBHATLA
Phone: 646-***-**** Email: ******.************@*****.***
A Senior Data Engineer with 19 years of extensive experience in Business IntelligenceI/ETL/Datawarehouse/AWS services
,programming,QA and Leadership with comprehensive understanding of all stages of Project Lifecycle. Possess excellent communication and interpersonal skills with the ability to work as a Leader as well as a independent resource Education:
M.S, Electrical Engineering University of Texas at El Paso, Texas Dec 2003 Summary:
ETL- Microsoft Database suite across OLTP, OLAP, Data Warehouse,EDW,Data Modelling,ETL Architecture and Data Integration.
AWS for building Data Pipelines,Data Ingestion and Analysis both from on-prem and within cloud.
Managing & Leading - cross functional teams,Project Planning and Deliverables,Resource Management onsite, Vendor Management and Process Improvement
Experience:
Rocket Central Mortgage Detroit, MI Nov 13 – Till date Current Role: Data Engineer
Data Modelling for OLAP systems both logical and physical data modeling implementing Star Schemas for various Business Solutions for EDW Datawarehouse..
Designing and building SSIS Packages, SSAS cubes and SQL Agent Jobs for multiple types of source systems like Data Lake, Hive,AWS S3, SQL Server, Flat Files, salesforce etc.
T-SQL and implementing stored Procedures, Functions, Views, and Data Integrity.
Extensive experience in Performance Tuning and Query Optimization using execution plan, Query Store, Extended Events, Performance Monitor and DMVs
Attunity Replicate data ingestion for CDC from SQL sources to Data Lake /AWS S3
Develop Data Ingestion pipelines,workflows using AWS Glue,Spark,Lambda functions,API Gateway,dynamoDB, S3,SQS,EMR, cloudWatch and other AWS services
Extensively develop Databases, tables and queries using Presto-SQL for analysis in AWS Athena.
Co-ordinate with DevOps for Terraform/Infrastructure deployments and update IAM roles for the services/applications involved.
Use GIT for Source Control,work in an Agile Environment and Mentor other other team members. Environment: MS SQL Server 2016,2012,2008R2, Visual Studio 2017, SSIS 2008, 2012,2016, SSAS, Tableau 9.0, Power BI. C#,
.NET framework 4.0, YARN, AWS Services, Adobe Analytics, Python, Team Foundation Server, Confluence, REST API Client: Ally Financials, Kaiser Permanente Detroit,MI Oct 09 – July 13 Role: Lead Programmer Analyst
Responsibilities:
Data Modelling and design
Develop Informatica Mappings by usage of mappings like Aggregator, Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets to populate star schemas.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.
API/Web services Testing using internally developed tool.
Working on multiple projects, develop and maintain project timelines, Tasks (WBS), Estimates (LOE), Schedules and deliverables for multiple projects.
Creation and maintenance of the Master Test Plan Strategy
Manage project resources.
Defect Triage calls discussing Status /Progress/Bugs /Defects and ascertaining the Priority/Severity during project execution cycles
Environment: Java J2EE, JSP, Oracle 10g, Unix, MS project plan, Quality Center, QTP, Performance Center,Python,Informatica powecenter 8.x
Oppenheimerfunds Inc, Manhattan, NY Apr 07 – Aug 09 Role: SR. Technical QA Analyst
Consolidated Data repository is a Data warehouse/Business Intelligence project for the entire firm so the Sales and Marketing team can create and execute reports in SIEBEL ANALYTICS to make decisions on the Sales and Marketing/Distribution of various Mutual fund products offered by the company. Sales and Marketing Data from Siebel CRM and Mainframe sources are populated along different dimensions in the data ware house and Analysis/reporting is done via Siebel Analytics. Responsibilities:
Create the Master Test Plan from the Requirements Specifications and develop test procedures from the best practices.
Develop Test cases using PL/SQL Queries and Stored Procedures for data analysis, manipulation and testing data in the Oracle data warehouse.
Analyze, Validate the Informatica Mappings for Source data and populate data in the tables in Lower environment by executing workflows from Workflow Manager of Informatica Power mart 6.2 and Power center 8.1.1
Execute and Monitor Autosys Jobs to test various data loads.
Develop, debug, execute and maintain test automation scripts using QTP 9.0 Environment: Siebel 7.7, Siebel Analytics, Informatica Power mart 6.2.1, Informatica Powercenter 8.1, Cognos 8, x, Oracle 9.1, SQL, PL/SQL, VBScript, Embacarado Rapid SQL,Quality Center 9.2, QTP 9.0 Client: United Bank of Switzerland, Wells Fargo FMG July 04 – Apr 07 Role: Programmer Analyst
Responsibilities:
Perform Business Requirements and Gap Analysis.
Functional Regression Testing, UAT and BAT (Business Acceptance testing) using Automation Scripts developed in QTP 9.0 and perl scripting .
Performance testing using Load Runner
Test Director 7.6 for defects management.
SQL: Create/maintain Test Environment/Test Data in SIT/UAT/PROD. Environment: Java J2EE, RapidSQL, Oracle, MS SQL Server2000, 3270 Emulator, QTP 8.2, Quality Center 8.2, Load Runner 8.0, Caliber RM 2000, PVCS Version Manager, PVCS Tracker,