Post Job Free
Sign in

Integrations Lead

Location:
San Francisco, CA
Posted:
June 10, 2025

Contact this candidate

Resume:

BALAJYOTHI KOTHAGUNDLA LEAD INTEGRATIONS/SOLUTION ARCHITECT

Dublin, California

******.***@*****.***

Contact: 408-***-****

LinkedIn Profile: www.linkedin.com/in/jyothi-kothagundla-65a93512

OBJECTIVE:

Accomplished IT professional with extensive experience as a Developer, Solutions Consultant, Solutions Architect, and Data Engineer, specializing in Digital Transformation, Data Integration, and Business Intelligence. Proven expertise in data capture, analysis, ETL, and data warehousing to support end-to-end solution design, application development, and enterprise architecture. Seeking a challenging role as a Lead Data Engineer or Solutions Architect to drive innovative, scalable, and data-driven solutions aligned with business goals.

WORK AUTHORIZATION: US Green Card Holder.

PROFESSIONAL SUMMARY:

●A results-driven Data Engineer with demonstrated success in ETL/ELT design, development and deployment of Enterprise Data Warehouse application built on cloud and on premise.

●Over 15 years of experience with Technical Lead, Solutions Architect, Professional Services, Data warehousing, Business Intelligence, Database, Reports and ETL/ELT processes.

●Over 7 years of expertise in Customer Success, Technical Consulting, Technical Leadership, Project Management, Solving complex Integration Problems and Subject Matter Expertise on Data logistics Products.

●Over 5 Years of experience working at a Startup and collaborating with Marketing, Engineering and Sales Teams.

●Professional Services, Pre-Sales/Post-Sales and Technical POCs delivery and implementation for Data Analytics and BI projects.

●Experience with Customer Success, Product Evaluations, Testing, research and learning technical products.

●Extensive experience on Traditional ETL tools (Talend, IBM DataStage, Informatica) and dbt.

●Results-driven Solutions Architect with demonstrated success in various industries such as:

Supply Chain, Manufacturing, Retail, Aviation and E-Commerce industries.

●Experience working with various Integration platforms like ERP, Legacy, Cloud and Ecommerce.

SKILLS:

●Integration Architecture: API, ESB, ETL, ELT

●Cloud Platforms: AWS, GCP

●Solution Design and Technical Leadership

●Enterprise Systems Integration: ERP, CRM (HubSpot, Oracle NetSuite), Salesforce, QuickBooks.

●HRIS and ATS Integrations: Bamboo HR, Achievers, Survey Monkey, Motus, iCIMS

●Data Modeling and Analytics: dbt, Snowflake

●Integration Platforms: IBM DataStage, Talend, Hulft/Data Spider, Informatica.

●Operating Systems: UNIX, Linux, Windows, Mainframes AS400.

●Programming Languages: Python, SQL, PL /SQL, C, Java, Visual Basic.

●Scripting Languages: UNIX Shell Scripting, HTML, DHTML, and Java Script.

●Cloud Database/ Datawarehouse: PostgreSQL, Sql Server, Snowflake

●RDBMS Databases: Oracle, SQL Server, DB2 UDB, IBM DB2, Teradata, PostgreSQL.

●Methodologies: Kimball, Inmon, 3NF, Data Modeling, Dimensional Modeling

●Scheduling Tools: Control-M Desktop/Enterprise Manager, Autosys, Talend Qlik TMC.

●BI Tools: Oracle Business Intelligence Suite Enterprise Edition (OBIEE), Power BI, Tableau

EXPERIENCE:

Integrations Lead, Nevro

San Mateo, CA

Oct’2023-Present

●Providing Data Integration solutions for large scale enterprise system built for Clinical Trial and Implants Data.

●Leading the engineering team at Nevro with a continuous delivery of quality projects.

●Working with business/stake holders closely to understand the full life cycle of the project delivery.

●Mentoring the team with best practices on both Data solutions and Analytical solutions.

●Working on cross functional teams to serve the Data needs for Applications (Icims, Navex, Survey Monkey, ETQ, SFDC, QAD).

●Worked on various Integrations and built ELT/ETL models with more scalability in Talend and dbt.

●Developing the transformations and models in DBT tool which is used for the BI reporting metrics.

●Maintaining and writing python scripts for integrations to support HR systems like Bamboo HR.

●Successfully meeting the goals and having better planning and insight for IT and HRIS projects.

●Worked on product integration projects for SIJ and SCS involving Quality, BOM, Invoicing and Testing.

●Working on the Data silos to reform to a single Datawarehouse that serves the analytics for customers and internal business stake holders.

●Understanding the overall applications and working effectively on mitigating and replacing the systems.

●Worked on end-end data retrieval from sales reps’ devices (raw data) conversion from Amazon S3 to Snowflake via Snow pipe and dbt models.

●Worked on design, solution and documentation for ETL/ELT processes built on Talend/Qlik tool with the modern data warehousing architecture on Snowflake.

●Worked on the Data Modeling for Turns Report, Programmers Data from Sales Rep Data to Datawarehouse with raw data to Dimension model.

●Worked on Migration project for Talend upgrade on cloud/ remote engines from 7.3x to 8x.

●Worked on process improvement and collaboration for the team to build more effective ETL/ELT and Data Integration solutions.

●Developing and maintaining various processes on cloud DBs Snowflake, SQL server for complex Data Integration problems.

●Working on high priority Issues and understanding the cause-effect and mitigating risk.

Solutions Architect, Hulft. Inc. (ETL/ELT Data logistics e-commerce platform On-Premises and Cloud))

San Mateo CA

May’2018-Feb’2023

●Meeting with potential and existing customers as well as our business partners (who will in turn promote our solutions) in order to explain and recommend our solutions.

●Preparing presentation materials and conducting sales seminars; providing remote and onsite demonstrations, training and consulting with customers and partners.

●Understanding customers' technical requirements, analyzing their needs and offering new solutions; determining functional and technical specifications that meet customer’s particular business and functional needs in order to propose specific solutions.

●Evaluating and researching various products ; Analyzing and preparing comparison reports for COO and internal Hulft board of members.

●Preparing product Demos, POC’s, Project planning, Project scope, System Design for external clients.

●Concurrently managing customer projects and internal projects and coordinating with the engineering teams for the product feature requests and enhancements.

●Developed various solutions on Inventory Management, Warehouse Reconciliation and PO/PL Comparison on AWS Cloud Services which in turn used by the customer as mobile apps.

●Worked on the various data sources including structured and unstructured xml,json etc.

●Worked on the various Data validation, Quality projects with json files and Postman.

●Worked on the complex project that automates complete aviation Invoice and Shipment process using Hulft Integrate(ETL/ELT) and OCR technology.

●Worked on various Integration projects that integrate data from various Web applications into the native BI.

●Extensive use of APIs and REST services using Oauth for various systems data extraction.

●Developed reports for various BI projects both internal and external customers in Power BI.

●Support/research into native and third-party products, cloud-based adapters, investigating various technologies and always look for a scalable solution.

Application Systems Engineer, UCOP, Oakland CA

May’2015 – May,2018

Environment:

IBM Information Server DataStage 8.1/11.3, IBM DB2 Database, Cognos, SQL Server 2008, Jira, Unix, SQL Developer, control m, service Now.

Responsibilities:

●Analyzing business requirements designing and writing mapping specifications to design the complex process for HRB, CRM, Financial-aid, Summer Enrollment, Undergraduate Admissions modules.

●Create Data mappings and analyze the field to field requirements for the xml incoming UC files.

●Working on cross-functional data profiling to match the legacy system requirements.

●Creating functional and technical documents for the UC Apply process, UAD and Enrollment data to feed the DB tables from xml files/fixed format files.

●Working on the DataMart and Star schema built on Kimball methodology for the data warehouse built DB2 Database.

●POC Testing, planning and Execution of Datastage Migration project from 8.1 to 11.3.

●Creating a project estimate for the Data warehouse DSS projects.

●Worked on vsftp file encryption mechanism to GoAnywhere project.

●Design and development of Datastage jobs end-to-end ETL process flow.

●QA testing and validation sign off as part of integration and Regression testing.

●Developing new documentation, departmental technical procedures and user guides.

●Development and testing Unix scripts for Control-m batch flow.

●Working in an Agile environment and using Jira actively.

●Analyzing the business requirements for new ETL projects and preparing ETL sizing, functional design documentation, mapping specifications, testing plans and deployment plans.

●Working in the multi project environment based BI projects for all campuses of University of California data.

●Production and Operations support for the ETL batch processes on CRM/HRB/DSS systems.

Sr. ETL Developer, Wells Fargo, Fremont, CA

May’2013 – May’2015

Wholesale Loan Services (WLS) Datamart is an infrastructure to provide operational level reporting to various managers in WLS. The scope of this project is to capture the data from the various Application databases in the forms of files and database table format and feed the Datamart.

Environment:

IBM Information Server DataStage 8.1, Informatica PowerCenter 8.1, Oracle 11g, Cognos, SQL Server 2008, Jira, Unix, SQL Developer, Autosys 11.3, Pac 2000.

Sr. ETL Developer, Sephora, San Francisco, CA

Oct’2012 – March’2013 & Nov’2014 –Jan’2015

Provide accurate Gift Card reporting (sales and total redemptions) for Sephora USA and Sephora Canada, across all channels (Stores, Dotcom, Mobile) sold directly from Sephora and through 3rd party channels at a weekly, MTD and YTD cadence.

Environment:

DataStage 9.1, Tableau 8.0, Mainframes AS400, SQL Server, Oracle 10g, SSRS, Sql Squirrel, SQL Developer, SQL Server Management Studio, Jira ticketing System.

Technical Lead, Gap Inc, San Francisco, CA.

Oct’ 2011 – Dec’2012.

GLV is Global Logistics Visibility system, from where logistics related data should be imported to PROMPT system. To update the PROMPT system with the latest logistics data from GLV, the input feed files would be provided and through the separate inbound batches, GLV data would be updated to PROMPT.

Environment:

Ascential DataStage 7.5.1 (Parallel Jobs), IBM Information Server 8.5, Informatica Power Center 8.1, Message Broker, OBIEE Reports/Dashboards, Mainframe DB2, Mainframe ESP, UNIX AIX, MS Visio, Serena Dimensions, Unix Shell scripting.

Business Intelligence Consultant, Blackhawk Networks, Pleasanton, CA.

May’ 2011 – Sep’ 2011.

This project is about the Blackhawk core system data and reporting maintenance: Data Centralization, Data Integration of interfacing systems called TRACS, IMS, BLISS, PAY GO, Galileo and e-funds.

EDW is the Enterprise Data Warehouse which stores Blackhawk enterprise data for reporting and analysis. This system extracts Sales, Supply Chain and Issuance data from various interfacing systems (TRACS, IMS, BLISS, PAY GO, Galileo and efunds) directly or through external data file loads. EDW messages the data which it pulls and stores them as Facts and Dimensions.

Environment:

IBM DataStage Information Server 8x (Parallel Jobs/Server Jobs), IBM DB2 UDB 8.0 database, Microstrategy Business Intelligence Tool 9.0, AquaData Studio 9.0.13, Windows XP, Linux, Unix Scripting, Tidal.

BI Consultant, University of Arizona, Tucson, AZ.

Jan’2010-April’ 2011.

Mosaic is the name of The University of Arizona's Enterprise System Replacement Project (ESRP). The purpose of this project is to update and augment the University’s ageing core administrative systems. It is a multi-phase project with five key areas: Student Administration, Financials (Kuali Financial System, KFS), Human Resources / Payroll, Research Administration (Kuali Coeus, KC), and Business Intelligence.

Environment:

IBM DataStage Information Server 8x (Parallel Jobs), Ascential Datastage 7.5.2(Server Jobs), Fast track, Datanamic Dezign for Databases V5, Talend, Control-M Desktop/Enterprise Manager, Oracle Business Intelligence Suite Enterprise Edition (OBIEE), PeopleSoft, Application Designer, Oracle 10g, SQL Developer 2.1.1, Windows XP, KC 3.0, Linux.

Responsibilities:

BI/Datawarehouse Consultant, Astellas Pharma US, Deerfield, IL.

Sep’09-Dec’09

The purpose of the project is to implement the EDGE Continuous Loop Promotion (CLP) system for Astellas. The vision of the EDGE solution is to allow Astellas Sales Representatives to interact with their customers through the delivery of the targeted messages to Health Care Professionals and how Astellas can control and utilize its sales force in the most effective manner.

Environment:

Informatica Power Center 8.1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer), Oracle Business Intelligence Enterprise Edition (Answers, Dashboards), Oracle 10g, Toad, Sql* plus, MS Visio, Erwin 7.3, Pl/Sql, SQL Server 2005, Windows XP.

Research Assistant (BI Applications), College of Health Sciences, University of Texas at El Paso, El Paso, TX.

Sep’07-Dec’07

System Analyst, Computer Applications & Learning Center (CALC), College of Business,

University of Texas at El Paso, El Paso, TX.

May’07-Aug’07

Teaching Assistant, Department of Information and Decision Sciences, College of Business,

University of Texas at El Paso, El Paso, TX.

Jan’07-Mar’07

EDUCATION:

Master ’s in information technology, University of Texas at El Paso, Texas, USA. Graduation Date: Dec’08

Bachelor of Technology, Computer Science, J.N.T University, Hyderabad, India. Graduation Date: May’05



Contact this candidate