Post Job Free
Sign in

Lead Integration Developer

Location:
Dublin, CA
Posted:
September 25, 2025

Contact this candidate

Resume:

BALAJYOTHI KOTHAGUNDLA LEAD INTEGRATIONS/SOLUTION ARCHITECT

Dublin, California

******.***@*****.***

Contact: 408-***-****

LinkedIn Profile: www.linkedin.com/in/jyothi-kothagundla-65a93512

OBJECTIVE:

Accomplished IT professional with extensive experience as a Developer, Solutions Consultant, Solutions Architect, and Data Engineer, specializing in Digital Transformation, Data Integration, and Business Intelligence. Proven expertise in data capture, analysis, ETL, and data warehousing to support end-to-end solution design, application development, and enterprise architecture. Seeking a challenging role as a Lead Data Engineer or Solutions Architect to drive innovative, scalable, and data-driven solutions aligned with business goals.

WORK AUTHORIZATION: US Green Card Holder.

PROFESSIONAL SUMMARY:

●Over 15 years of experience with Technical Lead, Solutions Architect, Professional Services, Data warehousing, Business Intelligence, Database, Reports and ETL/ELT processes.

●Over 7 years of expertise in Customer Success, Technical Consulting, Technical Leadership, Project Management, Solving complex Integration Problems and Subject Matter Expertise on Data logistics Products.

●Over 5 Years of experience working at a Startup and collaborating with Marketing, Engineering and Sales Teams.

●Professional Services, Pre-Sales/Post-Sales and Technical POCs delivery and implementation for Data Analytics and BI projects.

●Extensive experience on Traditional ETL tools (Talend, IBM DataStage, Informatica) and modern ETL/ELT tools (dbt).

●Experience with Reporting/Data Visualization Tools (Tableau, Power BI).

●Results-driven Solutions Architect with demonstrated success in various industries such as:

Supply Chain, Manufacturing, Medical/Health Care, Retail, Aviation and E-Commerce industries.

●Experience working with various Integration platforms like ERP, Legacy, Cloud and Ecommerce.

SKILLS:

●Operating Systems: UNIX, Linux, Windows, Mainframes AS400.

●Programming Languages: Python, SQL, PL /SQL, C, Java, Visual Basic.

●Scripting Languages: UNIX Shell Scripting, HTML, DHTML, and Java Script.

●Cloud Databases: PostgreSQL, Sql Server

●Cloud Data Warehouses: Snowflake

●Cloud ETL/ELT Tools: Stitch, Talend Data, Qlik Data integration Platform, dbt

●RDBMS Databases: Oracle, SQL Server, DB2 UDB, IBM DB2, Teradata, PostgreSQL.

●ETL Tools: DataStage, Talend, Hulft/Data Spider, Informatica.

●Methodologies: Kimball, Inmon, 3NF, Data Modeling, Dimensional Modeling

●Scheduling Tools: Control-M Desktop/Enterprise Manager, Autosys, Talend Qlik TMC.

●BI Tools: Oracle Business Intelligence Suite Enterprise Edition (OBIEE), Power BI, Tableau, SSRS, SSAS

EXPERIENCE:

Lead Integration Developer, Nevro

San Mateo, CA

Oct’2023-June’2025

●Providing Data Integration solutions for large scale enterprise system built for Clinical Trial and Implants Data.

●Partnered with HR operations and Peoples team to reconcile Employee data with Disclosures and Regulatory Compliance.

●Worked with People team and external vendor for on-boarding Nevro’s employees to Achievers (Rewards based platform).

●Automated data quality checks and dimensional models to harmonize Bamboo HR(ATS) extracts and MOTUS datasets improving data consistency and compliance.

●Maintaining and writing python scripts for integrations to support HR systems like Bamboo HR(ATS).

●Successfully meeting the goals and having better planning and insight for IT and HRIS projects.

●Applied dimensional modeling and data warehousing practices to harmonize clinical, implant, and HR data into Snowflake for analytics.

●Working on cross functional teams to serve the Data needs for enterprise-wide Applications (ICIMS, Navex, Survey Monkey, ETQ, SFDC, QAD).

●Delivered Tableau dashboards and advanced analytics for clinical, financial, and regulatory stakeholders, providing visibility into device utilization, patient outcomes, and billing trends.

●Leading the engineering team at Nevro with a continuous delivery of quality projects.

●Working with business/stake holders closely to understand the full life cycle of the project delivery.

●Mentoring the team with best practices on both Data Solutions and Analytical Solutions.

●Worked on various Integrations and built ELT/ETL models with more scalability in Talend and dbt.

●Developing the transformations and models in DBT tool which is used for BI reporting metrics.

●Worked on product integration projects for SIJ and SCS involving Quality, BOM, Invoicing and Testing.

●Working on the Data silos to reform to a single Datawarehouse that serves the analytics for customers and internal business stake holders.

●Understanding the overall applications and working effectively on mitigating and replacing the systems.

●Worked on end-end data retrieval from sales reps’ devices raw Data conversion from Amazon S3 to Snowflake via Snow pipe and dbt models.

●Worked on design, solution and documentation for ETL/ELT processes built on Talend/Qlik tool with the modern data warehousing architecture on Snowflake.

●Worked on the Data Modeling for Turns Report, Programmers Data from Sales Rep Data to Datawarehouse with raw data to Dimension model.

●Developed Turns report in Tableau to represent the COGS metrics using Date functions, Time functions, conditional formatting, filters and joins.

●Built Data Pipelines in dbt for integrating Patients Data, Reps Data and Account information from Sales Force Lightning to Snowflake.

●Worked on the Acquisition/Merger Data Mining project that brings in the ETL processes from DataStage 11.7 to Snowflake Datawarehouse.

●Worked on the Data analysis for the acquisition merger project that allows the data consolidation from various sources like Amazon S3, Oracle NetSuite, Quick Books and csv files.

●Leveraged ETL processes that are built in SQL Server/Talend to dbt models with the end-end solution, architecture and deployment.

●Worked on Migration project for Talend upgrade on cloud/ remote engines from 7.3x to 8x.

●Worked on process improvement and collaboration for the team to build more effective ETL/ELT and Data Integration solutions.

●Developing and maintaining various processes on cloud DBs Snowflake, SQL server for complex Data Integration problems.

●Working on high priority Issues and understanding the cause-effect and mitigating risk.

Solutions Architect, Hulft. Inc. (ETL/ELT Data logistics e-commerce platform On-Premises and Cloud))

San Mateo CA

May’2018-Feb’2023

●Meeting with potential and existing customers as well as our business partners (who will in turn promote our solutions) to explain and recommend our solutions.

●Preparing presentation materials and conducting sales seminars; providing remote and onsite demonstrations, training and consulting with customers and partners.

●Understanding customers' technical requirements, analyzing their needs and offering new solutions; determining functional and technical specifications that meet customer’s particular business and functional needs to propose specific solutions.

●Evaluating and researching various products; Analyzing and preparing comparison reports for COO and internal Hulft board of members.

●Preparing product Demos, POC’s, Project planning, Project scope, System Design for external clients.

●Concurrently managing customer projects and internal projects and coordinating with the engineering teams for the product feature requests and enhancements.

●Led data migration project from IBM DataStage 11.3 to the Hulft Data Platform, leveraging AWS cloud infrastructure for enhanced scalability, performance, and maintainability.

●Developed various solutions on Inventory Management, Warehouse Reconciliation and PO/PL Comparison on AWS Cloud Services which in turn used by the customer as mobile apps.

●Worked on the various data sources including structured and unstructured xml,json etc.

●Worked on the various Data validation, Data Conversion, Quality projects with json files and Postman.

●Providing Data Solutions for a retail client featuring Data solutions on Teradata server.

●Worked on the complex project that automates complete aviation Invoice and Shipment process using Hulft Integrate(ETL/ELT) and OCR technology.

●Worked on various Integration projects that integrate data from various Web applications into the native BI.

●Extensive use of APIs and REST services using Google Cloud Platform Oauth for various systems data extraction.

●Developed reports for various BI projects both internal and external customers in Power BI.

●Created dashboards in Power BI to calculate the revenue metrics for Sales Teams opportunities.

●Support/research into native and third-party products, cloud-based adapters, investigating various technologies and always look for a scalable solution.

Application Systems Engineer, UCOP, Oakland CA

May’2015 – May,2018

Environment:

IBM Information Server DataStage 8.1/11.7, IBM DB2 Database, Teradata, Cognos, SQL Server 2008, Jira, Unix, SQL Developer, control m, service Now.

Responsibilities:

●Analyzing business requirements designing and writing mapping specifications to design the complex process for HRB, CRM, Financial-aid, Summer Enrollment, Undergraduate Admissions modules.

●Create Data mappings and analyze the field-to-field requirements for the xml incoming UC files.

●Working on cross-functional data profiling/Data Conversion to match the legacy system requirements.

●Creating functional and technical documents for the UC Apply process, UAD and Enrollment data to feed the DB tables from xml files/fixed format files.

●Working on the DataMart and Star schema built on Kimball methodology for the data warehouse built DB2 Database.

●POC Testing, planning and Execution of Datastage Migration project from 8.1 to 11.7.

●Creating a project estimate for the Data warehouse DSS projects.

●Leverage DataStage jobs loading data to Teradata DB with the Teradata utilities Fast Load, TPump and MultiLoad capabilities.

●Worked on vsftp file encryption mechanism to GoAnywhere project.

●Design and development of DataStage 11.7 jobs end-to-end ETL process flow.

●QA testing and validation sign off as part of integration and Regression testing.

Sr. ETL Developer, Wells Fargo, Fremont, CA

May’2013 – May’2015

Wholesale Loan Services (WLS) Datamart is an infrastructure to provide operational level reporting to various managers in WLS. The scope of this project is to capture the data from the various Application databases in the forms of files and database table format and feed the Datamart.

Environment:

IBM Information Server DataStage 8.1, Informatica PowerCenter 8.1, Oracle 11g, Cognos, SQL Server 2008, Jira, Unix, SQL Developer, Autosys 11.3, Pac 2000.

Responsibilities:

●Analyzing highly complex business requirements designing and writing technical specifications to design or redesign complex computer platforms and applications.

●Verifying program logic by overseeing the preparation of test data, testing and debugging of programs.

●Creating Functional design specifications and mapping documents for end-to-end ETL process flow.

●Developing new documentation, departmental technical procedures and user guides.

●Analyzing the business requirements for new ETL projects and preparing ETL sizing, functional design documentation, mapping specifications, testing plans and deployment plans.

●Developing and Designing ETL solutions to the new enhancements in the various application flow.

●Extensively used SQL Override function in Source Qualifier Transformation.

Sr. ETL/Data Analyst, Sephora, San Francisco, CA

Oct’2012 – March’2013 & Nov’2014 –Jan’2015

Provide accurate Gift Card reporting (sales and total redemptions) for Sephora USA and Sephora Canada, across all channels (Stores, Dotcom, Mobile) sold directly from Sephora and through 3rd party channels at a weekly, MTD and YTD cadence.

Environment:

Ascential DataStage 7.5.1(Server Jobs), Datastage 9.1, Tableau 8.0, Mainframes AS400, SQL Server, Oracle 10g, SSRS, Sql Squirrel, SQL Developer, SQL Server Management Studio, Jira ticketing System.

ETL Lead, Gap Inc, San Francisco, CA.

Oct’ 2011 – Dec’2012.

GLV is Global Logistics Visibility system, from where logistics related data should be imported to PROMPT system. To update the PROMPT system with the latest logistics data from GLV, the input feed files would be provided and through the separate inbound batches, GLV data would be updated to PROMPT.

Environment:

Ascential DataStage 7.5.1 (Parallel Jobs), IBM Information Server 8.5, Informatica Power Center 8.1, Message Broker, OBIEE Reports/Dashboards, Mainframe DB2, Mainframe ESP, UNIX AIX, MS Visio, Serena Dimensions, Unix Shell scripting.

Business Intelligence Consultant, Blackhawk Networks, Pleasanton, CA.

May’ 2011 – Sep’ 2011.

This project is about the Blackhawk core system data and reporting maintenance: Data Centralization, Data Integration of interfacing systems called TRACS, IMS, BLISS, PAY GO, Galileo and e-funds.

EDW is the Enterprise Data Warehouse which stores Blackhawk enterprise data for reporting and analysis. This system extracts Sales, Supply Chain and Issuance data from various interfacing systems (TRACS, IMS, BLISS, PAY GO, Galileo and efunds) directly or through external data file loads. EDW messages the data which it pulls and stores them as Facts and Dimensions.

Environment:

IBM DataStage Information Server 8x (Parallel Jobs/Server Jobs), IBM DB2 UDB 8.0 database, Microstrategy Business Intelligence Tool 9.0, AquaData Studio 9.0.13, Windows XP, Linux, Unix Scripting, Tidal.

DataStage/BI Consultant, University of Arizona, Tucson, AZ.

Jan’2010-April’ 2011.

Mosaic is the name of The University of Arizona's Enterprise System Replacement Project (ESRP). The purpose of this project is to update and augment the University’s ageing core administrative systems. It is a multi-phase project with five key areas: Student Administration, Financials (Kuali Financial System, KFS), Human Resources / Payroll, Research Administration (Kuali Coeus, KC), and Business Intelligence.

Environment:

IBM DataStage Information Server 8x (Parallel Jobs), Ascential Datastage 7.5.2(Server Jobs), Fast track, Datanamic Dezign for Databases V5, Talend, Control-M Desktop/Enterprise Manager, Oracle Business Intelligence Suite Enterprise Edition (OBIEE), PeopleSoft, Application Designer, Oracle 10g, SQL Developer 2.1.1, Windows XP, KC 3.0, Linux.

BI/Datawarehouse Consultant, Astellas Pharma US, Deerfield, IL.

Sep’09-Dec’09

The purpose of the project is to implement the EDGE Continuous Loop Promotion (CLP) system for Astellas. The vision of the EDGE solution is to allow Astellas Sales Representatives to interact with their customers through the delivery of the targeted messages to Health Care Professionals and how Astellas can control and utilize its sales force in the most effective manner.

Environment:

Informatica Power Center 8.1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer), Oracle Business Intelligence Enterprise Edition (Answers, Dashboards), Oracle 10g, Toad, Sql* plus, MS Visio, Erwin 7.3, Pl/Sql, SQL Server 2005, Windows XP.

ETL Developer/Analyst, Software Research Group, Santa Clara, CA.

Sep’08-Aug’09

ETL Developer, Infosmart Systems, Dallas, TX.

Jan' 08-July 08

Research Assistant (BI Applications), College of Health Sciences, University of Texas at El Paso, El Paso, TX.

Sep’07-Dec’07

System Analyst, Computer Applications & Learning Center (CALC), College of Business,

University of Texas at El Paso, El Paso, TX.

May’07-Aug’07

Teaching Assistant, Department of Information and Decision Sciences, College of Business,

University of Texas at El Paso, El Paso, TX.

Jan’07-Mar’07

EDUCATION:

Master ’s in information technology, University Of Texas at El Paso, Texas, USA. Graduation Date: Dec’08

Bachelor of Technology, Computer Science, J.N.T University, Hyderabad, India. Graduation Date: May’05



Contact this candidate