Seeking a Python Developer with strong expertise in ETL pipelines and data engineering to join a high-performing development team. This role will be primarily focused on backend development and data processing, with a small emphasis on management and collaboration. The ideal candidate will bring a strong command of Python, JavaScript, and API integration, as well as experience with data systems, AI-driven analytics, and cloud data environments.
Key Responsibilities:
Develop and maintain complex ETL pipelines in Python
Design and manage data workflows between databases, APIs, and third-party applications
Debug, monitor, and ensure quality in data transformation processes
Collaborate closely with two other senior Python developers
Work with large-scale data sets, leveraging data frames, statistics, and predictive analytics
Integrate with tools and platforms such as Maximo, Oracle, SQL, Tableau, SharePoint, Smart Sheets, and Graphana
Contribute to AI/ML initiatives using data lakes, warehouses, and lakehouses
Support reporting teams with data integration and backend support for visualizations
Ensure proper data handling from source to endpoint for downstream visualization (primarily database-side)
Participate in an efficient and engaging hiring process: 1.5 rounds including leadership and peer developer conversations
Required Skills:
Extensive Python development experience, especially for data transformation and ETL
Strong proficiency in JavaScript, particularly with multi-dimensional arrays and tabular data transformations
Deep understanding of API usage – consuming, integrating, and managing APIs from external systems
Familiarity with SQL and Oracle databases
Experience working with data frames, forecasting models, and statistical computations
Prior experience with data reporting and analytics tools (Tableau, Graphana, SharePoint, Smart Sheets)
Understanding of data architecture – including data lakes, warehouses, lakehouses
Experience with tools like Splunk is a plus
Exposure to AI/ML analytics is a strong plus
Nice to Have:
Experience with Maximo for resource data extraction
Background in AI/ML reporting analytics
Knowledge of data visualization tools (though backend/data manipulation skills are more critical)
Work Environment:
Primarily onsite (4 days), but flexible for additional remote work if needed
Collaborative environment with focus on backend development (80%) and some management/collaboration (20%)
Team is based in both California and Florida