Python(Django or Flask) + Angular or React. + AWS. (bonuses: big data/spark, Airflow, ML, GraphQL APIs)
Team/Space: Market Regulation
JD
We are seeking a Full Stack Engineer to work on innovative projects aimed at enhancing user experience and business efficiency. This role involves end-to-end development, from designing intuitive front-end interfaces to architecting scalable and secure back-end systems. You will collaborate closely with product managers, data scientists, and engineers to deliver high-performance systems that power market surveillance and regulatory workflows.
Job Responsibilities:
· Design and implement scalable, modular architectures that align with evolving business needs.
· Build and maintain responsive front-end applications using Angular or React for internal business users.
· Develop robust RESTful APIs and back-end services using Python (Django or Flask).
· Integrate with enterprise data platforms, data lakes, and cloud-based storage systems.
· Deploy and manage applications on AWS (EC2, Lambda, S3, RDS).
· Set up and maintain CI/CD pipelines using Jenkins or GitHub Actions for continuous integration and delivery.
· Write and maintain comprehensive automated tests, including unit, integration, and end-to-end coverage.
· Participate in Agile ceremonies, collaborate with cross-functional teams, and contribute to iterative development cycles.
Qualifications:
· Bachelor’s degree in computer science, Software Engineering, or related field.
· 7+ years of hands-on experience developing production-grade web applications.
· Strong expertise in Python (Django or Flask) for back-end development.
· 5+ years of experience with JavaScript frameworks (React or Angular).
· Solid understanding of both SQL (e.g., PostgreSQL, MySQL) and NoSQL (e.g., MongoDB, DynamoDB) databases.
· Proficient with AWS cloud services and infrastructure.
· Experience implementing CI/CD workflows and managing deployments.
· Familiarity with automated testing frameworks and a test-driven development mindset.
· Proven ability to thrive in Agile/Scrum teams and collaborate effectively.
Nice to have:
· Experience with big data technologies and data pipeline development using Apache Spark.
· Knowledge of integrating machine learning models into production applications.
· Familiarity with Apache Airflow for orchestrating workflows.
· Experience working with GraphQL APIs.