Job Title: Application Analyst (Python / PyFlink / Data Engineering)
Location: Charlotte, NC
Duration: 6-9 month contract
Work Environment: Hybrid (local candidates preferred)
Position Overview
We are seeking an experienced Application Analyst / Data Engineer to support the development of an Enterprise Protective Services (EPS) Data Mart. This role will focus on building and supporting data pipelines, streaming solutions, and containerized applications within a complex enterprise environment.
The ideal candidate brings strong experience in Python, PyFlink/Flink, and Docker, along with a solid understanding of data integration and application development lifecycles.
Key Responsibilities
Design, develop, and support scalable data and application solutions
Build and maintain data pipelines and streaming solutions using PyFlink/Flink and related technologies
Translate complex business requirements into user stories and technical solutions
Break down large initiatives into manageable components and provide effort estimates
Support testing, code migration, and deployment processes across environments
Ensure adherence to coding standards, design principles, and source control practices
Collaborate with product owners, business users, and cross-functional teams
Assist with application integration and data architecture design
Conduct code reviews and participate in design walkthroughs
Transfer knowledge to support teams and provide ongoing technical guidance
Ensure compliance with regulatory and enterprise standards
Required Qualifications
Bachelor's degree in Computer Science or related field
5-10+ years of experience in application development and support
Strong experience with Python
Experience with PyFlink / Apache Flink
Hands-on experience with Docker (containerized environments)
Strong SQL and database knowledge
Experience working within the software development lifecycle (SDLC)
Strong analytical, problem-solving, and communication skills
Ability to manage multiple priorities and work on concurrent user stories
Desired Qualifications
Experience with Kafka or streaming technologies
Experience with data integration architectures (ODS, Data Warehouse, Data Mart)
Experience with API and application integrations
Familiarity with modern source code management tools and processes
Experience working in regulated or enterprise environments
Experience working with distributed or remote teams
Technical Environment
PyFlink / Apache Flink
Python
Docker (Swarm mode preferred)
SQL / Data platforms
Kafka (optional)
Data warehousing and integration tools
What We're Looking For
Strong data engineering / application development background
Candidates comfortable working with streaming and data integration technologies
Engineers who can design and build solutions end-to-end
Strong communicators who can collaborate across technical and business teams