More jobs:
Software Engineer, Data Engineer
Job in
Durham, Durham County, North Carolina, 27703, USA
Listed on 2026-02-16
Listing for:
Cyberhill Partners
Full Time
position Listed on 2026-02-16
Job specializations:
-
Software Development
Data Engineer
Job Description & How to Apply Below
Overview
Location:
Austin, TX;
Durham, NC. Full-time position. We’re looking for a Software Engineer with strong Python skills to join our data and AI engineering team.
- Data Ingestion & Processing
- Develop and maintain robust Python-based systems for ingesting unstructured data (text, PDFs, images, web, etc.).
- Design scalable pipelines to clean, normalize, and prepare data for downstream ML/analytics use.
- API Integration & Connectors
- Build and manage integrations with third-party APIs, internal systems, and external data providers.
- Develop reusable connectors and services to support dynamic data environments.
- Software Development
- Write clean, modular, production-ready Python code following best practices.
- Work with cloud infrastructure (AWS, Azure, or GCP) to deploy scalable data services.
- Collaboration
- Partner with data scientists to operationalize models and support real-time and batch inference.
- Work cross-functionally with architects and product teams to translate business needs into technical solutions.
- Monitoring & Optimization
- Implement logging, testing, and monitoring tools to ensure data reliability and system performance.
- Continuously optimize code and systems for performance, scalability, and maintainability.
- 3–6 years of experience as a software engineer with a focus on Python.
- Strong experience working with unstructured data (e.g., documents, text, media).
- Experience building and consuming RESTful APIs and data connectors.
- Familiarity with data engineering tools and patterns (e.g., ETL, stream processing).
- Working knowledge of cloud platforms (AWS, GCP, or Azure).
- Experience with Git, CI/CD pipelines, and containerized environments (e.g., Docker).
- Excellent problem-solving skills and ability to work independently in a fast-paced environment.
- Experience with libraries/tools such as Pandas, PySpark, FastAPI, Apache Airflow, or similar.
- Background in supporting machine learning workflows or MLOps pipelines.
- Exposure to NLP or document intelligence use cases.
- Familiarity with data storage formats and databases (e.g., JSON, Parquet, SQL, No
SQL).
- Competitive salary: $120,000 – $180,000/year (based on experience)
- Exciting projects at the intersection of AI, data science, and software engineering
- Flexible remote/hybrid work environment
- Growth and learning opportunities in a high-impact, client-focused tech firm
- Potential equity and performance-based bonuses
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×