More jobs:
Job Description & How to Apply Below
Company Description
Mars Devs partners with startups and small-to-medium-sized businesses (SMBs) to transform innovative ideas into high-quality software products. With a holistic approach that blends product strategy, top-notch engineering, and intuitive design, Mars Devs specializes in delivering scalable and impactful digital solutions. Our expertise includes custom web and mobile app development, AI-driven solutions, blockchain integrations, and cloud modernization. Founded by experienced entrepreneurs and engineers, Mars Devs has supported over 100 businesses across the Middle East, Europe, and India.
We prioritize long-term partnerships while aligning technology with business objectives to drive growth and success.
Role Description
This is a full-time remote opportunity for an experienced Python Data Engineer with 4+ years of professional experience. The Python Data Engineer will be responsible for designing, building, and maintaining data pipelines, developing scalable data solutions, and managing data workflows. Daily responsibilities include implementing efficient ETL processes, optimizing data storage systems, and working collaboratively with cross-functional teams to extract insights from data.
The role emphasizes best practices in data architecture and ensures the integrity and security of data systems.
Roles & Responsibilities
Design and build scalable Python backend services for data-heavy workloads
Develop and maintain ETL / ELT data pipelines
Integrate and work extensively with Snowflake
Write clean, efficient, and optimized SQL queries
Build reusable and modular data processing components
Create and manage workflow orchestration using Prefect (DAGs)
Define task dependencies, retries, and failure handling in pipelines
Monitor, debug, and optimize pipeline reliability and performance
Perform large-scale data transformations using Pandas, Num Py, Polars, and GraphDB
Handle aggregations, joins, validations, and data cleaning
Improve query performance and control data processing costs
Build backend APIs for data processing using FastAPI
Implement logging, monitoring, and alerting
Write unit and integration tests for data pipelines and services
Ensure system reliability, reproducibility, and observability
Collaborate with data analysts and backend engineers
Requirements Must-Have
3+ years of strong hands-on experience with Python
Experience with Prefect or Airflow or any DAG-based orchestration tool
Hands-on experience with Snowflake or a modern cloud data warehouse
Strong SQL skills
Experience with Pandas and Num Py for data manipulation
Backend engineering mindset with focus on scalability and reliability
Experience building and running production-grade data pipelines
Good to Have
Experience with Docker
CI/CD using Git Hub Actions or similar tools
Working knowledge of PostgreSQL
Understanding of data modeling concepts
Experience optimizing large datasets and long-running queries
Exposure to async Python or multiprocessing
Brownie Points
Frontend or React experience
Web UI development exposure
LLM Ops / MLOps experience
NLP or AI model development
Dev Ops / SRE-level production experience
Position Requirements
3+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×