×
Register Here to Apply for Jobs or Post Jobs. X

Senior Python Developer - Data Pipelines - Full remote - Full remote - contractor in USD

Remote / Online - Candidates ideally in
600001, Chennai, Tamil Nadu, India
Listing for: All European Careers
Full Time, Contract, Remote/Work from Home position
Listed on 2026-02-14
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst
Job Description & How to Apply Below
For an international project in Chennai, we are urgently looking for a Full Remote Senior Python Developer. We seek a Python Developer to build and maintain reliable data pipelines and analytics workflows that ingest data from multiple internal and external sources, transform it into clean, analysis-ready datasets, and validate quality end-to-end. The work supports reporting, monitoring, and advanced analytics.

We are looking for a motivated contractor. Candidates need to be fluent in English.

Tasks and responsibilities:

- Design, build, and maintain data ingestion pipelines to collect information from APIs, databases, cloud storage, web services, and other internal/external data sources;
- Develop data transformation processes for cleaning, transforming, aggregating, and harmonizing data into analytics-ready formats;
- Implement data validation and quality assurance checks to ensure data accuracy, completeness, and consistency;
- Automate data extraction and loading (ETL/ELT) processes using Python and SQL-based frameworks;
- Collaborate with analysts, report developers, product owner, and business users to define data requirements and ensure smooth integration with analytics platforms;
- Optimize data pipelines for scalability, reliability, and performance.
- Maintain data documentation, metadata, and version control for all data assets;

Profile:

- Bachelor or Master degree;
- +6 years of relevant experience as a Python developer.
- Strong proficiency in Python, including libraries such as pandas, numpy, sqlalchemy, and requests;
- Expertise in data extraction and integration (REST APIs, SQL/No

SQL databases, file systems, cloud data sources);
- Solid understanding of data transformation, cleaning, and validation techniques;
- Strong hands-on SQL skills, including query optimization and performance tuning;

- Experience with ETL orchestration tools (e.g., Airflow, ADF) and workflow automation;
- Familiarity with data warehouses (Oracle, Azure Lakehouse, iceberg, Postgre

SQL);
- Experience implementing data validation and quality checks (e.g., dbt tests);

- Experience with version control systems (Git) and CI/CD pipelines;
- Knowledge of containerization (Docker) and cloud platforms (Azure);
- Knowledge of Dremio and Pyarrow;
- Knowledge of ADF and Databrick using pyspark;
- Fluent in English;
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary