More jobs:
Data Engineer
Job Description & How to Apply Below
Location:
Bangalore
Experience:
1–3 years
Compensation: ₹8–12 LPA
Role Type:
Hands-on IC
About the Role
We process millions of phone numbers, business profiles, and campaign interactions.
We need a Data Engineer who can build and scale data pipelines to support AI models and enrich SME data.
This role is perfect for someone who loves:
Scraping
ETL
Data cleaning
Automating workflows
Fast execution
What You Will Build (First 180 Days)
1. Data Ingestion Pipelines
Scrape open directories
Parse public business information
Integrate partner APIs
Build clean datasets
2. Data Cleaning + Normalization
Remove duplicates
Standardize numbers
Tag location, industry, business type
Build category mapping files
3. Enrichment Pipelines
Infer attributes
Build connection graphs
Support the Data Scientist with clean feature-ready data
Skills Required
Must Have
Python (requests, Beautiful Soup, Selenium optional)
SQL
ETL tools or custom scripts
Experience with large CSV/JSON processing
API integration
Good to Have
Kafka / Airflow basics
Scrapy
AWS / GCP
Basic ML familiarity (not mandatory)
Responsibilities
Build scalable scraping system
Maintain clean, up-to-date datasets
Create ETL pipelines for ML models
Ensure high data quality
Debug pipeline failures
Support DS and CTO with data requirements
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×