Job Description & How to Apply Below
Job Summary
We are looking for a skilled Data Integration / ETL Developer who can extract data from multiple source systems, transform it as required, and load it into a database efficiently and accurately. The ideal candidate should have strong experience working with diverse data formats and building reliable data pipelines. The role will focus on structured and unstructured data ingestion, secure storage of sensitive candidate information, transcript management, reporting, and data lifecycle-management.
Key Responsibilities
Read and extract data from multiple sources such as files, APIs, databases, and third-party systems
Design, develop, and maintain ETL/ELT pipelines to load data into relational or cloud databases
Perform data cleansing, validation, and transformation to ensure data quality and consistency
· Implement storage solutions for transcripts (text-only) and enable analytics and report generation on transcript data
· Design encrypted-at-rest databases for sensitive candidate data, including photo storage, leveraging enterprise-grade encryption and key management
· Optimize data loading processes for performance and scalability
Monitor data pipelines and troubleshoot failures or data issues
Collaborate with business and technical teams to understand data requirements
Document data flows, transformations, and loading procedures
· Ensure compliance with data privacy and security standards (GDPR, SOC 2, ISO 27001)
Required
Skills & Qualifications
Strong experience in reading data from various sources (CSV, JSON, XML, APIs, databases, etc.)
Hands-on experience with databases (Oracle, SQL Server, Postgre
SQL, MySQL, or similar)
Experience with BLOB storage or secure object storage for sensitive media files
Proficiency in SQL for data loading, transformation, and validation
Experience with ETL tools or scripting languages (Python, PL/SQL, Shell, or similar)
Understanding of data modeling and data quality best practices
Ability to analyze and troubleshoot data-related issues
Nice to Have
Experience with cloud data platforms (AWS, Azure, GCP)
Exposure to AI/ML pipelines, especially interview analytics
Prior experience working on AI-driven platforms and familiar with start-up culture.
Education & Experience
Bachelor’s degree in computer science, Engineering, or related field
2-4 years of experience in data engineering, database architecture/developer, or platform engineering
Proven experience working with regulated data environments
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×