More jobs:
Big Data Developer
Job in
Alpharetta, Fulton County, Georgia, 30239, USA
Listed on 2026-02-16
Listing for:
Sanrnd
Full Time
position Listed on 2026-02-16
Job specializations:
-
Software Development
Data Engineer
Job Description & How to Apply Below
San R&D Business Solutions LLC | Full time
Alpharetta, United States | Posted on 02/11/2026
Job Title: Big Data Developer (GCP / Dataflow)
Work Type: Hybrid
Employment Type: Contract(C2C)
Visa Requirement: OnlyUSC / GC
Open to local GA candidates only
Job SummaryWe are seeking an experienced Big Data Developer with strong expertise in Google Cloud Platform (GCP) and Dataflow to design, develop, and maintain scalable data processing systems. The ideal candidate will have around 8+ years of experience in Big Data technologies and a strong background in building high-performance data pipelines and distributed systems.
Key Responsibilities- Design and develop scalable Big Data solutions using GCP services
. - Build and optimize data pipelines using Google Cloud Dataflow (Apache Beam).
- Work with large datasets in distributed environments.
- Develop batch and real-time data processing workflows.
- Integrate data from multiple sources (structured and unstructured).
- Optimize data processing performance and cost efficiency in GCP.
- Collaborate with data engineers, architects, and business stakeholders.
- Implement data quality checks, monitoring, and logging frameworks.
- Ensure data security, governance, and compliance standards are met.
- Troubleshoot and resolve performance bottlenecks.
Skills & Qualifications
- 8+ years of experience in Big Data development
. - Strong hands-on experience with:
- Big Query
- Pub/Sub
- Strong programming skills in Java or Python
. - Experience with Apache Spark, Hadoop, Hive, or similar Big Data frameworks
. - Experience building ETL/ELT pipelines
. - Knowledge of SQL and No
SQL databases. - Understanding of data modeling and data warehousing concepts.
- Experience with CI/CD pipelines and Dev Ops practices.
- Strong problem-solving and analytical skills.
- GCP certifications (Professional Data Engineer preferred).
- Experience with Kubernetes and containerization.
- Experience in streaming technologies.
- Knowledge of Airflow or other workflow orchestration tools.
- Experience working in Agile/Scrum environments.
- Ability to work independently and in a team environment.
- Strong documentation and stakeholder management skills.
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×