More jobs:
W2 --Data Engineer--F2F ; No C2C
Job in
Richmond, Henrico County, Virginia, 23214, USA
Listed on 2026-01-01
Listing for:
Cliff Services Inc
Full Time
position Listed on 2026-01-01
Job specializations:
-
Software Development
Data Engineer
Job Description & How to Apply Below
Job Title
Data Engineers
TypeOnsite (Hybrid 3 to 4 days to office)
InterviewIn Person
LocationsMcLean VA, Richmond VA, Dallas TX
Job DescriptionA Data Engineer with Python, PySpark, and AWS expertise is responsible for designing, building, and maintaining scalable and efficient data pipelines in cloud environment.
Responsibilities- Design, develop, and maintain robust ETL/ELT pipelines using Python and PySpark for data ingestion, transformation, and processing.
- Work extensively with AWS cloud services such as S3, Glue, EMR, Lambda, Redshift, Athena, and Dynamo
DB for data storage, processing, and warehousing. - Build and optimize data ingestion and processing frameworks for large-scale data sets, ensuring data quality, consistency, and accuracy.
- Collaborate with data architects, data scientists, and business intelligence teams to understand data requirements and deliver effective data solutions.
- Implement data governance, lineage, and security best practices within data pipelines and infrastructure.
- Automate data workflows and improve data pipeline performance through optimization and tuning.
- Develop and maintain documentation for data solutions, including data dictionaries, lineage, and technical specifications.
- Participate in code reviews, contribute to continuous improvement initiatives, and troubleshoot complex data and pipeline issues.
- Strong programming proficiency in Python, including libraries like Pandas and extensive experience with PySpark for distributed data processing.
- Solid understanding and practical experience with Apache Spark/PySpark for large-scale data transformations.
- Demonstrated experience with AWS data services, including S3, Glue, EMR, Lambda, Redshift, and Athena.
- Proficiency in SQL and a strong understanding of data modeling, schema design, and data warehousing concepts.
- Experience with workflow orchestration tools such as Apache Airflow or AWS Step Functions.
- Familiarity with CI/CD pipelines and version control systems (e.g., Git).
- Excellent problem-solving, analytical, and communication skills, with the ability to work effectively in a team environment.
- Experience with streaming frameworks like Kafka or Kinesis.
- Knowledge of other data warehousing solutions like Snowflake.
Mid-Senior level
Employment typeContract
Job functionAnalyst
IndustriesBanking
ContactK Hemanth | Recruitment Specialist
Thanks & regards
#J-18808-LjbffrTo View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
Search for further Jobs Here:
×