×
Register Here to Apply for Jobs or Post Jobs. X

Cloud AWS Engineer

Job in Richmond, Henrico County, Virginia, 23214, USA
Listing for: Stefanini Group
Full Time position
Listed on 2026-02-15
Job specializations:
  • IT/Tech
    Data Engineer, Cloud Computing
Salary/Wage Range or Industry Benchmark: 80000 - 100000 USD Yearly USD 80000.00 100000.00 YEAR
Job Description & How to Apply Below

Cloud AWS Engineer – Remote/Hybrid

Responsibilities
  • Understand technology vision and strategic direction of business needs
  • Understand our current data model and infrastructure, proactively identify gaps, areas for improvement, and prescribe architectural recommendations with a focus on performance and accessibility.
  • Partner across engineering teams to design, build, and support the next generation of our analytics systems.
  • Partner with business and analytics teams to understand specific requirements for data systems to support both development and deployment of data workloads ranging from Tableau reports to ad hoc analyses.
  • Own and develop architecture supporting the translation of analytical questions into effective reports that drive business action.
  • Automate and optimize existing data processing workloads by recognizing patterns of data and technology usage and implementing solutions.
  • Solid grasp of the intersection between analytics and engineering while maintaining a proactive approach to assure solutions demonstrate high levels of performance, privacy, security, scalability, and reliability upon deployment.
  • Provide guidance to partners on effective use of the database management systems (DBMS) platform through collaboration, documentation, and associated standard methodologies.
  • Design and build end to end automation to support and maintain software currency
  • Create automation services for builds using Terraform, Python, and OS shell scripts.
  • Develop validation and certification process through automation tools
  • Design integrated solutions in alignment with design patterns, blueprints, guidelines, and standard methodologies for products
  • Participate in developing solutions by incorporating cloud native and 3rd party vendor products
  • Participate in research and perform POCs (proofs of concept) with emerging technologies and adopt industry best practices in the data space for advancing the cloud data platform.
  • Develop data streaming, migration and replication solutions
  • Demonstrate leadership, collaboration, exceptional communication, negotiation, strategic and influencing skills to gain consensus and produce the best solutions.
  • Engage with Senior leadership, business leaders at the Federal Reserve and the Board to share the business value.
Top Skills
  • Enterprise data service, building platform in the cloud to build their platform on top of ---- focus is on infrastructure (Terraform is REQUIRED- not looking for an architect)
  • Understanding of Data Mesh
  • Platform focused (infrastructure skills, hands on terraform (not looking for an architect) more infrastructure engineer
  • Python, Terraform (hands on), AWS, AWS Gov Cloud, API
Qualifications
  • Demonstrates mutual respect, embraces diversity, and acts with authenticity
  • Bachelor's degree in Computer Science, Management Information Systems, Computer Engineering, or related field or equal work experience; advanced degree preferred
  • Seven or more years of experience in designing and building large-scale solutions in an enterprise setting in both
  • Three years in designing and building solutions in the cloud
  • Expertise in building and managing Cloud databases such as AWS RDS, Dynamo

    DB, Document

    DB or analogous architectures
  • Expertise in building Cloud Database Management Systems in Databricks Lakehouse or analogous architectures
  • Expertise in Cloud Data Warehouses in Redshift, Big Query or analogous architectures a plus
  • Deep SQL expertise, data modeling, and experience with data governance in relational databases
  • Experience with the practical application of data warehousing concepts, methodologies, and frameworks using traditional (Vertica, Teradata, etc.) and current (Spark

    SQL, Hadoop, Kafka) distributed technologies
  • Refined skills using one or more scripting languages (e.g., Python, bash, etc.)
  • Experience using ETL/ELT tools and technologies such as Talend, Informatica a plus
  • Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime and reliability in mind
  • Expertise in relational and dimensional data modeling
  • UNIX admin and general server administration experience required
  • Presto, Hive, Spark

    SQL, Cassandra, or Solr other Big Data query and transformation experience a plus
  • Experience using Spark, Kafka, Hadoop, or similar distributed data technologies a plus
  • Able to expertly express the benefits and constraints of technology solutions to technology partners, business partners, and team members
  • Experience with leveraging CI/CD pipelines
  • Experience with Agile methodologies and able to work in an Agile manner is preferred
  • One or more cloud certifications.
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary