Systems Architect
Listed on 2025-12-12
-
IT/Tech
Data Engineer, Cloud Computing
Systems Architect – Bixal
Join to apply for the Systems Architect role at Bixal
. This is a full‑time position contingent on a contract award, with a defined performance period of one year and two one‑year option periods.
At Bixal, we want to ensure a transparent and secure application process for all candidates. Official communication will come from an email address ending in or Messages from other sources may be fraudulent, so please exercise care to avoid any links or attachments included.
Bixal will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment.
Need Assistance or a Reasonable Accommodation?If you need assistance or a reasonable accommodation to complete your application, we’re here to help. Please reach out to us at and let us know how we can support you. You do not need to share personal details or disclose the nature of your request. You can expect a response from a team member within 24 hours during the regular work week and on the next operating day during the weekend or holidays.
WhyBixal?
Bixal is a consulting company headquartered in Fairfax, VA, working alongside governments and organizations to help them deliver better services and experiences to the communities they serve. Using evidence‑based knowledge and technology, Bixal empowers clients to deliver on their missions more effectively by fostering a culture of learning and continuous improvement.
Our values:
- People‑First
:
Emphasizing the importance of people in all aspects of work. - Collaboration and Transparency
:
Valuing teamwork and open communication. - Growth Mindset
:
Encouraging innovation and continuous improvement. - Creating Lasting Impact
:
Focusing on meaningful outcomes and positive change.
The Systems Architect plays a central role in designing, implementing, and optimizing cloud data solutions that power a large data lakehouse will work directly with engineers, data analysts, and stakeholders to develop solutions in Databricks and AWS that are scalable, compliant, and efficient. This position is hands‑on, focused on delivering practical, secure, and maintainable architectures that support data ingestion, transformation, and visualization.
CompensationThe salary range for this role is $135,000 – $155,000. In the spirit of transparency, most offers tend to land near the midpoint of the range. We make compensation decisions thoughtfully, considering your experience, the skills you bring, and our commitment to internal equity.
Responsibilities- Design and document end‑to‑end data solutions in Databricks and AWS, including ingestion pipelines, transformations, and storage patterns.
- Define technical standards and configurations that ensure performance, reliability, and security.
- Develop and maintain architecture diagrams, schemas, and documentation for engineering teams.
- Review requirements and propose efficient, cost‑effective cloud solutions that align with client standards.
- Support integration of Databricks with AWS services, Quick Sight, and other client systems.
- Other relevant duties as assigned and qualified/trained to perform.
- Work closely with data engineers to design and implement pipelines using Spark, Delta Lake, and Databricks Workflows.
- Optimize cluster configurations, job performance, and data access for large‑scale workloads.
- Support automation of deployment and monitoring processes using CI/CD and Infrastructure as Code tools.
- Troubleshoot and resolve technical issues across the Databricks and AWS environments.
- Design, deploy, and operate secure hosting for custom and proprietary models using AWS Bedrock, including model access controls, network isolation, and MLOps‑ready deployment pipelines.
- Collaborate with CMS stakeholders to assess data maturity and establish governance, quality, and transformation approaches that produce AI‑ready datasets within the lakehouse.
- Partner with client to review cloud usage metrics and cost forecasts for Databricks and AWS services.
- Work with the Security and Dev Ops…
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).