×
Register Here to Apply for Jobs or Post Jobs. X

VDOT Cloud ETL Developer

Job in Richmond, Henrico County, Virginia, 23214, USA
Listing for: MetaSense Inc
Full Time position
Listed on 2025-12-02
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Data Warehousing, Database Administrator
Job Description & How to Apply Below
Position: VDOT Cloud ETL Developer (752580)

Overview

Join to apply for the VDOT Cloud ETL Developer (752580) role at Meta Sense Inc
.

Responsibilities
  • Develop ETL to extract business data and spatial data and load it into a data warehousing environment.
  • Design and test the performance of the system.
  • Consult with various teams to understand the company s data storage needs and develop data warehousing options.
  • Deep knowledge of coding languages, such as Python, Java, XML, and SQL.
  • Work with the Project team members and business stakeholders to understand business processes and pain points.
  • Develop expertise in source system datasets and data lifecycle.
  • Profile source data which may contain a spatial component; review source data and compare content and structure to dataset requirements; identify conflicts and determine recommendations for resolution.
  • Conduct entity resolution to identify matching and merging and semantic conflicts.
  • Elicit, record, and manage metadata.
  • Diagram current processes and proposed modifications using process flows, context diagrams and data flow diagrams.
  • Decompose requirements into Epics and Features and create clear and concise user stories that are easy to understand and implement by technical staff.
  • Utilize progressive elaboration; map stories to data models and architectures to be used by internal staff to facilitate master data management.
  • Identify and group related user stories into themes, document dependencies and associated business processes.
  • Assist Product Owner in maintaining the product backlog.
  • Create conceptual prototypes and mock-ups.
  • Collaborate with staff, vendors, consultants, and contractors as engaged on tasks to formulate, detail and test potential and implemented solutions.
  • Perform Quality Analyst functions such as defining test objectives, test plans, and test cases, and executing test cases.
  • Coordinate and facilitate User Acceptance Testing with Business and ensure Project Managers/Scrum Masters are informed of progress.
  • Designs and develops systems for the maintenance of the Data Asset Program (Data Hub), ETL processes, ETL processes for spatial data, and business intelligence.
  • Develop a new data engineering process that leverages a new cloud architecture and will extend or migrate existing data pipelines to this architecture as needed.
  • Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data Marts.
  • Work closely with data analysts, data scientists, and other data consumers within the business to populate data hub and data warehouse table structure, optimized for reporting.
  • Partner with Data modeler and Data architect to refine the business s data requirements for building and maintaining Data Assets.
Qualifications
  • Minimum of 10 years of experience delivering business data analysis artifacts.
  • 5+ years of experience as an Agile Business Analyst; strong understanding of Scrum concepts and methodology.
  • Experience organizing and maintaining Product and Sprint backlogs.
  • Experience translating client and product strategy requirements into dataset requirements and user stories.
  • Proficient with defining acceptance criteria and managing acceptance process.
  • Exceptional experience writing complex SQL queries for SQL Server and Oracle.
  • Experience with Azure Databricks, Azure Data Factory, Snowflake.
  • Experience with ESRI ArcGIS.
  • Experience with enterprise data management.
  • Excellent written and oral communication skills and ability to work with diverse peers and customers.
  • Experience with reporting systems, operational data stores, data warehouses, data lakes, data marts.
  • Advanced understanding of data integrations and database architectures.
  • Strong analytical and problem-solving skills; ability to build relationships internally and externally; ability to negotiate and resolve conflicts; ability to prioritize and manage multiple tasks and projects.
  • Desire to learn, innovate and evolve technology.
Technologies Required
  • Data Factory v2, Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse
  • IBM Data Stage, Erwin, SQL Server (SSIS, SSRS, SSAS), Oracle, T-SQL, Azure SQL Database, Azure SQL Data Warehouse.
  • Operating System Environments (Windows, Unix).
  • Scripting experience with Windows and/or Python, Linux Shell scripting.
Seniority level
  • Entry level
Employment type
  • Contract
Job function
  • Information Technology
  • Industries:
    Human Resources Services
#J-18808-Ljbffr
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary