×
Register Here to Apply for Jobs or Post Jobs. X
More jobs:

Senior Data Engineer

Job in Carmel, Hamilton County, Indiana, 46033, USA
Listing for: Byrider Franchising, LLC.
Full Time position
Listed on 2026-02-14
Job specializations:
  • IT/Tech
    Data Engineer
  • Engineering
    Data Engineer
Salary/Wage Range or Industry Benchmark: 90000 - 120000 USD Yearly USD 90000.00 120000.00 YEAR
Job Description & How to Apply Below

Senior Data Engineer - Byrider Corporate - 12802 Hamilton Crossing Blvd.

- Carmel, IN 46032

Rewards for Senior Data Engineer:
  • Competitive starting salary
  • Annual bonus
  • Great benefits & paid time off
  • Growing national company in business for 37 years
  • Nice office with plenty of parking
  • Hybrid work schedule
Job Summary:

We seek a highly skilled and experienced Senior Data Engineer to join our dynamic team. As a Senior Data Engineer, you will play a critical role in designing, developing, and maintaining our data infrastructure, ensuring the availability, reliability, and performance of our data systems. You will drive the data engineering initiatives and collaborate closely with cross‑functional teams to support data‑driven decision‑making within the organization.

You must also be skilled at finding solutions to problems while keeping the environment well‑structured, stable, and secure. This position reports to the Director of Solutions Engineering.

Specific Responsibilities:
  • Data Architecture:
    Lead the design and development of complex data pipelines, data models, and data integration solutions that meet business requirements and performance standards
  • Data Ingestion:
    Architect and oversee the development of efficient and scalable data ingestion processes from various sources, including databases, APIs, and external data providers (AWS Redshift, AWS Dynamo

    DB, AWS S3, MS SQL, etc.)
  • Data Transformation:
    Develop advanced data transformation pipelines and ETL processes to convert raw data into structured and meaningful formats
  • Data Quality:
    Establish and enforce best practices for data quality, validation, cleaning, and error handling
  • Data Storage:
    Manage and optimize data storage solutions, including databases, data lakes, and cloud storage services
  • Data Security:
    Implement advanced data security measures and access controls to protect sensitive data
  • Performance Optimization:
    Continuously monitor and optimize the performance of data pipelines and databases, implementing best practices to ensure efficient data processing
  • Collaboration:

    Collaborate closely with leadership, analysts, and business stakeholders to understand their data requirements and deliver advanced data solutions that meet their evolving needs
  • Innovation:
    Stay at the forefront of emerging data engineering technologies, industry trends, and best practices, and drive innovation within the data engineering domain
  • Documentation:
    Maintain comprehensive documentation of data pipelines, data models, and processes, and ensure knowledge sharing within the team
Products and Stacks:
Languages
  • C#
  • Type Script
  • Python
  • TSQL
  • Java
  • Scala
Frameworks
  • .Net Core
  • .Net Framework
  • ASP .NET
Database
  • MS SQL
  • DynamoDB
  • Redshift
  • Postgresql
Amazon Web Services
  • ECS
  • Cloudwatch
  • Lambda
  • S3
  • Secrets
  • DynamoDB
  • API Gateway
  • SNS
  • SQS
  • SES
  • Cloud Formation
  • Glue
Azure Cloud Services
  • Service Fabric Clusters
  • Service Bus
  • App Services
  • Application Insights
  • Key Vaults
  • Databricks
  • Function Apps
Tools and Platforms
  • Docker
  • Bitbucket
  • Azure Devops
  • Jira
  • Confluence
  • Lucidspark
  • Looker
Skills:
  • Proven experience as a Senior Data Engineer, with a strong understanding of information data security and data access controls
  • Proficiency in programming languages such as Python, Java, or Scala
  • Strong SQL skills and deep experience with database technologies (e.g., SQL, No

    SQL)
  • Expertise in data warehousing and ETL tools (e.g., Kodda, AWS Glue)
  • Extensive knowledge of data modeling techniques and data warehouse design
  • Excellent problem‑solving and communication skills
  • Ability to work collaboratively in a team, adapt to a fast‑paced, evolving environment, and work individually to accomplish goals
  • Experience with cloud platforms (e.g., AWS, Azure) and containerization (e.g., Docker, Kubernetes)
  • Strong commitment to data accuracy, data quality, and data security
  • Ability to work with minimal supervision.
Qualities:
  • Strong teamwork is a must
  • Resilience and resourcefulness
  • Strong customer service focus
  • High energy with self‑motivation
  • Ability and eagerness to solve problems
Educational Requirements:
  • Bachelor's degree in Computer Science, Information Technology, or equivalent experience
  • Technology‑related certifications are a plus
Experience

Required:
  • 3+ years of Python experience
  • 3+ years of SQL experience
  • 3+ years of experience with data modeling and data warehousing
  • Experience with frameworks and products described in the “Products and Stacks” section
  • Experience with standard tools such as the Atlassian product suite, AWS, and Databricks
#J-18808-Ljbffr
Position Requirements
10+ Years work experience
To View & Apply for jobs on this site that accept applications from your location or country, tap the button below to make a Search.
(If this job is in fact in your jurisdiction, then you may be using a Proxy or VPN to access this site, and to progress further, you should change your connectivity to another mobile device or PC).
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary