Más empleos:
Senior Data Engineer México
Trabajo disponible en:
59058, Guadalajara, Michoacán de Ocampo, México
Publicado en 2026-03-10
Empresa:
Belcan
Tiempo completo
puesto Publicado en 2026-03-10
Especializaciones laborales:
-
TI/Tecnología
Ingeniero de datos, Almacenamiento de datos
Descripción del trabajo
As Senior Data Engineer is available as a Hybrid role for candidates in the Mexico City, Guadalajara, or Monterrey, México areas. The Data Engineer will be responsible for designing and developing robust and scalable data warehousing solutions, based on the business requirements. Data solutions may involve retrieval, transformation, storage, and delivery of the data while following standards and best practices, writing code and providing production support for the enterprise data warehouse.
Our ideal candidate is a skillful data wrangler who enjoys building data solutions from the ground up and optimizing their performance.
Job Duties:
Data Architecture & Infrastructure Development
- o Designs and implements robust, scalable, and high-performance data solutions using Snowflake, dbt, and Python
o Builds and maintains the organization"s data infrastructure
o Champions the data warehouse by creating denormalized data foundation layers and normalized data marts
o Works on all aspects of the data warehouse/BI environment including architecture, design, development, automation, caching, and performance tuning
o Builds infrastructure for optimal extraction, transformation, and loading (ETL) of data from various sources using SQL and cloud data platforms like Snowflake
o Defines strategies to capture all data sources and assess the impact of business process changes on data inputs
Data Platform Management & Optimization
- o Leads the migration of existing data platforms to Snowflake, ensuring minimal disruption to business operations
o Manages the full lifecycle of data within Snowflake, from ingestion and storage to analytics and reporting
o Conducts performance tuning and troubleshooting of the Snowflake environment to ensure optimal efficiency
o Identifies, designs, and implements internal process improvements, such as re-architecting for scalability, optimizing data delivery, and automating manual processes
Cross-functional Collaboration & Stakeholder Support
- o Collaborate with systems analysts and cross-functional partners to understand data requirements
o Works with stakeholders including Executive, Product, Data, and Design teams to support data infrastructure needs and resolve data-related technical issues
Innovation & Continuous Improvement
o Explores emerging technologies continually such as Big Data, Artificial Intelligence, Generative AI, Machine Learning, and Predictive Data Modeling to enhance data capabilities
o Performs other duties that may be assigned from time to time
Job Requirements:
· 8+ years of professional experience in the data engineering field
· Hands-on polyglot programming expertise - hands-on, current Python experience is a must-have
· Marketing channel data automation, pipeline monitoring and data delivery is strongly preferred
· Extensive experience in designing, developing Snowflake Cloud Data Platform
· Proficiency in Multi cloud platform like Azure, AWS and/or GCP
· Proficiency in designing and implementing data pipelines using diverse data sources including databases, APIs, external data providers, and streaming sources
· Demonstrated history of designing efficient data models using Medallion Architecture
· Deep understanding and experience with relational (SQL Server, Oracle, Postgres and MySQL) and No
SQL databases
· Experience building and supporting REST APIs for both inbound and outbound data workflows
· Proficiency and solid grasp of distributed system concepts to design scalable and fault tolerant data architectures
· Excellent critical thinking to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
· Excellent analytic skills associated with working on structured and unstructured datasets
· Ability to build processes that support data transformation, workload management, data structures, dependency and metadata
· Ability to build and optimize data sets, "big data" data pipelines and architectures
· Ability to understand and tell the story embedded in the data at the core of our business
· Ability to communicate with non-technical audience from a variety of business functions
· Strong knowledge of coding…
Requisitos del puesto
10+ años
Experiencia laboral
Tenga en cuenta que actualmente no se aceptan solicitudes desde su jurisdicción. Las preferencias de los candidatos son decisión del empleador o del agente reclutador.
Para buscar, ver y solicitar empleos que acepten solicitudes de su ubicación o país, toque aquí para realizar una búsqueda:
Para buscar, ver y solicitar empleos que acepten solicitudes de su ubicación o país, toque aquí para realizar una búsqueda:
Busque más trabajos aquí:
×