More jobs:
Job Description & How to Apply Below
Job purpose
As part of the Global IT organization, you will be responsible for leading and supporting large-scale implementation projects and building robust data ingestion pipelines from ERPs / cloud-based SaaS systems and other environments into our central data warehouse (Snowflake). You will develop reusable and scalable data models and data products ensuring our foundation for governed self-service BI & Analytics.
Main selling points of the job
We are seeking a skilled and motivated Data Engineer to join our BI & Analytics team at our Pune office. In this role, you will lead the design and implementation of scalable, reusable, and optimized end-to-end data pipelines. You will collaborate closely with both technical and business stakeholders to gather requirements, develop data models, and deliver high-quality solutions that drive insight and performance.
Main accountabilities and tasks (include KPIs and ESH requirements)
- Leading and supporting large-scale implementation projects, managing internal and external resources
- Identifying relevant data sources based on business needs and system architecture
- Building robust data ingestion pipelines from ERP / cloud-based SaaS systems and other environments into our central data warehouse (Snowflake)
- Designing and implementing transformation pipelines to deliver clean, structured data models for BI reporting tools
- Defining reusable, scalable data models (e.g., star/snowflake schemas)
- Enhancing and extending our common data integration framework
- Debugging and optimizing existing pipelines for performance and reliability
- Automating testing procedures to ensure consistent data quality
Desired experience and qualifications
Work experience:
6+ years
Education:
Level: Master Subject:
Computer Science
Languages:
English
Soft skills:
- Bachelor’s or master’s degree in computer science, Engineering, Data Science, or a related field
- Minimum 5 years of hands-on experience in:
- Data Management (Snowflake, Azure Data Lake Store, DBT)
- Data Modeling (Inmon, Kimball, Data Vault)
- Data Ingestion & Processing (Azure Data Factory, Azure Functions, Stream Analytics, API Management)
- Dev Ops practices (CI/CD, test automation, Azure Dev Ops, Git/Git Hub)
- Strong business acumen and proven ability to translate data into actionable insights
- Experience with enterprise systems such as SAP (ECC/S4), Microsoft Dynamics 365, Salesforce and others across multiple business domains like Sales, Production, Materials Management, or Finance
- Project management skills: certifications in PM, ITSM, or ITIL are a plus
- Familiarity with agile methodologies (Scrum, SAFe, etc.)
- Strategic thinker with a detail-oriented mindset and a positive, proactive attitude
- Excellent communication skills across all organizational levels
- Demonstrated success in delivering results within a global matrix organization
- Fluency in English is a must
Position Requirements
10+ Years
work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×