×
Register Here to Apply for Jobs or Post Jobs. X

Associate Vice President - Senior Lead Data Engineer T500-23254

Job in 500001, Hyderabad, Telangana, India
Listing for: Deutsche Börse
Full Time position
Listed on 2026-02-17
Job specializations:
  • IT/Tech
    Data Engineer, Data Analyst, Cloud Computing
Job Description & How to Apply Below
Position: Associate Vice President - Senior Lead Data Engineer [T500-23254]
About Deutsche Börse Group:
Headquartered in Frankfurt, Germany, Deutsche Börse Group is a leading international exchange organization and market infrastructure provider. They empower investors, financial institutions, and companies by facilitating access to global capital markets.
Their India centre is located in Hyderabad, serves as a key strategic hub and comprises India’s top-tier tech talent. They focus on crafting advanced IT solutions that elevate market infrastructure and services. Deutsche Börse Group in India is composed of a team of capital market engineers forming the backbone of financial markets worldwide.

Corporate IT of Deutsche Börse Group is in charge of end user workplace experience, voice & communication, application development and operations for all group processes such as Financial Core, Customer Care, Control & Corporate Processes, as well as Deutsche Börse Group’s Reference Data Platform. We also develop and operate the group’s Enterprise Analytics solutions which form the core of sharing and measuring our group’s success.

Our mission is simple – Make IT Run! As member of the Enterprise Analytics Team, you must be an experienced and inspiring specialist in the dimensions of technology and analytics with first proven record in strategic or
operational execution. You are a role model in a team of junior and senior talents with diversity in experience, background, and locations. Translating business (processes) into numbers and KPIs, fully digital and automated, must be your DNA. You must embrace transparency and simple access to data from diverse ecosystems (SAP & non-SAP) via the Data Mesh to take decisions faster based on reliable data.

The perfect candidate for this role will have a “can do” positive attitude. If you strive to take ownership and develop creative solutions, are fascinated by technology, and like to work in a challenging and fast paced environment – then you are exactly the person we are looking for.

Tasks / Responsibilities:
Conception and implementation of innovative data analytics solutions and projects for our business partners on the Enterprise Analytics & Reporting Platform.
Design and Development:
Create and manage data models & data pipelines using GCP services such as Big Query, Dataflow, Pub/Sub, Cloud Composer, Cloud Data Fusion and Cloud Storage.
Data Integration:
Integrate data from various sources, ensuring data quality and consistency.
Optimization:
Optimize data processing workflows for performance and cost-efficiency.
Security and Compliance:
Implement data security measures and ensure compliance with relevant regulations and best practices.
Monitoring and Maintenance:
Monitor data pipelines and troubleshoot issues to ensure smooth operation.
Collaborate with various stakeholders to analyse, define, and prioritize business requirements and translate them into technical specifications
Build, strengthen and maintain a close relationship to our main stakeholders
Foster and drive the analytics and reporting culture in the whole organization Internal

Qualifications /

Required skills:

Technical or University Degree in Business Informatics, Business Management, Information Technology or a similar field paired with a passion for data
Professional experience working in a Data Engineering (+5 Years)
Distinguished capabilities in ETL / ELT development and deployment of data processing pipelines using CI/CD.
Background in cloud technologies preferably Google Cloud and associated Cloud Services (Big Query, BigQuery

ML, Dataflows, Data Fusion, Cloud Composer, Cloud Run). Azure, AWS and others would be an advantage

Experience with Infrastructure as Code (IaC) tools like Terraform for deploying and managing GCP resources with basic understanding of Linux operating system.
Very Good knowledge in programming languages like SQL, Python and Java, with experience in Java Spring boot.

Experience with setting up VMs, VPCs

Experience with Cloud Workstation and Cloud Shell

Experience with Cortex Framework

Experience with Git Hub Repo Setup
Experience in containerized applications such as Docker and Kubernetes
Ability to effectively explain the relevant analytics concepts and technologies with a strong passion in analyzing business needs together with the various stakeholders
Understanding of modern analytics tools (e.g. SAP Analytics Cloud, Power BI, Looker Studio)
Proven expertise in using an agile project methodology (SCRUM)
Good communication with peers, technical teams, and business representatives
Fluent English is a must
Position Requirements
10+ Years work experience
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
 
 
 
Search for further Jobs Here:
(Try combinations for better Results! Or enter less keywords for broader Results)
Location
Increase/decrease your Search Radius (miles)

Job Posting Language
Employment Category
Education (minimum level)
Filters
Education Level
Experience Level (years)
Posted in last:
Salary