More jobs:
GCP Data Engineer/Data Architect
Job in
87100, Cosenza, Calabria, Italy
Listed on 2026-01-05
Listing for:
Altro
Full Time
position Listed on 2026-01-05
Job specializations:
-
IT/Tech
Data Engineer, Big Data
Job Description & How to Apply Below
A Data Architect is an IT expert that enables data-driven decision making by collecting, transforming, and publishing data. In NTT Data, a Data Architect should be able to design, build, operationalize, secure, and monitor data processing systems with a particular emphasis on security and compliance, scalability and efficiency, reliability and fidelity, flexibility and portability. The main mission of a Data Architect is to turn raw data into information creating insight and business value.
Build large-scale batch and real-time data pipelines with data processing frameworks in GCP cloud platform
Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution
Required Skills
Bachelor’s degree in Computer Science, Computer Engineering or relevant field
At least 5 - 10years’ experience in a data engineering role
Expertise as a software engineering using Scala / Java / Python
Experience in Advanced SQL skillset - preference on using Big Query
Good knowledge on Google Managed Services as Cloud Storage, Big Query, Dataflow, Dataproc, and Data Fusion
Experience using workflow management
Good understand of GCP Architecture batch and streaming
Strong knowledge of data technologies and data modeling
Expertise on building modern, cloud-native data pipelines and operations, with an ELT philosophy
Experience with Data Migration / Data Warehouse
Intuitive thinking of how to organize, normalize, and store complex data, enabling both ETL and end users
Passion for mapping and designing ingestion and transformation of data from multiple sources, creating a cohesive data asset
Good understanding of developer tools, CICD etc
Excellent communication, empathetic with end users and internal customers.
Nice-to-have :
Experience using Big Data echo system Hadoop, Hive, HDFS, Hbase
Experience with Agile methodologies and Dev Ops principles
#J-18808-Ljbffr
Note that applications are not being accepted from your jurisdiction for this job currently via this jobsite. Candidate preferences are the decision of the Employer or Recruiting Agent, and are controlled by them alone.
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
To Search, View & Apply for jobs on this site that accept applications from your location or country, tap here to make a Search:
Search for further Jobs Here:
×