- Azienda
- Dacomat srl
- Posizione
- Senior data engineer (roma)
- Data udate
- 10/10/2025
- Luogo lavoro
- Roma
- Impegno
- Full-time
- Compenso lordo
- Da concordare
- Posti disponibili
- 1
- Contratto lavoro
- Da determinare
Descrizione offerta di lavoro
Senior Data Engineer – Job Description
Location: Rome / Hybrid
Industry: Telecommunications / Technology
Reporting Line: Big Data General Manager
Overview
Our client, a leading multinational group in the telecommunications and technology sector, is seeking a highly skilled Senior Data Engineer to join its Big Data division. This strategic role is embedded within a high-impact team responsible for designing and operating scalable data platforms that support advanced analytics across global operations.
The ideal candidate will have extensive hands-on experience in designing, implementing, and managing robust data pipelines across batch, streaming, and real-time integration paradigms. The candidate will have 10+ years’ professional experience in Information Technology sector and at least 5 years of proven experience in Data Engineering Projects or Data Governance Teams operating in Large Corporations, preferable in the Telecommunication industry or the Finance Industry is highly preferable.
These pipelines will operate within Data Warehouse, Data Lake, and Data Lakehouse environments, leveraging technologies such as Teradata Vantage, Databricks, and Azure Cloud Data Services.
A strong command of Relational and Multidimensional Data Modelling is essential, along with advanced programming skills in Python, PySpark, and SQL. Experience with Geospatial data processing and Graph Databases is highly desirable.
The candidate should also demonstrate knowledge of core computer science principles, including Algorithms and Data Structures, Computer Architecture, Operating Systems, Programming Languages, Compilers and Interpreters, Theory of Computation, etc.
This role offers the opportunity to contribute to the evolution of a modern data ecosystem, enabling real-time insights and data-driven decision-making at scale.
Key Responsibilities
- Design and implement reliable and efficient data pipelines on Vantage, Databricks and Azure Data Services.
- Integrate and transform data from diverse sources including relational databases, cloud object storage, Event Stream Processing platforms, Web Services.
- Develop and maintain scalable data architectures to support business intelligence, machine learning and Agentic AI initiatives.
- Model complex datasets using both relational and multidimensional approaches (e.g., star/snowflake schemas).
- Work with geospatial datasets and graph-based data structures to support advanced analytics use cases.
- Collaborate with cross-functional teams including Data Scientists, Analysts, and Business Stakeholders.
- Ensure data quality, governance, and security across all engineering processes.
- Monitor and optimize performance of distributed systems and data workflows.
- Contribute to the evolution of the company’s data strategy and platform architecture.
Education
- Master’s or PhD in Computer Science, Engineering, Statistics, Physic, Mathematics.
Required Technical Skills
- Data platforms: Databricks (Apache Spark), Azure Cloud Data Services (Fabric, Synapse, Functions, etc.), Teradata Vantage
- Languages: Python (including PySpark), SQL, Unix Scripting
- Relational databases: Teradata Vantage, SQL Server, PostgreSQL, MySQL
- Geospatial data processing: GeoPandas, PostGIS, SQL Geospatial
- Graph Databases: Neo4j, Cosmos DB with Gremlin
- Data Modelling: Relational Modeling (3NF), Multidimensional Modeling (star/snowflake schema, slow changing dimension, etc.)
Soft Skills & Leadership Competencies
- Strong analytical thinking and problem-solving capabilities.
- Excellent communication skills, with the ability to engage both technical and non-technical stakeholders.
- Collaborative mindset and ability to work effectively in cross-functional teams.
- High level of initiative, autonomy, and ability to manage multiple priorities.
- Adaptability to fast-paced environments and emerging technologies.
- Technical leadership and mentoring capabilities.
- Strategic vision and business-oriented approach to data solutions.
Preferred Qualifications
- Certifications in Azure, Databricks and Teradata.
- Experience with CI/CD pipelines, Git, and DevOps practices.
- Familiarity with data governance frameworks and GDPR compliance.
Competenze richieste
Se le può interessare , mi può allegare un suo cv alla mail : m.dangelo@dacomat.com esplicitando :
- disponibilità in termini di tempistiche (eventuale preavviso)
- aspettativa economica (ral annuale ,netto mensile desiderato / daily rate )
Il presente annuncio è rivolto a entrambi i sessi, ai sensi delle leggi 903/77 e 125/91, e a persone di tutte le età e tutte le nazionalità, ai sensi dei decreti legislativi 215/03 e 216/03.