Data Scientist - (AD-L6.13) at Zutari
Zutari
Role overview
Zutari's Digital Advisory team is expanding its data practice, delivering innovative solutions across energy, transport, water, and buildings. Working at the intersection of engineering and technology, the team partners with BIM and IoT specialists to connect static 3D and asset data with real-time information, bringing digital twins to life.
Role responsibilities
Collaborate across cross-functional teams to deliver multiple concurrent data and analytics projects.
Ingest and consolidate raw data from diverse sources, including Building Information Modelling (BIM), IoT sensors, and ERP systems.
Design, build, and maintain scalable data architectures and end-to-end data pipelines.
Perform data analysis and develop BI dashboards to surface actionable insights for stakeholders.
Research, prototype, and validate machine learning and AI proof-of-concepts to address business challenges.
Minimum requirements
A "can-do" attitude and genuine passion for applying data in engineering and infrastructure domains
Bachelor's degree in Engineering, Computer Science, or a related field
At least 4 years' experience in data engineering and/or data science
Fluent in Python, including object-oriented design, unit testing, and Git version control
Hands-on experience with SQL databases, data warehousing, and data pipeline design
Experience with Docker for containerisation and deployment of data applications
Experience working with time-series data and IoT data streams
Ability to communicate complex technical findings clearly to non-technical engineering stakeholders - both in writing and in presentations
Comfortable leading technical decisions on a workstream and collaborating across multidisciplinary teams (BIM, IoT, engineering)
Advantageous
Master's or higher qualification
Experience in infrastructure sectors (energy, water, transport, or buildings)
Understanding of asset lifecycle data and digital twin concepts, including semantic data modelling or ontologies
Experience with BIM data formats (IFC) or integration with BIM platforms
Data engineering certification
Software development and software engineering best practices
Docker / containerisation
Leveraging Claude Code (or similar) to increase output
Proven experience building and deploying ML/AI solutions beyond proof-of-concept
Cloud and data platform experience: Snowflake, Microsoft Fabric, Databricks, BigQuery, or similar
Streaming and real-time pipeline experience (Kafka, Spark Streaming, or similar)
MQTT / SCADA / industrial IoT protocols (e.g. Sparkplug B, OPC-UA, Modbus)
Familiarity with geospatial data and tools (e.g. GeoPandas, PostGIS, or similar)
Time-series database experience (InfluxDB, TimescaleDB, or similar)