1. Introduction
The integration of geotechnical/geological data with BIM (GeoBIM) is critical for managing various natural calamities such as earthquakes, landslides, and avalanches, as well as hydrogeological disasters (e.g., debris flow, floods). Accordingly, the use of digital information models is of paramount importance in the analysis and study of future possible scenarios. Climate change, which is affecting the frequency of natural events and their impact on the landscape, has made this requirement more pressing in recent years. As a result, the focus of digitalisation in the Built Environment should be on how to integrate and manage heterogeneous data from different data sources. However, such integration becomes difficult in the case of geotechnical and geological digitisation. In this publication, GeoBIM [
1] refers to an environment in which subsoil layer information is used to define a digital information model that may interact with and play a role in Building Information Modelling (BIM) processes.
The initial research on the integration of geodata with BIM was conducted on the merging of BIM and GIS, establishing a basis for collaboration between the two systems [
2,
3]. Several unresolved issues were identified in those studies, including differences in their modelling methods, the heterogeneous levels of detail and development of BIM and GIS at different scales, and the different uses of BIM and GIS [
4,
5,
6]. Several authors have analysed the current status and prospects of using BIM, machine learning and computer vision techniques as support tools to reinforce the efficient and effective planning, development and management of underground construction [
7,
8,
9,
10,
11].
All of these studies aimed to create communication between such systems by employing different approaches to integration [
12,
13,
14,
15,
16,
17,
18]. One of the key aspects of this topic is related to the analysis of different representation methods for 3D geological models, such as BIM-based, voxel-based and hybrid models [
19,
20]. Due to the flexibility offered by parametric design, other approaches that involve the use of Visual Programming Languages (VPLs) and dynamic 3D modelling for geology for the design and execution of large-scale infrastructure projects have recently been developed [
21]. To provide sufficient integration, several actions that manage/modify the subsoil and its geo-related data and hydrogeological and geotechnical objects and properties must be performed [
22,
23]. This type of integration would make relevant data available at any stage, and drastically decrease data loss between the two domains. Some studies have analysed the role of BIM in the different phases of geotechnical investigation [
24] and proposed methods for optimising the procedures of geotechnical surveys [
25,
26]. In some studies, investigations for the design and management of civil infrastructures have been performed that incorporate the use of non-destructive survey data, such as Mobile Laser Scanner (MLS), Ground-Penetrating Radar (GPR) and satellite-based information [
27]. Most of this research has been performed in support of the development of innovative modelling methods for tunnelling projects, where the information related to the subsoil is even more relevant [
28,
29,
30]. The core of GeoBIM is a versatile and comprehensive database [
1] that is capable of handling geotechnical data, which are collected digitally and seamlessly transferred throughout the whole design process of underground infrastructure projects, with no manual transformations. This process requires the geo-model to have an “unbroken digital information supply chain”, and its elements need to be directed towards the design process in a way that designers can readily use [
1]. Setting up a proper database for an efficient data exchange process is crucial in this approach.
The data collection process begins with the collection of geotechnical data. This phase aims to gather relevant information in order to create suitable soil models and provide appropriate design solutions (e.g., the selection of an appropriate foundation system for the structure, the choice of the improvement technique or environment to be used, etc.). A geotechnical model is, for all intents and purposes, a “computational model” in which each element is characterised by parameters that allow the definition of an appropriate soil behaviour model (e.g., thickness of layers, geo-structural features, mechanical and hydraulic parameters). Building a soil model requires a preliminary interpretation of in situ and laboratory tests, as well as the collection of these data in a database. To provide comprehensive interoperability and integration with the superstructures, the database should not only allow the construction of the geometrical model, but also the extraction of all of the information and data in each layer. During this phase, the key challenge is to combine the results of tests performed in order to identify the stratigraphic model and hydromechanical parameters. This process is rather complex, especially when the site is characterised by high lateral variability or when stratigraphic profiles must be reconstructed by merging the results of several types of test (e.g., boreholes, CPTs—Cone Penetrometer Tests). To pick an adequate representation of the soil model based on proper soil classification, a comprehensive 3D stratigraphic model will involve the combination, correlation, and simplification of all tests [
31]. This is an important step that must be completed while keeping in mind that the 3D model is derived via interpolation methods. As a result, each option must be chosen carefully to generate a simple yet complete representative model of soil without losing crucial strata or information. In this context, the term classification is employed to describe the procedure of sorting soil into distinct classes, with the assumption that materials within the same class will exhibit similar behaviour [
32]. It is worth noting that several classification systems are currently in use: (i) the Unified Soil Classification System [
33] adopted by the AGI (Italian Geotechnical Association); and (ii) the HRB-AASHTO system (American Association of State Highway Officials), codified in Italy by UNI EN 13242 [
34].
Another crucial part of the integration of geo-related data and BIM is the information interchange and processing standard, which allows successful management of the built environment and connectivity between BIM and geo-models. Several research groups have attempted to define appropriate standards for the incorporation of geological models into BIM processes. Building Smart International (bSI) has already introduced an open standard for describing the built environment, known as IFC (Industry Foundation Classes). The “BuildingSMART rooms” [
35] research group suggested updating the IFC open standard by including geological models for planning as use cases for the standard “IFC-Tunnel”. This standard aims to expand the IFC data model to allow for the precise description of the semantics and geometries of the different elements that make up tunnels, such as geotechnical subsoil conditions and treatments, civil engineering components, and the functional systems that equip them [
36]. Geotechnical analysis tools already enable IFC input, but this data format does not allow for further analysis once the features have been imported as IFC. Furthermore, only a few IFC Classes are implemented in some tools, severely limiting the options inside the exchange scenario.
Other extant digital spatial data and information standards related to 3D geometries include those developed by the Open Geospatial Consortium (OGC): (i) CityGML, an open-data model and XML-based format for storing and exchanging virtual 3D city models; (ii) GeoSciML and GroundWaterML2, which are targeted towards geology and hydrogeology, respectively. GeoSciML is a logical model, and GML/XML encodes rules for exchanging geological map data, geological time scales, boreholes, and metadata for laboratory analyses [
37]. GroundWaterML2 introduces extra concepts including hydrogeologic units, fluid bodies, discharge, and recharge [
36]. OGC developed the LandInfra/InfraGML format, which focuses on land use, topographic modelling, and infrastructure parts such as roads and rainwater management works [
38]. Even though all these standards are excellent, there is still a lack of actual integration between geomodelling and BIM systems.
Due to the limitations outlined, integration between geotechnical and geological modelling and BIM procedures is currently specified in different workflows. Each of them is characterised by distinct tools and standards, which frequently impede the effectiveness of the integration process. The present study aims to examine several feasible workflows for integrating and managing geotechnical and environmental project data within BIM procedures. This integration is designed to provide a final federated model, which is also archived in the as-built documentation. Four alternative processes were selected for this purpose to analyse the specific types of data and transformations required in the process. The methodologies were applied to a real case study. The originality of the paper lies in the definition of a framework for assessing several GeoBIM integration methods, considering specific parameters for evaluation. Each process was evaluated by assigning different levels of interoperability based on data integration. The Analytical Hierarchy Process (AHP), one of the Multi-Criteria Decision Approaches (MCDAs), was used to conduct the evaluation in order to use both quantitative and qualitative indicators for the assessment. This enabled the effective coordination and interchange of data for the purposes of information integration, which were then used to construct a “multidisciplinary model” of the case study, incorporating data from many sources.
2. The Case Study
The present study was established with the aim of creating a GeoBIM model for the underground area where the new Headquarters of the Piedmont Region was built and where the new hospital “Parco della Salute” will be in Turin (Italy).
The project site was included in the requalification and urban development plan for the ex-industrial site Nizza Millefonti, located in the southeast part of Turin city. The whole area is 313.725 m
2, and it is divided into three lots (
Figure 1). The first lot will be converted into the “Parco della Salute, della Ricerca e dell’Innovazione”, where a hospital including several healthcare facilities and research centres will be built. Lots 2 and 3 were selected for the construction of the new Headquarters of the Piedmont Region, which was officially opened on 14 October 2022. The skyscraper, which stands 205 m tall and has 42 storeys, serves as the Headquarter of the Piedmont Region. The BIM model of this tower, developed in previous years by the drawingTOthefuture lab at the Polytechnic of Turin [
39], is integrated with the federated model of the case study in the present research.
Many investigation campaigns have been performed over the years, making it possible to collect several sets of subsoil data, including hydro-mechanical properties.
Figure 1 shows the localisation of the in situ tests. Specifically, two different investigation campaigns were performed on lots 1, 2, and 3: an environmental survey on the first two lots and a geotechnical investigation on lot 3. Due to their classification as ex-industrial sites, the investigations carried out on lots 1 and 2 in 2004–2005 aimed to identify chemical pollutants (electromagnetic investigations, georadar reliefs, electrical tomographies, piezometric surveys, and water and soil analyses). The depth of the boreholes used in this area was limited to 15–17 m due to the nature of the site investigation. In contrast, the survey campaign performed on lot 3 in 2007–2008 was carried out to establish a geotechnical model for the design of the tower. Several geotechnical tests were executed, both in situ (SPT: standard penetrometer tests; PMT: pressuremeter tests; Lafranc permeability tests; cross-hole tests; and measurement of piezometer levels) and in the laboratory (grain-size distribution tests, Atterberg limits, triaxial tests, and oedometer tests). Boreholes in this lot were employed at two different depths: up to 80 m in the zone where the skyscraper was built, and up to 40 m in the area nearby.
The following factors influenced the selection of the case study: (i) the presence of many lots with specific characteristics; (ii) investigation campaigns carried out for different purposes and at various depths; and (iii) the broad availability of geotechnical and geo-environmental data from heterogeneous tests. All these factors made the site appropriate not just for testing various geotechnical data and GeoBIM model integration approaches, but also for performing interoperability tests. Lot 3 was subjected to the methodology and the interoperability tests outlined below. Once the appropriate tools in terms of interoperability were identified, data from the investigations of lots 1 and 2 were processed and integrated with those from lot 3 to create a federated model of the whole underground area, as detailed in
Section 4.3.
3. Methodology
3.1. Geo-Modelling–BIM Integration Methods
There are currently several workflows specifying methods for integrating geo-modelling and BIM, which are based on a combination of tools and software that frequently impede the good integration process. Different approaches can be selected in accordance with an ideal methodological framework that is capable of achieving a high degree of integration, which may be developed starting from the semantic concepts established in the IFC-Tunnel project [
17]. A specific taxonomy was defined in this project to clarify the context in which the geological/geotechnical data and models should be exchanged. The taxonomy can be divided into two main categories:
- (a)
GeoDocu (Factual or Base Data), including data related to site investigation, laboratory/in situ tests, borehole data, and tunnel documentation;
- (b)
Interpreted models (GeoModel, HydroModel, and GeotechModel), providing input for further analysis and application. Each taxonomy element requires geometric representations, which can be divided into the following dimensions: 0D point, 1D line, 2D area/surface, 2.5 elevation grid or 3D surface, and 3D volume [
36].
For instance, in the case of interpreted models, the element “geological/geotechnical units” must be represented as 3D solid models or 3D surfaces, with a top and/or a bottom layer, such as Faceted BRep, NURBS, or TriangulatedFaceSet [
36].
Based on this semantic concept, methodological frameworks are defined, considering the specific types of data required at each stage of the process. These methods are developed with the following objectives in mind: (i) analysing data to be used as the “input” in the first phase, to better understand how to organise the existing data sets; (ii) processing the data by using different methods and tools for the definition of soil modelling; and (iii) determining the results in terms of information models that can be integrated within a BIM environment procedure, as a repository of information and database for geotechnical/geological domain that may also interact and be transformed into BIM features.
In detail, the data input phase includes the following major activities: (i) data pre-processing to adopt the appropriate classification codes chosen for the collected information; (ii) data digitisation, which is helpful for further data elaboration; and (iii) data setup, to manage the import phase. It is important to note that an efficient workflow involves information about the points (e.g., Cartesian points and annotations), surfaces (e.g., triangulated and parametric surfaces), and volumetric representations (e.g., faceted BRep and NURBS).
As stated in the IFC-Tunnel project [
36], the data processing step involves the use of factual data to generate interpreted models. This phase can be divided into four major activities: (i) modelling, which requires the use of specific customised commands; (ii) parameterisation, which involves linking information contents to the modelled objects; (iii) data visualisation, filters, and customised views, for organising the data hierarchy; and (iv) export, which necessitates proper management of the associated settings. The content of factual data in this context is highly dependent on existing data and standards (e.g., AGS or GeoSciML), and, eventually, new project-specific site investigation results. In this respect, it is important to highlight that the accuracy of data gathering is largely dependent on the activities of the previous phases. This emphasises the significance of the data pre-processing described in
Section 4.1.1. Interpreted models describe the predicted ground conditions, including uncertainty, and may operate as the basis for the design, structural analysis, and definition of construction measures, as well as the representation of contract-relevant predictions of ground conditions [
36].
Finally, the data output phase includes the following activities: (i) visualisation of stratigraphies, along with their properties; (ii) model production; (iii) extraction of views in the form of 3D views, cross-sections, etc.; (iv) extraction of information contents related to the properties and characteristics of objects; and (v) setting the model to coordinate it with other discipline-related models with the goal of creating a data repository. At this stage, the results serve as the input for subsequent analysis and/or design processes, due to the creation of a digital repository collecting all necessary information produced to ensure integration with BIM processes. Following this methodological framework, several workflows combining many tools and software solutions for geo-modelling–BIM integration can be developed. These integration approaches must ensure an adequate level of interoperability, which can be evaluated throughout all data processing steps. Several tests can be adopted to assess the interoperability of the different approaches. The method described by Fjeldsted et al. [
5] was employed in this study.
3.2. Assessing Workflow Alternatives Using the AHP Approach
Different integration approaches can be characterised on the basis of their advantages and disadvantages. Especially within the context of the present research, where different approaches have been developed in recent years, it is of paramount importance to provide a tool to support decision making regarding the selection of one method over another. The innovative scope of the present study is to define the best methods for achieving the aim of integrating and managing geotechnical and environmental project data within BIM processes and to develop a data repository for such information. In this respect, integration procedures must guarantee: (i) time saving, ensuring rapid replicability for application in different contexts; (ii) high levels of interoperability in terms of the type and number of file formats that can be imported/exported, in order to make the procedure flexible; (iii) exchange of information content, i.e., “a set of information organized according to a specific scope for a systematic communication of a set of knowledge within a process” [
40], that is effective and efficient; and (iv) a good level of process customisation according to the different characteristics of the specific application. These characteristics define metrics that make the definition of an “ideal” process possible; providing appropriate indications regarding the type of data to be collected, the type of integration that should take place, and the possible outputs.
Based on these metrics, an original framework for assessing integration processes is proposed here. The interoperability tests performed by applying the selected modelling workflows enable a database of ‘values of performance’ to be defined, which is necessary for the development of a framework for assessment. In order to evaluate the performance of the different workflows towards the research goal, a set of Key Performance Indicators (KPIs) was identified and evaluated on the basis of the previously performed interoperability tests.
Since the use of KPIs to assess the efficiency of workflows for modelling purposes represents a novelty, the authors identified specific indicators for their evaluation, providing for each KPI a unit of measurement, a description, and a measurement method, as follows:
- (a)
Workflow execution time [min]: time required to execute the whole workflow. Measurement method: Quantitative.
- (b)
Number of formats supported for import/export (Data exchange) [n.]: sum of the total number of file formats allowed for import and export of data. Measurement method: Quantitative.
- (c)
Information content share (Modelling) [%]: percentage of information preserved through appropriate levels of interoperability. Measurement method: Qualitative.
- (d)
Customisation share [%]: percentage of the workflow with the ability to be customised according to specific user requirements (e.g., the ability to produce different views—3D, cross-sections, etc.—or the ability to achieve a given objective simply by combining different tools). Measurement method: Qualitative.
The first two indicators’ values are determined by assessing the time required to execute the integration approach and the interoperability tests on a computer, and for the final two on the basis of the subjective experience of the operator.
These indicators can be used to perform a quantitative comparison of different approaches and to determine which one is superior based on the parameters considered. The Analytical Hierarchy Process (AHP), one of the Multiple-Criteria Analysis (MCA) decision-making techniques, is used as the assessment method in the present study. Multi-criteria analysis first appeared in the 1960s as a decision-making method for evaluating alternative projects by comparing and assessing many criteria simultaneously. Several evaluation methods are available, but the AHP method [
41] appears to be the best suited for this study owing to its ease of application and the ability of compare multiple options by giving priority ratings to distinct criteria/sub-criteria to be used for comparison.
The approach is divided into four stages: (i) top-to-bottom breakdown, creating a structure with unidirectional hierarchical links across levels; (ii) pairwise comparison, in which the decision elements are compared pairwise in terms of their importance for their control criterion. This comparison is based on the “Saaty’s Fundamental Scale”, a 9-point scale that determines the relative importance of one choice when compared to another; (iii) judgment synthesis to define weights for each decision element; and (iv) evaluation and checking of judgment consistency.
Pair matrixes are formed on the basis of the numerical judgments established at each level of the hierarchy: if n is the number of criteria in a given level of the hierarchy and m is the number of options, a square matrix (n × m) can be constructed at that level.
5. Conclusions
A study on the interoperability between geotechnical and environmental data and BIM-based processes with a view to their integration was presented. The research was tested on a case study in a Turin area involving a requalification plan. Using currently available tools and software, four distinct workflow approaches for the integration of geological/geotechnical data in a BIM environment were defined.
Interoperability tests were performed to assess the advantages and disadvantages of each approach. These tests helped the researchers to evaluate each procedure based on the three primary steps that characterise the integration process: data input, data processing, and data output. The application of the integration procedures to the case study revealed that all of them required an initial phase in which the geotechnical and geological information was thoroughly analysed to enhance its utility in producing GeoBIM models.
The results highlighted that it is not possible to determine a priori which method will provide the best outcome, because the requirements and objectives of a GeoBIM model are defined on the basis of the specific project. In this respect, the adoption of a customizable method allowed the performance of the approaches to be assessed. KPIs were selected, and one of the multi-criteria analysis tools was adopted to define the best alternative workflow. The AHP was applied to the case study, and the results showed the great adaptability of such an assessment approach. Indeed, these indicators not only allowed a quantitative analysis to be performed in which a metric for qualitative parameters was defined, they also provided a framework through which it is possible to draw up a classification and assign a weight for each comparison criterion. The weights can be assigned adequately to emphasise the importance of some criteria over others, depending on the finality of the GeoBIM model.
Accordingly, as explained in
Section 4.2, the authors designed a pairwise comparison in which weights were assigned to each criterion to reflect its relative importance to the decision. As the main objective of this evaluation was to assess the interoperability performance and different levels of integration of the various approaches with BIM-based procedures, the data exchange, measured using the indicator “Number of formats supported for import/export”, was taken as the most important criterion in the approach comparison. The calculation of the final priorities showed that workflow n. 4 obtained a higher result (0.441) than workflows n. 1 (0.022), n. 2 (0.198) and n. 3 (0.338). However, workflow n. 4 was selected as the optimum approach due to the weighting decided by the authors within the pairwise comparison.
For this reason, no “ideal” workflow exists that perfectly suits any case, and the most important step when performing the assessment is to decide the priorities of the criteria in order to establish the most efficient workflow based on the given priorities. Therefore, the results of the assessment could differ greatly when giving priority to some criteria instead of others. This means that the most efficient workflow for achieving a specific aim depends on a lot of factors, which must be defined and sorted in order of importance before performing the assessment itself. Due to its flexibility and scalability, the proposed framework serves as a guide for practitioners in need of a clear and straightforward identification of the best-performing workflows, and it is extremely adaptable in its application.
Finally, a coordinated model of the case study was obtained. This represents a data repository model that collects heterogeneous data. This demonstrates how the integration of geotechnical–geological modelling in a BIM environment is crucial to the increasing requirement to create Digital Twins not just for structures or infrastructures, but also for larger areas in territorial contexts. This is critical, for example, in the context of Risk Management for the prevention of floods, fires, and other natural disasters, as well as for defining territorial resilience. Geothermal energy is another sector in which the integration of geo-models and BIM may be quite beneficial (e.g., GeoFIT, in the Horizon 2020 project). It should be noted that the integration of heterogeneous data from different disciplines into BIM can create new challenges within the new bounds of Artificial Intelligence with the application of machine learning techniques (ML). Appropriate solutions might incorporate ML algorithms to address complex issues that require simulations of different scenarios.
Although the findings of this study demonstrate significant benefits for data integration and provide a highly replicable method for multiple contexts, interoperability remains a challenge. However, one limitation of this study is that the critical criterion of the data collected via the interoperability tests could change over time, and is strictly dependent on the user who performs these tests. This requires constant updating of the assessment and necessitates the creation of a clear procedure for performing interoperability tests in order to avoid interpretation mistakes in the process. Furthermore, issues such as the definition of classifications and standards for integration remain to be solved. In terms of future research, it would be useful to extend the current findings by addressing these gaps, decreasing data losses in terms of attributes, and enhancing interoperability between the two systems and among specialists.