The old miner’s saying ‘All is dark ahead of the pick’ captures the uncertainty of geology and geotechnical conditions in mining as well as modern-day tunneling. Rapid technological advances in machine manufacturing allow tunnel boring machines (TBM) to be used in ever more challenging ground conditions. The new generations of TBMs have a significant level of sensory systems built in with the collected data available for analysis in near real-time. Data collection and processing provides increasing opportunities for analyzing the tunneling process, inferring face conditions encountered during tunneling, optimizing performance and providing and maintaining building information modeling (BIM) models for use over the design life of the underground structure. This article offers a snapshot of the current state of data collection and processing in tunneling for a variety of end uses and offers some thoughts on emerging technologies and technology transfers to mechanized tunneling projects that have the potential of impacting the tunneling practice and being a disruptive force in the industry. The discussion will provide a brief background on how the big data is changing tunneling and how these emerging technologies are being implemented in different projects.
The basic requirement for success in any tunneling project is sufficient data on the geological and geotechnical conditions along the alignment. Geotechnical exploration provides this data by characterization and testing of a limited amount of sampled material and from field testing and monitoring of instrumentation, specifically installed for the project. Existing geological maps and geotechnical archive data supplement the data base used for predicting the subsoil conditions that will be encountered during tunneling. The design objectives and boundary conditions that are to be considered, including estimating the interaction of geotechnical conditions, tunnel and existing structures go into the planning and design process and result in the documents that become the basis for the bid process, selection of the tunneling contractor and execution of construction.
The construction process itself is the source of huge amounts of data from various sources that provide the opportunity to add to the project data base more detailed information on subsoil conditions within and beyond the tunnel envelope, the behavior of the various geotechnical units in response to tunneling and the interaction of the tunnel and surrounding ground. This data can serve to add to the understanding of the subsoil condition(s), thereby either verifying the contractually defined subsurface conditions or identifying differing site conditions (DSC), the latter allowing the contractor to consider a DSC claim and request for modifications of the cost and time schedule if material impact on the tunneling process is determined. Ideally, the data collected during the construction phase is used by all project stakeholders to advance the project in the most effective and efficient way. However, shielded tunneling methods do not allow direct observation of the tunnel walls and in the case of pressurized face units, limited to no access to observe the face, and thereby require some interpretation of ground conditions. This offers the opportunity for correct or misinterpretation of the actual tunnel face conditions encountered.
Construction process monitoring data, TBM operational parameters, instrumentation data on subsurface deformation and hydrostatic head variations, and survey data including INSAR satellite data on ground surface movement add to the data base for analyzing cause-effect relationships. Sharing the data between the project parties constitutes the commonly available set of information and increases the probability of establishing the facts and perhaps achieving an agreement regarding impacts on cost and schedule.
The collection of data does not end with completion of the tunneling process. Data collection on postconstruction displacement trends of surface structures, or those in close proximity of the tunnel, changes of ground water table, and — last but not least — tunnel liner conditions over the service life of the structure provides information that adds to the overall project database. This includes any rehabilitation work on the tunnel during the service life.
An overview of the various sources of relevant data during planning and design, construction and service life of a tunnel is provided in Fig. 1.
Data is increasingly acquired in digital form e.g., with the use of sensors or input values on the observations made by field crews during geotechnical exploration and during construction monitoring using tablets. The resulting electronic databases can be easily shared among project parties with process controls determining when a file reaches the revision level at which sharing is permitted. These can be, for instance, geotechnical logs being shared after validation of field observations by a certain amount of geotechnical laboratory index testing, or tunnel inspectors’ reports being shared after review by the resident engineer. Instrumentation data are often acquired digitally and automatically by total stations that are monitoring survey points on existing infrastructure within the zone of influence of the tunnel construction. The data from these sources are typically displayed in graphic information system (GIS) software with time and spatial references relatable to one another and made accessible to users online.
For tunneling projects in larger metropolitan areas such as Seattle, WA, where government agencies make archives of geotechnical reports and exploration logs linked to a GIS system available to the public via an online geologic information portal, these archives may gain in importance as a data source, especially where past projects, tunneling or otherwise, required deep explorations and methodical characterization of subsoil conditions. This is an invaluable source of information on the preliminary stages of a site investigation and can result in substantial cost savings for the owners.
Data processing (2020)
The design phase of a TBM tunneling project provides a representation of the planned tunnel structure using computer-aided design (CAD) tools and a model of the subsoil conditions the TBM will need to be operated in to excavate the tunnel and build the tunnel liner while minimizing deformations that could impact third parties. These are the basic requirements for conforming to the limits specified in the construction contract. The main components of the geotechnical data in the design and bidding process are compiled in a geotechnical data report (GDR) and the summary interpretation, which allows for a risk sharing program, is contractually described in a geotechnical baseline report (GBR). The GBR is based on analytical evaluations or perhaps numerical modeling to develop the design of tunnel structure and shafts, cross passages and other subsystems and functional components, and combined with the specifications, defines the limits for deformations and surface settlement. Ideally, the geological and geotechnical data are consolidated in a ground model using 3D geologic modeling software.
Although not yet common in mechanized tunneling, a BIM model can already be generated during the design phase with the CAD data imported to BIM software such as Autodesk Revit and others. The basis for BIM is typically a 3D geometrical representation of the tunnel system with various levels of detail. As an example, the segmental liner of a TBM tunnel can be shown as a surface with no thickness dimension at a low level of detail and can be shown with an accurate threedimensional depiction of individual liner segments and their relation to each other at a higher level of detail. The model can be visualized at any angle in 3D, which can be useful for a better understanding of complex geometries of building elements. However, the actual value of BIM consists of linking the geometrically depicted building elements with information specific to a given element. To use tunnelliner segments as an example again, this information may include structural reinforcement per design, the as-built position and TBM operational parameters, such as thrust forces, exerted on the liner ring as recorded during construction, and any damage features observed during tunnel inspections conducted over the service life to the tunnel. Figure 2 shows an example of a BIM representation of a tunnel built by a TBM and lined using concrete segments with thrust forces exerted on individual elements during the ring-building process depicted using color coding.
After the tunnel design and the ground model developed based on the geotechnical data gathered during the exploration phase are set, the data collected from the TBM as the tool for tunnel construction is the next key component to be considered for further detailing the BIM model during the construction phase. With modern TBMs generating 200 to 1,000 sensor readings every two to 10 seconds, the large data volume requires filtering out data to make it manageable. Only relevant data for the specific evaluation at hand and visualization are used to allow easy recognition of relevant values and trends for quick decision making as the construction processes proceed.
Most modern-day TBM projects utilize process-control software such as PROCON for providing graphic presentations of TBM data in user-configurable diagrams combined in dashboards to the project parties — contractor, owner, engineer — in real time at any location with internetconnectivity. A dashboard example of process-controlling software is presented in Fig. 3.
In addition to the automatically recorded data generated by the TBM, the process-control software integrates relevant project documents such as the CAD drawings and the ground model, which requires using a project-wide uniform space and time reference system. Other sources providing relevant data of the interaction between the ground, the tunneling process and the completed structure during and after the construction phase are integrated as well. These data sources include survey data and geotechnical instrumentation data (e.g., surface deformations above the tunnel alignment). These measurements are often digitally acquired and processed and are typically displayed in a GIS environment, where users can easily locate instrument or survey point locations and access the associated measurement data. Additional data sources are interventions for cutterhead maintenance (tool wear), shift reports and other construction reporting generated by user interface or by other form of data compilation. A presentation of the various data sources integrated into PROCON used here as an example for process-controlling software is provided in Fig. 4.
The integrated database allows evaluating TBM performance and identifying inefficiencies, to detect deviations from the predicted ground conditions, to conduct downtime analysis, to evaluate tool wear causeeffect relationships and generally to analyze the influence of ground conditions on TBM operational aspects (Maidl and Stascheit, 2014).
With completion of the tunneling process, BIM databases — if maintained — will support asset management over the service life of the tunnel.
Emerging and disruptive technologies
The impact that technological developments in data collection and processing may have on a specific industry such as tunneling is difficult to predict; however, the following basic trends seem to be evident:
- Technological advances in sensor technology including image sensors and digital image processing software have resulted in increased data collection options over time at lower cost for monitoring systems and physical processes. This includes the use of hyper-spectral cameras that can track many other features that are not readily visible such as thermal patterns, gas/liquid traces and surface finishes.
- Commercial use of advanced surveying and remote sensing technologies (laser scanning, laser tracking, LIDAR, INSAR) has increased as new infrastructure projects in dense urban environments require increasing focus on the impact on existing infrastructure.
- Computing power increases exponentially over time as proven to date by the continuing validity of the observation that the number of transistors on integrated circuits doubles approximately every two years (known as Moore’s law). Although Moore’s law cannot continue forever and most semiconductor industry experts expect it to reach its physical limit at some point in this decade (2020s), other technological developments may sustain the general trend of increasing computing power (e.g., a breakthrough in quantum computing and its application).
- The decrease of data storage cost per gigabyte of data over time has in the past been exponential. This includes hardware as well as web services introduced in recent years that offer cloud data storage as a secure and scalable form of data storage on a flexible and accessible platform.
- Similarly, data transfer speeds have increased over time and more diversity in network topologies (star, tree, mesh networking) generally increases the ability to share data.
- Over the past decade, powerful analytics engines for big data-processing emerged. Among them, Apache Spark, a general-purpose clustercomputing framework developed in 2009 by University of California Berkeley’s AMPLab and open-sourced in 2010, has become the basis for a multitude of commercial applications and will continue to be so with its key use case being based on the ability of processing streaming data and data analytics in real time, in conjunction with machine learning (ML) capabilities. ML, a branch of artificial intelligence (AI), is a data-analysis method that automates analytical model building based on the idea that systems can learn from data, identify patterns and make decisions with no or minimal human intervention.
- Internet connectivity, open-source databases and computing tools/frameworks, and commoditized services are great equalizers for innovators. Historically, innovation has been driven by highcost, high-risk processes involving substantial organizational infrastructure. Now, small experiments can be created resulting in innovative and marketable solutions by individuals and smallscale enterprises.
These trends have resulted in a number of emerging technologies that have disrupted certain industries and have changed our daily routines. Prominent examples at this time (2020) are the Internet of Things, autonomously operating (a.k.a. self-driving) cars; virtual, augmented, mixed and enhanced reality (Microsoft’s HoloLens 2 becoming a commodity for industrial processes); use of AI-enabled cameras for face recognition, site surveillance and inspection works; use of unmanned aerial vehicles (UAVs, or drones) for photogrammetric survey, LIDAR survey, geologic mapping (e.g., rock face mapping using Structure from Motion (SfM) photogrammetric range imaging technique). Recently combining drones with AIenabled cameras was used for inspecting mine tailings dams in Brazil in the wake of such a dam’s breach that resulted in hundreds of fatalities.
On the individual level, products such as AI-enabled cameras and UAVs have become toys for the amateur tinkerer. At the Ignite Seattle 2019 short-form speaking event, an Amazon product engineer presented the problem of his housecat bringing half-dead prey into the house in the middle of the night and also presented the solution he engineered: a cat-flap lock controlled by an AI-enabled camera. After installing the AI-enabled camera at the cat-flap entry path, taking 23,000 photos of his cat in the various states of advancing and leaving with and without prey in its mouth, using a platform for developing and deploying machine learning algorithms named SageMaker and services for AI-powered transcription and translation, he constructed an Arduinopowered locking mechanism keeping the cat-flap locked for 15 minutes once the camera identified a scenario in which the cat approached the entry with prey in its mouth. The presentation illustrated the everyday utility of AI. These examples are scalable for a variety of industries and can be equally used in tunneling where monitoring of the muck can be used to identify issues at the face and monitoring of the personnel and equipment can be used to improve work safety at the jobsites.
Considerations for future TBM projects — an outlook
Big data analytics can find actionable insights from data. A simple example from grocery shopping — if cereal and bananas appear in the same customer’s shopping basket, then milk is also likely to appear in that basket with a confidence level determined from the dataset used for training — this can obviously be transferred to the mechanized tunneling process for detecting patterns between TBM operation, ground conditions and surface deformations.
Machine learning has been applied before in tunneling, however, mostly in hard-rock tunneling with a smaller number of parameters describing face conditions and tunnel excavation processes compared to pressurizedface soft-ground TBM tunneling, resulting in simpler models. Prediction of hard rock TBM penetration rates based on parameters describing rock strength and joint systems is one example (Benardos, 2008; Martins and Miranda, 2013; Salimi et al., 2015). Use of a trained model to better forecast rock quality based on probe holes and historical probe hole data in drill-and-blast tunneling is another example (Allende Valdes et al. 2019). Several approaches have been conducted using trained models to predict TBM tunneling-induced surface settlements and research is ongoing using numerical simulation models for training and for settlement prognosis in near-real time as the TBM is advancing (Stascheit et al., 2018). Recently, machine learning has also been applied for earth pressure balance (EPB) TBM performance analytics of the Northgate Link tunnel drives in Seattle’s complex and variable glacial geology (Mooney et al., 2018). The data evaluation reportedly identified or confirmed the TBM driving strategy, adjusted after a first drive to use less thrust at lower advance rate in a second drive for minimizing tool wear and improving overall performance by having to perform fewer tool maintenance stops.
Building on these applications and assuming the continuing validity of the aforementioned trends, it seems relatively safe to predict an increase of AI applications in mechanized tunneling. The near-future goals may be an improved performance forecast and to determine TBM driving strategies to optimize overall performance assisting the TBM operator in the decision-making process. This would shift this process more and more to the automated system based on AI data analysis driving the TBM, or a hybrid system where the data is analyzed and prepared by an expert system for the end user or operator to make a final decision relative to the implementation of the recommendations. New sensor technology provides additional valuable data sources (e.g., recent developments in tool wear monitoring such as recording disc cutter rotation speed and tool temperature sensor data (Mosavat, 2017)).
The earlier reference to new product innovation such as the AI-controlled cat-flap lock by individuals would indicate that AI applications may be developed for individual TBM projects tailored to the specific boundary conditions and needs. A possible application of an AIenabled camera could be for monitoring muck flows — either at conveyor belts of EPB TBM operations or at separation plants of slurry TBM operations — with the objective of inferring face conditions in highly variable geology for comparison with the baselined conditions per GBR (Gwildis et al., 2009). Low-hanging fruit in this regard would be identifying rock shards as indicators for encountering coarse components such as cobbles and boulders, during a TBM drive through glacial and interglacial deposits. A conceptual sketch is provided in Fig. 5.
Each BIM database of a completed tunnel may be of value for a future tunneling project (e.g. one in similar ground conditions). After all, the computer does not forget as long as the data access is maintained. Cloud-based data storage services seem to provide the infrastructure to do so. And these data bases may even become a commodity. The value for asset management seems obvious. Maintenance inspection data can be added to the design and construction phase data and performance of the structure can be evaluated if as per design, unexpected system behavior or deterioration can be linked to relevant information for cause-effectevaluation, and questions, concerns or claims by third parties, such as unanticipated settlement of structures within the area of influence of the tunnel alignment, can be addressed based on data.
TBM projects in urban areas with a high density of existing infrastructure have in the past included involvement of infrastructure owners as third parties to the project in monitoring the operation and its effects. Data were collected and analyzed by the third party to protect its assets. The results of material flow reconciliation such as the ratio of theoretical weight of the excavated soil volume versus the combined weight of spoils, conditioners and grout measured during the tunneling and later ringbuilding processes at a specific ring location have been mapped for identifying lower risk versus higher risk areas of possible future settlement (Fig. 6).
With further densification of urban areas, the focus on data from both old and new construction will only increase.
Discussion of implications
An increasing amount of data will be generated and processed during all phases of future TBM projects, from the early-planning-phase cradle to the post-service-life grave. The current trend seems to be that all this data will be integrated in a BIM database of the specific tunnel project. Assuming that machine learning will be increasingly used for big data analytics to detect patterns and trends and derive actionable insights from the data base, what will be the implications? The following statements are meant to start the thought process and discussion on disruptive impact on the tunneling industry and do not claim to predict the future.
- Use of geotechnical archive data in metropolitan areas will increase and may result in a reduction of project-specific geotechnical exploration over time. This is especially true as the manufacturers get closer to designing the “Universal TBM” that is a more flexible system with ample capabilities to identify the issues ahead of the face and adapt to the upcoming ground conditions. This does not negate the need for understanding the ground conditions, just that it could be the result of the actual tunneling operation and not relying only on the site investigations.
- The accuracy of tracking face conditions during pressurized-face TBM drives may increase, but the transparency and traceability by the human mind of the underlying analytics may not.
- TBM project databases may become commodities similar to research papers.
- Data analytics of past TBM drives may be increasingly used for pre-bid decisions on TBM selection, drive strategy and operational target parameters.
- Real-time data analytics by AI processes may result in TBM operation assistance and ultimately in autonomously driven TBMs.
- Future tunnel construction contracts may require submission of the contractor’s tunneling plan to include TBM driving strategy and performance modeling results, which may then be compared to independent modeling.
- DSC claims may be based on the difference of TBM performance compared to pre-drive modeling considering the ground conditions along the tunnel derived from the interpretation of siterelated data during TBM advance.
- In densely built metropolitan areas, the focus on the extent to which existing infrastructure is impacted by a new TBM project will increase, hence the need for a better understanding of the ground conditions and impact of machine operation on the surrounding structures. Improved TBM performance as reflected in more efficient operations with minimum impact on the surrounding environment will be the ultimate goal of applying the data analytics in the tunneling industry.
Allende Valdes, M., Merello, J.P., Cofre, P., 2019. Artificial Intelligence Technique for Geomechanical Forecasting. RETC Proceedings. Chicago, IL, June 2019.
Benardos, A., 2008. Artificial Intelligence in Underground Development: A Study of TBM Performance. Underground Spaces, WITPress, 2008.
Gwildis, U.G., Maday, L.E., Newby, J.E., 2009. Actual vs. Baseline Tracking during TBM Tunneling in highly variable Glacial Geology. RETC Proceedings. Las Vegas, NV, June 2009.
Hamm, B., 2019. Cats, Rats, AI, Oh My. Ignite Seattle, June 6, 2019. Maidl, U., Stascheit, J., 2014. Real Time Process Controlling for EPB Shields – Echtzeit-Prozesscontrolling bei Erddruckschilden, Geomechanics and Tunneling 7, 2014.
Martins, F.F., Miranda, T.F.S., 2013. Prediction of Hard Rock TBM Penetration Rate based on Data Mining Techniques. 18th International Conference on Soil Mechanics and Geotechnical Engineering Proceedings. Paris, September 2013.
Mooney, M., Yu, H., Mokhtari, S., Zhang, X., Zhou, X., Alavi, E., Smiley, L., Hodder, W., 2018. EPB TBM Performance on the University Link U230 Project. NAT Proceedings. Washington, D.C., June 2018.
Mosavat, K, 2017. A Smart Disc Cutter Monitoring System Using Cutter Instrumentation Technology. RETC Proceedings. San Diego, CA, June 2017.
Salimi, A., Moormann, C., Singh, T.N., Jain, P., 2015. TBM Performance Prediction in Rock Tunneling Using Various Artificial Intelligence Algorithms. 11th Iranian and 2nd Regional Tunnelling Conference Proceedings, November 2015.
Stascheit, J., Ninic, J., Meschke, G., Hegemann, F., Maidl, U., 2018. Building Information Modelling in Mechanised Shield Tunnelling – A Practitioner’s Outlook to the Near Future / Building Information Modelling im maschinellen Schildvortrieb – Ein praxisorientierter Blick in die naehere Zukunft, Geomechanics and Tunneling 11, 2018.