Accuracy is a key concern when commissioning or creating a point cloud survey. Surveyors are often bound by job specifications, limiting error rates across the finished point cloud to within only a few millimetres. Particularly in construction, this level of accuracy is critical to the reason a point cloud survey was created at all.
There are four ways in which inaccuracies find their way into point clouds. The first relates to the inherent capabilities of the scanner. Even the best laser scanners are not perfect. All scanners have a maximum inherent accuracy capability that is tied to distance. This often hovers around +/- 1mm at 10m on average, although that figure can vary significantly based on the make and model.
The second and third source of errors are scene distortions and failures of registration. These two issues often come hand-in-hand. However, they can be avoided through due-diligence in the field and the use of high-quality processing software. Verification technology provides guarantees that these issues have been overcome. For this reason, verification is an important step in point cloud processing.
The fourth issue surveyors face is ‘propagation error’. Unlike scene distortions and failures of registration, propagation error is an inherent issue that stems from the other source of inherent error — scanner limitations.
Propagation error describes how errors compound as they are combined. When building a composite point cloud, scans are effectively built on top of each other, all rooting back to a single ‘home’ scan. The more ‘steps’ between any given scan and its ‘home’ scan, the greater propagation error it will suffer.
For example, imagine aligning a dataset in which each scan has an error of +/- 1mm. Your home scan will have an error of +/- 1mm. But, each step you take away from that scan will suffer an ever increasing level of error equal to the sum of the errors squared. Therefore, although an adjacent scan will have an inherent baseline error of +/- 1mm, its error within the composite scan will be +/-4mm because its placement within the composite image is impacted by the existing error in the home scan. This will increase as you move further away according to the equation (n1*2)2 + (n2*2)2 … = (total error) [where ‘n’ is the error rate for each scan].
This is an issue all surveyors must surmount when building composite point clouds of any substantial size. This article will explain how to accomplish this when approaching scanning for cloud-to-cloud registration and why this issue makes targetless registration an even more critical tool for surveyors looking to generate point clouds more efficiently.
Propagation error is sometimes considered to be a larger issue for cloud-to-cloud registration. Broadly speaking, this is not true. Although targets are the answer to solving error propagation, traditional registration targets identified within the scan field solely by the laser scanner will not help mitigate this problem. The position of registration targets will suffer the same inherent inaccuracies as every point within the scene because that error rate is set by the fundamental capabilities of the scanner.
Scans aligned using artificial targets and natural features are equally affected by prorogation error relative to the number of scans and the baseline error rate within each scan. However, traditional cloud-to-cloud registration software can require greater scan overlap to create a robust alignment, sometimes verging on 60%. Registration using artificial targets will sometimes limit overlap requirements to 30%-40%. If more scans are needed to cover the same sized area, propagation error will become a larger problem more quickly.
Although this is a real concern, its impact is minimised by quality processing software. For example, advanced, vector-based, multi-stage processing software can deliver cloud-to-cloud alignments with overlaps in the 30% range. Software choices are critical to effective cloud-to-cloud registration for a number of reasons. This is one of several aspects that will impact your ability to take a fully targetless approach. Fundamentally, however, once a scene becomes large enough, there is no way to gain sufficient coverage without producing so many scans that an alternative solution for propagation error is necessary, no matter how little overlap you can get away with generating.
Propagation error can never be removed. The compounding of errors is an inherent aspect of physical reality. However, it can be minimised. What surveyors need is the ability to fix scans within a wider and more precise context, allowing them to limit the number of steps any single scan has to be from its ‘home’ scan, no matter how large the dataset.
This is where targets come back into play. The only way to minimise the propagation error is to use a total station to build a site grid. This involves the placing of targets throughout a scan field to create a coordinate system. This system should be scaled to meet the maximum error threshold of the project.
The reason this is confusing, however, is that this process is fundamentally different to target placement for point cloud registration, even though it also includes the use of targets. As explained, targets identified solely through the use of a laser scanner will suffer the same inherent inaccuracy of placement as any point within the point cloud. Targets used for a site grid are instead measured and placed relative to one another using a total station. Total stations are capable of measurements ‘orders of magnitude’ more precise than those deliverable by a laser scanner.
This allows for the creation of a coordinate system that (although also suffering its own internal propagation error) will remain more precise than error rates within any single scan. Scans that pick up these grid targets can be placed within a control frame for the global placement of scans with nearly the level of precision that they possess internally. During processing, surveyors are then enabled to create multiple ‘home’ scans, building ‘scan trees’ only as large as the accuracy requirements of the job will allow. This basic strategy remains unaltered whether or not the scans themselves are aligned using targets or not. The spacing of the site grid will also remain unaffected by the alignment strategy.
This might seem counterintuitive. If building a site grid requires placing targets, why not place targets in every scan anyway?
The reason that propagation error increases the utility of targetless registration is that building a site grid is time-consuming. This additional step amplifies the utility of creating efficiencies elsewhere. Although you do have to place targets to build a site grid, they aren’t the same targets you will use to align scans, nor are they measured or placed in the same way. Rather than placing targets in areas of overlap, a total station is used to create a coordinate system that encompasses the entire job site. You don’t gain anything by placing targets for scan alignments. It is simply another manual process that has to be undertaken to create a robust end product.
For a long time, the answer to that question has been ‘probably no’. Although time has always been saved in the field by avoiding target placement, processing that data for cloud-to-cloud alignment took so much longer (including additional manual steps in the office) that it wasn’t worthwhile. This, however, really has changed. There is still processing software on the market that won’t deliver when it comes to targetless registration, but there are now options.
Some of the best software uses multi-stage vector analysis to deliver faster, automated and more robust outcomes. Here, the positional data of the scanner is used to extrapolate each point into a directional vector. This allows entire scans to be collapsed into single points, creating ‘vector spheres’. The density and directional characteristics of the vectors retain the unique identity of the point clouds. This representation allows for easy rotational alignment through placing adjacent vector spheres within one another.
Once rotational alignment is achieved, horizontal and vertical alignments can be made using rapid 2D point density techniques. By approaching alignment in three stages, rather than holistically, these programmes deliver more robust outcomes at speeds 40%-80% faster, depending on the size of the dataset. More critically, the need to set scan parameters is diminished, and the need to cross-check alignments throughout preprocessing is removed — relegating all manual involvement in processing to the beginning and end of the procedure. Scans can be queued up for hands-off processing, removing any time-costs to approaching registration without targets and even creating a more efficient processing workflow than traditional software can deliver for registration using artificial targets.
Fundamentally, these new types of processing software have set a new industry standard when it comes to registration procedures for all types of point cloud creation. The fact that they deliver a more efficient workflow makes them all the more critical when it is necessary to account for propagation error in large datasets.
The only way to account for propagation error within point clouds is to create a site grid that can act as a control frame for the global placement of scans. To do this, that grid must be developed to a level of precision at least one order of magnitude greater than what your laser scanner is capable of producing. That means using a total station.
This is a critical component of producing an accurate point cloud of any large size. This necessity is not impacted by the registration method you use to align scans within your dataset. The time-consuming nature, however, of developing a site grid means that it is even more important to generate efficiencies in other aspects of point cloud creation. This means utilising quality processing software that removes the historical issues with targetless registration and delivers a streamlined and efficient process for actually aligning your scans. Taking a targetless approach will enable you to invest the time needed to create a robust site grid and deliver the high-quality data that you and your clients want and need.