CloudComPy API tutorial using internal API for calculating distances

Tutorial 21 novembre 2022

CloudComPy is a wrapper that uses CloudComapre API for operations on cloud-point data. It allows for using python without CLI CloudComapre calls.

In order to get started with CloudComPy one must follow instructions on preparing environment for it:

CloudComPy instructions for environment

This tutorial is for pre-compiled Windows 10 binary files variant.

CloudComPy works with multiple file formats (At the time of writing of this material .las was not supported out of the box), but the one used here is the .LAS version 1.3 that contains the necessary informations for calculating scalar fields and eventually calculating differences between two point-clouds using M3C2 plugin from CloudCompare.

In order to use the functions from the library python must include some of the libraries.:

1. import os
1. import math
1. import numpy as np
1. import laspy
1. import tqdm
1. from pathlib import Path
1. from gendata import getSampleCloud
1. from multiprocessing import cpu_count
1. import datetime
1. from time import perf_counter, time
1. import cloudComPy as cc

Setting up paths and initialization of CloudComPy CloudCompare module (When dealing with .las files instead of .txt files check out solution at the end of this tutorial):

1. path1 = r"C:\Users\szinp\Desktop\lazorScan\_project\testData\txtFiles\1E.txt"
1. path2 = r"C:\Users\szinp\Desktop\lazorScan\_project\testData\txtFiles\2E.txt"
1. cc.initCC()

1. cloud1 = cc.loadPointCloud(path1)
1. cloud2 = cc.loadPointCloud(path2)
1. #? You can also generate point-clouds directly from CloudComPy API
1. # cloud1 = cc.loadPointCloud(getSampleCloud(1.0))
1. # cloud2 = cc.loadPointCloud(getSampleCloud(2.0))

In order to have clear distinction visualise clouds in CloudComPy we name them:

1. cloud1.setName("cloud1")
1. cloud2.setName("cloud2")

When dealing with two point-clouds that are differing in central position (Does not always happen) an alignment must be performed. In this case this is called ICP registration:

1. res=cc.ICP(data=cloud2, model=cloud1, minRMSDecrease=1.e-5,
1. `            `randomSamplingLimit=50000,maxIterationCount=40, removeFarthestPoints=False, method=cc.CONVERGENCE\_TYPE.MAX\_ERROR\_CONVERGENCE, adjustScale=True, finalOverlapRatio=0.90, maxThreadCount = 0)
1. removeFarthestPoints=False, method=cc.CONVERGENCE\_TYPE.MAX\_ERROR\_CONVERGENCE, adjustScale=False, finalOverlapRatio=0.90, maxThreadCount = 0)
1. tr2 = res.transMat
1. cloud2ICP = res.aligned
1. cloud2ICP.applyRigidTransformation(tr2)
1. cloud2ICP.setName("cloud2\_transformed\_afterICP")

When the point-clouds are registered it is possible to run the main plugin responsible for calculating distances between two point-clouds -> M3C2 plugin. But first a set of configuration options must be specified for that plugin. At the time for CloudComPy39 there was no option for automatic calculation of given parameters.

Configuration File:

1. m3c2\_params\_dic={}
1. m3c2\_params\_dic["ExportDensityAtProjScale"] = "false"
1. m3c2\_params\_dic["ExportStdDevInfo"] = "false"
1. m3c2\_params\_dic["M3C2VER"] = 1
1. m3c2\_params\_dic["MaxThreadCount"] = multiprocessing.cpu\_count()
1. m3c2\_params\_dic["MinPoints4Stat"] = 5
1. m3c2\_params\_dic["NormalMaxScale"] = 0.283607
1. m3c2\_params\_dic["NormalMinScale"] = 0.070902
1. m3c2\_params\_dic["NormalMode"] = 0
1. m3c2\_params\_dic["NormalPreferedOri"] = 4
1. m3c2\_params\_dic["NormalScale"] = 0.141803
1. m3c2\_params\_dic["NormalStep"] = 0.070902
1. m3c2\_params\_dic["NormalUseCorePoints"] = "false"
1. m3c2\_params\_dic["PM1Scale"] = 1
1. m3c2\_params\_dic["PM2Scale"] = 1
1. m3c2\_params\_dic["PositiveSearchOnly"] = "false"
1. m3c2\_params\_dic["ProjDestIndex"] = 1
1. m3c2\_params\_dic["RegistrationError"] = 0
1. m3c2\_params\_dic["RegistrationErrorEnabled"] = "false"
1. m3c2\_params\_dic["SearchDepth"] = 0.709017
1. m3c2\_params\_dic["SearchScale"] = 0.141803
1. m3c2\_params\_dic["SubsampleEnabled"] = "true"
1. m3c2\_params\_dic["SubsampleRadius"] = 0.070902
1. m3c2\_params\_dic["UseMedian"] = "false"
1. m3c2\_params\_dic["UseMinPoints4Stat"] = "false"
1. m3c2\_params\_dic["UseOriginalCloud"] = "false"
1. m3c2\_params\_dic["UsePrecisionMaps"] = "false"
1. m3c2\_params\_dic["UseSinglePass4Depth"] = "false"

1. #? Here you can specify where the configuration file is being saved
1. paramFilename =os.path.abspath("") + "\dataFolder\PythonParameters\m3c2\_params.txt"
1. print(f"Writing parameters to {paramFilename}")
1. assert os.path.isfile(paramFilename), "File does not exist"

Finally M3C2 plugin can be called:

1. if cc.isPluginM3C2():
1. `    `import cloudComPy.M3C2
1. `    `print("M3C2 plugin is loaded, attempting to run M3C2")
1. `    `CloudAfterM3C2 = cc.M3C2.computeM3C2([cloud1,cloud2ICP], paramFilename)
1. `    `print("M3C2 finished")
1. `    `print("type(CloudAfterM3C2)", type(CloudAfterM3C2))
1. `    `if CloudAfterM3C2 is None:
1. `        `raise RuntimeError
1. `    `if CloudAfterM3C2.getNumberOfScalarFields() < 3:
1. `        `raise RuntimeError
1. `    `dic= CloudAfterM3C2.getScalarFieldDic()
1. `    `sf = CloudAfterM3C2.getScalarField(dic['M3C2 distance'])
1. `    `if sf is None:
1. `        `raise RuntimeError
1. `    `pathToSaveOutput = os.path.abspath("") + "\dataFolder\pythonScripts\_testOutputs\\" + "bigData\_test.las"
1. `    `print(f"Saving results to {pathToSaveOutput}")
1. `    `cc.SavePointCloud(CloudAfterM3C2, pathToSaveOutput) # OK
1. `    `assert os.path.isfile(pathToSaveOutput)

At the end in {pathToSaveOutput} we get a .las file with scalar field calculated and indicating where are the differences. You can load that file in CloudCompare and visualize it there or use Potree for the same purpose.

Now this procedure is tedious so in Gter there was written internal API for given commands for simplifying these steps.

Link to repository

ComputeClouds library uses higher order functions for streamlining this procedure and also allows for quick conversion from .las to .txt files.

List of functions:

1. convertLasTxt() - Converts .Las files to .txt
1. createShpereFull() - Creates point-cloud sphere with volume full of points
1. createSpherePoints() - Creates a sphere with no points inside
1. createSampleCloud() - Creates a two sample point clouds. Looks like a teardrop hitting the water.
1. writeParamsFile() - Creates a file with parameters for the simulation and return the path to the file.
1. overwrite\_parameters() - Each time that the configuration changes via dictionary of parameters this function can be called to overwrite the file.,
1. calcM3C2() - It takes into two file paths {firstCd} and {secondCd} to point clouds and runs M3C2. Loads .txt and .las files. 
1. boundingBox() -  Calculates the bounding box of a cloud.

Extras:

  • Setting up automatic environment variables for anaconda environment so there is no need to run envCloudComPy.bat link

Szymon Zinkowicz

 Date: November 21, 2022
 Tags:  CloudComPy QGIS cloudCompare opensource Potree

Previous
⏪ Pianificazione di strumenti per sperimentazione di procedure per analisi territoriali con GRASS GIS