![]() |
Cesium for Unity 1.15.2
|
Options for adjusting how point clouds are rendered using 3D Tiles. More...
Properties | |
bool | attenuation [get, set] |
Whether or not to perform point attenuation. | |
float | geometricErrorScale [get, set] |
The scale to be applied to the tile's geometric error before it is used to compute attenuation. | |
float | maximumAttenuation [get, set] |
The maximum point attenuation in pixels. | |
float | baseResolution [get, set] |
The average base resolution for the dataset in meters. | |
Options for adjusting how point clouds are rendered using 3D Tiles.
Definition at line 10 of file CesiumPointCloudShading.cs.
|
getset |
Whether or not to perform point attenuation.
Attenuation controls the size of the points rendered based on the geometric error of their tile.
Definition at line 19 of file CesiumPointCloudShading.cs.
|
getset |
The average base resolution for the dataset in meters.
For example, a base resolution of 0.05 assumes an original capture resolution of 5 centimeters between neighboring points.
This is used in place of geometric error when the tile's geometric error is 0. If this value is zero, each tile with a geometric error of 0 will have its geometric error approximated instead.
Definition at line 70 of file CesiumPointCloudShading.cs.
|
getset |
The scale to be applied to the tile's geometric error before it is used to compute attenuation.
Larger values will result in larger points.
Definition at line 32 of file CesiumPointCloudShading.cs.
|
getset |
The maximum point attenuation in pixels.
If this is zero, the Cesium3DTileset's maximumScreenSpaceError will be used as the maximum point attenuation.
Definition at line 48 of file CesiumPointCloudShading.cs.