fix(core): Auto-convert uint8 buffers to uint16#2486
fix(core): Auto-convert uint8 buffers to uint16#2486felixpalmer merged 2 commits into9.2-releasefrom
Conversation
|
thanks @felixpalmer, looks really promising. I suspect that this is related to my issue. The problem seems most severe in areas where the ground resolution is low or the tiles would have been simplified a lot (water areas, rural areas, etc..). I'm guessing these areas where there are a lot of simple tiles. |
charlieforward9
left a comment
There was a problem hiding this comment.
First time poking around in luma. I appreciate the attention on this issue and exposure to a lower level.
chrisgervang
left a comment
There was a problem hiding this comment.
Does this need to be mentioned in the docs, or are there no consequences to a user?
Done: 068be39 In general I don't expect the user will notice, this is more removing a gotcha |
|
This appears to have fixed my issue. Thanks! |
Background
WebGPU doesn't support Uint8Array buffers, but the GLTF loader can produce them (as could any app). Many small 3D tiles end up having a Uint8Array index buffer which causes luma to throw an error:
As a result the tile doesn't load and there is a hole in the terrain:
To resolve this, this PR automatically converts uint8 buffers to uint16. Given that the buffers will typically be small, the extra memory isn't significant.
After this PR:
Change List