r.GPUSkin.Support16BitBoneIndex

r.GPUSkin.Support16BitBoneIndex

#Overview

name: r.GPUSkin.Support16BitBoneIndex

The value of this variable can be defined or overridden in .ini config files. 2 .ini config files referencing this setting variable.

This variable is created as a Console Variable (cvar).

It is referenced in 3 C++ source files.

#Summary

#Usage in the C++ source code

The purpose of r.GPUSkin.Support16BitBoneIndex is to control the bone index format used for GPU skinning in Unreal Engine 5. It determines whether newly imported meshes will use 8-bit or 16-bit bone indices for rendering, depending on the number of bones in the skeleton.

This setting variable is primarily used by the GPU skinning system, which is part of the rendering subsystem in Unreal Engine. It’s also referenced in the Mutable plugin, which is used for runtime mesh customization.

The value of this variable is set through the console variable system. It’s defined as a read-only console variable, which means its value is typically set at engine startup and cannot be changed during runtime.

The GCVarSupport16BitBoneIndex variable interacts with this console variable, storing its value for quick access in the GPU skinning code.

Developers should be aware that:

  1. This setting affects newly imported meshes, not existing ones.
  2. It’s a read-only variable, so it can’t be changed during gameplay.
  3. The choice between 8-bit and 16-bit indices is automatic based on the number of bones (<=256 bones use 8-bit, >256 bones use 16-bit).

Best practices when using this variable include:

  1. Set it appropriately before importing new meshes to ensure optimal memory usage and performance.
  2. Consider the trade-off between memory usage (8-bit indices use less memory) and the maximum number of bones supported (16-bit indices allow for more than 256 bones).
  3. Be consistent with its usage across your project to avoid unexpected behavior with different meshes.
  4. Remember that changing this setting may require reimporting meshes to take effect.

#Setting Variables

#References In INI files

Location: <Workspace>/Projects/Lyra/Config/DefaultEngine.ini:107, section: [/Script/Engine.RendererSettings]

Location: <Workspace>/Projects/Lyra/Config/DefaultEngine.ini:127, section: [/Script/Engine.RendererSettings]

#References in C++ code

#Callsites

This variable is referenced in the following C++ source code:

#Loc: <Workspace>/Engine/Source/Runtime/Engine/Private/GPUSkinVertexFactory.cpp:35

Scope: file

Source code excerpt:

static int32 GCVarSupport16BitBoneIndex = 0;
static FAutoConsoleVariableRef CVarSupport16BitBoneIndex(
	TEXT("r.GPUSkin.Support16BitBoneIndex"),
	GCVarSupport16BitBoneIndex,
	TEXT("If enabled, a new mesh imported will use 8 bit (if <=256 bones) or 16 bit (if > 256 bones) bone indices for rendering."),
	ECVF_ReadOnly);

// Whether to use 2 bones influence instead of default 4 for GPU skinning
// Changing this causes a full shader recompile

#Loc: <Workspace>/Engine/Plugins/Experimental/Mutable/Source/CustomizableObject/Private/MuCO/CustomizableObjectSystem.cpp:571

Scope (from outer to inner):

file
function     void UCustomizableObjectSystem::InitSystem

Source code excerpt:


	// This CVar is constant for the lifespan of the program. Read its value once. 
	const IConsoleVariable* CVarSupport16BitBoneIndex = IConsoleManager::Get().FindConsoleVariable(TEXT("r.GPUSkin.Support16BitBoneIndex"));
	Private->bSupport16BitBoneIndex = CVarSupport16BitBoneIndex ? CVarSupport16BitBoneIndex->GetBool() : false;

	// Read non-constant CVars and do work if required.
	CVarMutableSinkFunction();

	Private->OnMutableEnabledChanged();

#Loc: <Workspace>/Engine/Source/Editor/UnrealEd/Classes/Editor/AssetGuideline.h:15

Scope: file

Source code excerpt:

	FString Section;

	/** From .ini. Ex: r.GPUSkin.Support16BitBoneIndex */
	UPROPERTY(EditAnywhere, Category = General)
	FString Key;

	/** From .ini. Ex: True */
	UPROPERTY(EditAnywhere, Category = General)
	FString Value;