Interface IJobEntityBatchWithIndex
IJobEntityBatchWithIndex is a variant of [IJobEntityBatch] that provides an additional indexOfFirstEntityInQuery parameter, which provides a per-batch index that is the aggregate of all previous batch counts.
Namespace: Unity.Entities
Syntax
[JobProducerType(typeof(JobEntityBatchIndexExtensions.JobEntityBatchIndexProducer<>))]
public interface IJobEntityBatchWithIndex
Remarks
Schedule or run an IJobEntityBatchWithIndex job inside the OnUpdate() function of a
SystemBase implementation. When the system schedules or runs an IJobEntityBatchWithIndex job, it uses
the specified EntityQuery to select a set of chunks. These selected chunks are divided into
batches of entities. A batch is a contiguous set of entities, always stored in the same chunk. The job
struct's Execute
function is called for each batch.
When you schedule or run the job with one of the following methods:
- ScheduleSingle<T>(T, EntityQuery, JobHandle),
- ScheduleParallel<T>(T, EntityQuery, JobHandle),
- or Run<T>(T, EntityQuery)
all the entities of each chunk are processed as
a single batch. The ArchetypeChunk object passed to the Execute
function of your job struct provides access
to the components of all the entities in the chunk.
Use ScheduleParallelBatched<T>(T, EntityQuery, Int32, JobHandle) to divide each chunk selected by your query into (approximately) equal batches of contiguous entities. For example, if you use a batch count of two, one batch provides access to the first half of the component arrays in a chunk and the other provides access to the second half. When you use batching, the ArchetypeChunk object only provides access to the components in the current batch of entities -- not those of all entities in a chunk.
In general, processing whole chunks at a time (setting batch count to one) is the most efficient. However, in cases where the algorithm itself is relatively expensive for each entity, executing smaller batches in parallel can provide better overall performance, especially when the entities are contained in a small number of chunks. As always, you should profile your job to find the best arrangement for your specific application.
To pass data to your Execute function (beyond the Execute
parameters), add public fields to the IJobEntityBatchWithIndex
struct declaration and set those fields immediately before scheduling the job. You must always pass the
component type information for any components that the job reads or writes using a field of type,
ComponentTypeHandle<T>. Get this type information by calling the appropriate
GetComponentTypeHandle<T>(Boolean) function for the type of
component.
For more information see Using IJobEntityBatch.
Methods
Execute(ArchetypeChunk, Int32, Int32)
Implement the Execute
function to perform a unit of work on an ArchetypeChunk representing
a contiguous batch of entities within a chunk.
Declaration
void Execute(ArchetypeChunk batchInChunk, int batchIndex, int indexOfFirstEntityInQuery)
Parameters
Type | Name | Description |
---|---|---|
ArchetypeChunk | batchInChunk | An object providing access to a batch of entities within a chunk. |
Int32 | batchIndex | The index of the current batch within the list of all batches in all chunks found by the job's EntityQuery. If the batch count is one, this list contains one entry for each selected chunk; if the batch count is two, the list contains two entries per chunk; and so on. Note that batches are not processed in index order, except by chance. |
Int32 | indexOfFirstEntityInQuery | The index of the first entity in the current chunk within the list of all entities in all the chunks found by the Job's EntityQuery. |
Remarks
The chunks selected by the EntityQuery used to schedule the job are the input to your Execute
function. If you use ScheduleParallelBatched<T>(T, EntityQuery, Int32, JobHandle)
to schedule the job, the entities in each matching chunk are partitioned into contiguous batches based on the
batchesInChunk
parameter, and the Execute
function is called once for each batch. When you use one of the
other scheduling or run methods, the Execute
function is called once per matching chunk (in other words, the
batch count is one).