-
Notifications
You must be signed in to change notification settings - Fork 154
Description
I was wondering about the performance of removing and adding entities, as it feels like these might be quite expensive operations. If destroying an entity simply punches multiple holes in a chunk (one per fragment), this might make iterating over all entities in a chunk non-trivial. It might also affect cache efficiency as skipping holes will most likely mean that you will load cache lines with information you discard again. Then again, preventing holes by moving lots of entities is quite inefficient as well. I can't think of any other way of dealing with the aforementioned issues, do you know how Epic implemented it?
Have you done any benchmarks so far or have any advice on how to deal with scenarios where you want to constantly destroy and create new entities? I thought about simply adding some sort of "IsDead" tag instead of destroying entities and when creating new entities first querying all dead entities and "revive" them by removing the tag again (and setting some fragment values). But this would be quite inefficient as well as each time a new entity should be created, one must first iterate over all entities (or at least a subset) and look for a specific tag. Furthermore, how does the Engine even handle scenarios where people add tags at arbitrary locations in the chunk, as this would constantly "invalidate" existing queries, which massively impacts cache efficiency.
It's quite difficult to grasp how this whole Mass system can be efficient, as it has so few restrictions that it's hard to believe it still manages to perform well even if people add tags or fragments at arbitrary places, which constantly creates new archetypes (potentially), remove and create new entities, and just mess with entities all over the place.