Hello, and thankyou for the detailed questions! There are several parts to answering these questions as they are to do with existing features, experimental features, and features still on the roadmap.
TLDR:
- Landscape generation - Experimental
- Procedural BP placement - Yes
- AI/Navmesh - Yes (Unreal)
- Object state - Yes
- Ease of use - Varies
More detail:
Landscapes have always been a long term goal for Apparance, and recently I’ve been lucky to be able to bring some of the experimental work in this area into play (see the Garden Generation prototype). Specifically this uses some image processing operators to provide masks to scattered object placement (flowers/grass), e.g. there is a noise mask applied to mix flower types in the beds. This ability to drive placement and generation from images is part of the plan to support landscape generation (heightmaps particularly). There is a lot more I want to do (proc textures, parameterised materials, extended image operators, etc) that I hope to roll out over time. The next update will include some of these operators in experimental form (i.e. for trying out, they may change).
Landscapes as heightfields are suitable for various 2D landscape features, but 3D features (like caves) are trickier and probably need a voxel based solution (which is not on the roadmap yet). Having said that, because you have a high level of control over the generated geometry and using clever generation techniques you could integrate features like caves or openings in the landscape that are built using other methods (e.g. underground rooms/dungeon generator with it’s door integrated into a hole in the heightfield). This will probably require further extension to the operator library and some experimentation.
Apparance is fully runtime, so you can build patches of landscape as you need them, there are limitations as to how much you can instance at once that you will find by experimentation as Unreal tends to stall a little when you are, for example, creating lots of mesh instances. However there is ongoing work to improve this. You will need to implement a system to manage patches of land, placing, generating, and show/hide as needed. Similarly if you need to generate at different detail levels you will need to manage and switch those in/out as the player moves around. I will be providing examples of more sophisticated use cases like this in the future.
You can place BP actors into the scene using Apparance as part of the procedure operations, and these themselves can be procedural entities if needed, each running their own set of procedures fed parameters from the placement logic in the landscape generator, so yes, you could place buildings on your generated terrain.
Unreal provides a nav-mesh builder which can build and understand the geometry you’ve generated to provide your AI with navigation around procedural landscape. This can be updated at runtime based on the meshes you generate.
For objects that have persistent state, you will need to manage that, but it could be as simple as your procedural tree Blueprint using it’s world position to lookup/store some state about it. This state can be used to drive a parameter on the procedure which would control whether it generates (or places) a mesh for a complete tree, or just the treestump, or even stages in between. Interacting with the BP results in state changes that can rebuild the visuals to reflect that change and then store the new state for later. If you navigate away from the area (and it is destroyed) and come back, when the tree generates again the BP can read the state to see what it should look like. Would that work?
Apparance is a very flexible system, providing many low-level operators to build up your procedure library and generation systems from. Whilst this means you do need to build a lot, it gives you complete control over how they operate and all the logic and decisions you need. The ability to customise and parameterise at a low level means you are building exactly the object you need and you can get quite intricate about how inputs affect the output. This does mean it’s quite a technical process, and the node graph procedure construction is very close to a functional programming language in operation and method. However I believe this is more than made up with by the visual and fully interactive nature of the editing environment, allowing instant feedback as you create and tweak, and making it simple to drop in visualisations of the internal logic you build.
I would recommend you spend some time exploring the examples, tinkering with them to see what does what, and then building some test procedures of your own. At the moment there are only a few basic tutorials and examples but this will be expanded in the future. If you need to discuss anything from how to approach things down to how a particular feature works I encourage you to join the Apparance Forums where myself and other users will be able to help you out.
As with many software tools, their application in real world settings helps shape them more than any theoretical premises they are built on. This is especially true for Apparance, first in Unity with the Gloomhaven project, and then through various other projects on the way, to the Unreal version which is just starting to stretch it’s legs. The more projects that use it, and the more I get involved with helping, the more mature the tool gets. This is especially true of the more experimental areas being explored; the projects it is used on drive the innovation. I hope you will be able to join in with this process and together make amazing things.
Hopefully this is a helpful insight into Apparance and goes some way to answering your questions.
Sam