Q: What is a true space experience?
A true space experience projects computer generate objects and imagery into a cameras view with no trigger required.
Using true space, the camera needs to recognize only a nearby horizontal surface. The camera then can attach virtual content onto that surface - such as a computer-generated chair sitting on the floor or a coffee mug onto a table top.
AR Designer makes use of ARKit on iPhones and other iOS devices, and ARCore on Android devices.
Q: Does the size of my model matter for a true space experience?
Because your content will appear to exist in real space in front of the user, you need to be cognizant of the size and scale of your content. If for example you want to depict a bear standing in a room, you need to be sure the bear is modeled and exported at a real-world scale. (5 to 8 feet, depending on the species).
The true space user will tap on the floor as displayed on the phone's screen and the bear (or whatever) will appear on the floor at that spot.
You also have the option to allow the user of your experience to change the size of it once it appears, by using a pinch action on the phone screen.
Q: Does orientation matter when setting up a true space experience?
You will also want to make sure that the bear is facing in the right direction when the user taps on that spot. The bear won't be nearly as terrifying if it is facing in the other direction!
You do have the option to allow the AR user to spin the model around to face in another direction.
Q: How can I use the 3D workspace to know if I have the scale and orientation correct?
The 3D workspace features a floor grid that is 1m. You can reference this to get an idea of how big your model is once it is important.
Regarding orientation, the default workspace point of view (when you open the experience in AR Designer) is approximately where the user's camera will be when they open the experience on their phone.
In the very near future, we'll provide more help regarding scale and point-of-view. Stay tuned!