For my graduation I am working an a few Augmented Reality apps with ARfoundation.
In order to get started I followed the tutorial from Elin Höhler:
I have followed all the exact same steps except for the build settings as for now I am building to my iPhone XR running on iOS 12.
However I noticed my touch inputs weren't working, or rather it doesn't notice any touches.
So in order to test this I stripped all the nodes so I only listen for the basic touch input.
After a touch is registered then I change my UI text from: 'No touch detected!' to 'Touch detected!' and change its color to green.
See the image below:
When I built this to my iPhone it just doesn't work. However when I connect my iPhone to my laptop and open the Unity Remote app to test it then it does works. But when I built it to my phone it just doesn't.
Also, when I create a new project without the ARkit and ARfoundation components and built to my phone it also does work and the text changes when I touch my screen.
Note: when I built the ARfoundation project to my phone it neatly scans surfaces and shows me planes where I can put my game objects. So ARfoundation and ARkit are functioning properly, I just need to get my touch working.
I am really stuck at this point and already tried to place my flowmachine and variables on different elements without any luck. I placed my stuff on the AR Session Origin, the camera within the AR Session Origin, on the AR Session itself and I also moved everything to an empty which I renamed Game Manager but without any luck.
If there's someone who knows how to fix this or who can point me in the right directen then please let me know as I can't wait to use Bolt in my work.
- iPhone XR (iOS12)
- MacBook Pro Late 2015 (OSX Mojavé)
Customer support service by UserEcho