0
Answered

-- SOLVED -- Touch input not working when using ARfoundation

InfusedNL 3 years ago updated by Lazlo Bonin (Lead Developer) 3 years ago 3

Hi there,

For my graduation I am working an a few Augmented Reality apps with ARfoundation.

In order to get started I followed the tutorial from Elin Höhler:
-

I have followed all the exact same steps except for the build settings as for now I am building to my iPhone XR running on iOS 12.

However I noticed my touch inputs weren't working, or rather it doesn't notice any touches.

So in order to test this I stripped all the nodes so I only listen for the basic touch input.

After a touch is registered then I change my UI text from: 'No touch detected!' to 'Touch detected!' and change its color to green.

See the image below:

When I built this to my iPhone it just doesn't work. However when I connect my iPhone to my laptop and open the Unity Remote app to test it then it does works. But when I built it to my phone it just doesn't.

Also, when I create a new project without the ARkit and ARfoundation components and built to my phone it also does work and the text changes when I touch my screen.

Note: when I built the ARfoundation project to my phone it neatly scans surfaces and shows me planes where I can put my game objects. So ARfoundation and ARkit are functioning properly, I just need to get my touch working.

I am really stuck at this point and already tried to place my flowmachine and variables on different elements without any luck. I placed my stuff on the AR Session Origin, the camera within the AR Session Origin, on the AR Session itself and I also moved everything to an empty which I renamed Game Manager but without any luck.

If there's someone who knows how to fix this or who can point me in the right directen then please let me know as I can't wait to use Bolt in my work.

My setup:

- iPhone XR (iOS12)
- MacBook Pro Late 2015 (OSX Mojavé)

Best regards,

Christiaan

Bolt Version:
1.4.0f11
Unity Version:
2018.2.7f1
Platform(s):
Scripting Backend:
.NET Version (API Compatibility Level):

I did another test in the at this point latest Unity version: 2018.2.18f1 Personal.

I built a simple project with the only assets imported being Bolt (So no ARkit or ARfoundation). I have the same node setup with the text changing when I touch my phone's screen. However it doesn't register input at all in this version.

In this test I also have the latest Bolt version 1.4.0f11. While my phone is connected with the Unity Remote app it works perfectly fine, but again, when I built to my iPhone XR running iOS12 it just doesn't recognize any touches.

Here is the screenshot of my node setup which works when testing but fails when testing the built in my iPhone:

+1

- - - UPDATE - - -

I fixed my issue by following the steps on this page: https://ludiq.io/bolt/manual/advanced/aot

What I had to do before I built to my phone was go to:

1. Tools --> Ludiq --> AOT Pre-built

2. A new popup appears that contains a button with the text: Pre-Build

3. Click the 'Pre-built' button everytime BEFORE you built to your device.

Since I built to iOS I follow the above steps each time before I create my xCode built and now all my touches are being registered. :)

I hope this may be of use to anyone else struggling to get AOT (Ahead Of Time) stuff to work.

Cheers!

+1
Answered

Glad you figured it out :) I keep trying to make that page pop up more, and hopefully all of this will be integrated automatically in the build process in Bolt 2.