Since Delphi XE5 I was trying to use sensors. But it never worked. With Delphi DX I found new hope in a lot of bugfixes since then. So I've tried to build an ambigious project with augmented reality.
The first step was to compile the official gyroscope demo and run it on Android 6.0.1.
And the nightmare began...
I will not explain how aweful it was to gather some insider informationen and to find out one to another bug.
The more I will describe a way to fix some of those problems.
This explanation reflects my individual experiance taken from several tests and code analysis. So if I'm completely wrong, let me know.
For my tests I used the LG Nexus 5X with Android 6.0.1.
2.) The Effort
To build an augmented reality app, where you can rotate a 3D camera inside of an 3D environment with the device-camera stream in the background, we'll need sensors to detect device movement at first.
In detail: we need some kind of gyroscope. That's why I started with the gyroscope demo by Embarcadero.
Take a look at this official Embarcadero Youtube video:
Android itself offers some sensors (hardware and logical sensors) for solving this task:
- usage of raw Gyroscope sensor data
- fusion of 3 sensors: Accelerometer, Magnetometer and Gyroscope
- Orientation Sensor
- Rotation Vector, Game Rotation Vector and Geomagnetical Rotation Vector Sensor
After reading lots of blogs and documentations this list reduces. Take a look at the official Android documentation:
- Using sensor raw data is definitly a bad idea, because it doesn't matter, which hardware sensor you'll choose, each and every one is NOT PERFECT.
- So the way is to fuse different sensors. Up to Android API 18 we have to do this completely on our own!
- Until Android 2.2 a simple Orientation Sensor existed. But it's deprecated and so I will no longer refer to it here.
- Since API 18 Android offers the Rotation Vector, Geomagnetical Rotation Vector and the Game Rotation Vector Sensor. The difference is the hardware components used for fusion. All three types are logical, which means they use specific hardware sensors and combine them by complex software algorithms. The mainly recommended sensor is the Rotation Vector Sensor.
So far, so good ...
By this recommendation I chose to use the Rotation Vector Sensor.
In my expectations this would deliver already smoothed rotation values.
I thought so ...
In Delphi it can be accessed by the TOrientationSensor component and the Inclinometer3D type.
When implementing a TOrientationSensor in Delphi we can access the TiltX, TiltY and TiltZ and this should return the device rotation information.
But if we start the demo on Android 6.0 the result is not satisfying at all. The cube is completly going crazy, it's flickering and unstable.
4.1.) Problem #1: Bug inside the Delphi sensor frameworkIn my opinion the implementation of Android sensors is not correct!
Besides that, the Game Rotation Vector and Geomagnatical Rotation Vector Sensors are not implemented at all.
Back to the Rotation Vector Sensor implemenation: When accessing f.e. the TiltX property a getter function will request multiple ASensorEvents to get the specific event and finally retrieve the TiltX value.
function TNativeSensor.LastValue: ASensorEvent;
var SensorEvent: ASensorEvent;
while ASensorEventQueue_getEvents(FNativeEventQueue, @SensorEvent,1) > 0 do
FLastSensorEvent := SensorEvent;
Result := FLastSensorEvent;
Because it is done by every property access on TiltX, TiltY, TiltZ, the x,y,z values may come from different sensor events, which leads to inconsistent data.
Between the property access the device is still in movement, so pitch,roll and yaw are still changing. So the correct way should be to gather those coordinates at once, from one single event.
Another inaccuracy the Rotation Vector Sensor is not delivering a vector (Array) of 3 float values. Instead it returns a quaternion with 5 float values: x,y,z,w + c
For simple rotation the x,y,z should be enough. Maybe Embarcadero thought so...
But when combining those values with other sensor values, like from gyroscope, we need more information.
Alright, we could recalculate the missing parts, but why should we waste performance, for what we've already got.
4.2.) Problem #2: Incompatible SDK & NDK versionDelphi DX Seattle compiles Android apps with an older SDK and NDK version: SDK for Android 5.1 and NDK 9c.
The most apps are compilable and runnable on Android 6+. But for sensors those versions are not compatible.
For Android 6.0 we'll need at least SDK 24.4.1 and NDK 10e. (I've also tried NDK 11a & 11b, but those are not runable on my device.)
When creating a SDK version in your Delphi IDE, take care of linking the NDK paths to the highest API version (not android-14!)
Tools > Options > SDK-Manager
"android-sdk-windows-current" is a secondary directory, where I check out the current official Android version.
To get the 23.0.2 build tools select the specific package in the SDK Manager of Android and download it.
Take care of the red marked parts. When creating a new SDK Version in Delphi this is not linked correctly by the IDE.
I compiled with the latest Java version, but it should also be runnable with an older one.
- C:\Program Files (x86)\Java\jdk1.8.0_74\bin\KeyTool.exe
- C:\Program Files (x86)\Java\jdk1.8.0_74\bin\JarSigner.exe
4.3.) Problem #3: Android Update IntervalSensors have a update interval, by what values are requested from hardware.
Here Android 6+ supports only values LARGER THAN 18019 microseconds.
So we need to set the UpdateInterval property for every sensor to at least 18020 microseconds.
Inside of Delphi we've got another inaccuracy. This property should be a long integer type instead of a double, but that may just be meticulous by me. Very often I see people giving in a milliseconds value, as delphi typical, but it needs to be microseconds!
When eliminating problem #1, by redesigning the Android sensor implementation, we'll get much more stable rotation values.
4.4.) Problem #4: Rotation Vector Sensor still got gyro drift
Nevertheless, against any expectations the delivered Rotation Vector Sensor data still got some gyro drift, which leads to unsatisfying results.
Try to shake your device a little bit and you will see, what I mean.
Here we get back to section 3.2.): We need to fuse those values with another hardware sensor.
Alexander Pacha also recognized this inaccuracy and built an open source Java Android app:
He defined two new external logical sensors:
- Improved Orientation Sensor 1 (Sensor fusion of Android Rotation Vector and Calibrated Gyroscope - less stable but more accurate)
- Improved Orientation Sensor 2 (Sensor fusion of Android Rotation Vector and Calibrated Gyroscope - more stable but less accurate)
5.) ConclusionI hope this helped some of you, struggling with the same difficulties. Maybe Embarcadero is about to change it's library, so we'll need no workaround or new sensor library.
(*) These links are being provided as a convenience and for informational purposes only; they do not constitute an endorsement or an approval by this blog of any of the products, services or opinions of the corporation or organization or individual. This blog bears no responsibility for the accuracy, legality or content of the external site or for that of subsequent links. Contact the external site for answers to questions regarding its content.