Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

AppGameKit Classic Chat / Detect Movement using Accelerometer

Author
Message
Jack
20
Years of Service
User Offline
Joined: 4th Oct 2004
Location: [Germany]
Posted: 1st Nov 2015 20:28 Edited at: 8th Dec 2015 16:41
Hello folks!

I have the mission to use the accelerometer to get the device movement.
The Idea behind this is very powerful. Some controls for VR apps or to use the smartphone as a 3d mouse paired with a software running on the computer.
I thought, that the idea is worth to try out.
There where some threads around with the same quest[ion], but without results. I made some research about the accelerometer feed inside AppGameKit combined by a timer to get the differrence,
but that experiment failed for now. Iam still not able to calculate the linear acceleration.

I tried to translate the code from here:
https://androidcookbook.com/Recipe.seam?recipeId=529





The main concept looks like these:

[ Get Accelerometer data ]
[ Get Timer data ]
[ Convert rotation Accelerometer data to movement data ]
[ Calculate distance from movement data ]


Any ideas on that one?

EDIT:

the code so far:





Ressources:
Stackoverflow code ideas
Accelerometer - Wikipedia

[/url]
Jack
20
Years of Service
User Offline
Joined: 4th Oct 2004
Location: [Germany]
Posted: 27th Nov 2015 21:58 Edited at: 28th Nov 2015 04:17
After some hard weeks, learning the theory, I am gonna try to push this forward.

First, there are "virtual sensors" like the linear acceleration sensor. They can be used from API Level 11 (afaik) and can calculate the linear acceleration by removing the gravity out of the acceleration raw data.
So the idea behind this is:

raw acceleration - gravity => linear acceleration

Although there are better ideas created by sensor fusion. The problem of linear acceleration estimination is, that
this process won't get correct data after about 20 seconds. The error bias is just too great to get some perfect device offset.

My first attempt is to create some graphs to compare different sensors. I found some interesting code with a highpass filter.

Compare, "RawRotationVector" and "fused Orientation"




Some fancy experiment:



If it does work, it has some vibration problems. I try to compensate them somehow. My Idea would be to add a sequence of sensor outputs together and divine them through the data points.
That should somehow work for my needs. I would be glad to hear a better method.

[/url]
Jack
20
Years of Service
User Offline
Joined: 4th Oct 2004
Location: [Germany]
Posted: 30th Nov 2015 18:27 Edited at: 30th Nov 2015 21:56
There is still some vibration problem, but this one will work quite well:

Current V1:



Sensitive version:

[/url]
mrradd
9
Years of Service
User Offline
Joined: 24th Feb 2015
Location: CA, USA
Posted: 30th Nov 2015 22:56
@Jack I just copy pasted the code, and that was pretty fun to play with.
-mrradd-
Jack
20
Years of Service
User Offline
Joined: 4th Oct 2004
Location: [Germany]
Posted: 7th Dec 2015 11:33 Edited at: 8th Dec 2015 16:55
Thank you mrradd! I will try to include more examples and visualisations.

The road so far:

I am pretty happy with my linear acceleration integral algorythm. (I have a new one!) But there is still a problem. I searched for different methods to calculate the gravity vector and finally realized, that only a sensor fusion can help me out of this.
It's very hard to keep an eye on the data, so I designed a 3d graph for the accel and magneto sensor.






The main problem will be, to get the green line above the red line




I have still no idea what I have to do to fuse the magnetic sensor with the accelerometer.



EDIT: To spice this thread up:


[/url]
Jack
20
Years of Service
User Offline
Joined: 4th Oct 2004
Location: [Germany]
Posted: 7th Dec 2015 17:27 Edited at: 7th Dec 2015 19:29
This example shows the relation between the magnetic sensor and the accelerator data:

The code is a basic visual example of sensor data. The next step would be a sensor fusion.









Edit:
an even better version:



but this still does not work quite well with the magnetic sensor. The Gyro can be calibrated to the accelerometer, but has a drift over the time.
Are there any solutions for that?


[/url]
CJB
Valued Member
20
Years of Service
User Offline
Joined: 10th Feb 2004
Location: Essex, UK
Posted: 8th Dec 2015 00:27
You want to implement something like a Kalman Filter (see this for a simple(?) explanation). The "Fused" sensor commands that are available in newer Android releases perform very differently over different handset models so are useless for our creative needs. I think we'll either have to code our own sensor fusion commands or nag Paul into writing some for us as built-in commands

SENSOR_TYPE_GAME_ROTATION_VECTOR gives us fused Accelerometer and Gyroscope, but without also fusing the magnetometer as a stabiliser, it suffers badly from drift (it also suffers from that awful sideways wobble when you tilt quickly up and down which needs the implementation of a filter to sort).

I'm going to keep an eye on this thread as I had a go at doing this a few months back and gave up due to lack of available time. Would be great to see someone crack it! Keep it up Jack!

V2 T1 (Mostly)
Phone Tap!
Uzmadesign
Jack
20
Years of Service
User Offline
Joined: 4th Oct 2004
Location: [Germany]
Posted: 8th Dec 2015 07:42 Edited at: 8th Dec 2015 16:46
Hello CJB,

its good to hear, that this problem is more common than I thought.
The magnetic sensor could be a very good addition to our VR portfolio. SENSOR_TYPE_GAME_ROTATION_VECTOR is a good idea to start with. I will try to use the rotational vector for a gravity calculation attempt.

I have to create a software that can measure some kind of linear offset in relation to the first measurement. The cool thing about this work is: We can detect steps!
Imagine a sporthall filled with VR players running around - multiplayer



I am very shure, that the magnetic sensor can be used to remove the gyro drift. But the magnetic sensor does not offset exactly like the acceleration vector. The example down here explains it. But - if you calibrate the sensors
on a table, move and rotate the device and return it to the place, the vectors will be still identical.
My idea is to add some math to the offset of the vector. The behaviour of the inconsistent offset reminds me a cos().







Recent code:
(magnetic sensor can be calibrated)


Target: All lines should move exactly the same.


EDIT: I have added the object and camera rotation vector




Edit: To have some more insight, Ill post some screens:
(my computer has no sensors, so... 0.0..^^)


[/url]
Jack
20
Years of Service
User Offline
Joined: 4th Oct 2004
Location: [Germany]
Posted: 8th Dec 2015 16:47 Edited at: 10th Dec 2015 06:49
I have updated the first post with the concept.

The clearest code for the 3d visualisation can be found here:




The following code only uses two axis of the magnetic sensor to compare them with the fused orientation vector.
The interesting part is, that the magnetic sensor and orientation vector are the same once the device has a 90 degree angle to the ground.
Does the fused orientation (GetRawRotationVector) use the magnetic sensor at least once? I think so.

x,z - Magnetic sensor:



[/url]

Login to post a reply

Server time is: 2024-11-25 13:48:01
Your offset time is: 2024-11-25 13:48:01