Waveform-Guide Transformation of IMU Measurements for Smartphone-Based Localization | IEEE DataPort

Waveform-Guide Transformation of IMU Measurements for Smartphone-Based Localization

Citation Author(s):
Kyuwon
Han
Yonsei University
Seung Min
Yu
Korea Railroad Research Institute
Seung-Woo
Ko
Inha University
Seong-Lyun
Kim
Yonsei University
Submitted by:
Kyuwon Han
Last updated:
Tue, 04/19/2022 - 00:11
DOI:
10.21227/x9s8-6558
Data Format:
Links:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

We collect IMU measurements under three different patterns: Fixing a smartphone in front of his chest (chest), swing a smartphone while holding it in his hand (swing), and putting a smartphone in his pocket (pocket). We use Google Pixel 3XL for the pattern of chest and Google Pixel 3a for the patterns of swing and pocket. The sampling frequency of each measurement is fixed to 15Hz. We collect the measurement of 111 paths in total, categorized into 4 types. We partition them into 84 and 27 paths, used for training and testing, respectively. It takes 10 hours to collect all datasets.

Instructions: 

The raw data are consist of three pattern with below types : hodling(chest), swing, pocket

 - type 1 : repeat 10 step walking & turn left 360 degree

 - type 2 : repeat 10 step walking & turn left 180 degree

 - type 3 : repeat 10 step walking & turn left 275 degree

 - type 4 : repeat 10 step walking & turn left 90 degree

 - type 5 : repeat 10 step walking & turn left 315 degree

 - type 6 : repeat 10 step walking & turn left 225 degree

 - type 7 : repeat 10 step walking & turn left 135 degree

 - type 8 : repeat 10 step walking & turn left 45 degree

 - type 12 : repeat 10 step walking & turn left 180 and 360 degrees.

 

 - type 11 : repeat 10 step walking & turn right 360 degree

 - type 22 : repeat 10 step walking & turn right 180 degree

 - type 33 : repeat 10 step walking & turn right 275 degree

 - type 44 : repeat 10 step walking & turn right 90 degree

 - type 55 : repeat 10 step walking & turn right 315 degree

 - type 66 : repeat 10 step walking & turn right 225 degree

 - type 77 : repeat 10 step walking & turn right 135 degree

 - type 88 : repeat 10 step walking & turn right 45 degree

 

-------------------------------------------------------------------------------------------------------

The raw data progresses to the time-aligned process (due to the accelerometer and gyroscope unsynced problem) and to the high pass & low pass filter.

The test and train data consist of below categories: 

 

{'time';'Accm_x';'Accm_y';'Accm_z';'Accm_mag'; 'Gyrm_x';'Gyrm_y';'Gyrm_z';'Gyrm_mag';...

'Accm_Out'; 'Gyrm_Out'; 'heading_change'; 'length_diff';'theta';...

'Accm_x_raw'; 'Accm_y_raw'; 'Accm_z_raw'; 'Gyrm_x_raw'; 'Gyrm_y_raw'; 'Gyrm_z_raw'};

 

'time' is a measurement of time.

'Accm_x', 'Accm_y', 'Accm_z' 'Gyrm_x' 'Gyrm_y' 'Gyrm_z' are the pre-processed data described in [1].

'Accm_mag' and 'Gyrm_mag' are the magnitude of the accelerometer and gyroscope measurements, e.g., Accm_mag = norm(Accm_x_raw, Accm_y_raw, Accm_z_raw).

'Accm_Out' and 'Gyrm_Out' are reference waveforms (ground-truth).

'heading_change' is a heading difference between times.

'length_diff' is a length difference between times.

'theta' is a heading direction estimated in a holding pattern through [2].

'Accm_x,y,z_raw' and 'Gyrm_x,y,z_raw' are raw measurements with time synced.

 

-------------------------------------------------------------------------------------------------------

The train data set is 

 - type 2: path 1,2 #2

 - type 4: path 1:3, 14:19 #9 

 - type 7: path 1:3, 5:12 #11

 - type 8: path 2,3, 5:13 #11

 - type 12: path 1:5, 7,8, 12 #8

 - type 22: path 1:9, 11 #10

 - type 44: 5,6, 8:12 #6

 - type 77: path 1:12 #12

 - type 88: path 1:15 #15

The total # is 84.

 

The test data set is

 - type 2: path 4 #1

 - type 4: path 4, 20,21 #3

 - type 7: path 13:16 #4

 - type 8: path 4, 14:16 #4

 - type 22: path 10, 12:14 #4

 - type 44: 7, 13 #2

 - type 77: path 13:16 #4

 - type 88: path 17, 19:21, 23 #5

The total # is 27.

-------------------------------------------------------------------------------------------------------

Conversion to raw data to train and test data

 

1) Open matlab file 'DataAcquisition_Kinetic_v2_1_full_raw' in raw_data

2) Set a parameter and File path

 - Train stride 2, Test stride 6.

 - isTest = true for test data generation.

 - isCategory = true for train data generation.

 - window size is LPF paramter for the acceleromter and gyroscope. default: [2, 6]

 - isRRM = true for train data generation.

3) Set a path and type for data generation.

 

-------------------------------------------------------------------------------------------------------

The deep learning model consists of the following structure.

 1. 1d_resnet accm (Proposed model 1)

 - input (Accm_mag, Gyrm_mag): 2x80

 - output (1d_resnet_accm_out): 1x80

 - loss: MSE (Accm_out, 1d_resnet accm)

 

2. 1d_resnet gyrm (Proposed model 2)

 - input (Accm_x,y,z,mag, Gyrm_x,y,z,mag, 1d_resnet accm): 9x80

 - output (1d_resnet_gyrm_out): 1x80

 - loss: MSE (Gyrm_out, 1d_resnet_gyrm_out): 1x80

 

3. PDRnet [3] (Comparision model)

 - input (Accm_x,y,z_raw, Gyrm_x,y,z_raw): 6x80

 - output (heading_change, length_diff): 2x80

-------------------------------------------------------------------------------------------------------

Reference

[1] Kyuwon Han, Seung Min Yu, Seung-Woo Ko, and Seong-Lyun Kim, "Waveform-Guide Transformation of IMU Measurements for Smartphone-Based Localization," has been submitted to the IEEE for for possible publication.

[2] W. Kang and Y. Han, "SmartPDR: Smartphone-based pedestrian dead reckoning for indoor localization," IEEE Sensors Journal, vol. 15, no. 5, pp. 2906-2916.

[3] O. Asraf, F. Shama and I. Klein, "PDRNet: A Deep-Learning Pedestrian Dead Reckoning Framework," in IEEE Sensors Journal, vol. 22, no. 6, pp. 4932-4939, 15 March15, 2022, doi: 10.1109/JSEN.2021.3066840.

Documentation

AttachmentSize
File read me.txt5.39 KB