This is Part 6 of the new award winning development IoT kit from Nordic NRF54LM20-DK Nordic Semiconductor ASA | Development Boards, Kits, Programmers | DigiKey
The purpose of this article is to continue where Part 5 gesture recognition demo was left off. This article illustrates how to build the firmware to collect data from the accelerometer and gyro sensors. This is typically needed to capture data for training new models. To capture data in the minicom serial terminal, proceed to modify the following prj.conf file, to add these parameters in the file,
CONFIG_DATA_COLLECTION_MODE=y
CONFIG_BLE_MODE_NONE=y
this prof.conf file is located here,
edge-ai/applications/gesture_recognition/configuration/nrf54lm20dk_nrf54lm20b_cpuapp/prj.conf
Now build and flash the application as it was illustrated in Part 5. If the process is completed properly, in a separate minicom terminal it should display the following,
*** Booting MCUboot v2.3.0-dev-3dfaa012cf34 ***
*** Using nRF Connect SDK v3.3.0-preview2-ede152ec210b ***
*** Using Zephyr OS v4.3.99-4b6df5ff11b1 ***
I: Starting bootloader
I: Primary image: magic=unset, swap_type=0x1, copy_done=0x3, image_ok=0x3
I: Secondary image: magic=unset, swap_type=0x1, copy_done=0x3, image_ok=0x3
I: Boot source: none
I: Image index: 0, Swap type: none
I: Bootloader chainload address offset: 0x10000
I: Image version: v0.0.0
I: Jumping to the first image slot
*** Booting MCUboot v2.3.0-dev-3dfaa012cf34 ***
*** Using nRF Connect SDK v3.3.0-preview2-ede152ec210b ***
*** Using Zephyr OS v4.3.99-4b6df5ff11b1 ***
I: Starting bootloader
I: Primary image: magic=unset, swap_type=0x1, copy_done=0x3, image_ok=0x3
I: Secondary image: magic=unset, swap_type=0x1, copy_done=0x3, image_ok=0x3
I: Boot source: none
I: Image index: 0, Swap type: none
I: Bootloader chainload address offset: 0x10000
I: Image version: v0.0.0
I: Jumping to the first image slot
*** Booting nRF Connect SDK v3.3.0-preview2-ede152ec210b ***
*** Using Zephyr OS v4.3.99-4b6df5ff11b1 ***
[00:00:00.048,787] <inf> main: nRF Edge AI Gestures Recognition Demo:
[00:00:00.048,798] <inf> main: nRF Edge AI Runtime Version: 2.2.0
[00:00:00.048,816] <inf> main: nRF Edge AI Lab Solution id: 36038
0,0,0,0,0,0
0,0,0,0,0,0
0,0,0,0,0,0
5904,-351,-7637,0,0,0
5953,-332,-7722,39,42,-17
5948,-326,-7720,35,33,-17
5912,-313,-7714,29,29,-11
5912,-314,-7703,15,23,-7
5935,-302,-7705,2,13,-4
5937,-314,-7709,-10,4,-1
5934,-335,-7708,-23,-4,2
5921,-365,-7705,-30,-15,6
5937,-384,-7699,-35,-21,11
5953,-408,-7695,-33,-29,13
5950,-426,-7702,-29,-37,13
5955,-430,-7696,-22,-42,20
5978,-444,-7697,-14,-44,29
5990,-429,-7698,-5,-44,30
6004,-408,-7692,2,-44,27
6015,-409,-7689,9,-39,23
6029,-387,-7695,14,-31,20
6033,-368,-7688,14,-22,13
6029,-350,-7676,13,-14,7
6029,-347,-7698,12,-5,4
6020,-343,-7692,6,6,4
6035,-348,-7694,1,17,1
6034,-347,-7690,-2,24,-3
6009,-355,-7689,-7,33,-11
6001,-373,-7698,-7,39,-20
5983,-386,-7716,-6,44,-28
5967,-408,-7697,-2,44,-35
5961,-396,-7691,2,42,-30
5916,-339,-7680,4,37,-13
5984,-286,-7771,-9,20,29
5903,-330,-7692,-35,39,37
5983,-337,-7664,-24,5,-10
5925,-360,-7721,-24,21,-31
5930,-402,-7674,-30,2,-16
5938,-411,-7744,-30,2,8
5936,-432,-7677,-28,-15,16
5941,-421,-7719,-12,-22,10
5950,-429,-7703,-3,-31,13
etc
etc
.....
these are the raw 3-axis accelerometer and 3-axis gyroscope IMU sensor values collected at 100 Hz (ignore the first 4 samples),
<acc_x>,<acc_y>,<acc_z>,<gyro_x>,<gyro_y>,<gyro_z>
from the Bosch BMI270 IMU Sparkfun Evaluation Board available at DigiKey.
These Zephyr RTOS proj.conf parameters will activate the send_imu_data(input_data_i16) function to send these values to the user as shown before. Also in this mode, the Bluetooth LE is disabled.
Please stay tuned for our next article in this series. This has presented how to collect training values for the gesture demo for this new Axon Neural Processing Unit (NPU) using the Axon NPU driver directly, in this innovative Nordic IoT platform. This new Embedded World 2026 award winning Nordic IoT NRF54LM20-DK development kit is available at DigiKey.
Have a great day!
This article is available in spanish here.
Este artículo está disponible en español aquí.

