Category Archives: Tutorials


A Practical Introduction to the Raspberry Pi Pico

A Practical Introduction to the Raspberry Pi Pico

tags: Raspberry Pi RaspberryPi Raspberry Pi Pico RaspberryPiPico MicroPython

Introduction

My name is Salvatore and I am a Development Engineer at 株式会社Rist.

  • Robotics team, responsible for Robotics R&D.
  • Robot design, control, motion, perception and manipulation.
  • Image analysis, recognition and classification.
  • AI algorithms design and implementation.
  • UI/UX design and implementation.

About this article

This article aims to give a quick introduction to the Raspberry Pi Pico, through simple step-by-step instructions to its setup and providing some code examples in MicroPython.

※ No soldering nor external components are required to follow this tutorial~

What is the Raspberry Pi Pico

The Raspberry Pi Pico is a microcontroller able to acquire various inputs and provide outputs through a series of GPIO pins, similar to a Raspberry Pi computer [1][2].
Microcontrollers are most suitable for embedded applications or any other situations in which the use of a full computer running an Operating System would be considered not necessary.

Environment Setup

~

User group

In order to have read/write permission to the serial port created upon connecting the Pico, the current user must belong to the dialout group.
Verify the current groups the user belongs to with:

$ groups

If the user does not belong to the dialout group, add it as follows:

# usermod -a -G dialout $USER

To make the changes effective, relog the current user to the system.

※ A reboot might be needed after performing the operation, in the case which a simple relog would not be sufficient.

Thonny and MicroPython

Proceed with the installation of the Thonny Python IDE [3]:
As of writing this article, the easiest way to get the most updated version of the IDE on a variety of linux distributions is through pip.

$ pip install thonny

※ For more information and other platforms support, visit thonny.org.

Once installed, the Thonny IDE would look similar to:

In order to control the Pico, it is necessary to switch the Python interpreter to MicroPython (Raspberry Pi Pico).

Click on the bottom right corner to switch the interpreter:
Python 3.7.9 > Configure interpreter...
Then select MicroPython (Raspberry Pi Pico) from the combo box list.

Or alternatively, from the Thonny menu, select:
Run > Select interpreter... > Interpreter > MicroPython (Raspberry Pi Pico)

Connecting the Pico to the PC

As the Pico comes out of the box, it needs to be initialised by installing the MicroPython firmware.
This can be done easily from the Thonny IDE itself.

To connect the Pico to the computer:

  1. Press and hold the BOOTSEL button on the Pico
  2. Connect the Pico to the computer via micro USB
  3. Release the BOOTSEL button

The Pico is now connected as a USB mass storage device.

MicroPython firmware installation

By clicking the Stop/Restart button on the Thonny toolbar, the IDE will connect to the Pico and it will prompt for the MicroPython firmware installation.

Proceed with the installation.

At this point, the environment setup is complete.

Code

As mentioned in the introduction, for this tutorial no external component are required.
The provided sample code will make use of the on-board power LED (GPIO 25) and the embedded temperature sensor.

LED test

Let's skip the basic LED on/off/blink tests and try the PWM controlled brightness test instead.

Create a new file and paste the following code [1]:

from machine import Pin, PWM
from time import sleep

pwm = PWM(Pin(25))
pwm.freq(1000)

while True:
    for duty in range(1024, 65025, 2):
        pwm.duty_u16(duty)
        sleep(0.0001)
    for duty in range(65025, 1024, -4):
        pwm.duty_u16(duty)
        sleep(0.0005)

By running this code, the on-board LED will pulse smoothly according to the values specified in the ranges of the duty cycles and the sleep intervals.

※ The value 65025 corresponds to the LED maximum brightness.

Temperature test

For the temperature test, use the following code [4]:

import machine
from time import sleep

sensor_temp = machine.ADC(4)
conversion_factor = 3.3 / 65535

while True:
    reading = sensor_temp.read_u16() * conversion_factor
    temperature = 27 - (reading-0.706)/0.001721
    print(temperature)
    sleep(2)

The temperature measured by the sensor will be printed in the Shell standard output every 2 seconds.

Putting it together

The following code will make the on-board LED pulse with variable speed.

from machine import Pin, PWM
from time import sleep

sensor_temp = machine.ADC(4)
conversion_factor = 3.3 / 65535

pwm = machine.PWM(machine.Pin(25))
pwm.freq(1000)

while True:
    reading = sensor_temp.read_u16() * conversion_factor

    temperature = 27 - (reading-0.706)/0.001721
    print(temperature)

    if temperature < 19:  # Lower threshold
        steps = 16
    elif temperature > 23:  # Higher threshold
        steps = 64
    else:
        steps = temperature

    for duty in range(0, 65025, int(steps**2)):
        pwm.duty_u16(duty)
        sleep(0.001)
    for duty in range(65025, 0, -int(steps**3)):
        pwm.duty_u16(duty)
        sleep(0.05)

The LED will blink in three different ways, depending on the ambient temperature measured by the on-board sensor:

  • Temperature below the lower threshold:
    • The LED pulses at a slow pace
  • Temperature within the thresholds interval:
    • The LED pulses at a pace proportional to the temperature
  • Temperature above the upper threshold:
    • The LED pulses at a fast pace

Adjust the Lower and Higher thresholds to best suit the local environment temperature.

Stand-alone operation

It is time to store the program in the Pico.

File > Save as...
And select Raspberry Pi Pico from the following dialog window:

In order to make the code run upon power up, the file must be named main.py.

※ Safe operating voltages are between 1.8V and 5.5V.

Alternatives

Recently, various boards are taking advantage of the RP2040 chip to provide custom solutions. One of them is the Seeed Studio XIAO RP2040, currently available through a Free Shipping campain in 221 countries and counting.
More on this and other tiny boards soon! Please check it out if you like.

References

[G] https://github.com/slabua/RaspberryPiPico
[Q] https://qiita.com/slabua/items/ed0a49cd587d0c8103b8
[1] https://www.raspberrypi.org/documentation/pico/getting-started/
[2] https://projects.raspberrypi.org/en/projects/getting-started-with-the-pico
[3] https://hsmag.cc/picobook
[4] https://tkrel.com/14899


Testing ROS Bridge on the TurtleBot3

ROS Bridge

~


Table of Contents

Overview

~

Introduction

My name is Salvatore and I am a Development Engineer at 株式会社Rist.

About this article

This article is meant to be a quick and direct introduction to the ROS Bridge, based on the official documentation.
This is the first article of a series related to my work at Rist.

About me

Present

Joined Rist in February 2020, currently part of the Development Team.
Responsible for Robotics R&D, and among other things, primarily working on the following topics:

  • Robot design, control, motion, perception and manipulation.
  • Image analysis, recognition and classification.
  • AI algorithms design and implementation.
  • UI/UX design and implementation.

Background

Graduated cum laude with a master's degree in Computer Science Engineering for Intelligent Systems at University of Palermo, Italy, presenting a Thesis in Robotics on a Brain-Computer Interface Framework.
Previously graduated with the grade of 110/110 in Computer Science Engineering, presenting a Thesis in Artificial Intelligence / Latent Semantic Analysis.

ROS Bridge

~

What is ROS Bridge

The ROS Bridge is a ROS 2 package that makes possible to seamlessly exchange messages between ROS 1 and ROS 2, on shared topics. Once executed, the bridge taps on both ROS 1 and ROS 2 topics and, by default, establishes a communication bridge between the existing topics which bear the same name.
It is also possible to establish a bridge for any existing topics, regardless of their concurrent status in both environments.

Why it is needed

Currently, the ROS 2 core packages are still undergoing heavy development, and most of the third-party packages are still relying on ROS 1.
Therefore, by using the ROS Bridge, it is possible to make use of those packages that are still not natively running on ROS 2.

Installation

~

ROS 2 machine

As previously stated, the ROS Bridge is a ROS 2 package, and it is available in the official repositories.
In order to run the ROS Bridge, the machine running ROS 2 is required to also run a functional ROS 1 environment.

Environment

HW: Raspberry Pi 3 Model B+
(used as the computational unit for TurtleBot3 Burger/Waffle Pi)

Software Version
OS Ubuntu 18.04.4 LTS (Bionic Beaver)
Python Python 2.7.17 (Default)
Python 3.6.9
ROS ROS 1 (Melodic)
ROS 2 (Dashing)

For information on setting up a ROS 2 Dashing environment on a machine running Ubuntu, refer to the official documentation.

  • To proceed with the Bridge installation, execute:
$ sudo apt install ros-dashing-ros1-bridge

Bash

I have slightly customised the Bash configuration file, adding an automatic IP recognition and a convenient alias for sourcing the ROS environments.

  • Add the following lines at the end of the ~/.bashrc file:
# ROS
ROS_MASTER_IP=192.168.0.100  # Ip of the ROS 1 master machine
WIFI_INTERFACE=$(iw dev | awk '$1=="Interface"{print $2}')
WIFI_IP=$(/sbin/ip -o -4 addr list $WIFI_INTERFACE | head -1 | awk '{print $4}' | cut -d/ -f1)
export TB3_MODEL=burger
export TURTLEBOT3_MODEL=${TB3_MODEL}

# ROS1
export ROS_IP=$WIFI_IP
export ROS_HOSTNAME=$WIFI_IP
export ROS_MASTER_URI=http://$ROS_MASTER_IP:11311
alias source-ros1='source /opt/ros/melodic/setup.bash'

# ROS2
alias source-ros2='source /opt/ros/dashing/setup.bash; source ~/turtlebot3_ws/install/setup.bash; export ROS_DOMAIN_ID=30 #TURTLEBOT3'

Note: In order to retrieve the interface name, the iw package must be present in the system.
If it is not already available, proceed with its installation:

$ sudo apt install iw

ROS 1 machine

The use of the ROS Bridge is completely transparent to the ROS 1 machine, therefore the machine running ROS 1 is required to only be configured to run a ROS 1 environment and its ROS 1 related packages.

Environment

HW: Desktop PC

Software Version
OS Ubuntu 18.04.4 LTS (Bionic Beaver)
Python Python 2.7.17
Python 3.6.9 (Default)
ROS ROS 1 (Melodic)

For information on setting up a ROS 1 Melodic environment on a machine running Ubuntu, refer to the official documentation.

Bash

  • Similarly, add the following lines at the end of the ~/.bashrc file:
# ROS
WIFI_INTERFACE=$(iw dev | awk '$1=="Interface"{print $2}')
WIFI_IP=$(/sbin/ip -o -4 addr list $WIFI_INTERFACE | head -1 | awk '{print $4}' | cut -d/ -f1)
export TB3_MODEL=burger
export TURTLEBOT3_MODEL=${TB3_MODEL}

# ROS1
export ROS_IP=$WIFI_IP
export ROS_HOSTNAME=$WIFI_IP
export ROS_MASTER_URI=http://$WIFI_IP:11311
alias source-ros1='source /opt/ros/melodic/setup.bash > /dev/null && source ~/catkin_ws/devel/setup.bash > /dev/null'

Running the demo example


To verify the successful installation of the Bridge, it is possible to run a simple talker/listener example between the machine running ROS 1 and the machine running ROS 2, and vice versa.

ROS 2 machine

  • Install the demo examples:
$ sudo apt install ros-dashing-demo-nodes-py

Shell 1: bridge

  • Launch the bridge:
$ . /opt/ros/melodic/setup.bash
$ . /opt/ros/dashing/setup.bash
$ ros2 run ros1_bridge dynamic_bridge

Shell 2: listener

  • Launch the listener:
$ . /opt/ros/dashing/setup.bash
$ ros2 run demo_nodes_py listener

ROS 1 machine

  • Install the demo examples:
$ sudo apt install ros-melodic-rospy-tutorials

Shell 1: roscore

  • Launch roscore:
$ . /opt/ros/melodic/setup.bash
$ roscore

Shell 2: talker

  • Launch the talker:
$ . /opt/ros/melodic/setup.bash
$ rosrun rospy_tutorials talker

Terminals

Screenshot from 2020-07-28 13-50-45.png

Switching roles

To switch the roles of the two machines, it is also possible to run the listener on the ROS 1 machine and the talker on the ROS 2 machine.

ROS 2 machine: talker

  • Launch the talker:
$ ros2 run demo_nodes_py talker

ROS 1 machine: listener

  • Launch the listener:
$ rosrun rospy_tutorials listener

Verify the USB camera stream

~

ROS 2 machine

  • Install the image_tools package:
    This package will allow the ROS 2 machine to use a regular USB camera to produce a stream of frames of specified dimensions on a chosen topic.
$ sudo apt install ros-dashing-image-tools

Shell 1: bridge

  • Launch the bridge:
$ . /opt/ros/melodic/setup.bash
$ . /opt/ros/dashing/setup.bash
$ ros2 run ros1_bridge dynamic_bridge

Shell 2: cam2image

  • Obtain the camera stream:
$ . /opt/ros/dashing/setup.bash
$ ros2 run image_tools cam2image -s 0 -x 320 -y 240 -t /image_raw
  • For more information about the usage of the available parameters:
$ ros2 run image_tools cam2image -h

ROS 1 machine

Shell 1: roscore

  • Launch roscore:
$ . /opt/ros/melodic/setup.bash
$ roscore

Shell 2: rqt_image_view

  • Verify the camera stream:
$ . /opt/ros/melodic/setup.bash
$ rqt_image_view /image_raw

Terminals

Screenshot from 2020-07-28 14-01-52.png

References