RGS

Robocup 2026

  • Dog robot (onstage)
    • Walking (gait)
      • For research, you can search for “quadruped gait”
      • Some useful links:
        • Real Animals (…nice videos with slow-mo)
        • Commercial toy robot (…the video and some of the description is nice, but the diagram and code are useless)
        • Generic description of a quadruped robot gait (…the timing diagram is good)
        • If you pay close attention to the leg movement in the three sources, you’ll realize that they are all almost entirely different from each other. They demonstrate different approaches, and it’s useful to think carefully about each of them and understand how each works.
      • When analyzing the movement of the the legs, you’ll realize that they are typically expressed in terms of the X and Y coordinates of the feet (eg. the timing diagram in the third link). But our motors are controlled by rotation / angle. You’ll need to use inverse kinematics (IK) to convert the cartesian (XY) coordinates to joint angles. You don’t need to understand everything in the IK link, but you should at least try to understand the section on “Analytical IK in 2D”. If you’re interested to learn the more generic numerical method, the above link can be rather difficult to understand, but I can go through one (relatively) simple method with you (…many methods exists).
    • Object detection (detecting people)
    • Speech recognition

Sec 1

CoSpace

  • For beginners
  • Setup compilation in vscode
    1. Open up your cospace C file in vscode.
    2. Press “Ctrl – Shift – b”. You should see a message “No build task to run found. Configure build task”. Click on that message.
    3. Click on “Create tasks.json file from template”
    4. Click on “Others”. This should open a new “tasks.json” file.
    5. Delete everything in you “tasks.json” and replace it with the content of this link. Save your “tasks.json” (Ctrl – s).
    6. Go back to your C file, and press “Ctrl – Shift – b”. Your file should now be compiled into a dll.

Robocup Rescue Line (OpenMV Cam)

  • If you are using the OpenMV Cam, you’ll need to…
    • Detect the black line and green box. Install the OpenMV IDE. Open the IDE and load the multi-color blob tracking example. Try it out, and tune it to detect the black line.
    • When tuning, set the preview to LAB mode, select a region that you’re interested in (eg. black line), and look at the histogram to determine a suitable threshold. Note that in LAB color space, the L represents brightness, and should be low for black.
    • By default, the example uses the entire frame for detection. That’s probably too large. Use the crop function to reduce the detection area.
    • If you’re using the spike prime, you should use PUPremote to communicate between the OpenMV cam and Spike Prime. Download the python files from this github page. pupremote.py needs to be uploaded to your openmv cam, while pupremote_hub.py needs to be uploaded to your spike prime (via pybricks software).
    • Other examples.
      • OpenMV IDE contains an example of a line following code (Image Processing -> Black grayscale line following).
      • Antons Mindstorm provides another example of line following code here.
      • Neither are recommended, because… 1) You should write your own code. 2) Both of these are inadequate for robocup; you’ll need to modify them, so it’s better to start with simple code that you can understand well.
      • But feel free to read through and understand how these examples work.

Robocup Rescue Line (EV3)

Robocup Rescue Line (Generic stuff)

These slides are old, so the sample code are based on the old EV3 software. You won’t be able to use them directly, but the concepts and approach remains the same.

IoTy

IoTy is a platform for programming the ESP32 using blocks or Python. This is useful for OnStage, Robocup Rescue Line (…if you’re building non-Lego robots), and for general electronics projects (eg. for WRO open category).

Others