Robocup 2026
- Dog robot (onstage)
- Walking (gait)
- For research, you can search for “quadruped gait”
- Some useful links:
- Real Animals (…nice videos with slow-mo)
- Commercial toy robot (…the video and some of the description is nice, but the diagram and code are useless)
- Generic description of a quadruped robot gait (…the timing diagram is good)
- If you pay close attention to the leg movement in the three sources, you’ll realize that they are all almost entirely different from each other. They demonstrate different approaches, and it’s useful to think carefully about each of them and understand how each works.
- When analyzing the movement of the the legs, you’ll realize that they are typically expressed in terms of the X and Y coordinates of the feet (eg. the timing diagram in the third link). But our motors are controlled by rotation / angle. You’ll need to use inverse kinematics (IK) to convert the cartesian (XY) coordinates to joint angles. You don’t need to understand everything in the IK link, but you should at least try to understand the section on “Analytical IK in 2D”. If you’re interested to learn the more generic numerical method, the above link can be rather difficult to understand, but I can go through one (relatively) simple method with you (…many methods exists).
- Object detection (detecting people)
- This should give you a decent starting point. Make sure to read the docs!
- You can also search for “YOLO python” or “YOLO opencv”.
- Speech recognition
- You can use this library. Note that you’ll need to use an offline engine for onstage.
- Walking (gait)
Sec 1
- Please try this tutorial
- Pybricks Documentation
CoSpace
- For beginners
- Basic tutorial on C coding http://learn-c.org (Need only do “Learn the Basics”)
- Design Patterns (ODP / PDF)
- Handling Durations (ODP / PDF)
- Setup compilation in vscode
- Open up your cospace C file in vscode.
- Press “Ctrl – Shift – b”. You should see a message “No build task to run found. Configure build task”. Click on that message.
- Click on “Create tasks.json file from template”
- Click on “Others”. This should open a new “tasks.json” file.
- Delete everything in you “tasks.json” and replace it with the content of this link. Save your “tasks.json” (Ctrl – s).
- Go back to your C file, and press “Ctrl – Shift – b”. Your file should now be compiled into a dll.
Robocup Rescue Line (OpenMV Cam)
- If you are using the OpenMV Cam, you’ll need to…
- Detect the black line and green box. Install the OpenMV IDE. Open the IDE and load the multi-color blob tracking example. Try it out, and tune it to detect the black line.
- When tuning, set the preview to LAB mode, select a region that you’re interested in (eg. black line), and look at the histogram to determine a suitable threshold. Note that in LAB color space, the L represents brightness, and should be low for black.
- By default, the example uses the entire frame for detection. That’s probably too large. Use the crop function to reduce the detection area.
- If you’re using the spike prime, you should use PUPremote to communicate between the OpenMV cam and Spike Prime. Download the python files from this github page. pupremote.py needs to be uploaded to your openmv cam, while pupremote_hub.py needs to be uploaded to your spike prime (via pybricks software).
- Other examples.
- OpenMV IDE contains an example of a line following code (Image Processing -> Black grayscale line following).
- Antons Mindstorm provides another example of line following code here.
- Neither are recommended, because… 1) You should write your own code. 2) Both of these are inadequate for robocup; you’ll need to modify them, so it’s better to start with simple code that you can understand well.
- But feel free to read through and understand how these examples work.
Robocup Rescue Line (EV3)
- Tutorial on different methods of reading from an OpenMV Cam or ESP32 without blocking
- Use this when you have an EV3 or PC reading data from the serial port.
- If the reader is micropython device (eg. ESP32), UART read is non-blocking by default.
- Docs for line sensor
Robocup Rescue Line (Generic stuff)
These slides are old, so the sample code are based on the old EV3 software. You won’t be able to use them directly, but the concepts and approach remains the same.
- Briefing Slides (ODP / PDF)
- Single Line Follower (ODP / PDF)
- Double Sensors Line Following (ODP / PDF)
- Ending the Loop (ODP / PDF)
- Turn on Green (ODP / PDF)
- Obstacles Avoidance (ODP / PDF)
- Handling Inclines (ODP / PDF)
- Tiles package for printing on A3 paper
- Video of a common and successful design
- Video of an unconventional design
- Difficult Lines Videos
IoTy
IoTy is a platform for programming the ESP32 using blocks or Python. This is useful for OnStage, Robocup Rescue Line (…if you’re building non-Lego robots), and for general electronics projects (eg. for WRO open category).
- Link to IoTy
- Intro to ESP32 and IoTy
- Connecting
- Working with Breadboards
- Analog Output
- Digital Input
- Ultrasonic Sensor
- Neopixel
- HSV
- IoT with IoTy

