WRO 2025
- Future Innovator
- Drone guided garbage collector
- Documentation for AprilTag in OpenMV : The OpenMV IDE contains example code that you can use for an easy start.
- Use this to generate AprilTags for printing
- AprilTag official website
- Nice summary of how AprilTags work
- Emotional Support Robot
- Object recognition using Google’s Vision API
- Sample code
- When sending the request using curl, use this
curl -X POST -H "X-goog-api-key: AIzaSyD8a2Ihqd-_3TPl1PBUb7gZDWJnnc179Tc" -H "Content-Type: application/json; charset=utf-8" -d @request.json "https://vision.googleapis.com/v1/images:annotate"
- Text-to-Speech: If using a microcontroller, you’ll need hardware for this, either https://vi.aliexpress.com/item/1005008596064970.html or https://vi.aliexpress.com/item/1005008572550921.html
- Text-to-Speech alternative: If using a computer, you can do text-to-speech on the computer itself with no added hardware.
- Convert images to suitable format for CYD
- Reading materials for soft robot
- Snail Eliminator
- Drone guided garbage collector
Sec 1
- Please try this tutorial
- Pybricks Documentation
CoSpace
- For beginners
- Basic tutorial on C coding http://learn-c.org (Need only do “Learn the Basics”)
- Design Patterns (ODP / PDF)
- Handling Durations (ODP / PDF)
- Setup compilation in vscode
- Open up your cospace C file in vscode.
- Press “Ctrl – Shift – b”. You should see a message “No build task to run found. Configure build task”. Click on that message.
- Click on “Create tasks.json file from template”
- Click on “Others”. This should open a new “tasks.json” file.
- Delete everything in you “tasks.json” and replace it with the content of this link. Save your “tasks.json” (Ctrl – s).
- Go back to your C file, and press “Ctrl – Shift – b”. Your file should now be compiled into a dll.
Robocup OnStage
- Detecting the balls
- OpenCV (Computer vision library). You’ll need to install this on a laptop. Python version is recommended.
- Complete color blob detection tutorial.
- Sending commands
- When sending data from an OpenMV Cam to an ESP32, the easiest way would be to use a UART connection.
- To wire the UART connection, connect the TX pin of one device to the RX pin of the other device (…and vice versa). The GND pin of one device should also be connected to the GND pin of the other device.
- OpenMV Cam. Read the docs to learn how to send data. You can find the TX and RX pin position here (…use UART3. Avoid UART1). When sending data, it’s often best to use string format. Be sure to add a ‘\n’ at the end of the line.
- ESP32. Read this page on initializing the UART and the TX RX pin numbers. And this page on how to readline. Do not use UART0 (…UART2 is fine, and you can use the default pins for that).
Robocup Rescue Line (OpenMV Cam)
- If you are using the OpenMV Cam, you’ll need to…
- Detect the black line and green box. Install the OpenMV IDE. Open the IDE and load the multi-color blob tracking example. Try it out, and tune it to detect the black line.
- When tuning, set the preview to LAB mode, select a region that you’re interested in (eg. black line), and look at the histogram to determine a suitable threshold. Note that in LAB color space, the L represents brightness, and should be low for black.
- By default, the example uses the entire frame for detection. That’s probably too large. Use the crop function to reduce the detection area.
- If you’re using the spike prime, you should use PUPremote to communicate between the OpenMV cam and Spike Prime. Download the python files from this github page. pupremote.py needs to be uploaded to your openmv cam, while pupremote_hub.py needs to be uploaded to your spike prime (via pybricks software).
- Other examples.
- OpenMV IDE contains an example of a line following code (Image Processing -> Black grayscale line following).
- Antons Mindstorm provides another example of line following code here.
- Neither are recommended, because… 1) You should write your own code. 2) Both of these are inadequate for robocup; you’ll need to modify them, so it’s better to start with simple code that you can understand well.
- But feel free to read through and understand how these examples work.
Robocup Rescue Line (EV3)
- Tutorial on different methods of reading from an OpenMV Cam or ESP32 without blocking
- Use this when you have an EV3 or PC reading data from the serial port.
- Not needed if the reader is micropython device (eg. ESP32), as that uses UART read which is non-blocking by default.
- Docs for line sensor
Robocup Rescue Line (Generic stuff)
These slides are old, so the sample code are based on the old EV3 software. You won’t be able to use them directly, but the concepts and approach remains the same.
- Briefing Slides (ODP / PDF)
- Single Line Follower (ODP / PDF)
- Double Sensors Line Following (ODP / PDF)
- Ending the Loop (ODP / PDF)
- Turn on Green (ODP / PDF)
- Obstacles Avoidance (ODP / PDF)
- Handling Inclines (ODP / PDF)
- Tiles package for printing on A3 paper
- Video of a common and successful design
- Video of an unconventional design
- Difficult Lines Videos
IoTy
IoTy is a platform for programming the ESP32 using blocks or Python. This is useful for OnStage, Robocup Rescue Line (…if you’re building non-Lego robots), and for general electronics projects (eg. for WRO open category).
- Link to IoTy
- Intro to ESP32 and IoTy
- Connecting
- Working with Breadboards
- Analog Output
- Digital Input
- Ultrasonic Sensor
- Neopixel
- HSV
- IoT with IoTy