Micro Swarm
Last updated
Last updated
Micro Swarm is a project for the Micro Challenge Course in the Master in Design for Emergent Futures Year 1 for 2024-2025 to test 'swarm intelligence' using a series of Barduino (ESP32) devices. With input from a human user (vibration sensor), a note will be played on a single 'Reference' device. Using OpenAI, a harmonizing note will be determined and relayed to the 'Cell' devices via MQTT. The 'Cell' devices will then play the note to create "swarm music".
Carlos Silveira: Code for playing notes and OpenAI integration and modeling for carapace
Flavio Grimaldi: Code for playing notes, sensing environment, and OpenAI integration
Lucretia Field: Code for MQTT protocol and integration with OpenAI code
The proposed system is composed of a single "Mother" or "Reference" device that reads input from a sensor and sends out a signal to the "Child" or "Cell" nodes to make certain noises in response to the stimuli.
The Reference node will process the input from a sensor to determine the frequency of a note to play on the onboard piezo buzzer while simultaneously sending a query to a ChatGPT prompt through the OpenAI API to determine harmonizing frequencies for the desired note. The information of the harmonizing note is then conveyed to the Cell nodes using a Message Queues Telemetry Transport (MQTT) publish-subscribe protocol on a local WiFi network through an existing MQTT broker server. When the Cells receive the frequency, they will activate the onboard buzzers to play the sound, creating the harmonious din of cicadas.
When a predator makes a vibration close to the Mother node, the node will send out a "danger" signal so that the Children nodes are alerted. A predator's approach is detected through a piezo vibration sensor on the Reference device. When the sensed vibration is within an acceptable range, to cut out noise from the sensor, the value is mapped to correspond to one of the indexes of 88 notes, which is passed to the OpenAI prompt. The prompt requests that 3 harmonizing notes be determined from the given note and returned as an array.
The Reference node will then relay a different frequency to each of the Cells so they can respond in an individualist manner to make sure all of the children heard the call. This is done by publishing the desired frequency to a different 'topic' to which the Cells can each subscribe to receive only the frequency meant for their buzzer.
The code for the cells is much simpler. Each Cell subscribes to a topic, and then when a new message comes in from the Reference, plug that value into the buzzer frequency and play the note.
The carapace, with an insect look, besides being designed to be adaptable, emitted in its internal parts when assembling the electronic components to improve time, for mother and son, was intentionally designed to reverberate the acoustic from one chamber to the other and amplify the sound emitted by the buzzer inside of the artificial insect. The structure of the artifact components is divided into two specific parts, a flexible silicone base with four legs that can be attached to different places vertically and horizontally with suction cups on each leg, and an external structure that covers the Barduino board, battery, and piezo sensor, 3D printed in PLA to protect the components from the external interference and maximize the sound by using oval shapes and seven holes on top to let the airflow from the chamber to the outside.
To produce the carapace, a Bamboo 3D printer was used for the shell, a Creality Ender to print the mold for the silicone bottom layer, and Blender and Adobe Illustrator to define the shapes and forms. It's important to enhance the several changes made during the project that transformed the original shape to a new version considering the possibilities of batteries to be used on the Barduino and they would be distributed inside of the carapace without interfering with each other, which consequently gave a more anatomical shape specifically designed to the board and components for this Micro Challenge.
Lucretia: From the beginning we were interested in exploring different inputs to the system, for a future iteration of the project, we think it would be very cool to have the potential for the input to come from a MIDI controller where a human could provide a note or series of notes for the 'cicadas' to harmonize with. Instead of mimicking a dangerous sense in animals in nature, this functionality could allow humans and insects to work together to create a musical atmosphere.
Carlos: Considering the logical nature of the communication between different individuals in a swarm, this project could be used to make humans aware of human and insect environmental issues, as something camouflaged in nature giving signs to humans and other animals. One of the possibilities that I would like to explore is trying to mimic the frequencies of specific living beings to communicate and create an artificial-biological relation to trigger actions. This would have a true potential for environmental measurements and care and could be used in different biomes with easy adaptability. In addition, this could provide a glimpse of the possibilities to communicate with different individuals with one reference of control, something that could truly be applied to my research in biohybrid robots. at last, it would be delightful to improve and give AI autonomy to each one of the robots.
Flavio: Future developments of this project will explore how ethnocognition influences the way individuals and communities synchronize, adapt, and co-create shared rhythms. By expanding the system to allow real-time behavioral adaptation, the devices could dynamically adjust to human presence, movement, or cultural soundscapes, reflecting how people negotiate space and interaction in urban environments. By integrating machine learning, the system could evolve its harmonization patterns based on community engagement, turning public spaces into interactive sound ecosystems. This could serve as a metaphor for cultural integration, where different backgrounds contribute to a fluid, ever-evolving collective identity through shared sensory experiences.