Future Robots.
In collaboration with DeepMind, the Central Saint Martins MA Material Futures first year students imagined how technologies could be deployed to tackle some of the most complex and pressing issues that we face today and in the future. The outcomes, developed by pairs of students, were exhibited at the college’s Lethaby Gallery in King’s Cross in March 2022.
As part of a UAL Sustainability Alumni Network, I was able to go along to a private tour of the exhibition led by MA Material Futures course leader Kieran Jones. I don’t particularly like those guided headphone tours you can get at galleries, mostly because I will easily skip past something that doesn’t catch my eye, so this was an almost forced opportunity to really dig in to the bones of a project. They were all fascinating in their own right, though I did have questions I would posit to the student pairing, which I will share along with the project overview.
“Acclaimed Waste” ⇾ (Above, middle image). There is no framework for plastic types, but machines can learn them by establishing the molecules, the shape and the colour. It can lead to hyperlocal recycling where councils would have more control, for instance by implementing systems suited to local materials. This then could lead to local vernacular, for example, the orange plastic then transformed into orange yarn for orange school uniforms. The students worked with refugees to understand the items that are missed out of food bags, and produced them using this technology.
◦ I liked the local vernacular aspect of this, to make a connection between places and waste. But no authority is going to allow all waste to stay in one place. The link to refugees was interesting, and gave a practical commercial value to the technology - offering questions of why should they receive products only made from waste anyway, and why aren’t the items they need already included in supply bags?
“Redline” ⇾ (Above, right image). It addressed how facial expressions could be used to better advertise to us, for example establishing when there is soft trauma and subsequently what to market for that. However, the duo of students concluded that the tech was in fact too dangerous with dire consequences, and didn’t finish up with the project, instead exhibiting their research and concerns only. Technology can be democratic, but what methodology or frameworks must be implemented to protect the democratic prcoess?
◦ Appreciate that using expressions to sell us products does go too far. At what point do we lose all control? Do we even really know what happens when we “accept cookies”, so what information are we giving and to who if we can’t make a selection against what emotions we give away.
“Precision Farming Blimp” ⇾ (Left image). This is a drone disguised as a weather balloon in order to obtain data for agriculture without feeling invasive. The balloon then shares data with other farms to create collective knowledge, for instance Farm A has experienced blight, and Farm B can prepare.
◦ As a grower I can imagine this working in practice, and can appreciate that a redesign from drone to weather balloon would ease suspicion. For areas of the world that are so rural or telecomms are unreliable, then this could have a great impact in building resilient communities. It’s a neutral data collection device and makes for a neutral, easily transferable model.
“City Aid” ⇾ (Right image). This student pairing redesigned drones into a fifth emergency service. The drones share real time air quality data to highlight the danger associated with air pollution. Low traffic neighbourhoods have been established in certain areas of London (bus gates to prevent normal vehicular traffic though, and only buses and bikes) creating noticeable build ups in traffic elsewhere, and primarily in low income areas because this is where main thoroughfares are. The use of drones as an emergency service could help to have oversight on traffic jams, and therefore air quality, so making traffic more democratic.
◦ I could see the starting point, and remember nodding along when the situation with low traffic neighbourhoods was mentioned; it is something I hadn’t previously considered as a campaigner raising the profile of such strategies in Hackney. Speaking later to those living in low traffic neighbourhoods, I realise that there is a need to monitor the wider impact of such spontaneous policies.
“Smacksystem” ⇾ (Left image). Coral bleaching is caused by waters being too warm, and the corals consequently expelling the algae living in their tissues, thereby making them turn completely white. A reversible problem, coral bleaching does cause displacement of ecosystems while it recovers. This project considered how technology could be used to transplant bacteria via a fermentation vessel dropped by a drone.
◦ As I felt with a few projects, it seems like too-late a fix. Coral bleaching doesn’t need to happen, so rather than focussing on technology to go in after the fact, focus on tech that can ensure they’re not bleached in the first place. Implanting bacteria could still upset the ecosystem and not take.
“Let’s Talk Dirty (Air)” ⇾ (Right image). Essentially an anti-social air pollution device. Worn by people in urban environments, the suit acts as barrier to people in physical terms, but also simultaneously collects data about air pollution around the specific person in the vest. Interesting that this came during Covid-19.
◦ This felt neutral, like a nice fashion-based project that could hold space in the wearable tech realm, and not be outwardly aggressive when it came to the use of machine learning. A data collection device that additionally was fun to wear and use.
“Anthem” ⇾ (Left image). This is a ‘meadow musician’ that plays optimal music for the season/location/ecosystem, in order to attract pollinators. It can also learn sounds that are being played around it to change what music/sounds it subsequently plays. It questions what a robot looks like, considering the robot here is the log and biomaterial flower.
◦ My concern with this concept was that it would be disrupting something unknown; unless you immerse yourself in the environment first to pick up all sounds, then how would you know that what you play isn’t going to do more harm than good?
“Future Foraging” ⇾ (Right image). A foraging simulation pig in the form of a foraging basket will direct you to truffles (or other such foraging goodies) through its senses. It then maps ecosystems as a sort of environmental steward. The basket is designed with natural materials so that it fits well within the art of foraging, but I find the whole premise of using tools to forage unruly.
◦ We’ve all discovered how plant ID apps need cross-checking, so how can an AI fully machine learn all of the external indictors for foraging? And though it suggests stewardship, I argue that it take the excitement out of foraging something yourself with your own skills, making judgement calls. This was the one project I wasn’t convinced by.
“Synaesthesia Translator” ⇾ (Left image). How does a person with synaesthesia translate their language to others who don’t have synaesthesia? Each box contains a different material - the person affected by synaesthesia described the colours and patterns they saw upon feeling them, and this was translated via machine learning to digitise the vision. This formed the basis of the language so that others may understand.
◦ This project was utterly fascinating as I know nothing about how this neurological condition works, yet the final outcome explained in practical terms how someone with and without synaesthesia would communicate - how could one person truly understand what someone else was sensing? Beautifully and simply presented.
“Independent Phonic Operator” ⇾ (Right image - note, I didn’t have one so took this from the exhibition trailer). This is a cool machine learning musician that would jam with you. You would use the instrument as normal, as it had strings and a drum, then the recording would be translated by a machine that would learn your likes, and start to play music with the same vibe. They presented a live show of this in action.
◦ It’s a fairly neutral use of an AI and felt safe; it wasn’t using an AI for anything dramatic, and yet it showcased how humans and AI could potentially collaborate democratically, giving each a voice.
Want updates on blogs like these? Sign up to my publication on Substack. Or subscribe below.