post obj.
Animal Voice Translator_
Animal Voice Translator_
Bridging
the human and non-human interconnection
FUTURE BAZAAR
Future Bazaar is a design fiction workshop was developed by Situation Lab and BBC and
hosted by Søren Rosenbak at Umeå Institute of Design. The question during the workshop was:
‘How will life and systems look like in 2048? Post obj. is shaped as an entity in this world
using
a future-back casting method.
POST OBJ
Post obj. is a maker space that works as an open platform
_recrafting technology through collective knowledge
_recrafting technology through collective knowledge
UNTOLD MAG 26-P.40
ANIMAL VOICE TRANSLATOR_INTERFACE
Animal voice translator is an object created by post obj.s open source. The aim is to create an under-standing of the animal world to create a bond with nature in 2048.
The interface consists of three parts: listening_non human, speaking _human and results from the dialogue as emotional and physical.
The interface consists of three parts: listening_non human, speaking _human and results from the dialogue as emotional and physical.
AUDIOVISUALS_BRIDGING SOUND AND EMOTIONS
I used Touchdesigner to turn the voice into an emotion map. Randomised shapes and the selected color ranges formed embedded sounds that bridge a poetic expression between sound and emotions.
POST-BRANDING
Post obj. identity is designed by using post-branding, which empowers better public communication design for civic and activist groups.
During the project, I gained valuable experiences by designing the physical product and its digital interface together. This also allowed me to question the possible visualisations on a specific product. Moreover, the future-back casting method made a strong foundation for a future brand which stems from today’s open source and grass roots community movements.
I used Touchdesigner, node based visual programming language, for the first time to create audio-reactive visuals. The iterations with various node features on audio allowed me to try different visualisations of non-verbal communication.
I used Touchdesigner, node based visual programming language, for the first time to create audio-reactive visuals. The iterations with various node features on audio allowed me to try different visualisations of non-verbal communication.