top of page

Machine Visions



Performance Series at Osage Gallery Hong Kong

Peter Nelson

Roberto ALonso Trillo

Marek Poliks

19th November 2022 - 14 January 2023

WhatsApp Image 2022-10-24 at 6.04.07 PM.jpeg


Anchor 1

This exhibition explores how machine learning tools are being integrated into artistic practice. The works on show are the result of a two-year exploration of how machine learning can be used to synthesise music and synthesise 3D objects. In the time elapsed since the commencement of this project, online visual culture has reacted to and absorbed a host of new techniques, from image recognition to style transfer, to natural language synthesis and more recently the text-to-image synthesis pipelines offered by tools such as MidJourney and Dall-e. Underneath these rapidly evolving creative toolkits lie a common computational approach of a dataset, a neural network, and a newly synthesised output based on what features the network can understand in the original dataset. As the utility of these tools and the quality of their results improve, various cultural debates have been spawned, such as who 'owns' the collective cultural databases on which these systems are trained, and who therefore owns the works that these systems generate? Is there a tipping point where the human creative input relative to automated machine output shifts balance to the degree that we no longer consider the human to be the author of the work? In her overview of modern visual communication, Joanna Drucker notes that representational strategies evolve historically with changes in technological production, from the relationship between 16th-century developments in optics and Renaissance painting to mechanised assembly lines and the industrial geometric abstractions of modernist artists such as Paul Klee and Wassily Kandinsky. Considered in this broader trajectory, what we are witnessing is human creativity once again adapting to a paradigm shift, namely that of automation and artificial intelligence. It would be difficult to produce a definitive exhibition of how machine learning is changing the creative process, simply because these techniques are being integrated very quickly and across a wide number of applications and tools. Instead, this exhibition presents a bespoke exploration of three techniques - synthesising 3-dimensional shapes, synthesising music, and synthesising human motion. We present various artworks, sound installations, and musical performances made using these tools, alongside educational panels explaining the machine learning approaches behind these works. We hope that this exhibition can make a modest contribution to the rapidly evolving conversation of machine learning, artificial intelligence, and creativity. 


Peter Nelson and Roberto Alonso Trillo, 2022.


Performance Series

Peformance 1

RSVP to Performance #1 Here

Performace 2

RSVP to Performance #2 Here

Peformance 3

RSVP to Performance #3 Here

Performance #3, the second performance of the Archon system brings together musicians from Spain,  Hong Kong and the USA and will also include a special panel discussion led by Peter Nelson and Roberto Alonso,  with artists Marek Poliks, Karen Yu and Angus Lee, in partnership with the Australia Council of the Arts Engaging Influencer Series. Following the Archon performance,  artists and curators will talk specifically about the impact of machine learning on musical performance,  and speculate more broadly on how machine learning is going to alter the shape of the creative landscape in East Asia. Special guests to be announced soon!

Perfomance 4

RSVP to Performance #4 Here

Performance #4 is a special addition to our program, where artist Peter Nelson,  musician Roberto Alonso and dancer Sudhee Liao will share a number of art and technology performances they have been working on for the past two years, mashed up into a special 30 minute composition.


A performance in two parts, Debris project is a series of commissions of new music for fixed electronic means that exclusively explore, through the application of any sound transformation techniques (DSPs, fluid corpus manipulation/granular synthesis, physical modeling synthesis, etc.),  the materials found in the Demiurge’s Debris database. The project has been shaped by an interdisciplinary committee [Marek Poliks (Berklee) / Roberto Alonso (HKBU) / Pablo Coello (Vertixe) / Ramón Souto (Vertixe) / Ángel Faraldo (Phonos) / Jaime Reís (DME)] 


RSVP to Performance #5 Here

Perfomance 5
bottom of page