This is the very first test result of ‘AI-Enhanced Cyber-Physical Drone Art:Humachine’ that we've been working on for a while.
*All visuals in the project are generated realtime using the drones’ spatial/motion analysis data.
60 years ago, American computer scientist J.C.R. Licklider proposed a prescient dream of computing devices combined with human brains.
That vision has inspired a generation of scientists and engineers, and is essentially the basis of our artistic approach today. The work is inspired
by Lickliders' vision for a complementary relationship between humans and machines, in other words; the idea of technology as an enabler of human
This experiment was aiming to use the artificial intelligence as an extension of the human body and practice it as a tool of indirect self-expression.
A new interface developed by our team enabled us to create a hybrid connection between the micro drones, performer and a computer.
The micro drones were interconnected by droneswarm systems however it was a real challenge to let the drones know their exact positions to each other and
to the performer since the project was held indoors and the use of GPS tracking was impossible. We trained the micro drones' tracking system here using mocap
technology and a local positioning system, which we obtained using the spatial / motion analysis, 3 different coding languages, and a game engine. The
human-machine symbiosis made the creation of previously unattained form of expression possible.
It is very motivating to be able to carry the already existing drone swarm system to another and bespoke level for our specific needs. As first output; the human
abilities are enhanced and a new form of artistic expression is attained. Now we began to see human and machine as complete partners in artistic creation through
a symbiotic relationship and we're very excited to explore the potential opportunities abound!