This are instruments because they are designed to allow live video performance, either for an audience or just as part of the development and/or creative process of a visual narrative.

Originally they were thought to be a tool for writing and fast prototyping interactive narratives. In my experience it is really cumbersome to write, shoot, edit and program the visual narrative just to see how the interactive montage works and what semantic relations are triggered. This was the main reason I had for building these instruments. After working with them, I found out a lot of other potential uses :D

Mu_Table was made in Max/msp with some Java script.

YeiTlahuiMeca is being made in Quartz Composer and Xcode