Although the notion of choreography hasn’t disappeared in the context of interactive performance and virtual art, it undergoes a re-evaluation in terms of how bodily movement produces data, and how a performer/immersant engages with an interface environment which is dynamic, programmable and networked – thus open to unpredictable and emergent states. Unpredictable states and “affective computing”? What kind of data is handled by computation, and how? For dancers working with wearable sensors or camera-motion-sensing environments, real-time processing means they will surrender most of their kinesthetic and expressive qualities of experiencing movement in-space-over-time to (limited) ways in which programming parameters can “read” or map incoming data back to visual or audio output. What are the mappings of somatic affect? It of course depends on what someone wants to get out of interactive constellations, what programmers are satisfied with. From the perspective of dancers, it’s not illuminating to rely on system-feedback as it cannot be easily re-internalized in the same way in which physical processes work in reality. Thus VR seems a more pertinent context? The articulation of sound and light (images) in augmented reality, through “kinaesonic” gestures, can create tactile feedback in the projected environment outside of the performer’s body. Yet there’s a disjuncture insofar as the data acquired from bodies (making bodies instruments or biophysical objects to be played/extracted from) drive other temporal objects in the environment, they’re not subjectively felt or “known” by the dancer. The machine has not learnt much about gender, race, age, and different abilities either.