This journal article describes the original technology and aesthetic of an audiovisual duo who have been collaborating for four years, including discussion of mapping strategies, real-time audiovisual object segmentation and audiovisual feedback. Rather than founding analysis primarily on the visual domain, or utilising gross measures of an audio signal, the authors consider the close synchronicity of audiovisual objects through message passing between modalities. Further, live audio analysis and processing can be used to gain video processing capabilities for free; a novel technique in the paper leverages real-time audio segmentation to also tag video stream frames, and thus collect databases of audiovisual objects from captured audio and video on-the-fly. Whilst 'klipp av' is an artistic collaboration, the academic content of this article was almost entirely written up by Nick Collins. Olofsson heped draft part of the 'klipp av mappings' section on pp.14-16, and cleared the article for release. Where audio analysis and algorhythmic cut-up research underlies the audiovisual technology, this is again solely Nick Collins's work.