One of the goals of ubiquitous computing is to allow users to access and manipulate information anywhere in the environment. In order to display visual information in our instrumented room, we work with a steerable projector and camera unit (Fluid Beam) capable of projecting "virtual displays" at arbitrary locations, and acoustic data is presented using a spatial audio system (SAFIR) with which virtual sound sources can be created in 3D. In my talk I will present two devices with which 3D interaction with virtual objects (images and sounds) in the environment can be realized. One is the SCIPIO Gesture Band - a wearable input device that integrates several sensors in a wristband with which the user's arm position and movements can be captured, and the second one is SHAKE - a small size PDA extension that provides movement and orientation sensing and vibrotactile feedback. The use of the different interaction techniques will be illustrated in a "virtual mailbox" scenario.