Within the autonomous driving community, millimetre-wave frequency-modulated continuous-wave (FMCW) radars are not used to their fullest potential. Classical, hand-designed target detection algorithms are applied in the signal processing chain and the rich contextual information is discarded. This early discarding of information limits what can be applied in algorithms further downstream. In contrast with object detection in camera images, radar has thus been unable to benefit fully from data-driven methods. This work seeks to bridge this gap by providing the community with a diverse, minimally processed FMCW radar dataset that is not only RGB-D (color and depth) aligned but also synchronized with inertial measurement unit (IMU) measurements in the presence of ego-motion. Moreover, having time-synchronized measurements allow for verification, automated or assisted labelling of the radar data, and opens the door for novel methods of fusing the data from a variety of sensors. We present a system that could be built with accessible, off-the-shelf components within a $1000 budget and an accompanying dataset consisting of diverse scenes spanning indoor, urban and highway driving. Finally, we demonstrated the ability to go beyond classical radar object detection with our dataset with a classification accuracy of 85.1% using the low-level radar signals captured by our system, supporting our argument that there is value in retaining the information discarded by current radar pipelines.