To some extent, what we see depends on what we are trying to do. How perception is effected by the behavioural task that is being performed is determined by top-down attention . Previously, visual top-down attention has been thought just to lead to changes in perceptual sensitivity, and in particular to improvements in sensitivity towards an attended feature or spatial location . However, recent psychophysical experiments have shown that attention can also introduce perceptual biases, effectively altering the appearance of visual stimuli . Here, we propose a series of psychophysical experiments designed to investigate in more detail how attending to a particular visual feature, such as orientation, biases perception. Using a simple computational model, we will attempt to relate this to the underlying changes in the tuning curves of sensory neurons that are measured during featurally guided attention, looking specifically at whether the observed perceptual biases can be explained by a fixed decoder that is `unaware’ of attention-dependent changes in the activity of sensory neurons. In further modelling work, we will attempt to provide a functional explanation for the perceptual changes that occur with top-down visual attention, by assuming that attention acts to optimize visual processing towards the behavioural task that is being performed. Within this framework, we will argue that perceptual biases occur when the observer is asked to perform a secondary task (such as orientation estimation) that is different from the task for which visual processing is being optimized towards.