To gain insight into how vision guides eye movements, monkeys were trained to make a single saccade to a specified target stimulus during feature and conjunction search with stimuli discriminated by color and shape. Monkeys performed both tasks at levels well above chance. The latencies of saccades to the target in conjunction search exhibited shallow positive slopes as a function of set size, comparable to slopes of reaction time of humans during target present/absent judgments, but significantly different than the slopes in feature search. Properties of the selection process were revealed by the occasional saccades to distracters. During feature search, errant saccades were directed more often to a distracter near the target than to a distracter at any other location. In contrast, during conjunction search, saccades to distracters were guided more by similarity than proximity to the target; monkeys were significantly more likely to shift gaze to a distracter that had one of the target features than to a distracter that had none. Overall, color and shape information were used to similar degrees in the search for the conjunction target. However, in single sessions we observed an increased tendency of saccades to a distracter that had been the target in the previous experimental session. The establishment of this tendency across sessions at least a day apart and its persistence throughout a session distinguish this phenomenon from the short-term (<10 trials) perceptual priming observed in this and earlier studies using feature visual search. Our findings support the hypothesis that the target in at least some conjunction visual searches can be detected efficiently based on visual similarity, most likely through parallel processing of the individual features that define the stimuli. These observations guide the interpretation of neurophysiological data and constrain the development of computational models.