There is considerable debate regarding the ability to trade mnemonic precision for capacity in working memory (WM), with some studies reporting evidence consistent with such a trade-off and others suggesting it may not be possible. The majority of studies addressing this question have utilized a standard approach to analysing continuous recall data in which individual-subject data from each experimental condition is fitted with a probabilistic model of choice. Estimated parameter values related to different aspects of WM (e.g., the capacity and precision of stored items) are then compared using statistical tests to determine the presence of hypothesized differences between experimental conditions. However, recent research has suggested that the standard approach is flawed in several respects. In this study, we presented participants with behavioural pre-cues informing them about the upcoming number of to-be-remembered items (high- vs. low-load) with the goal of inducing a trade-off between capacity and precision. The data were then analysed using the standard analytical approach and a more rigorous Bayesian model comparison (BMC) approach. The second approach involved generating a set of probabilistic models whose priors reflect different hypotheses regarding the effect of our key experimental manipulations on behaviour. Our results demonstrate that these two approaches can produce notably different results. More specifically, the standard analysis revealed that a high- versus a low-load cue resulted in higher capacity and lower precision parameter estimates, suggesting the presence of a trade-off between capacity and precision. However, the more rigorous BMC analysis revealed that it was very unlikely that participants employed a behavioural strategy in which they sacrificed mnemonic precision to achieve higher storage capacity. In light of these differences, we advocate for a more stringent approach to model selection and hypothesis testing in studies implementing mixture modelling.