Data assimilation has been widely tested for flood forecasting, although its use in operational systems is mainly limited to a simple statistical error correction. This can be due to the complexity involved in making more advanced formal assumptions about the nature of the model and measurement errors. Recent advances in the definition of rating curve uncertainty allow estimating a flow measurement error that includes both aleatory and epistemic uncertainties more explicitly and rigorously than in the current practice. The aim of this study is to understand the effect such a more rigorous definition of the flow measurement error has on real-time data assimilation and forecasting. This study, therefore, develops a comprehensive probabilistic framework that considers the uncertainty in model forcing data, model structure, and flow observations. Three common data assimilation techniques are evaluated: (1) Autoregressive error correction, (2) Ensemble Kalman Filter, and (3) Regularized Particle Filter, and applied to two locations in the flood-prone Oria catchment in the Basque Country, northern Spain. The results show that, although there is a better match between the uncertain forecasted and uncertain true flows, there is a low sensitivity for the threshold exceedances used to issue flood warnings. This suggests that a standard flow measurement error model, with a spread set to a fixed flow fraction, represents a reasonable trade-off between complexity and realism. Standard models are therefore recommended for operational flood forecasting for sites with well-defined stage-discharge curves that are based on a large range of flow observations.