In this paper, which to some extend summarizes , we present a novel approach for the modelling of multivariable time series. The model class consists of linear systems, i.e., the solution sets of linear difference equations. Restricting the model order, the aim is to determine a model with minimal l2-distance from the observed time series. We propose an iterative algorithm for the nonlinear problem of identifying optimal models, using isometric state representations. Attractive aspects of the proposed method are that the model error is measured globally, that it can be applied for multi-input, multi-output systems and that no prior distinction between inputs and outputs is required. We also describe the link between isometric state representations and normalized coprime factorizations, and make some remarks on model uncertainty.