Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Thi-hang, Nguyen; Nabney, Ian T. (2010)
Languages: English
Types: Article
This paper presents a novel methodology to infer parameters of probabilistic models whose output noise is a Student-t distribution. The method is an extension of earlier work for models that are linear in parameters to nonlinear multi-layer perceptrons (MLPs). We used an EM algorithm combined with variational approximation, the evidence procedure, and an optimisation algorithm. The technique was tested on two regression applications. The first one is a synthetic dataset and the second is gas forward contract prices data from the UK energy market. The results showed that forecasting accuracy is significantly improved by using Student-t noise models.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [1] I.T. Nabney, NETLAB: Algorithms for Pattern Recognition, (Springer, Great Britain, 2002).
    • [2] H.T. Nguyen and I.T. Nabney, Combining the wavelet transform and forecasting models to predict gas forward prices, in: Proc. ICMLA’08, the 2008 Seventh International Conference on Machine Learning and Applications, (IEEE Computer Society, 2008) 311-317.
    • [3] H.T. Nguyen and I.T. Nabney, Energy forward price prediction with a hybrid adaptive model, in: CIFEr’09, IEEE Symposium on Computational Intelligence for Financial Engineering, (2009) 66-71.
    • [4] M.E. Tipping and N.D. Lawrence. Variational inference for Student-t models: Robust Bayesian interpolation and generalised component analysis, Neurocomputing, 69 (2005) 123-141.
    • [5] C.M. Bishop and M. Svensén, Robust Bayesian mixture modelling, Neurocomputing, 64(2005) 235-252.
    • [6] C. Archambeau and M. Verleysen, Robust Bayesian clustering, Neural Networks, 20(1) (2007) 129-138.
    • [7] J. Berger, Statistical Decision Theory and Bayesian Analysis (Springer, 1985).
    • [8] D.J.C. MacKay, Bayesian method for backprop network, in E. Domany, J. van Hemmen, and K. Schulten ed., Models of Neural Networks, III, (Springer, 1994) 211-254.
    • [9] M. I. Jordan, Z. Ghahramani, T. S. Jaakkola and L. K. Saul. An introduction to variational methods for graphical models, Machine Learning, 37 (1999) 183-233.
    • [10] M. Abramowitz and I. A. Stegun, Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, (Dover, New York, 1964).
    • [11] M.F. Møller, A scaled conjugate gradient algorithm for fast supervised learning, Neural Networks, 6(4) (1993) 525-533.
    • [12] D.J.C. MacKay, Bayesian interpolation, Neural Computation, 4(3) (1992) 415-447.
    • [13] C.M. Bishop, Neural Networks for Pattern Recognition, (Oxford University Press, 1995).
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article