Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1

Discovery Projects - Grant ID: DP130100364

Discovery Projects - Grant ID: DP130100364
ARC | Discovery Projects
Contract (GA) number
Start Date
End Date
Open Access mandate
More information


  • Estimation of Gaussian process regression model using probability distance measures

    Hong, Xia; Gao, Junbin; Jiang, Xinwei; Harris, Chris J. (2014)
    Projects: ARC | Discovery Projects - Grant ID: DP130100364 (DP130100364)
    A new class of parameter estimation algorithms is introduced for Gaussian process regression (GPR) models. It is shown that the integration of the GPR model with probability distance measures of (i) the integrated square error and (ii) Kullback–Leibler (K–L) divergence are analytically tractable. An efficient coordinate descent algorithm is proposed to iteratively estimate the kernel width using golden section search which includes a fast gradient descent algorithm as an inner loop to estimat...

    Sparse density estimation on the multinomial manifold

    Hong, Xia; Gao, Junbin; Chen, Sheng; Zia, Tanveer (2015)
    Projects: ARC | Discovery Projects - Grant ID: DP130100364 (DP130100364)
    A new sparse kernel density estimator is introduced based\ud on the minimum integrated square error criterion for the finite mixture model. Since the constraint on the mixing coefficients of the finite mixture model is on the multinomial manifold, we use the well-known Riemannian trust-region (RTR) algorithm for solving this problem. The first- and second-order Riemannian geometry of the\ud multinomial manifold are derived and utilized in the RTR algorithm. Numerical examples are employed to ...

    Tensor LRR and sparse coding-based subspace clustering

    Fu, Yifan; Gao, Junbin; Tien, David; Lin, Zhouchen; Hong, Xia (2016)
    Projects: ARC | Discovery Projects - Grant ID: DP130100364 (DP130100364)
    Subspace clustering groups a set of samples from a union of several linear subspaces into clusters, so that the samples in the same cluster are drawn from the same linear subspace. In the majority of the existing work on subspace clustering, clusters are built based on feature information, while sample correlations in their original spatial structure are simply ignored. Besides, original high-dimensional feature vector contains noisy/redundant information, and the time complexity grows expone...

    Relations among Some Low Rank Subspace Recovery Models

    Zhang, Hongyang; Lin, Zhouchen; Zhang, Chao; Gao, Junbin (2014)
    Projects: ARC | Discovery Projects - Grant ID: DP130100364 (DP130100364)
    Recovering intrinsic low dimensional subspaces from data distributed on them is a key preprocessing step to many applications. In recent years, there has been a lot of work that models subspace recovery as low rank minimization problems. We find that some representative models, such as Robust Principal Component Analysis (R-PCA), Robust Low Rank Representation (R-LRR), and Robust Latent Low Rank Representation (R-LatLRR), are actually deeply connected. More specifically, we discover that once...

    Tensor regression based on linked multiway parameter analysis

    Fu, Yifan; Gao, Junbin; Hong, Xia; Tien, David (2014)
    Projects: ARC | Discovery Projects - Grant ID: DP130100364 (DP130100364)
    Classical regression methods take vectors as covariates\ud and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the mode...

    Joint multiple dictionary learning for tensor sparse coding

    Fu, Yifan; Gao, Junbin; Sun, Yanfeng; Hong, Xia (2014)
    Projects: ARC | Discovery Projects - Grant ID: DP130100364 (DP130100364)
    Traditional dictionary learning algorithms are used for finding a sparse representation on high dimensional\ud data by transforming samples into a one-dimensional (1D)\ud vector. This 1D model loses the inherent spatial structure property of data. An alternative solution is to employ Tensor Decomposition for dictionary learning on their original structural form —a tensor— by learning multiple dictionaries along each mode and the corresponding sparse representation in respect to the Kronecker ...
  • No project research data found
  • Scientific Results

    Chart is loading... It may take a bit of time. Please be patient and don't reload the page.


    Chart is loading... It may take a bit of time. Please be patient and don't reload the page.

    Publications in Repositories

    Chart is loading... It may take a bit of time. Please be patient and don't reload the page.

Share - Bookmark

App Box