Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1

Future Fellowships - Grant ID: FT130101105

Future Fellowships - Grant ID: FT130101105
ARC | Future Fellowships
Contract (GA) number
Start Date
End Date
Open Access mandate
More information


  • Joint Emotion Analysis via Multi-task Gaussian Processes

    Beck, D.; Cohn, T.; Specia, L. (2014)
    Projects: ARC | Future Fellowships - Grant ID: FT130101105 (FT130101105)
    We propose a model for jointly predicting\ud multiple emotions in natural language sentences.\ud Our model is based on a low-rank\ud coregionalisation approach, which combines\ud a vector-valued Gaussian Process\ud with a rich parameterisation scheme. We\ud show that our approach is able to learn\ud correlations and anti-correlations between\ud emotions on a news headlines dataset. The\ud proposed model outperforms both singletask\ud baselines and other multi-task approaches.

    Fast, Small and Exact: Infinite-order Language Modelling with Compressed Suffix Trees

    Shareghi, Ehsan; Petri, Matthias; Haffari, Gholamreza; Cohn, Trevor (2016)
    Projects: ARC | Future Fellowships - Grant ID: FT130101105 (FT130101105)
    Efficient methods for storing and querying are critical for scaling high-order n-gram language models to large corpora. We propose a language model based on compressed suffix trees, a representation that is highly compact and can be easily held in memory, while supporting queries needed in computing language model probabilities on-the-fly. We present several optimisations which improve query runtimes up to 2500x, despite only incurring a modest increase in construction time and memory usage. ...

    Learning Robust Representations of Text

    Li, Yitong; Cohn, Trevor; Baldwin, Timothy (2016)
    Projects: ARC | Future Fellowships - Grant ID: FT130101105 (FT130101105)
    Deep neural networks have achieved remarkable results across many language processing tasks, however these methods are highly sensitive to noise and adversarial attacks. We present a regularization based method for limiting network sensitivity to its inputs, inspired by ideas from computer vision, thus learning models that are more robust. Empirical evaluation over a range of sentiment datasets with a convolutional neural network shows that, compared to a baseline model and the dropout method...

    Model Transfer for Tagging Low-resource Languages using a Bilingual Dictionary

    Fang, Meng; Cohn, Trevor (2017)
    Projects: ARC | Future Fellowships - Grant ID: FT130101105 (FT130101105)
    Cross-lingual model transfer is a compelling and popular method for predicting annotations in a low-resource language, whereby parallel corpora provide a bridge to a high-resource language and its associated annotated corpora. However, parallel data is not readily available for many languages, limiting the applicability of these approaches. We address these drawbacks in our framework which takes advantage of cross-lingual word embeddings trained solely on a high coverage bilingual dictionary....

    Learning when to trust distant supervision: An application to low-resource POS tagging using cross-lingual projection

    Fang, Meng; Cohn, Trevor (2016)
    Projects: ARC | Future Fellowships - Grant ID: FT130101105 (FT130101105)
    Cross lingual projection of linguistic annotation suffers from many sources of bias and noise, leading to unreliable annotations that cannot be used directly. In this paper, we introduce a novel approach to sequence tagging that learns to correct the errors from cross-lingual projection using an explicit debiasing layer. This is framed as joint learning over two corpora, one tagged with gold standard and the other with projected tags. We evaluated with only 1,000 tokens tagged with gold stand...

    Exploring Prediction Uncertainty in Machine Translation Quality Estimation

    Beck, Daniel; Specia, Lucia; Cohn, Trevor (2016)
    Projects: ARC | Future Fellowships - Grant ID: FT130101105 (FT130101105), EC | QT21 (645452)
    Machine Translation Quality Estimation is a notoriously difficult task, which lessens its usefulness in real-world translation environments. Such scenarios can be improved if quality predictions are accompanied by a measure of uncertainty. However, models in this task are traditionally evaluated only in terms of point estimate metrics, which do not take prediction uncertainty into account. We investigate probabilistic methods for Quality Estimation that can provide well-calibrated uncertainty...

    Hawkes processes for continuous time sequence classification : an application to rumour stance classification in Twitter

    Lukasik, Michal; Srijith, P. K; Vu, Duy; Bontcheva, Kalina; Zubiaga, Arkaitz; Cohn, Trevor
    Projects: EC | PHEME (611233), ARC | Future Fellowships - Grant ID: FT130101105 (FT130101105)
    Classification of temporal textual data sequences is a common task in various domains such as social media and the Web. In this paper we propose to use Hawkes Processes for classifying sequences of temporal textual data, which exploit both temporal and textual information. Our experiments on rumour stance classification on four Twitter datasets show the importance of using the temporal information of tweets along with the textual content.
  • No project research data found
  • Scientific Results

    Chart is loading... It may take a bit of time. Please be patient and don't reload the page.


    Chart is loading... It may take a bit of time. Please be patient and don't reload the page.

    Publications in Repositories

    Chart is loading... It may take a bit of time. Please be patient and don't reload the page.

Share - Bookmark

App Box