Semi-supervised LearningSemi-supervised learning is an approach to machine learning that combines a small amount of labeled data with a large amount of unlabeled data during training. Semi-supervised learning falls between unsupervised learning (with no labeled training data) and supervised learning (with only labeled training data).
NeurIPS, (2019): 5050-5060
We introduced MixMatch, a semi-supervised learning method which combines ideas and components from the current dominant paradigms for SSL
Cited by317BibtexViews560Links
0
1
IJCAI, (2019): 3635-3641
Machine learning is having a transformative impact on diverse areas, yet its application is often limited by the amount of available labeled data
Cited by73BibtexViews519Links
0
0
national conference on artificial intelligence, (2019)
Automated Machine Learning attempts to build an appropriate machine learning model for unseen data set in an automatic manner
Cited by4BibtexViews261Links
0
0
IEEE transactions on pattern analysis and machine intelligence, no. 8 (2018): 1979-1993
The results of our experiments on the three benchmark datasets, MNIST, Street View House Numbers, and CIFAR-10 indicate that virtual adversarial training is an effective method for both supervised and semisupervised learning
Cited by541BibtexViews336Links
0
0
computer vision and pattern recognition, (2018)
In addition we developed an architecture to improve landmark estimation using auxiliary attributes such as class labels by backpropagating errors through the landmark localization components of the model
Cited by82BibtexViews296Links
0
0
national conference on artificial intelligence, (2018)
The experiments showed that the generalization performances are improved by applying our adversarial dropout
Cited by69BibtexViews213Links
0
0
international conference on machine learning, (2018): 2464-2473
We present a novel cost function for semisupervised learning of neural networks that encourages compact clustering of the latent space to facilitate separation
Cited by35BibtexViews253Links
0
0
ECCV, pp.275-291, (2018)
We present a novel Memory-Assisted Deep Neural Network to enable semi-supervised deep learning on sparsely labelled and abundant unlabelled training data
Cited by19BibtexViews168Links
0
0
Inf. Sci., no. C (2017): 484-497
In this paper we have designed a new supervised learning algorithm for improving the classifier performance on Intrusion detection datasets by investigating a divide-and-conquer strategy in which unlabeled samples with their predicted labels are categorized according to the magni...
Cited by306BibtexViews237Links
0
0
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), (2017): 6510-6520
Our proposed methods consistently improve the performance upon feature matching
Cited by253BibtexViews453Links
0
0
Philip Häusser, Alexander Mordvintsev,Daniel Cremers
CVPR, (2017)
We have proposed a novel semi-supervised training scheme that is fully differentiable and easy to add to existing end-to-end settings
Cited by63BibtexViews178Links
0
0
ICLR, (2017)
Our training still operates on a single network, but the predictions made on different epochs correspond to an ensemble prediction of a large number of individual sub-networks because of dropout regularization
Cited by4BibtexViews230Links
0
0
meeting of the association for computational linguistics, (2016)
We have presented a semi-supervised approach to training bidirectional neural machine translation models
Cited by139BibtexViews253Links
0
0
arXiv: Machine Learning, (2016)
The experiments in this paper were conducted with https://github.com/DoctorTeeth/supergan, which borrows heavily from https://github.com/carpedm20/DCGANtensorflow and which contains more details about the experimental setup
Cited by1BibtexViews145Links
0
0
Annual Conference on Neural Information Processing Systems, pp.3546-3554, (2015)
We showed how a simultaneous unsupervised learning task improves convolutional neural networks and multi-layer perceptrons networks reaching the state-of-the-art in various semi-supervised learning tasks
Cited by800BibtexViews289Links
0
0
international conference on learning representations, (2015)
In this paper we present a method for learning a discriminative classifier from unlabeled or partially labeled data
Cited by464BibtexViews129Links
0
0
CoRR, (2015)
Various approaches have been tried over the years, but according to the results on the challenging Pascal VOC 2012 segmentation benchmark, the best performing methods all use some kind of Deep Convolutional Neural Network
Cited by349BibtexViews259Links
0
0
We have developed new models for semi-supervised learning that allow us to improve the quality of prediction by exploiting information in the data density using generative models
Cited by1706BibtexViews194
0
0
Neural networks : the official journal of the International Neural Network Society, (2014): 110-119
We showed that the class ratios estimated by the proposed method are more accurate than competing methods, which can be translated into better classification accuracy
Cited by98BibtexViews177Links
0
0
ACL, pp.676-686, (2014)
We presented an approach that can expand a translation model extracted from a sentence-aligned, bilingual corpus using a large amount of unstructured, monolingual data in both source and target languages, which leads to improvements of 1.4 and 1.2 BLEU points over strong baseline...
Cited by31BibtexViews147Links
0
0
Keywords
Semi Supervised LearningLearning Artificial IntelligenceSupervised LearningMachine LearningData MiningUnlabeled DataSatisfiabilityFeature ExtractionClassificationClassification Performance
Authors
Friedhelm Schwenker
Paper 5
Mikhail Belkin
Paper 4
Zhihua Zhou
Paper 4
Partha Niyogi
Paper 3
Masashi Sugiyama
Paper 3
Gustavo Batista
Paper 2
Yasemin Altun
Paper 2
Changshui Zhang
Paper 2
Rie Kubota Ando
Paper 2