Theoretical Foundation SNN, University of Nijmegen, SN2 |
Prof. dr. C.C.A.M. Gielen
Probabilistic knowledge representation.
Continuous learning | |
dr. H.J. Kappen
Probabilistic knowledge representation. Boltzmann Machines | |
dr. T.M. Heskes
Statistical embedding of learning methods,
Generalization for perceptron-type neural network. Continuous learning
| |
dr. W.A.J.J. Wiegerinck
Target application, medical diagnosis
| |
dr.J.J. Torres
Probabilistic knowledge representation. Boltzmann Machines
| |
dr. S. Stroeve
Target application. medical diagnosis
| |
A.T. Cemgil, M.Sc.
Target application. Bayesian networks
| |
M. Leisink, M.Sc.
Probabilistic knowledge representation. Bayesian networks
|
Probabilistic knowledge representation
Traditional rule-based systems based on pure
logic are incapable of handling uncertain (imprecise, incomplete or
inconsistent) data. The issue is especially problematic for real world
computing applications, where complete knowledge is not possible,
except in very trivial situations. The probabilistic approach can in
principle solve this problem.
However, probabilistic methods are usually too slow for practical
applications. Thus, the main problem is to design robust systems
that are semantically correct and that are computationally efficient.
The research aims at design of
algorithms to enable learning and reasoning involving up to the order of
1000 variables. This allows for applications which are order of magnitude
larger than currently possible.
This project addresses essential aspects of Real World Intelligence and is of
crucial importance for large scale applications in self-organizing
information databases, human-machine dialogue systems, reasoning in
large knowledge domains and robotics. We therefore expect that the
results will make significant contributions to the results of RWI as a
whole. In the last few years we developed several approximate methods for reasoning and
learning in probabilistic networks. These methods are based on mean field theory
which is a well-known approximation method in statistical physics. Whereas
thermodynamical systems are very large (10^23 elements), probabilistic models
are more modest and contain maybe up to 10000 elements. These finite size effects
cause that correlations must be treated with care and strongly affect the
quality of the approximation.
Statistical embedding of learning methods
Neural networks have been applied in many problem domains for regression
problems (fitting continuous outputs) and classification tasks (finding
the proper class). In several application domains, neural networks
regularly outperform competitive algorithms. An often
heard disadvantage of neural networks, which seems to hamper their
widespread use, is their presumed obscurity.
In this research project, we aim to enlighten the neural black box and
develop statistical methods for quantifying the confidence of the
network solutions. In the last few years we developed robust statistical methods for
pruning and weight elimination in neural networks. These methods allow
automatic selection of network structure and show improved generalization.
Medical diagnosis
We evaluate in practice the probabilistic and statistical methods.
A medical diagnostic system is currently build. The system is
based on a probabilistic model and features inference with missing values;
reasoning with multiple causes;
optimal selection of actions and
active decision to assist the diagnostic process. The system
is developed and evaluated in close collaboration with the
Department of Internal Medicine of the University Hospital Utrecht.
The prototype medical diagnostic system will be evaluated by target
users, such as physicians. The system consists currently of 100 variables.
Preliminary evaluation shows that this approach is received with enthousiasm by the
medical experts in the Netherlands.