The Boltzmann Machine is a highly recurrent and highly versatile neural network. It can be applied to practically all supervised, unsupervised, or pattern completion problems. A “Boltzmann Machine” is capable of learning the underlying constraints that characterize a domain simply by being shown examples from the domain. The network modifies the strengths of its connections so as to construct an internal generative model that produces examples with the same probability distribution as the examples it is shown. Then, when shown any particular example, the network can “interpret” it by finding values of the variables in the internal model that would generate the example. When shown a partial example, the network can complete it by finding internal variable values that generate the partial example and using them to generate the remainder.
One of the problems cited with gradient descent is that of falling into local minima. This has been identified as a deficiency of back propagation and its derivatives. The training philosophy adopted in Boltzmann learning although based upon gradient descent, is theoretically more likely to reach a global minimum than in BP. It’s able to do this because the machine also accommodates gradient ascent as well, at least for some of the time.
- Classifcation problems
- Image recognition
- Finanical correlations
- Define weights, outcomes and thresholds
- Run in a clamped or un-clamped input mode
- Calculate the weights or find the equilibrium state of a system