This is a short, popular account of some current work in machine learning, introduced with a description of convergence in perceptron-like networks. The author discusses Hopfield nets, which seek “minimum energy” patterns such that active units have maximal interfacilitation, and his own Boltzmann Machines, in which unit activation is probabilistic and the network settles to a “thermal equilibrium” in the presence of such background noise. The reference list contains a couple of more substantial papers.