Optimality Theory  
 
Paul Smolensky (Johns Hopkins University): Parallel Distributed Symbol Processing: Wellformedness optimization and discretization in cognition Does Optimality Theory provide a satisfactory basis for cognitive modeling of online processing? The description of OT as a competence theory—in which potentially infinitely many candidates are each evaluated by all constraints and then compared—is often mistaken as a performance model. But it is of course a basic characteristic of computation theory that an efficient algorithm (a processing theory) rarely corresponds in any direct way to the most insightful characterization (a competence theory) of the function the algorithm computed by the algorithm. OT computation was originally derived from a connectionistgrounded cognitive architecture, and in this talk I will describe how (continuous) connectionist networks can compute (discrete) outputs of OT grammars. These network computations bear no connection whatever to the sequential evaluation, by a sequence of constraints, of an infinite sequence of symbolic candidates. In fact, the only symbolic candidate ever represented in the processing system is the final output. Examples will be given illustrating gradient performance effects resulting from OT competence grammars in phonological production and syntactic comprehension. Outside the domain of language, an application of OT to crosscultural variation in moral systems—who sleeps with whom—will also be briefly described.
