Scheduled System Maintenance:
On May 6th, single article purchases and IEEE account management will be unavailable from 8:00 AM - 5:00 PM ET (12:00 - 21:00 UTC). We apologize for the inconvenience.
By Topic

Random Boolean nets and features of language

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Hurford, R. ; Dept. of Linguistics, Edinburgh Univ., UK

Describes an attempt to cast several abstract properties of natural languages in the framework of Kauffman's (1993, 1995) random Boolean nets (RBN). The properties are complexity, interconnectedness, stability, diversity, and underdeterminedness. A language is modeled as a Boolean net attractor. (Groups of) net nodes are linguistic principles or parameters as posited by Chomskyan theory, according to which the language learner sets parameters to appropriate values on the basis of very limited experience of the language. The setting of one parameter can have a complex effect on the settings of others. A RBN is generated to find an attractor. A state from this attractor is degraded, which represents the degenerate input of language to the learner, and this state is then input to a net with the same connectivity and activation functions as the original net to see whether it converges on the same attractor. Many nets degenerate into attractors representing complete uncertainty. Others settle at intermediate levels of uncertainty, and some manage to overcome the incompleteness of input and converge on attractors identical to that from which the original inputs were (de)generated. Finally, an attempt was made to select a population of such successful nets, using a genetic algorithm where fitness was correlated with an ability to acquire several different languages faithfully. This has so far proved impossible, supporting the Chomskyan suggestion that the human language acquisition capacity is not the outcome of natural selection

Published in:

Evolutionary Computation, IEEE Transactions on  (Volume:5 ,  Issue: 2 )