论文部分内容阅读
Monkey language models are defined for Chi-nese Phrase Networks, and scale-free features of Chinese Phrase Networks are uncovered. It is pointed out that the ratio of average degree to the total number of nodes ( k /N ) is close to a constant. Simulation for the evolution of phrase networks indicates that one of the important reasons for power law distributions is the word selection frequency, which, when tuned aptly, can make the monkey language present similar statistic traits as that of natural languages. Power law tails emerge at large k, and the exponent is about 6. Comparison between monkey model and natural language shows that humans are able to use Chinese words resources in more effective and compact ways to express their inten-tions. All the results demonstrate an important fact that the least effort principle is the basis of Chinese Phrase Networks.
Monkey language models are defined for Chi-nese Phrase Networks, and scale-free features of Chinese Phrase Networks are uncovered. It is pointed out that the ratio of average degree to the total number of nodes (k / N) is close to a constant Simulation for the evolution of phrase networks that that one of the important reasons for power law distributions is the word selection frequency, which, when tuned aptly, can make the monkey language like similar statistic traits as that of natural languages. Power law tails emerge at large k, and the exponent is about 6. Comparison between monkey model and natural language shows that humans are able to use Chinese words resources in more effective and compact ways to express their intentions. all the results demonstrate an important fact that the least effort principle is the basis of Chinese Phrase Networks.