2016 IEEE/ACIS 15th International Conference on Computer and Information Science (ICIS)
Download PDF

Abstract

This study aims to propose an Interactive Evolutionary Computation that creates sound contents for multiple users. Sound contents including music pieces and sign sounds are often used in our daily life for creating common atmosphere and transmitting a certain message to everyone. The proposed method is based on parallel distributed Interactive Genetic Algorithm (IGA), which creates visual media contents commonly suited to multiple users. In this method, each of the users proceeds general IGA process by evaluating solution candidates of the IGA. As a special property of this method, in some generations, solution candidates are exchanged between the users. With the exchange, each of the users is affected by other users' feelings. It is expected that good solution evaluated as best solution by all of the users is obtained. Based on the concept of the proposed method, we constructed an IGA system for fundamentally investigating efficiencies of the proposed method. Aim of the IGA system is to create a short music melody commonly affording bright image. Key of the notes was treated as gene of IGA. Music chord progression called Canon chord is attached to the melody when it is presented to the subjects. 10 persons participated in the experiment as subjects. In the IGA system, two users participated in the evaluation process simultaneously. Experimental results showed almost continuous increase in mean and maximum fitness values between the subjects. To clarify the efficiency of the exchange of the proposed method, further study with comparing experiment including conditions with/without the exchange is needed.

Related Articles