Vol.4 No.2 2011
21/66

Research paper : Toward the use of humanoid robots as assemblies of content technologies (S. Nakaoka et al.)−92−Synthesiology - English edition Vol.4 No.2 (2011) expressions[24]. In this system, the operation of the actuators in the head can be reflected in real time on the robot, and therefore, the changes of the fine facial expressions, which was difficult to render perfectly by CG, could be made directly. This system can bring out the expressive ability of the movement of HRP-4C.4.3 Method of using the motion captureIn the CG character animation, a method with which actions of an actual person are incorporated by motion capture is also widely used. A method for applying captured human whole body motion to the whole body motion of the bipedal humanoid was developed[25]. Using this method, the whole-body performance of a Japanese folk dance called Aizu Bandaisan was done by HRP-2[6].Comparing the motion capture method and the method developed in this research the former is, of course, more applicable in recreating the human actions. However, it must be noted that due to the limitations of the method and the limitations of the robot’s movements, the human action cannot be completely reproduced in the robot. On the other hand, to express the action unique to a robot or to create high quality action within the limit of the robot’s motion capacity, the latter method that choreographs the robot directly is more applicable. Moreover, the former requires a skilled performer that can move in a certain way, as well as specialized equipment and a studio, while the latter does not require such equipment and can be done easily on a PC.Considering the above characteristics, for the objective of the creation of new contents using the humanoid and the diffusion of its use, the method of this research will be used as the base. It is highly significant that this was realized for the first time. On the other hand, the motion capture method can be also useful, and one of the future topics will be to integrate the two methods.5 Voice expression support technologyThe voice of the robot is an important expression element for the content. Research has been done on the speech mechanism simulating the human vocal cord[26]. However, this is a large mechanism including the lungs, and cannot be currently installed in a humanoid like HRP-4C. Therefore, production of some voice source through the speaker is adequate as the source of speech. In this case, to make it look as if the robot is speaking, the robot’s mouth must move according to the voice source (lip synching). To obtain the vocal source, there are methods of using the human speech or using the voice synthesis technology. The difference of the characteristics of the two is similar to the difference between the two methods for creating the actions as mentioned in subchapter 4.3. In that sense, use of the voice synthesis will be more appropriate for our purpose.From the above considerations, the issue for voice expression would be to enable various speech and singing expressions with the voice synthesis technology in linkage with the mouth movement.To solve this issue, we developed a system using the VOCALOID[27], the song synthesis technology of Yamaha Corporation with whom we worked jointly, as the voice expression of HRP-4C[28]. The VOCALOID was developed to synthesize the singing voice, and is a technology that generates a singing voice very close to humans. Moreover, the VOCALOID-flex technology, where the technology is applied to produce natural speech with rich intonations, is available, and diverse voice expressions are possible. The system allows the robot to lip synch the voice data of the VOCALOID, and this enables easy creation of natural speech and singing performances by the robot.6 Integrated interfaceFor the implementation of the whole body motion choreography system mentioned in subchapter 4.2, it was necessary to implement the various functions including the management of various data, the display and operation of the 3D models, the sequential display of the key poses, and the dynamic simulations in a collaborative format, in addition to the essential key pose processing. To create the integrated expressions of the robot and to have the robot perform them, it is necessary to link the choreography system and voice expression support technology to the robot hardware, and to provide an interface that allows easy use by the user. Moreover, the usefulness as a content technology will increase further if the information and media technologies for the robot expression including the existing technologies and those that will be developed in the future can be collaboratively used. The motion capture technology mentioned in subchapter 4.3 is an example of such a useful technology.To realize the above, we developed the “Choreonoid framework”, a software framework for the development of the integrated interface. The interface for the technology developed and selected in this study was implemented on Fig. 6 Examples of expression creation(a)Smile(b)Surprise(c)Anger

元のページ 

page 21

※このページを正しく表示するにはFlashPlayer10.2以上が必要です