Vol.2 No.2 2009
35/98
Research paper : A secure and reliable next generation mobility (Y. Satoh et al.)−115−Synthesiology - English edition Vol.2 No.2 (2009) in Fig. 13 right, the volume filled by the mounted device is about one-third of the volume of the cover. However, it does not stand out since there is an overall good balance. We even received a question, “Where is the computer?” (3) To show that this project is an AIST effort, the AIST logo is included in the design.5 Evaluations5.1 Experiments using the prototypeFigure 14 shows the basic obstacle detection experiment. (1)~(4) show four scenes in chronological order. For each scene, the direction of the joystick operated by the rider is shown in the right-top (top direction is the forward direction), the omni-directional image shot by the stereo omni-directional camera in the left-bottom (expressed as a sphere; the rider is in the center), and the screen of information display interface described in section 3.7 is shown in the right-bottom. In (1), the wheelchair approaches the obstacle (chair) in front and goes automatically into deceleration mode. Since the rider continues to push the joystick forward, the wheelchair stops automatically (2) right before collision with the obstacle. In (3), the rider pulls the joystick to start backing, but a pedestrian approaches from behind out of the view of the rider, and the wheelchair stops automatically at (4) due to danger of collision.Figure 15 is an example of an automatic stop after detecting a staircase. Since the wheelchair detects level differences and descending stairs as well as obstacles on the street, it can prevent falls in advance.Figure 16 and 17 show examples of gesture and posture detection function. In Fig. 16, the wheelchair stops in emergency since it detects that the posture of the rider differs greatly from the preliminary registered posture. When this situation lasts longer than preset time, it is possible for the system to automatically call for assistance by the cell phone. Figure 17 is an example where gesture detection and risk detection functions are used at the same time. When grabbing something or approaching something such as in order to press the elevator button from the electric wheelchair, if the rider cannot reach the target, the gesture of extending the arm can be used as a trigger to advance the wheelchair automatically toward the target while checking the safety. Specifically, in (1), the rider extends his arm for 3 sec. or more to grab a PET bottle, assistance begins in (2), and the electric wheelchair advances slowly. It stops automatically when the arm is retracted or before colliding with an obstacle (table in this example), and in (3) the rider succeeds in grabbing the PET bottle.This gesture recognition function determines the posture and gesture by simple comparison of the preliminary registered pattern with the 3D shape pattern roughly quantized by voxels as explained in section 3.6. This can be used only in detection of relatively large movements as in the example of Figs. 16 and 17. Some users who have disability of the arms have requested, “Can slight movement of the shoulder be recognized as a gesture?” In the future, we shall consider accurate recognition of fine movements by introducing machine learning approach.The function shown in Fig. 17 was requested by an actual wheelchair user, and was investigated for realization. Much experience is required to fine-position the electric wheelchair using the joystick. Particularly, approaching a table to grab something or approaching a wall to press the elevator button carry large risks because of the possibility of collision with the table or wall. To avoid such problems, human assistance may be sought when fine positioning is necessary. However, such assistance is required dozens or several hundred times a day, and this makes the rider of the electric wheelchair reserved and may prevent him/her from going outdoors. Can machines support seemingly minor but high-frequency assistance? That was the users’ request and we conducted investigations.These are the basic functions of the prototype, and we implemented several functions as their extensions. In Fig. 18, the wheelchair recognizes the nearest person, and automatically tracks the person face to face at 1 m distance. Since all directions are constantly monitored, it does not lose track even if the person makes sudden movements. In the future, we are considering a function that automatically tracks a certain person (such as an assistant) by using facial recognition technology. Figure 19 is an experiment of automatic route selection in a crowd. In Fig. 19(2), the wheelchair is surrounded by several people, but since it is observing all directions at once, it instantly decides the direction that it can take and escapes automatically in (4). In a crowd, the environment changes dynamically and constantly, and the situation may change if time is taken to gather information. Since the stereo omni-directional camera gathers information simultaneously in all directions, control is possible using the latest information at all times for all Fig. 15 Detection of descending stairs.The descending stairs and bumps/dips are detected. The wheelchair automatically decelerates or stops if it is determined that the situation is dangerous.STOPSlow Down
元のページ