Vol.2 No.4 2010
7/68

Research paper : Development of a real-time all-in-focus microscope (K. Ohba)−237−Synthesiology - English edition Vol.2 No.4 (2010) image are measured to detect the peak. Once the distance to the image surface in focus is obtained, the distance to the object can be calculated by using the Gauss’ lens law.Whether the focus is correct can be determined by conducting the local spatial frequency analysis around the observation point of the image while moving the focal distance f, object distance l, or image distance l . The point with greatest spatial frequency is the point in focus. This method is used often in the automatic focus mechanism, and one can intuitively see that the out-of-focus area has low frequency while the in-focus area has high frequency. Basically, the focus of the lens is moved using the variable focus mechanism. The images are captured one by one, the local spatial frequency analysis around the pixel point is conducted for each image, the peak of the frequency or the area in focus are picked up from the images for each pixel, and these are pasted together as one image to obtain an all-in-focus image. The 3D data can also be obtained from the focal distance and image distance at each point.There are many methods of assessing the degree of focus of an image such as looking at the changes in brightness of the image while changing the focal distance. In this paper, considering the final product realization, we define the following equation for Image Quality Measure (IQM), to assess the local spatial frequency analysis of each pixel through spatial dispersion of image brightness value, for the reason that the image-processing algorithm can be easily implemented in the hardware. This IQM value was originally defined as one of the indices that indicate the clarity of the image, and was not for determining whether the image is in focus or not. However, we decided to use the IQM value since the processing algorithm can be easily adapted to higher speed in the future, with the assumption that the image is digitized and will be digitally processed.Here, (−Le , −Lr) − (Le , Lr) and (xi , yi) − (xf , yf) are small regions for conducting dispersion assessment and smoothing. D is the number of all pixels to which assessment is conducted for normalization at pixel unit. The IQM values are assessed for each pixel or region while moving the focal distance, the peak of the IQM value is detected, the object distance l is calculated from the focal distance f and image distance l , and then this is substituted in the matrix component for each pixel position to create the 3D data of the object.3.2 Configuration of sequential processing – for reduced load on memoryAs mentioned before, it is theoretically possible to obtain both the all-in-focus image and the depth image simultaneously, using the depth from focus method. However, in 2000 when we started this development, when calculating the algorithm for the IQM value, 2 Mbyte image memory and capture and processing of 30 shots/sec. × 30 frames = 900 images (about 3 min. using the PC in 2000) were necessary to obtain one all-in-focus image and depth image from about 30 images of 256 × 256 pixels in real time.In case it is necessary to obtain N number of different depth images to capture an all-in-focus image and a depth image at 30 frames/sec., an image capture device with high dynamic range that can shoot at 30 × N frames/sec. is required. Moreover, a high-speed processing system to process and display such volume of image data is required.In an automatic focus camera, to obtain the IQM value, the values for one or few points can be calculated and the focus can be moved according to the value. However, to obtain the all-in-focus image, calculations must be done efficiently within 33 ms for each pixel point.The method the authors devised for optimizing the memory constitutively through algorithm and for overcoming the limitation of this hardware will be described below. In the following chapter, the configuration for optimizing the processing speed using the hardware characteristics will be explained.When conducting these IQM processing at all pixel points, it was not efficient to process by temporarily storing the different pre-images with varying focal distances. Therefore we constructed a configuration using the sequential algorithm with steps (1)-(7) as shown below. Figures 4 and 5 show the main system configuration diagram and the flow chart, respectively.IQM = ΣΣ{ΣΣ I(x , y) −I(x+p , y+q)} x=xixfD1y=yiyfp=−Lcq=−LrLcLrFig. 3 Depth from focus method.ObjectLensOut of focusOut of focusFocal distanceIn focusIn focusImaging surfaceLocal frequency

元のページ 

10秒後に元のページに移動します

※このページを正しく表示するにはFlashPlayer9以上が必要です