JISE


  [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21]


Journal of Information Science and Engineering, Vol. 31 No. 3, pp. 1071-1084


Clip Space Sample Culling for Motion Blur and Defocus Blur


YI-JENG WU1, DER-LOR WAY2, YU-TING TSAI3 AND ZEN-CHUNG SHIH1 
1Institute of Multimedia Engineering 
National Chiao Tung University 
Hsinchu, 300 Taiwan 
2Department of New Media Art 
Taipei National University of Arts 
Taipei, 112 Taiwan 
3Department of Computer Science and Engineering 
Yuan Ze University 
Chungli, Taoyuan, 320 Taiwan


    Motion blur and defocus blur are two common visual effects for rendering realistic camera images. This paper presents a novel clip space culling for stochastic rasterization to render motion and defocus blur effects. Our proposed algorithm reduces the sample coverage using the clip space information in camera lens domain (UV) and time domain (T). First, samples outside the camera lens were culled in stage I using the linear relationship between camera lens and vertex position. Second, samples outside the time bounds were culled in stage II using the triangle similarity in clip space to find the intersection time. Each pixel was computed within two linear bounds only once. Our method achieves good sample test efficiency with low computation cost for real-time stochastic rasterization. Finally, the proposed method is demonstrated by means of various experiments, and a comparison is made with previous works. Our algorithm was able to handle these two blur effects simultaneously and performed better than others did.


Keywords: motion blur, defocus blur, stochastic rasterization, sample test efficiency (STE), focal depth

  Retrieve PDF document (JISE_201503_16.pdf)