Oe 24 18 20253

Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

Vol. 24, No.

18 | 5 Sep 2016 | OPTICS EXPRESS 20253

Real-time 3-D shape measurement with


composite phase-shifting fringes and multi-view
system
T IANYANG TAO, 1,2,3 Q IAN C HEN , 2,4 J IAN D A , 1,2 S HIJIE F ENG , 1,2 YAN
H U , 1,2 AND C HAO Z UO 1,2,*
1 Smart Computational Imaging (SCI) Laboratory, Nanjing University of Science and Technology, Nanjing,
Jiangsu Province 210094, China
2 Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense, Nanjing University of Science and
Technology, Nanjing, Jiangsu Province 210094, China
3 wowin110@163.com
4 chenqian@njust.edu.cn
* surpasszuo@163.com

Abstract: In recent years, fringe projection has become an established and essential method for
dynamic three-dimensional (3-D) shape measurement in different fields such as online inspection
and real-time quality control. Numerous high-speed 3-D shape measurement methods have
been developed by either employing high-speed hardware, minimizing the number of pattern
projection, or both. However, dynamic 3-D shape measurement of arbitrarily-shaped objects
with full sensor resolution without the necessity of additional pattern projections is still a big
challenge. In this work, we introduce a high-speed 3-D shape measurement technique based on
composite phase-shifting fringes and a multi-view system. The geometry constraint is adopted
to search the corresponding points independently without additional images. Meanwhile, by
analysing the 3-D position and the main wrapped phase of the corresponding point, pairs with
an incorrect 3-D position or a considerable phase difference are effectively rejected. All of the
qualified corresponding points are then corrected, and the unique one as well as the related period
order is selected through the embedded triangular wave. Finally, considering that some points
can only be captured by one of the cameras due to the occlusions, these points may have different
fringe orders in the two views, so a left-right consistency check is employed to eliminate those
erroneous period orders in this case. Several experiments on both static and dynamic scenes are
performed, verifying that our method can achieve a speed of 120 frames per second (fps) with
25-period fringe patterns for fast, dense, and accurate 3-D measurement.
c 2016 Optical Society of America
OCIS codes: (120.0120) Instrumentation, measurement and metrology; (150.6910) Three-dimensional sensing;
(120.5050) Phase measurement; (150.0150) Machine vision.

References and links


1. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: whither we are?” Opt. Eng. 48, 133–140 (2010).
2. X. Su and Q. Zhang, “Dynamic 3-D shape measurement method: a review,” Opt. Lasers Eng. 48(2), 191–204 (2010).
3. S. Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt.
Lasers Eng. 48(2), 149–158 (2010).
4. S Van der Jeught and Joris J. J. Dirckx, “Real-time structured light profilometry: a review,” Opt. Lasers Eng. (2016
in press).
5. Y. Gong and S. Zhang, “Ultrafast 3-D shape measurement with an off-the-shelf DLP projector,” Opt. Express 18(19),
19743–19754 (2010).
6. K. Liu, Y. Wang, D. L. Lau, Q. Hao, and L. G. Hassebrook, “Dual-frequency pattern scheme for high-speed 3-D
shape measurement,” Opt. Express 18(5), 5229–5244 (2010).
7. C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with
complex shapes,” Opt. Express 20(17), 19493–19510 (2012).
8. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,”
Appl. Opt. 22(24), 3977–3982 (1983).
9. Q. Zhang and X. Su, “High-speed optical measurement for the drumhead vibration,” Opt. Express 13(8), 3110–3116
(2005).

#267382 http://dx.doi.org/10.1364/OE.24.020253
Journal © 2016 Received 31 May 2016; revised 12 Aug 2016; accepted 14 Aug 2016; published 24 Aug 2016
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20254

10. X. Su and W. Chen, “Fourier transform profilometry:: a review,” Opt. Lasers Eng 35(5), 263–284 (2001).
11. L. Guo, X. Su, and J. Li, “Improved Fourier transform profilometry for the automatic measurement of 3D object
shapes,” Opt. Eng. 29(12), 1439–1444 (1990).
12. V. Srinivasan, H. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl.
Opt. 23(18), 3105–3108 (1984).
13. J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-
noise sensitivity,” J. Opt. Soc. Am. A 20(1), 106–115 (2003).
14. X. Su, G. Von Bally, and D. Vukicevic, “Phase-stepping grating profilometry: utilization of intensity modulation
analysis in complex objects evaluation,” Optics Communications. 98(1-3), 141–150 (1993).
15. H. S. Abdul-Rahman, M. A. Gdeisat, D. R. Burton, M. J. Lalor, F. Lilley, and C. J. Moore, “Fast and robust
three-dimensional best path phase unwrapping algorithm,” Appl. Opt. 46(26), 6623–6635 (2007).
16. G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and
phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. 38(31), 6565–6573
(1999).
17. Y. Wang, and S. Zhang, “Superfast multifrequency phase-shifting technique with optimal pulse width modulation,”
Opt. Express 19(6), 5149–5155 (2011).
18. Y. Zhang, Z. Xiong, and F. Wu, “Unambiguous 3D measurement from speckle-embedded fringe,” Appl. Opt. 52(32),
7797–7805 (2013).
19. Y. Wang, K. Liu, Q. Hao, D. L. Lau, and L. G. Hassebrook, “Period coded phase shifting strategy for real-time 3-D
structured light illumination,” IEEE Trans. Image Processing. 20(11), 3001–3013 (2011).
20. T. Weise, B. Leibe, and L. Van Gool, “Fast 3d scanning with automatic motion compensation,” 2007 IEEE Conference
on Computer Vision and Pattern Recognition. IEEE pp. 1–8 (2007).
21. D. Li, H. Zhao, and H. Jiang, “Fast phase-based stereo matching method for 3D shape measurement,” Optomecha-
tronic Technologies (ISOT), 2010 International Symposium on. IEEE pp. 1–5 (2010).
22. C. BrÃd’uer-Burchardt, C. Munkelt, M. Heinze, P. KÃijhmstedt, and G. Notni, “Using geometric constraints to solve
the point correspondence problem in fringe projection based 3D measuring systems,” International Conference on
Image Analysis and Processing. pp. 265–274 (2011).
23. Z. Li, K. Zhong, Y. Li, X. Zhou, and Y. Shi, “Multiview phase shifting: a full-resolution and high-speed 3D
measurement framework for arbitrary shape dynamic objects,” Opt. Lett. 38(9), 1389–1391 (2013).
24. P. Fua, “A parallel stereo algorithm that produces dense depth maps and preserves image features,” Machine vision
and applications 6(1), 35–49 (1993).
25. S. Zhang and P. S. Huang, “Phase error compensation for a 3-D shape measurement system based on the phase-shifting
method,” Opt. Eng. 46(6), 063601 (2007).
26. C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection
profilometry: A comparative review,” Opt. Lasers Eng 85, 84–103 (2016).
27. S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006).
28. S. Feng, Q. Chen, and C. Zuo, “Graphics processing unit–assisted real-time three-dimensional measurement using
speckle-embedded fringe,” Appl. Opt. 54(22), 6865–6873 (2015).
29. V. I. Gushov and Y. N. Solodkin, “Automatic processing of fringe patterns in integer interferometers,” Opt. Lasers
Eng 14(4-5), 311–324 (1991).
30. M. Takeda, Q. Gu, M. Kinoshita, H. Takai, and Y. Takahashi, “Frequency-multiplex Fourier-transform profilometry:
a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface
isolations,” Appl. Opt. 36(22), 5347–5354 (1997).
31. J. Zhong and Y. Zhang, “Absolute phase-measurement technique based on number theory in multifrequency grating
projection profilometry,” Appl. Opt. 40(4), 492–500 (2001).
32. K. H. Rosen, Elementary Number Theory and its Applications (Pearson Education,2005).
33. C. Zuo, Q. Chen, G. Gu, S. Feng F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement
for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51(8),
953–960 (2013).

1. Introduction
The dynamic measurement of three-dimensional (3-D) scenes becomes more and more important
in the fields of industrial quality control and human-machine interaction [1]. Due to its non-
contact, full-field, and high-resolution nature, fringe projection profilometry (FPP) has proven to
be one of the most promising techniques for measuring the motion or deformation of dynamic
objects. Recently, numerous dynamic 3-D measurement techniques have been developed based
on fringe projection [2–4]. Different from those techniques for applications with static objects,
the primary problem in this field is to reliably recover the 3-D information of moving objects or
dynamic scenes with minimized total acquisition time for reducing potential motion artifacts.
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20255

Generally, this problem can be solved in two directions. The first one is to increase the speed of
pattern projection and image acquisition, which could be realized by the using the digital-light-
processing (DLP) technology and high-frame-rate hardware, as demonstrated by [5–7]. The other
direction focuses on minimizing the number of projected patterns. However, the unambiguous
measurement of 3-D dynamic scenes in the presence of large discontinuities and spatially isolated
surfaces with full sensor resolution and minimal projected patterns is still a challenge.
To deal with this problem, some researchers resorted to Fourier transform profilometry
(FTP) [8, 9], by which only a single fringe pattern is sufficient to retrieve the phase. However,
due to the global character of the Fourier transform, the phase calculated at an arbitrary pixel
depends on the whole recorded fringe pattern, making FTP methods suffer from frequency band
overlapping problem caused by variations in object texture and excessive object slopes [10].
Although this problem could be remedied by the π phase-shifting FTP [11], the demand for an
additional fringe pattern would influence the instantaneous feature of the FTP to some extent.
Besides, the measurement accuracy of FTP is seriously limited by information about the noise
and bandwidth of the modulating signals of the pattern to be analyzed.
Another well-known method used in 3-D measurement is phase-shifting profilometry (PSP)
[12]. Compared to FTP, PSP has the advantages of higher accuracy, larger resolution, and greater
insensitivity to ambient light [13, 14]. Theoretically, at least three fringe patterns are required
to get the wrapped phase in PSP. Rahman et al. [15] utilized spatial phase constraint to unwrap
the phase and only three patterns were employed. This method bases on an assumption that
the height jumps of the measured object are less than a fringe period, so this method is not
qualified for measuring discontinuous or step-like surfaces with step height more than 2π which
is common in practical measurement. In fact, the extra information or constraint is necessary to
judge the mentioned irregular jump in period order. In [16, 17], at least three more fringe patterns
are projected to get extra phase information which is used to help recover the absolute phase
of an arbitrary surface. However, these two methods do not make full use of the redundancy
in PSP, with which the additional fringe patterns could reduce to less than three. Liu et al. [6]
and Zuo et al. [7] presented the dual-frequency pattern scheme and 2+2 phase shifting method
respectively where only five or four fringe patterns are adequate to retrieve a complex phase map
with the presumption that the average intensity and even the modulation keep constant within
a certain period of time. Nevertheless, the scheme with five or four fringe patterns is not the
most efficient way to unwrap the phase. Moreover, the additional patterns are not desirable for
high-speed measurement. To further reduce the sensitivity to dynamic scenes, several methods
were presented which can calculate the absolute phase of a complex surface using only three
fringe patterns. These techniques can be divided into two categories. The first one is the pattern-
embedded method, in which the special signal such as speckle [18] or triangular wave [19]
is embedded into original fringe patterns, and in this way extra information is obtained for
phase unwrapping. However, the algorithm in speckle-embedded scheme [18] is based on image
correlation which is time-consuming, so this limits the measurement speed in time-critical
conditions. For the case where the encoding signal is a triangular wave [19], the related method is
sensitive to noise, which means the measurement precision cannot be guaranteed. In the second
technique, a multi-camera system is employed to eliminate the ambiguity of period order with
geometry constraint [20–23]. Generally, only the rough period orders are available using this
geometry constraint, so an optimization algorithm or an accurate depth constraint is necessary
to refine the results in this method. In [20], graph cut and loopy belief propagation are used as
an optimization but it results in very high computation costs. Following this work, Burchardt et
al. [22] and Li et al. [23] utilized measurement volume to restrict the correspondence searching
range on the epipolar line. Without the complex optimization, a high-speed measurement rate
still can be obtained with a priori of measurement volume. However, to keep the robustness of
choosing the correct correspondence, the measurement volume must be set small enough when
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20256

using dense fringes.


In this work, a multi-view phase unwrapping method using three composite fringes is proposed.
The composite patterns are generated by embedding the triangular wave in the original phase-
shifting fringes under the guidance of the number theory to ensure the uniqueness for each
point. Benefited from the geometry constraint, each pixel can explore its corresponding points
independently through the wrapped phase map, and in this process, by analysing the 3-D position
of each corresponding point, the correspondence whose 3-D position exceeds the boundaries
of the geometry constraint is ruled out. Meanwhile, the correspondence having a considerable
difference in the wrapped phase of the sinusoidal wave is also eliminated. After these steps, few
candidates remain, then we just select the one with the closest value in the triangular wave as
the optimal correspondence. However, in multi-view system, not all points can be captured by
two cameras due to the existence of shaded areas, which means some points have no correct
correspondence. As the final period order of these points in the first camera and that of their
corresponding points in the second camera always have different values, left-right consistency
check is qualified to eliminate the period order in this case [24]. Compared with traditional
techniques, several merits exist in this system: (1) the sensitivity to movement decreases because
no additional patterns are required; (2) the robustness of phase unwrapping improves and this
character is useful to high-precision measurement; (3) there are no complex algorithm, which
ensures the high speed in our system. These features make our method suitable for high-speed 3-
D measurement with high precision. To accelerate the calculation, all these steps are implemented
on GPU and the rate of 3-D reconstruction reaches 120 fps with a resolution of 644 × 484.

2. Principle
2.1. Three-step phase-shifting algorithm
Phase-shift profilometry (PSP) is a well-known fringe projection method for the retrieval of
the 3-D information. A set of phase-shifted sinusoidal patterns are projected, and the phase
is calculated at each point. The minimum number of images is three, but more images will
improve the accuracy of the reconstructed phase. However, considering sensitivity to dynamic
scenes, we use three images, elsewhere this allows the images to be projected at high speed using
the modified DLP. The intensities for each pixel of the three images can be described by the
following formulas assuming a linear projector, a linear camera, constant lighting, and a static
object during the recording interval [7, 12, 25]

I1 = r (Idc + Iam1 ) + Iam2 + r I mod cos(φ − θ),


I2 = r (Idc + Iam1 ) + Iam2 + r I mod cos(φ), (1)
I3 = r (Idc + Iam1 ) + Iam2 + r I mod cos(φ + θ),

where I1 , I2 and I3 are the recorded intensities, r is the reflectivity, Idc is the DC component,
Iam1 and Iam2 are the ambient light with and without reflection respectively, Imod is the signal
amplitude, φ is the phase, and θ the constant phase-shift. The phase φ corresponds to projector
coordinates computed as
xp
φ= 2πN , (2)
w
where x p is the projector x coordinate, w the horizontal resolution of the projection pattern, and
N the number of periods of the sinusoidal fringes. This means that if phase φ is known, the 3-D
position can be calculated using calibration parameters between camera and projector [27]. The
wrapped phase φ0 (0, 2π) can be calculated as follows [7, 12, 25]
θ  (I1 − I3 )
!
φ0 = arctan tan . (3)
2 (2I2 − I1 − I3 )
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20257

In our system, we use a shift offset of θ = 2π/3 which gives the following final expressions for
phase φ0 √
3(I1 − I3 )
φ = arctan
0
,
(2I2 − I1 − I3 )
I1 + I2 + I3
r (Idc + Iam1 ) + Iam2 = , (4)
3
s
(I3 − I1 ) 2 (2I2 − I1 − I3 ) 2
r I mod = + ,
3 9
r I mod is calculated to remove the unreliable points with weak reflectivity. In fact, φ0 derived
from Eq. (4) is known as the wrapped phase, and the relationship between the unwrapped phase
φ and φ0 can be written as follows
φ = φ0 + 2kπ, k ∈ [0, N − 1], (5)
where k, an integer, is the so-called period order. If k is known, φ can always be computed, even
if some of the wrapped phase information is missing. However, based on the fact that spatial
approaches fail for discontinuous targets, k is not available using just three high frequency fringe
patterns in traditional PSP methods. In order to establish the accurate and reliable correspondence
between camera and projection without ambiguity, a phase unwrapping algorithm is needed to
construct the continuous phase map [26]. In this system, the phase unwrapping is based on the
multi-view system and composite fringe patterns. These patterns are designed by embedding
the auxiliary signal into original phase-shifting fringe patterns, so the intensities captured by the
camera are rewritten as
I1 = r (Idc + Iam1 + Ie ) + Iam2 + r I mod cos(φ − θ),
I2 = r (Idc + Iam1 + Ie ) + Iam2 + r I mod cos(φ), (6)
I3 = r (Idc + Iam1 + Ie ) + Iam2 + r I mod cos(φ + θ),
where Ie is the embedded signal. Then Eq. (4) is updated as

3(I1 − I3 )
φ = arctan
0
,
(2I2 − I1 − I3 )
I1 + I2 + I3
r (Idc + Iam1 + Ie ) + Iam2 = , (7)
3
s
(I3 − I1 ) 2 (2I2 − I1 − I3 ) 2
r I mod = + .
3 9
The accurate Ie could not be derived from the Eq. (7), but considering that Iam1 as well as Iam2
is much smaller than Ie , and Idc is an constant value, a substitute value Ie0 for Ie is utilized in our
paper. Meanwhile, in order to avoid the extra calculation and eliminate the effect of reflectivity,
the Ie0 can be computed as
r (Idc + Iam1 + Ie ) + Iam2
Ie 0 = . (8)
r I mod
The intensity of embedded signal will be replaced with Ie0 in the following sections.

2.2. Choice of encoding signals


Either the speckle signal or the triangular wave is the main-stream auxiliary information in 3-D
measurement [18, 19, 28]. In this section, the performance of these two signals is reevaluated
combining with the multi-view system to explain which one is suitable in our system.
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20258

Image distortion, including stretch, skew and rotation, caused by perspective transformation is
a common phenomenon in the multi-camera system and its impacts on embedded signal must be
taken into account. As shown in Fig. 1, p, pcorr are a pair of corresponding points in the left

Pcorr Pcorr2
P

(a) (b)

(c) (d) (e)


Fig. 1. Illustration of distortion and deviation in multi-view system. (a) Speckle pattern
derived from the composite patterns in the first camera; (b) Speckle pattern derived from
the composite patterns in the second camera; (c) Subimage around point p in (a); (d)-(e)
Subimages around corresponding points of p.

image and the right image, but the same-size sub-images shown in Figs. 1(c) and 1(d) centering
on these two points contain some difference, which exerts negative effect on correlation algorithm
in Eq. (9)
P 0 0 0 0
s,s corr (Ie − Ie )(Ie (pcorr ) − Ie (pcorr ))
corr = q q , (9)
P 0 0 2 P 0 0 2
s,s c orr (Ie − Ie ) s ,s c orr (Ie (pcorr ) − Ie (pcorr ))

where Ie0 is the intensity of the point in the neighborhood s around p and Ie0 (pcorr ) the one in
the neighborhood s corr around pcorr . Ie0 and Ie0 (pcorr ) denote the average intensity of Ie0 and
Ie0 (pcorr ), respectively. On the other hand, the pcorr calculated with calibration parameters
has a deviation compared with the ideal one if considering the system error, as well as the
neighborhood s corr , which is also shown in Figs. 1(c) and 1(d). Because either the distortion
caused by perspective transformation or deviation resulting from system error makes s not match
s corr correctly, the correlation result in Eq. (9) is no longer ideal. As is shown in Figs. 1(c) and
1(d), although these two sub-images correspond to each other, the corr is only 0.0484, and this is
lower than 0.0996 of Figs. 1(c) and 1(e) with the false correspondence. From Eq. (9), it is easily
found that Ie0 , Ie0 (pcorr ) and corr are calculated with all pixels in s and s corr , especially the
sqrt operation is involved in this process. Besides, since Figs. 1(a) and 1(b) could vary over time,
this computation result is unable to be estimated from a pre-formulated look-up-table (LUT).
Although the calculation of corr between different pixels can be accelerated by GPU owing to
its parallel feature, the calculation of a single pixel is still time-consuming. The above discussion
reveals that the composite pattern encoded with speckle is inappropriate to multi-view system
because of its weak recognition ability and intensive computation.
Being different from the random property of the speckle pattern, the character of the triangular
wave is periodicity, so if encoding signal becomes the triangular wave an extra task is to select
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20259

the appropriate period numbers to guarantee a unique information for each point. The basic
principle of number-theoretical phase unwrapping [29–31] can act as the guideline to design
the composite pattern embedded with the triangular wave. In this composite signal, the main
high frequency phase in Eq. (7) and the phase of the triangular wave should meet the following
relations
φ = φ0 + 2kπ
(10)
φe = φe 0 + 2k e π,
where k as well as k e is the period order of the main signal and the triangular wave respectively.
Assuming n, ne stand for the total number of fringes for corresponding signals, then Eq. (11)
can be obtained
ne φ = nφe . (11)
Combining Eqs. (10) and (11) yields

nφe 0 − ne φ0 /2π = kne − k e n.



(12)

The key destination for composite pattern design is to ensure the uniqueness of the (nφe 0 − ne φ0 )
for each pixel in the phase map. Generally, the number of fringes should be integers, so the
right hand side of Eq. (12) is an integer. Therefore, the left-hand side must also be the same
integer. The values of the left-hand side are used as the entries to determine these two integers
uniquely if n and ne are prime to each other [26, 31–33]. For simplicity of explanation, we
consider an example with n = 5 and ne = 3. Figures 2(a) and 2(c) show changes of two wrapped
phase maps along the absolute phase axis where the first wrapped phase map has n = 5 and
the second has ne = 3. The value of (5φe 0 − 3φ0 ) /2π are plotted in Fig. 2(e) from which it
is easily observed that there is a one-to-one relationship between the (k , k e ) and the value of
(5φe 0 − 3φ0 ) /2π. Theoretically, a absolute phase map could be derived from these composite
patterns without any other assistance, but from [31] we know if the maximal phase error is larger
than π/(n + ne ), mistakes will occur in determining the fringe orders. So it must be guaranteed
that the relationship n + ne < π/∆φmax can be established. In this composite pattern scheme,
the effect of ambient light is unavoidable, which undoubtedly increases the value of ∆φmax
and require the π/(n + ne ) to be small enough for robust phase unwrapping. Paradoxically, to
guarantee the high precision, n should be as large as possible. In this case, the relationship
n + ne < π/∆φmax is no longer valid as well as the one-to-one relationship between the (k , k e )
and the value of (5φe 0 − 3φ0 ) /2π. As is shown in Figs. 2(b), 2(d), and 2(f), the larger n + ne is,
the more errors in (nφe 0 − ne φ0 ) /2π when ∆φmax break through the boundary. This contradiction
can be alleviated from two perspectives in traditional methods: the first one is reducing ∆φmax
by projecting the triangular wave separately [31], and the other one is limiting the number
of fringes to decrease n + ne [19]. However, these methods are not satisfied ways because of
sacrificing the anti-motion ability or high-precision property in measurement. Note that small
changes in (nφe 0 − ne φ0 ) /2π (e.g.±1) typically result in large changes in the fringe order k [26],
that is, the ambiguities are very likely to emerge in non-adjacent period. Given this point, the
geometry constraint is an effective instrument in eliminating ambiguities of period order when
n + ne < π/∆φmax is not established since this constraint can easily reject those period orders
away from the correct one. In this paper, a multi-view system is introduced to address the paradox
without the defects in traditional methods. Several more reasons are taken into account for this
scheme to explain the feasibility: (1) the composite patterns designed with number theory ensure
the uniqueness in (φ0 , φe 0 ) theoretically; (2) assuming the wave length of embedded signal is
100 pixel, then the rate of phase change is only 0.06 rad/pixel. Compared to the effect of noise,
the influence resulted from deviation or distortion in multi-view system is very marginal; (3)
only simple computation is involved in this scheme to guarantee the speed of phase unwrapping.
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20260

൫5𝜙 ′ 𝑒 − 3𝜙 ′ )Τ2 𝜋
(a) (b)
൫9𝜙 ′ 𝑒 − 7𝜙 ′ )Τ2 𝜋

(c) (d)
൫21𝜙 ′ 𝑒 − 19𝜙 ′ )Τ2 𝜋
൫5𝜙 ′ 𝑒 − 3𝜙 ′ )Τ2 𝜋

(e) (f)

Fig. 2. Illustration of the influence of noise in composite patterns embedded with the
triangular wave.
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20261

2.3. Excluding false period orders using geometry constraint and the difference of the
wrapped phase
As mentioned above only the wrapped phase φ0 can be calculated, and Eq. (5) shows that for
each pixel N possibilities exist. This means that the recorded phase can originate from exactly N
different positions in the projector image (see Eq. (2)), and thus N possible 3-D positions.
For each possible period k, the 3-D position is calculated using calibration parameters between
the first camera and the projector. From Fig. 3, it can be found that only the area insides the red

Zmax

...
Candidates
Usable area
Visible area for
two cameras

Zmin
...
P

Baseline

Camera 1 Camera 2

Fig. 3. Illustration of the geometry constraint.

dotted lines can be captured by both two cameras, which means a possible 3-D position beyond
this area is not reasonable and the corresponding period k is not true. From [23], we know that
the smaller disparity range can get a shorter search line, and in that work, to keep the robustness
of choosing the correct correspondence, the measurement volume must be set small enough when
using dense fringes. In this paper, a much more free disparity range is allowed as long as the
fringe modulation of the captured images to be within an usable range. With this depth constraint,
some of the candidates are rejected in this step without any other computation. Note that both
projector and cameras are calibrated internally as well as externally, so the resulting 3-D point
between the red solid lines in Fig. 3 is projected into the second camera, which is shown in Fig.
4(b). Then the range of candidate period is limited to a small interval before next optimization.
For an efficient implementation on the GPU, several LUTs are employed to calculate the 3-D
positions and the corresponding projection points in the second camera.
It is well known that the wrapped phase φ0 in Eq. (7) of p ought to have a high degree of
similarity to the correct matched point in the second camera. Theoretically, the correspondence
with the minimal difference of the wrapped phase is the correct one, but noise as well as system
error makes this hypothesis unreliable so all the candidates perform similarly in the wrapped
phase should be reserved for prudential reasons. In other words, only if the difference between p
and its corresponding point in the wrapped phase exceeds a threshold, the related candidate will
be ruled out. The value of the threshold depends on the maximum offset caused by system error
in the wrapped phase of the sinusoidal signal, and in this paper it is 0.6 rad. Besides, considering
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20262

(a) (b) (c)

(d) (e) (f)

(g) (h) (i)

Fig. 4. Information of an arbitrary point and its corresponding points in another camera. (a)
A point in the first camera; (b) The corresponding projection points in the second camera; (c)
Wrapped phase of sinusoidal signal and the intensity of the triangular wave of any point in
the first camera; (d)-(i) Wrapped phase of sinusoidal signal and the intensity of the triangular
wave of projection points in period 10, 11, 12, 13, 14, 15.
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20263

the phase jumps in discontinuous region of the wrapped phase, another threshold 5.7 rad is set to
avoid mistakes in this case. Finally, the candidates satisfy the following formula

d (pcorr (k)) = abs φ0 − φ0 (pcorr (k)) , d (pcorr (k)) < th1 k d (pcorr (k)) > th2 (13)


will be reserved. Where pcorr (k) stands for corresponding point of p when the period order is k,
and φ0 (pcorr (k)) the wrapped phase of pcorr (k). th1 and th2 represent the threshold with the
value 0.6 rad and 5.7 rad respectively. It can be seen in Figs. 4(c)-4(i), the candidates calculated
with the period order 11, 12 and 13 are reserved according to Eq. (13). The further algorithmic
step is therefore aimed at confirming the final correspondence in the rest candidates.

2.4. Confirming final period order with the triangular wave


Because of the system error, the p0 s projection point in second camera has a minor deviation
compared to the ideal one. This deviation only affects slightly in the last step since the setting of
suitable thresholds gives some margin for this system error. However, the unique correspondence
with the minimum difference in the triangular-wave map should be determined, so it is necessary
to correct the position of the rest candidates and this process is operated in the wrapped phase of
the sinusoidal wave. Referring to Eq. (13), searching the wrapped phase closest to φ0 through the
5 × 5-sized neighborhood for each candidate is not difficult. Once the closest wrapped phase is
obtained, the point corresponding to this phase will take place of the original candidate in the
following steps.
Supposing pcorr (k) 0 is the point after correction in the second camera, and Ie0 pcorr (k) 0

the corresponding intensity of the triangular wave, then on account of the similar principle with
previous statement, Eq. (13) can be rewritten as

f pcorr (k) 0 = abs Ie0 − Ie0 pcorr (k) 0 .


 
(14)

As a result of previous optimization, a few candidates remain in this step, which increases the
possibility to select the correct k value when the relationship n + ne < π/∆φmax in section 2.2
is not established. Certainly, in order to get a more stable result, using the mean value in a 5 ×
10-sized neighborhood s to replace the center point may depress the noise to some extent. Then
Eq. (14) will update as
 
X X 
f pcorr (k) = abs 
0  0
Ie − Ie pcorr (k)  .
0 0 
(15)
s s

However, considering the symmetry of the triangular wave, when p and pcorr (k) 0 are located
at symmetric position of the triangular wave, Eq. (15) is found unable to distinguish these two
points. The first method occurs to us is computing the gradient map to indicate direction for
each point to remove the ambiguity in a symmetric position. Unfortunately, the noise makes the
gradient map unreliable. In this method, the neighborhood in Eq. (15) is split into left and right
component, and the formula is refined as
 
X X  
f pcorr (k) = abs 
0  0
Ie − 0 0
Ie pcorr (k) 
le ft le ft
  (16)
 X X  
+abs   0
Ie − Ie pcorr (k)  .
0 0

right right

As shown in Fig. 5, it is nearly impossible for traditional method to distinguish the points in three
vertical lines from each other because of the ambiguity resulted from similar wrapped phase
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20264

Fig. 5. Wrapped phase of sinusoidal signal and the intensity of triangular wave in line 330
of the first camera.

in these positions. On the contrary, confirming the correct period order is completely feasible
through several steps in the proposed method, and this is shown in Figs. 4(c)-4(i).
All of the previous discussion is based on a general assumption that the point p can be captured
by both the two cameras. However, in multi-view system, not all points can be captured by
two cameras owing to the existence of shaded areas, which means some points have no correct
correspondence, so period order error emerges in these areas just displaying in a single camera.
As the final period order of these points in the first camera and the one of their corresponding
points in the second camera always have different values, left-right consistency check is qualified
to eliminate the period order in this case [24].

3. Experiments
From the analysis of the correspondence method, the procedures required to determine the period
order uniquely and accurately is summarized. It mainly includes three steps:
Step 1: Preparation work before the measurement. It includes obtaining the calibration param-
eters of this system, setting the threshold of the geometry constraint, and generating composite
fringe patterns.
Step 2: Initial elimination. Geometry constraint is used to explore the corresponding point
and eliminate some unreasonable candidates with the geometry constraint, then two or three
candidates are retained according to the similar degree in the wrapped phase of the sinusoidal
signal.
Step 3: Correspondence determination and error elimination. In this step, the rest corresponding
points are corrected in a 5 × 5-sized neighborhood, then the final corresponding point is
distinguished from rest candidates through the intensity of the triangular wave, and left-right
consistency check is utilized to reduce the error caused by the shaded areas [24].
Step 2 and Step 3 are detailed in Fig. 6.
An experimental system is developed to validate the actual performance of the proposed
strategy. This system includes a DLP projector (Light Crafter 4500) and two digital CMOS
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20265

Composite patterns captured


by the first camera

The point with


an unknown After Step 3,
period order the period
By Eq. (7) By Eq. (7) order is known
and Eq. (8)

P P P
A wrapped phase of main signal The correspondence is confirmed,
derived from composite patterns then the related period order will
is used in Eq. (13) to reject some be used for phase unwrapping
unreasonable candidates.

Wrapped phase map Intensity map of


of sinusoidal signal Step 2 triangular wave Step 3

The candidates in red and purple


rectangular box are rejected by The unique point having the closest value
depth constraint and difference of with p in yellow rectangular box is selected
wrapped phase respectively. as the correct corresponding point

By Eq. (7)
By Eq. (7) and Eq. (8)
Candidates for P
with different
period orders
Composite patterns captured
by the second camera

Fig. 6. The process of confirming period order for point p before left-right consistency.

cameras (AVT GigE Mako G-030B), and the two cameras are synchronized with the DLP
projector. The camera resolution is 644 × 484 with a maximum frame rate of 309 fps, and the
projector has a resolution of 912 × 1140.
Firstly, a ceramic plate was measured to present the performance of Step 2, Step 3, and
validate the precision by the way. Then accuracy was analyzed with several static objects
including isolated objects and complex surfaces. Finally, deformation objects were measured to
evaluate the validation of our method in real-time 3-D measurements. Note that the measurement
range in multi-view system is not infinite, so in these experiments, a depth threshold is set as
-200mm and 200mm to guarantee the fringe modulation of the captured images to be within an
usable range.

3.1. Precision analysis


A ceramic plate was measured, and the performance of the related step is shown in Fig. 7.
Although geometry constraint removes quite a lot of erroneous correspondence, as shown in
Figs. 7(a) and 7(c), there is still a great possibility that a false value is selected according to the
wrapped phase in Step 2. In Step 3, another optimization basing on encoding signal is employed
to determine the final period order, and Figs. 7(b) and 7(d) present the final result. The blue
region in Fig. 7(b) is the area that cannot be captured by both cameras, so the period orders in
this region are unreliable and can be removed by left-right consistency check. The same ceramic
plate is also used to validate the precision of the proposed method, Fig. 8 is the measurement
result. The MSE between the acquired points and approximating points are largely due to the
random noise of the system, which measured approximately 0.0683 mm.

3.2. Static scenes measurement


The further experiments were carried out to verify the accuracy of our system with arbitrary
shape. A number of different objects were measured including a desk fan, the tooth model,
the Doraemon model and the statue of David. The 3-D results obtained by our method are
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20266

(a) (b)

(c) (d)
Fig. 7. Performance of Step 2 and Step 3. (a) The period order after Step 2; (b) The period
order after Step 3; (c) Line 333 of (a); (d) Line 333 of (b).

(a) (b)
Fig. 8. Measurement result of the ceramic plate. (a) 3-D reconstruction; (b) Height in 300
column.
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20267

shown in Fig. 9. The fan blades are separated from each other, so this object is suitable for

Z(mm)

Z(mm)
Y(pixel)
X(pixel)
Y(pixel) X(pixel)
(a) (b)
Z(mm)

Z(mm)

X(pixel) Y(pixel) Y(pixel) X(pixel)


(c) (d)
Fig. 9. Measurement results of several objects. (a) Desk fan; (b) Statue of David; (c) Tooth
model; (d) Doraemon model.

testing if a discontinuous surfaces could be reconstructed correctly, and Fig. 9(a) shows the
3-D reconstruction. In order to prove that our method can also be applied for complex shape,
the statue of David was adopted, and the result is shown in Fig. 9(b). The tooth model and the
Doraemon model shown in Figs. 9(c) and 9(d) were employed to enhance the credibility of
accuracy analysis. Quantitative analysis results about accuracy of the these objects are shown in
Table 1. The number of points is the sum of points before left-right consistency check. To ensure

Table 1. Accuracy results of the proposed method


Object Correctness(%) Error(%) Missing(%) Number of points
Doraemon 96.66 0.035 3.3 79253
Tablet 98.76 0.038 1.2 135051
David 93.13 0.066 6.8 74917
Fan 97.10 0.220 2.7 44289

objectivity of accuracy analysis, these objects were also measured by traditional multi-frequency
PSP with the same high-frequency fringe in our method. Then the ratio of correctness and
error is easily obtained by making a comparison between them after all optimization including
left-right consistency check. The missing ratio means the proportion of points which perform
badly in left-right consistency check, and most of them are captured by a single camera so that
no sufficient information can be used to make a accurate judgement. From Table 1, it can be
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20268

seen that the error ratio in our method is lower than 0.3%. When the measured surface is flat
such as tablet, the miss ratio is also at a low level. As the surface becomes more complex, there
are more shaded areas, and the missing ratio becomes larger as expected. Although shaded
areas also make trouble in traditional method, they will be expanded when another camera is
introduced in our technique. Fortunately, the mistakes in these areas can be removed by left-right
consistency check in multi-view system. These results indicate that the correspondence method
can measure arbitrary shape surfaces such as discontinuous surfaces and complex surfaces with
high correctness ratio.

3.3. Real-time measurement


After precision and accuracy test, the next two experiments were carried out to verify the real-
time performance of our system. Three phase-shift composite fringes are fused into a 24-bit
pattern stored in Light Crafter 4500 where the R, G, and B digital channel represent a 8-bit
composite fringe and are projected within 8500 us respectively. Meanwhile, the two cameras
synchronized by the trigger signals from Light Crafter4500 capture each projection pattern with
an exposure of 9000 us. Then the obtained images are processed according to the mentioned
steps which are implemented on GPU. The computer used in this system is a DIY computer with
an Intel i7-4790k CPU and a NVIDIA GeForce GTX970 graphics card. The graphics card has
1664 CUDA cores and 4 GB memory with a maximum bandwidth of 224 GB/sec.
We first measured a moving palm to show the 400mm measurement depth in our system and
the less sensitivity to motion. In order to display the 3-D reconstruction better, the point cloud is
translated into surface structure assisted with lighting and rendering. Fig. 10(a) shows several
results of the palm and the more details are presented in Visualization 1 and Visualization 2.
Then the facial expressions were tested to verify the performance of our system when measuring
a complex surface in dynamic condition, the related 3-D reconstruction results are shown in Fig.
10(b) as well as Visualization 3 and Visualization 4. These two experiments testify the ability of
our system to realize fast 3-D acquisition for dynamic scenes.

(a)

(b)
Fig. 10. Dynamic measurement results (associated Visualization 1, Visualization 2, Visual-
ization 3, and Visualization 4). (a) Palm measurement; (b) Facial expressions.
Vol. 24, No. 18 | 5 Sep 2016 | OPTICS EXPRESS 20269

4. Conclusion
In this work, a novel high-speed 3-D measurement technique for recovering absolute phase and
measuring spatially isolated objects with full revolution under dynamic conditions has been
presented. This paper combines composite phase-shift fringe and multi-view system to strengthen
the robustness of phase unwrapping. Compared to conventional temporal phase unwrapping
method, only three composite fringe patterns are sufficient to obtain the absolute phase of
complex surface, which makes our method less sensitive to dynamic scenes. On the other hand,
being different from traditional composite-pattern scheme, the multi-view system is introduced
to address the contradiction between precision and accuracy in high-speed 3-D measurement.
Meanwhile, no strict depth constraint or high computation-cost algorithm is required with the
assistance of composite patterns. Experimental results have demonstrated the success of our
proposed method in its ability to produce fast, accurate, and precise depth information for
complex scenes.
Finally, it should be also mentioned that several defects exist in this method. One example is
that the amplitude of the high frequency fringe needs to be reduced to embed the extra signal,
so the SNR of phase shifting is lowered. In the process of embedding the triangular wave into
original fringe, only coprime requirements between two signals is analyzed but the optimal
scheme is not presented in this paper. Besides, the discussion about embedded signal is limited
to speckle and the triangular wave, which is not reasonable and this is anther defect. In the future,
we will explore some other schemes to design a better composite pattern for multi-view system.

Funding
National Natural Science Fund of China (NSFC) (11574152, 61505081, 61377003); ‘Six Talent
Peaks’ project (2015-DZXX-009); ‘333 Engineering’ research project (BRA2015294); Funda-
mental Research Funds for the Central Universities (30915011318); Open Research Fund of
Jiangsu Key Laboratory of Spectral Imaging & Intelligent Sense (3092014012200417).

Acknowledgments
C. Zuo thanks the support of the ‘Zijin Star’ program of Nanjing University of Science and
Technology.

You might also like