## Abstract

The measuring technique combining a phase-shifting algorithm and Gray-code light has been widely used in three-dimensional (3D) shape measurement for static scenes because of its high robustness and anti-noise ability. However, in the high-speed measurement, phase unwrapping errors occur easily on the boundaries of adjacent Gray-code words because of the defocus of the projector, the motion of the objects and the non-uniform reflectivity of the surface. To overcome this challenge, a high-speed 3D shape measurement method based on shifting Gray-code light has been proposed in this paper. Firstly, the average intensity of three captured phase-shifting fringe images are used as a pixel-wise threshold to binarize the Gray codes and to eliminate most phase unwrapping errors caused by the non-uniform reflectivity, ambient light variations, and the defocus of projector. Then, the shifting Gray-code (SGC) coding strategy is proposed to avoid the remaining errors of phase unwrapping on the edge of the code words. In this strategy, no additional patterns are projected, and two sets of decoding words with staggered boundaries are built in the temporal sequences for one wrapped phase. Finally, the simple, efficient, and robust phase unwrapping can be achieved in the high-speed dynamic measurement. This proposed method has been applied to reconstruct 3D shape of randomly collapsing objects in a large depth range, and the experimental results demonstrate that it can reliably obtain high-quality shape and texture information at 310 frames per second.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

## 1. Introduction

Optical three-dimensional (3D) profilometry has been widely used for machine vision, biomedical engineering, intelligent manufacturing and so on [1–3]. Among all measuring methods, fringe projection profilometry is one of research hotspots because of its high accuracy and flexibility. Recently, with rapid advancements on hardware equipment such as the high-speed camera and the digital light processing (DLP) projector [4–7], high-speed and high-accuracy 3D shape measurement have been increasingly sought by scholars.

Among established methods, Fourier transform profilometry (FTP) [8–10] is mostly suitable for high-speed 3D measurement because it uses only one fringe pattern to reconstruct one 3D result. The optical projection system based on physical grating can be designed for high-speed measurement, theoretically, the reconstructed rate only depends on the maximum recording speed. To our knowledge, Zhang and Su firstly applied FTP in high-speed measurement [11]. Then they obtained 3D shape and deformation of rotating blades using stroboscopic structured illumination [12] and achieved 3D reconstruction of the drumhead vibration at the rate of 1000 Hz [13]. Base on FTP, Zuo et al. demonstrated Micro Fourier transform profilometry [14], which can realize an acquisition rate up to 10,000 3D frame per second (fps). However, due to the limitation of the band-pass filtering, FTP method is difficult to measure objects with sharp edges, abrupt change or non-uniform reflectivity. Compared with FTP, phase shifting profilometry (PSP) is used more widely in optical metrology because of the advantages of high precision, high robust capability and easy accomplishment under computer control. In addition, with the development of DLP technique, DLP platform (e.g., DLP Discovery, DLP Light Commander and DLP LightCrafter) can project the binary images at a much faster rate (e.g. 20 kHz for DLP Discovery 4100). So, combining with the binary defocusing technique [15], high-speed 3D shape measurement based on PSP has been the hot research topic over past few years.

In PSP method, the use of arctangent function caused the wrapped phase with 2π discontinuity. So, temporal phase unwrapping (TPU) algorithms [16] are applied to eliminate the phase ambiguity and measure multiple isolated objects or objects with the abrupt change. Typical TPU approaches used in dynamic 3D shape measurement can be classified into two categories: multi-frequency (or multi-wavelength) approaches [17–20] and Gray-code approaches [21,22]. Multi-frequency approaches deal with the wrapped phase by projecting additional shifting fringe patterns with other frequencies. According to the different principles to eliminate the phase ambiguity, multi-frequency approaches can be further classified into hierarchical approach [17,18], heterodyne approach [19,20] and number-theoretical approach [23,24]. In high-speed 3D shape measurement, Wang and Zhang proposed multi-frequency (heterodyne) phase-shifting technique with optimal pulse width modulation [25] to develop a 556 Hz measuring system. However, it cost 9 patterns to reconstruct one result, which is relatively inefficient compared with two-frequency methods for high-speed measurement. Therefore, they applied two-frequency (hierarchical) technique [26] to measure the shape of live rabbit hearts. Then, Zuo et al. proposed high-speed measuring method using bi-frequency tripolar pulse-width-modulation and number-theoretical TPU approach to achieve 1250 Hz 3D measurement [27]. All the traditional two-frequency TPU approaches are easy to suffer the phase-unwrapping errors when the high frequency is chosen far higher than the low frequency [16]. To improve the period number of the high-frequency fringe pattern, Yin et al. applied the depth constraint in number-theoretical approach and improved the measuring accuracy at the cost of a limited measuring depth [28].

Gray-code approaches eliminate the phase ambiguity by projecting a set of binary Gray-code patterns, and *N* patterns can uniquely label 2* ^{N}* stripe periods. Gray-code-based TPU approaches have been widely used in 3D shape measurement for static scenes owing to its high robustness and anti-noise ability. However, it is still challenging for this method to realize a high-speed shape measurement. Because of the object motion and the defocus of the projector, phase unwrapping errors occur easily on the boundaries of adjacent Gray-code words. Wang et al. combined the conventional spatial phase unwrapping with Gray-code method to solve the motion-induced phase unwrapping errors [29]. This framework is great but the spatial phase unwrapping is time-consuming and not suitable for parallel computation. Laughner et al. projected additional white image with all “1” and black image with all “0” to binarize the Gray codes and restored the accurate rabbit cardiac deformation at 200 fps [30]. But 10 images are used to recover one 3D frame. Zheng et al. projected an additional binary pattern to construct the image with all “0.5” utilizing the defocus of the projector [31]. Then the image with “0.5” grayscale is used to binarize the Gray codes. However, the binarization reduced but not eliminated the phase unwrapping errors and the other correction such as median filter was needed to remove the remaining phase unwrapping errors [32]. Wu et al. proposed cyclic complementary Gray-code coding strategy to realize the simple and robust phase unwrapping in dynamic 3D shape measurement without projecting additional patterns [33]. But the error correction leads to a limited measuring depth range. So, how to realize the simple, efficient, robust and depth-unconstrained phase unwrapping based on Gray-code light in dynamic 3D shape measurement is still a problem needed to be solved.

To this end, high-speed 3D shape measurement technique based on shifting Gray-code light is proposed. Two essential problems are still remaining in dynamic 3D shape measurement to improve the robustness and reliability of the Gray-code-based TPU method without projecting extra patterns. The first one is how to set a reasonable and pixel-wise threshold to binarize the Gray codes. Because of the different reflectivity and ambient light variations, thresholds in different pixels are non-unique. A pixel-wise threshold should be obtained to reduce the errors caused by binarization. So in this paper, three-step phase-shifting sinusoidal fringes are produced using dithering technique [34], and the average intensity of three captured fringe images is used as a pixel-wise threshold to binarize the Gray codes. Consequently, the most phase unwrapping errors caused by the different reflectivity, ambient light variations and the defocus of projector can be eliminated. The second problem is how to eliminate or avoid the remaining phase unwrapping errors mainly caused by motion and noise on the edge of the code words. In our work, shifting Gray-code (SGC) coding strategy is proposed. All the traditional Gray codes are shifted by half period in odd projected pattern sequences, so two sets of decoding words whose boundaries are staggered can be obtained for one wrapped phase. And different decoding codes are used depending on the range of phase value, so the codes on the edge will not be used and the remaining phase unwrapping errors on the edges can be avoided. Experiments are performed to verify the performance of the proposed method.

The rest of the paper is organized as follows: Section 2 illustrates the principle of the proposed SGC high-speed 3D measuring method. Section 3 shows some experimental results to validate the proposed method. Section 4 discussed the strengths and contributions of the proposed method. And Section 5 summarizes this paper.

## 2. Principle

#### 2.1. Coding strategy of the shifting Gray-code light

High-speed projectors based on DLP technique can display the binary images at a much faster rate than they display 8-bit grayscale images [5]. So binary defocusing technique [15] is suitable for high-speed 3D shape measurement to greatly improve measuring speed. Among the existing methods, the dithering technique [34] works well to produce high-quality sinusoidal patterns. Therefore, this technique is applied to produce sinusoidal fringes as shown in Figs. 1(a)-1(c). And phase-shifting algorithm has been extensively adopted in optical metrology due to its high accuracy and insensitivity to surface reflectivity. At least three fringe patterns are required for 3D shape recovery [35], so the three-step phase-shifting algorithm is widely used to reduce measuring time and motion-induced errors in high-speed 3D shape measurement. Three shifting sinusoidal patterns can be described as:

in which,*A*(

*x*,

*y*,

*n*) is the intensity of the background light, and

*B*(

*x*,

*y*,

*n*)/

*A*(

*x*,

*y*,

*n*) is the fringe contrast at different sequences of projected patterns (3 phase-shifting dithered patterns and 4 Gray-code patterns in one sequence). As shown in Fig. 1(h),

*ϕ*(

*x*,

*y*,

*n*) is the wrapped phase of the modulated light field which can be obtained using three-step phase-shifting algorithm.

Traditional Gray-code-based TPU method is widely used to eliminate the phase ambiguity owing to its high robustness and anti-noise ability in static scenes. However, Gray-code patterns are blurred because of the use of the binary defocusing technique in dynamic measurement. To avoid these phase unwrapping errors caused by the inaccurate binarization, the coding strategy of shifting Gray-code light is proposed. It is well-known that measuring accuracy will be improved with the increase of the stripe periods. However, in the traditional Gray-code-based TPU method, *N* Gray-code patterns can label maximum 2* ^{N}* stripe periods. So,

*N*is chosen as 4 to compromise the measuring accuracy and speed. Four Gray-code patterns are as shown in Figs. 1(d)-1(g). To build the complementary decoding words in the adjacent pattern sequences, all the traditional Gray codes are shifted by half period in odd pattern sequences as shown in Figs. 1(l)-1(o). So two sets of decoding orders with staggered boundaries can be obtained. Then, for the wrapped phase

*ϕ*(

*n*+ 1), two sets of decoding orders

*k*(

*n*+ 1) and

*k*(

*n*) are used to eliminate the phase ambiguity according to the range of phase value. So the codes on the edge will not be used and the phase unwrapping errors on the edges can be avoided.

#### 2.2. Phase unwrapping based on shifting Gray-code light

In the previous subsection, the design of the shifting Gray codes has been well illustrated. The coding strategy of shifting Gray-code light does not increase the number of the projected patterns on the condition of the same number of the marked stripes. So, it is suitable for dynamic measurement. In order to unwrap the wrapped phase, the robust decoding process is crucial. Therefore, how to realize a robust phase unwrapping is our main concern in this subsection.

The SGC TPU algorithm is proposed in this subsection. The pixel-wise threshold and the staggered decoding numbers are used in the decoding process to ensure the robustness of the phase unwrapping. And the details of the algorithm are described as follows:

**Step 1. Binarize the Gray-code patterns using a pixel-wise threshold.** All the Gray-code patterns need to be binarized before calculating the decoding numbers. Because of the different reflectivity of the tested object, thresholds in different pixels are non-unique. So, a pixel-wise threshold should be obtained to reduce the errors caused by binarization. However, additional patterns are not expected to be projected in order to guarantee the measuring speed. Owing to the high-speed projection and shooting, the motion between three continuous sinusoidal patterns is very small. So, the average intensity of three shifting patterns can be calculated as a suitable and pixel-wise threshold *T*(*x*, *y*, *n*) to binarize the Gray-code patterns using Eq. (5).

To verify the Eq. (5), the non-uniform reflectivity of surface is introduced in our simulation as shown in Fig. 2. Two Gray codes whose code words are reversed and three dithered sinusoidal fringes are all affected by the defocus, different reflectivity and random noise. The average intensity of two Gray-code patterns is treated as the reliable threshold [36]. The average intensity of sinusoidal patterns varies with the threshold and overlaps with it well, so the average intensity of the sinusoidal patterns can also be treated as a reliable and pixel-wise threshold in our measurement.

As shown in Fig. 3, the same conclusion can be drawn for the experimental data. Compared with the method in [36], the additional pattern (the reversed Gray-coded pattern) is not projected to calculate a pixel-wise threshold in our proposed method, so our method is more suitable for dynamic measurement.

**Step 2. Calculate decoding numbers in every pattern sequence.** After the binarization of the Gray-code patterns, binary coding words *GC*_{1}(*n*), *GC*_{2}(*n*), *GC*_{3}(*n*) and *GC*_{4}(*n*) should be converted to be decimal numbers *V*(*n*) using Eq. (6). Then, the decoding number *i*(*V*(*n*)) can be obtained by looking up the known unique relationship between *V*(*n*) and *i*(*V*(*n*)). Finally, the phase order *k*(*n*) is equal to the decoding number *i*(*V*(*n*)) as shown in Eq. (7). To clearly explain the decoding process, the decoding example in even pattern sequence is shown in Fig. 4.

**Step 3. Correct the phase orders.** Due to the design of the shifting Gray codes, the phase order “0” appears twice in odd pattern sequence as shown in Fig. 5. Equation (8) is used to eliminate the phase order ambiguity by judging the difference of the wrapped phase value.

**Step 4. Unwrap the wrapped phase using complementary decoding numbers.** As shown in Fig. 5, the two continuous decoding phase orders *k*(*n*) and *k*(*n*-1) after correction are staggered and complementary. According to the relationship between the phase orders and the wrapped phase distribution, the different phase orders are used to unwrap different parts of the wrapped phase *ϕ*(*n*) using Eq. (9) and Eq. (10). And the used parts of the phase orders are labeled in yellow as shown in Fig. 5. In this way, decoding orders on the edge is not used, so phase unwrapping errors on the edges can be avoided.

#### 2.3. System calibration

In order to obtain the 3D surface information of the object, the absolute phase is needed to be converted to the height using the phase-to-height algorithm [37]. Equation (11) can be used to reconstruct the height of a measured object:

*x*,

*y*,

*n*) is the absolute phase value of the measured object, relative to the reference plane. Four planes which have known height distributions are measured. Then the 3 unknown parameters

*u*(

*x*,

*y*),

*v*(

*x*,

*y*) and

*w*(

*x*,

*y*) can be calculated and saved as system parameters for the future phase-to-height mapping.

The camera calibration technique proposed by Zhang [38] is widely adopted and used due to its easy operation and high accuracy with an easily available plane calibration target. This calibration method is also implemented in our developed system.

#### 2.4. Framework of the proposed method

To explain the whole process of the proposed SGC method clearly, the whole framework of this method is illustrated in Fig. 6 (a). Firstly, two sequences of binary patterns (3 phase-shifting patterns and 4 Gray-code patterns in one sequence) are designed using the proposed SGC coding strategy and the period number of the sinusoidal patterns is 16. All 14 binary images are loaded into the high-speed projector. Then the projector projects the binary patterns circularly at 2170 fps, and high-speed camera captures images in sync with the projector. Three obtained sinusoidal fringes in one pattern sequence are used to calculate the wrapped phase using Eq. (4). Next, the SGC TPU algorithm is used to eliminate the phase ambiguity. In this algorithm, the groups of sinusoidal patterns and Gray-code patterns are switched at a high projecting rate, and Gray-code patterns in every pattern sequence can be used twice in the adjacent pattern sequence to eliminate the phase ambiguity for two wrapped phases. For the wrapped phase in one pattern sequence, 4 Gray-code patterns in the current pattern sequence and 4 patterns in the previous adjacent pattern sequence are used to unwrap the wrapped phase. As shown in Fig. 6, two sets of decoding orders *k*(*n*) and *k*(*n* + 1) whose boundaries are stagger are used to unwrap one wrapped phase *ϕ*(*n* + 1), so the codes on the edge are not used where the phase unwrapping errors can be avoided. Finally, the absolute phase is converted to the 3D result by the system calibration. In our proposed method, every 7 patterns we obtain, a new 3D frame can be calculated as shown in Fig. 6(b), and thus the 3D frame can be reconstructed in every pattern sequence. So it is more efficient compared with the traditional method in dynamic measurement.

## 3. Experiments and results

Experiments have been conducted to test the performance of our proposed method. A measuring system was developed, including a digital-light-processing (DLP) projector (LightCrafter 4500) and a high-speed CCD camera (Photron FASTCAM Mini UX100). The projector resolution is 912 × 1140 pixels and the lens assembled to the camera has a focal length of 16mm and an aperture of f/1.4. In all our experiments, the image refreshing rate of the projector was set at 2170Hz and the period number of the projected sinusoidal fringes is 16. And 912 × 1120 pixels of the projector are used to generate the projected fringe patterns whose period is 70 pixels. The camera resolution was set at 1280 × 1000 pixels and the camera was synchronized by the trigger signal of the projector.

#### 3.1. Accuracy analysis and measurement on complex static scenes

To quantitatively evaluate the accuracy of the proposed method, a standard ceramic ball with the radius of 12.6994 mm and a standard ceramic flat were measured as shown in Fig. 7(a). And the measurement result is shown in Fig. 7(b). The fitting plane on the reconstructed result of the measured flat was used as the standard value and the error distribution of the measured flat in the dashed box is shown in Fig. 7(c). The root-mean-square (RMS) error of the flat plane in the dashed box is 0.080 mm. The sphere fitting on the reconstructed 3D geometry of the standard ceramic ball was performed as shown in Fig. 7(d). The measured radius of the ball is 12.682 mm and the RMS error of the ball is 0.102 mm. The error distribution of the ball is shown in Fig. 7(e).

The second experiment was performed to further demonstrate the performance of our proposed method in the complex scenes with isolated objects. Two sets of objects were measured including a portrait sculpture with a cooling fan (scene I) and a portrait sculpture with a petaloid model (scene II). One of the deformed sinusoidal fringes are shown in Figs. 8(a) and 8(b), and the corresponding reconstructed 3D results are shown in Figs. 8(c) and 8(d). Experimental results demonstrated that our proposed method can achieve robust 3D measurement on the complex and isolated objects.

#### 3.2. Comparative experiments on a large measuring depth range

To verify the performance of the SGC method on a large measuring depth range, a cooling fan and a swinging simple pendulum were measured. Compared with other exiting Gray-code-based methods, the cyclic complementary Gray-code (CCGC) method [33] can realize a simple and robust 3D measurement in dynamic scenes without projecting extra patterns. So the CCGC method was used to measure the same dynamic scene for comparison. To achieve a fair comparison, both of these two methods projected 3 phase-shifting dithered sinusoidal patterns with 16 fringe periods and 4 Gray-code patterns in one pattern sequence. And the associated Visualization 1 shows continuously captured images of two methods at 15 fps. In the experiments, a cooling fan was static as a reference and a simple pendulum fell freely from the same height, then a complete swing was recorded and reconstructed using two methods. The restored 3D results of CCGC and SGC methods at three different moments are shown in Figs. 9(a) and 9(b). The pendulum kept on the same height at the corresponding moments in two methods. At the first two moments, 3D results without phase unwrapping errors are obtained using two methods. However, phase unwrapping errors occurred on the result of CCGC method at the last moment. This is because the exclusive OR operation between second-level Gray-code patterns in the adjacent pattern sequences is used to calculate the lowest-level Gray-code pattern in the CCGC method. However, edges of second-level adjacent Gray-code patterns cannot match well due to the binarization and the motion of the object, so a few extra errors of the phase order occur. To correct these errors, the limited measuring depth range should be calibrated and the phase difference of the whole measurement volume must be within the product of the period number and 2π [33]. So, in this experiment, the depth range from 0 to 120 mm was calibrated for the CCGC method. Once the pendulum moves out of the calibrated depth range, the phase unwrapping errors will occur on the reconstructed results as shown in the last moment of Fig. 9(a). But the exclusive OR operation is not needed in SGC method, all the errors on the edges of the Gray-code patterns can be avoided just using two complementary phase orders. So there is no limitation of the calibrated depth range in our SGC method, and no errors occurred on the results of the SGC method. The restored results of the complete swing are shown in Visualization 2. The reconstructed rate is 310 fps and the replayed rate is 15 fps. Comparative experimental results show our proposed SGC method has a larger measuring depth range than the CCGC method.

#### 3.3. Measurement on the complex dynamic scene

In the last experiment, the collapsing process of the building blocks was measured to demonstrate the performance of the proposed method in the complex and dynamic scene. Five-layer wooden building blocks (size of ~30 × 30 × 30 mm^{3} for each block) were struck by a swing ball and then they collapsed. It was a challenging measuring scene due to the uncertainty of the blocks’ motion, different reflectivity of the object surface and sharp edges of multiple objects. In addition, the depth range of this scene reaches 200 mm after collapsing, which is too large to be measured for the CCGC method. But because of the robustness and the large measuring depth range of the proposed SGC method, the whole collapsing process can be reconstructed greatly. And the texture map (average intensity of the phase-shifting fringe images) and corresponding high-quality results at different moments are shown in Fig. 10 and Visualization 3. The reconstructed rate is 310 fps and the replayed rate is 30 fps. Experimental results show the robustness of our proposed method in the complex and dynamic scene in a larger depth range.

To compare the performance of the SGC method and the traditional Gray-code-based method in the complex dynamic scene. The 3D results at a time moment (T = 468ms) were reconstructed using the traditional Gray-code-based method with a unique threshold, the traditional Gray-code-based method with the proposed pixel-wise threshold and the SGC method. The three results are shown in Fig. 11. And data indicates that the use of a pixel-wise threshold can reduce the error rate of the phase unwrapping from 6.73% to 0.27% compared with using a unique threshold. And the remaining errors can be avoided in the SGC method. And comparative assessments clearly show that the proposed method can realize a more robust measurement than the traditional method on the condition of the same number of projected patterns.

## 4. Discussion

Our proposed method has the following advantages compared with other high-speed 3D shape measurement techniques.

- ●
Compared with two-frequency or multi-frequency methods, only three phase-shifting sinusoidal patterns with single frequency are projected in our method. So the optimum defocusing degree can be guaranteed in binary dithering technique, thus high-quality sinusoidal fringe patterns can be obtained.*Assurance of the optimum defocusing degree for binary dithering technique.* - ●
In our proposed method, Gray-code patterns only have two gray scales, which have high robustness and anti-noise ability for phase unwrapping. Compared with the traditional Gray-code-based methods, the pixel-wise threshold and the coding strategy of the shifting Gray-code light can avoid phase unwrapping errors on the boundaries of adjacent Gray-code words caused by the defocus of the projector, the non-uniform reflectivity of the surface, ambient light variations and the motion of the objects. So complex dynamic scenes can be measured without phase unwrapping errors using the proposed method. And no extra patterns are projected in our method, so it is suitable for high-speed 3D measurement.*Simple, efficient and robust phase unwrapping in dynamic 3D shape measurement.* - ●
As shown in measuring experiment of collapsing building blocks, our proposed approach is capable of restoring absolute 3D geometries for many spatially isolated and randomly moving objects with sharp edges. And compared with our previous CCGC method [33], the proposed method has a much larger depth sensing range as shown in the comparative experiment. In our method, the depth sensing range is only limited to ensure some best defocus level for measurement accuracy in the binary defocusing technique.*Absolute 3D shape measurement of multiple randomly moving objects in a large depth range.*

## 5. Conclusion

In this study, the high-speed 3D shape measurement method based on shifting Gray-code light has been proposed. Dithering technique was employed to produce three-step phase-shifting sinusoidal fringes. And the wrapped phase is calculated using phase-shifting algorithm. Then, the SGC TPU algorithm was proposed to eliminate the phase ambiguity. Firstly, the average intensity of three captured fringe images was used as a pixel-wise threshold to binarize the Gray codes and to eliminate the most phase unwrapping errors caused by the non-uniform reflectivity of the tested surface, the background light intensity and the defocus of the projector. Then, two sets of decoding words whose boundaries are staggered in the temporal sequences were built to avoid the remaining phase unwrapping errors on the edge of the code words. In the SGC TPU algorithm, no additional patterns are projected, and the simple, efficient and robust phase unwrapping can be achieved in the high-speed dynamic measurement. Finally, the 3D results can be obtained by system calibration. Comparative experiments verified that the proposed SGC method outperforms the CCGC method [33] due to its larger measuring depth range. And the experimental results of randomly collapsing objects in a large depth range demonstrated that our method can reliably obtain high-quality shape and texture information at a rate of 310 fps.

## Funding

National Natural Science Foundation of China (61675141).

## References

**1. **K. R. Ford, G. D. Myer, and T. E. Hewett, “Reliability of landing 3D motion analysis: implications for longitudinal analyses,” Med. Sci. Sports Exerc. **39**(11), 2021–2028 (2007). [CrossRef] [PubMed]

**2. **E. Malamas, E. Petrakis, M. Zervakis, L. Petit, and J. Legat, “A survey on industrial vision systems, applications and tools,” Image Vis. Comput. **21**(2), 171–188 (2003). [CrossRef]

**3. **F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. **39**(1), 10–22 (2000). [CrossRef]

**4. **S. Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. **48**(2), 149–158 (2010). [CrossRef]

**5. **S. Zhang, “High-speed 3D shape measurement with structured light methods: A review,” Opt. Lasers Eng. **106**, 119–131 (2018). [CrossRef]

**6. **S. Van der Jeught and J. J. J. Dirckx, “Real-time structured light profilometry: a review,” Opt. Lasers Eng. **87**, 18–31 (2016). [CrossRef]

**7. **C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express **20**(17), 19493–19510 (2012). [CrossRef] [PubMed]

**8. **M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” Rev. Sci. Instrum. **72**(12), 156–160 (1982).

**9. **M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. **22**(24), 3977–3982 (1983). [CrossRef] [PubMed]

**10. **X. Su and W. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. **35**(5), 263–284 (2001). [CrossRef]

**11. **Q. Zhang and X. Su, “An optical measurement of vortex shape at a free surface,” Opt. Laser Technol. **34**(2), 107–113 (2002). [CrossRef]

**12. **Q. Zhang, X. Su, Y. Cao, Y. Li, L. Xiang, and W. Chen, “Optical 3D shape and deformation measurement of rotating blades using stroboscopic structured illumination,” Opt. Eng. **44**(11), 113601 (2005). [CrossRef]

**13. **Q. Zhang and X. Su, “High-speed optical measurement for the drumhead vibration,” Opt. Express **13**(8), 3110–3116 (2005). [CrossRef] [PubMed]

**14. **C. Zuo, T. Tao, S. Feng, L. Huang, A. Asundi, and Q. Chen, “Micro Fourier Transform Profilometry (*μ*FTP): 3D shape measurement at 10,000 frames per second,” Opt. Lasers Eng. **102**, 70–91 (2018). [CrossRef]

**15. **S. Lei and S. Zhang, “Flexible 3-D shape measurement using projector defocusing,” Opt. Lett. **34**(20), 3080–3082 (2009). [CrossRef] [PubMed]

**16. **C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, “Temporal phase unwrapping algorithms for fringe projection profilometry: A comparative review,” Opt. Lasers Eng. **85**, 84–103 (2016). [CrossRef]

**17. **J. M. Huntley and H. Saldner, “Temporal phase-unwrapping algorithm for automated interferogram analysis,” Appl. Opt. **32**(17), 3047–3052 (1993). [CrossRef] [PubMed]

**18. **H. O. Saldner and J. M. Huntley, “Temporal phase unwrapping: application to surface profiling of discontinuous objects,” Appl. Opt. **36**(13), 2770–2775 (1997). [CrossRef] [PubMed]

**19. **Y. Y. Cheng and J. C. Wyant, “Two-wavelength phase shifting interferometry,” Appl. Opt. **23**(24), 4539–4543 (1984). [CrossRef] [PubMed]

**20. **Y. Y. Cheng and J. C. Wyant, “Multiple-wavelength phase-shifting interferometry,” Appl. Opt. **24**(6), 804–807 (1985). [CrossRef] [PubMed]

**21. **G. Sansoni, S. Corini, S. Lazzari, R. Rodella, and F. Docchio, “Three-dimensional imaging based on Gray-code light projection: characterization of the measuring algorithm and development of a measuring system for industrial applications,” Appl. Opt. **36**(19), 4463–4472 (1997). [CrossRef] [PubMed]

**22. **G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: analysis and compensation of the systematic errors,” Appl. Opt. **38**(31), 6565–6573 (1999). [CrossRef] [PubMed]

**23. **V. Gushov and Y. Solodkin, “Automatic processing of fringe patterns in integer interferometers,” Opt. Lasers Eng. **14**(4–5), 311–324 (1991). [CrossRef]

**24. **J. Zhong and M. Wang, “Phase unwrapping by a lookup table method: application to phase maps with singular points,” Opt. Eng. **38**(12), 2075–2080 (1999). [CrossRef]

**25. **Y. Wang and S. Zhang, “Superfast multifrequency phase-shifting technique with optimal pulse width modulation,” Opt. Express **19**(6), 5149–5155 (2011). [CrossRef] [PubMed]

**26. **Y. Wang, J. I. Laughner, I. R. Efimov, and S. Zhang, “3D absolute shape measurement of live rabbit hearts with a superfast two-frequency phase-shifting technique,” Opt. Express **21**(5), 5822–5832 (2013). [CrossRef] [PubMed]

**27. **C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. **51**(8), 953–960 (2013). [CrossRef]

**28. **W. Yin, C. Zuo, S. Feng, T. Tao, Y. Hu, L. Huang, J. Ma, and Q. Chen, “High-speed three-dimensional shape measurement using geometry-constraint-based number-theoretical phase unwrapping,” Opt. Lasers Eng. **115**, 21–31 (2019). [CrossRef]

**29. **Y. Wang, S. Zhang, and J. H. Oliver, “3D shape measurement technique for multiple rapidly moving objects,” Opt. Express **19**(9), 8539–8545 (2011). [CrossRef] [PubMed]

**30. **J. I. Laughner, S. Zhang, H. Li, C. C. Shao, and I. R. Efimov, “Mapping cardiac surface mechanics with structured light imaging,” Am. J. Physiol. Heart Circ. Physiol. **303**(6), H712–H720 (2012). [PubMed]

**31. **D. Zheng, F. Da, and H. Huang, “Phase unwrapping for fringe projection three-dimensional measurement with projector defocusing,” Opt. Eng. **55**(3), 034107 (2016). [CrossRef]

**32. **D. Zheng, F. Da, Q. Kemao, and H. S. Seah, “Phase-shifting profilometry combined with Gray-code patterns projection: unwrapping error removal by an adaptive median filter,” Opt. Express **25**(5), 4700–4713 (2017). [CrossRef] [PubMed]

**33. **Z. Wu, C. Zuo, W. Guo, T. Tao, and Q. Zhang, “High-speed three-dimensional shape measurement based on cyclic complementary Gray-code light,” Opt. Express **27**(2), 1283–1297 (2019). [CrossRef] [PubMed]

**34. **Y. Wang and S. Zhang, “Three-dimensional shape measurement with binary dithered patterns,” Appl. Opt. **51**(27), 6631–6636 (2012). [CrossRef] [PubMed]

**35. **K. Creath, “Phase-Measurement Interferometry Techniques,” Prog. Opt. **26**, 349–393 (1988). [CrossRef]

**36. **Z. Zeng, Y. Fu, B. Li, and M. Chai, “Complex surface three-dimensional shape measurement method based on defocused gray code plus phase-shifting,” Opt. Rev. **23**(4), 628–636 (2016). [CrossRef]

**37. **W. Li, X. Su, and Z. Liu, “Large-scale three-dimensional object measurement: a practical coordinate mapping and image data-patching method,” Appl. Opt. **40**(20), 3326–3333 (2001). [CrossRef] [PubMed]

**38. **Z. Zhang, “A Flexible New Technique for Camera Calibration,” IEEE T. Pattern Anal. Mach. Intell. **22**(11), 1330–1334 (2000). [CrossRef]