We gratefully acknowledge support from
the Simons Foundation and member institutions.
Full-text links:

Download:

Current browse context:

eess.IV

Change to browse by:

References & Citations

Bookmark

(what is this?)
CiteULike logo BibSonomy logo Mendeley logo del.icio.us logo Digg logo Reddit logo

Electrical Engineering and Systems Science > Image and Video Processing

Title: Divide-Conquer-and-Merge: Memory- and Time-Efficient Holographic Displays

Abstract: Recently, deep learning-based computer-generated holography (CGH) has demonstrated tremendous potential in three-dimensional (3D) displays and yielded impressive display quality. However, most existing deep learning-based CGH techniques can only generate holograms of 1080p resolution, which is far from the ultra-high resolution (16K+) required for practical virtual reality (VR) and augmented reality (AR) applications to support a wide field of view and large eye box. One of the major obstacles in current CGH frameworks lies in the limited memory available on consumer-grade GPUs which could not facilitate the generation of higher-definition holograms. To overcome the aforementioned challenge, we proposed a divide-conquer-and-merge strategy to address the memory and computational capacity scarcity in ultra-high-definition CGH generation. This algorithm empowers existing CGH frameworks to synthesize higher-definition holograms at a faster speed while maintaining high-fidelity image display quality. Both simulations and experiments were conducted to demonstrate the capabilities of the proposed framework. By integrating our strategy into HoloNet and CCNNs, we achieved significant reductions in GPU memory usage during the training period by 64.3\% and 12.9\%, respectively. Furthermore, we observed substantial speed improvements in hologram generation, with an acceleration of up to 3$\times$ and 2 $\times$, respectively. Particularly, we successfully trained and inferred 8K definition holograms on an NVIDIA GeForce RTX 3090 GPU for the first time in simulations. Furthermore, we conducted full-color optical experiments to verify the effectiveness of our method. We believe our strategy can provide a novel approach for memory- and time-efficient holographic displays.
Comments: This paper has been accepted as conference paper in IEEE VR 2024
Subjects: Image and Video Processing (eess.IV); Graphics (cs.GR); Optics (physics.optics)
Cite as: arXiv:2404.10777 [eess.IV]
  (or arXiv:2404.10777v1 [eess.IV] for this version)

Submission history

From: Yuye Ling [view email]
[v1] Sun, 25 Feb 2024 13:58:03 GMT (10084kb,D)

Link back to: arXiv, form interface, contact.