VR To Enhance NYU Studio Art

A 2020 - 2021 exploration into the capabilities of virtual reality usage to enhance studio art class.

By David Lasala

In the summer of 2020, the provost for educational technology funded a collaboration allowing the exploration of virtual reality as a part of Projects in Ceramics, a class in Steinhardt's Department of Art and Art Professions taught by professor Linda Sormin. Along with contributions from Paula Rondon of Steinhardt's Digital Studio and Claire Menegus of the NYU Usability Lab, the funding provided:

  • Reality Capture software to convert sculptures into 3D models.

  • Sketchfab, a repository for storing/showing 3D content.

  • 10 Quest 2 Virtual Reality Headsets.

Quest 2 all-in-one VR

Multiple Oculus Quest 2 headsets on desk.

conversion of mixed media art

3d scanning software interface.

Sketchfab Content Hosting

iTLAB's initial goal to offset the impact of the coronavirus lockdown on classes merged with Linda’s goals, which included:

  • Using VR to display and critique student work.

  • Examining a VR collaboration between art and science with Doctor Lloyd Brown, transplant surgeon from Rutgers.

  • Assessing the potential impact of virtual reality to enhance sculptural installation art.


In September, after coordinating roles and acquiring hardware and software, our efforts turned towards getting the student sculptures converted for use in a fully virtual class environment. We confirmed our method for converting real-world objects into 3D models worked quite well, and the environments we created were not only able to display the converted artwork effectively (see below) but provided a great deal of freedom over the “installation” of the objects.

3d digital sculpture.
3d digital sculpture.
3d digital sculpture.

During classes, we watched the students explore VR actions that are impossible in real life, like instantly scaling objects. Sculptures that were normally a few feet in diameter were scaled to 15 + feet in virtual reality with a simple “pinch/zoom” type gesture. Of course, there were challenges, predictably in the areas of comfort and ease-of-use.


  • Individual user comfort remains unpredictable. Some take to VR effortlessly, while others experience a range of issues, from feeling “seasick” to claustrophobia.

  • Technical issues from the range of dependencies related to multi-user virtual reality also impact the predictability of the class experience.


But these challenges can be addressed with time and continued resource development. Foremost, our development of in-house expertise will support better training for faculty, while in-person monitoring of VR users will allow us to quickly respond to issues. Improvements in the technology will eventually allow users to interact with their own hands, removing the need to learn controller inputs, which is currently a barrier to easier adoption. Also, better lens resolution, refresh rate, and processing power in the virtual reality hardware will contribute to overall user comfort and immersion.