Visual art evaluation methods in future digital works: From virtual reality to metaverse
Article type: Research Article
Authors: Hu, Zhexian; *
Affiliations: Arts and Social Science Faculty, University of Sydney, Sydney, NSW, Australia
Correspondence: [*] Corresponding author. Zhexian Hu. E-mail: h18758434346@163.com.
Abstract: The motivation for this paper is to consider that in recent years, the concept of metaverse, as the latest and most popular concept in the world, has been widely applied and studied in various industries, including economic management, art design, education and teaching. However, the academic and scientific circles have not reached a consensus on whether to define the metaverse as a technology or an intelligent scene. We believe that the metaverse should be a key concept and emerging theory for constructing the future wisdom field. Therefore, in this study, our research objective is to focus on the visual art evaluation in digital works, and propose a visual art quality evaluation method in future metaverse digital works. This method is based on the quality function deployment theory and fuzzy mathematics theory in marketing. The second core point of this study is to build a field framework for the visual art evaluation of future digital works based on the metaverse by combing the current international and domestic understanding of the concept of metaverse. In addition, taking visual art quality evaluation as the research object, we have constructed a visual art quality evaluation index system for digital works under the background of metaverse. The index system is composed of one first-class index, three second-class indexes and nine third-class indexes. At the same time, we proposed a new fuzzy mathematics evaluation method in the research, called G1 entropy method. This algorithm combines subjective weighting method: G1 method and objective weighting method: entropy method as an important method of quality evaluation, and carries out the final rating through the combination weight of G1 entropy method. This study makes up for the concept of the future metaverse, introduces the gaps in the theory of visual art evaluation of future digital works, innovates the analysis of new concepts and the improvement of old methods, builds a new scene of organic combination of new technology and traditional visual art, and provides new ideas for the improvement of art quality at home and abroad in the future. In general, we sorted out the contributions of this research, including the following three aspects: (1) we constructed the metaverse field structure of digital works. By analyzing the current international and domestic research literature on the application of metaverse technology, especially the concept of metaverse in art scenes, we proposed to construct the field structure of online visual art after introducing the concept of metaverse, including blockchain technology, artificial intelligence technology Interaction technology and Internet of things technology as the four characteristics; (2) Method theoretical contribution: we further take the visual art quality evaluation as the research object, construct the index system of visual art quality evaluation of digital works under the background of metaverse, and propose an evaluation method of G1 entropy method, which is actually a method of subjective weighting by experts; (3) We use the method proposed in (2) to complete the calculation and ranking of the importance of 9 indicators in a practical case, and give some countermeasures for the calculation results of the importance of indicators. In conclusion, this study has realized the construction of new application scenarios of concepts and the new improvement of methods, and can provide theoretical and practical case experience support for the quality improvement of international and domestic metaverse visual art.
Keywords: Metaverse, visual art, field architecture, quality function deployment, G1 entropy method
DOI: 10.3233/JIFS-223376
Journal: Journal of Intelligent & Fuzzy Systems, vol. 45, no. 2, pp. 2347-2365, 2023