BEIJING, July 6, 2023 /PRNewswire/ — WiMi Hologram Cloud Inc. (NASDAQ: WIMI) (“WiMi” or the “Company”), a leading global Hologram Augmented Reality (“AR”) Technology provider, today announced that it is researching an interactive driving simulation system based on VR technologies, which is a combination of 3D visual reality, information transmission, computer vision recognition, virtual world interaction, and other functions to realize physical interaction and virtual driving simulation in real-time.
The system is based on rationalized virtual simulation of actual data to visualize virtual urban traffic road scenes. First, it acquires urban road traffic data. The system builds and optimizes the virtual scene model based on the acquired primary road traffic data. Then it uses the rendering function in the virtual simulation engine to render the whole urban traffic road model and generate the entire virtual scene. Then the results of gesture recognition are imported into the virtual model to realize the interaction with the virtual scene. Finally, the user uses a virtual reality headset to realize an immersive perceptual experience.
The interactive driving simulation system based on VR technologies by WiMi consists of three main modules as follows:
(1) Construction of virtual scenes
The system quickly constructs the city road and terrain and simulates the 3D terrain scene of citywide road traffic. After that, the system visualizes the city buildings and road paving model construction and adds optimization to the virtual scene, thus forming a high degree of realism in the three-dimensional urban road traffic virtual scene. The system collects vehicle data and builds a strong sense of authenticity for the car’s driving environment and equipment. The system simulates the driving state of the vehicle, for example, the steering wheel and seat inside the car, the lights and tires outside the vehicle, the change of traffic signals, etc., mainly showing the vehicle’s internal equipment, vehicle driving conditions, intersection signals, and other vital interactive things to realize the roaming of the virtual scene of the city traffic road.
In the virtual simulation driving system, some objects have specific interactions with the user in addition to the scene objects, for example, the operation of car tires, simulation of vehicle driving status, vehicle-related equipment, traffic lights, etc. The system requires detailed models, animations, and other designs for such specific objects. While improving the fidelity of the virtual scene, it also enhances the visualization effect and interactive performance.
(2) System design and implementation
Interaction operation is the basis of virtual simulation interactive scene experience. In the virtual scene, the user can drive the vehicle freely through gestures and control the car from the inside, such as the opening and closing of the rear-view mirrors, the interaction between the vehicle and the traffic lights, etc. The interactive operation of the virtual scene includes the change of object geometry, the detection of light rays, the detection of a collision between objects and objects, and the triggering of events, etc.
Implementing interactive operations between the user and the virtual world requires recognition devices to extract information about the user’s movements. The classification results are obtained by feature extraction through intelligent algorithms and convolutional neural networks. Then the classification information is input to the corresponding object in the virtual scene through a specific interface, and the related parameters are assigned or modified. The recognition results are directly fed back to the control of vehicle driving, and the human motion information or gesture information can control the vehicle’s driving direction, driving speed, etc. Take the vehicle driving module and traffic signal module as an example. Through the output of the gesture command, the tired state in the vehicle driving module is modified, and then the vehicle driving condition is changed. The virtual reality equipment is used to simulate the driving situation in the car to simulate the experience and truly realize the immersive experience of virtual reality.
(3) Data management
The system transmits data between modules in real-time to ensure data transmission and interaction events triggered between modules in the virtual simulation system; the system structure is appropriately revised and managed to optimize the system’s performance.
WiMi’s research on VR technologies-based interactive driving simulation systems can provide safe, efficient, low-cost driving training and evaluation. The virtual-real interactive vehicle driving simulation system has been widely used in automobile manufacturing, driving training, traffic management, and other fields. In the future, with the development of artificial intelligence and autonomous driving technology, the virtual-real interactive vehicle driving simulation system will play a more important role. At the same time, it is also expected to be applied to automobile design, intelligent transportation systems, virtual city construction, and other fields.
About WIMI Hologram Cloud
WIMI Hologram Cloud, Inc. (NASDAQ:WIMI) is a holographic cloud comprehensive technical solution provider that focuses on professional areas including holographic AR automotive HUD software, 3D holographic pulse LiDAR, head-mounted light field holographic equipment, holographic semiconductor, holographic cloud software, holographic car navigation and others. Its services and holographic AR technologies include holographic AR automotive application, 3D holographic pulse LiDAR technology, holographic vision semiconductor technology, holographic software development, holographic AR advertising technology, holographic AR entertainment technology, holographic ARSDK payment, interactive holographic communication and other holographic AR technologies.
Safe Harbor Statements
This press release contains “forward-looking statements” within the Private Securities Litigation Reform Act of 1995. These forward-looking statements can be identified by terminology such as “will,” “expects,” “anticipates,” “future,” “intends,” “plans,” “believes,” “estimates,” and similar statements. Statements that are not historical facts, including statements about the Company’s beliefs and expectations, are forward-looking statements. Among other things, the business outlook and quotations from management in this press release and the Company’s strategic and operational plans contain forward−looking statements. The Company may also make written or oral forward−looking statements in its periodic reports to the US Securities and Exchange Commission (“SEC”) on Forms 20−F and 6−K, in its annual report to shareholders, in press releases, and other written materials, and in oral statements made by its officers, directors or employees to third parties. Forward-looking statements involve inherent risks and uncertainties. Several factors could cause actual results to differ materially from those contained in any forward−looking statement, including but not limited to the following: the Company’s goals and strategies; the Company’s future business development, financial condition, and results of operations; the expected growth of the AR holographic industry; and the Company’s expectations regarding demand for and market acceptance of its products and services.
Further information regarding these and other risks is included in the Company’s annual report on Form 20-F and the current report on Form 6-K and other documents filed with the SEC. All information provided in this press release is as of the date of this press release. The Company does not undertake any obligation to update any forward-looking statement except as required under applicable laws.