BEIJING, Sept. 5, 2023 /PRNewswire/ — WiMi Hologram Cloud Inc. (NASDAQ: WIMI) (“WiMi” or the “Company”), a leading global Hologram Augmented Reality (“AR”) Technology provider, today announced that it developed an assistive robotic technology and a control approach based on hybrid BCI. The technology combines various technological components such as an eye-tracker, a device for recording EEG signals, a webcam, and a robotic arm to enable the user to accurately control the movement of the robotic arm by means of hybrid gaze BCI.

This assistive robot control technology will enable users to control the movement of the robotic arm end-effector through hybrid BCI, enabling more precise and flexible manipulation. The technology has been developed to improve the robot’s grasping performance, with a focus on improving its reaching performance so that grasping tasks can be automated. To achieve this goal, the development team divided the task into three key phases and leveraged the natural human visual motor coordination behavior.

First, the user specifies the target location of the assisting robot with a hybrid BCI in discrete selection mode. By observing the virtual rectangle appearing around the target, the user confirms that the target position has been successfully communicated to the assistive robot. Subsequently, it automatically switches to the continuous velocity control mode and enters the second phase. The user uses the hybrid BCI to move the robotic arm end-effector sequentially while avoiding collisions with obstacles. Once the end-effector enters a pre-specified area directly above the target object, it automatically stops and hovers over the target. Finally, the pre-programmed program is executed. The end-effector moves downward, adjusts the gripper orientation according to the direction of the target in the workspace, and successfully grabs the object. This design effectively reduces the number of degrees of freedom and allows the user to reach the object in three dimensions.

One of the key points of the technology is the application of hybrid BCI. This technology combines vision tracking and brain-computer interface technologies to enable control of the robot through discrete selection mode and continuous velocity control mode. In the discrete selection mode, the user inputs the target position by gazing at it, and then it automatically switches to the continuous velocity control mode, which moves the robotic arm end-effector to the target position according to the user’s velocity command.

The goal of WiMi’s technology is to enable accurate intent perception, efficient motion control and human-computer interaction. The underlying logic of this technology combines several technical components and algorithms to ensure stability, reliability and performance optimization of the control system. In the underlying logic, eye trackers and EEG signal recording devices play a key role in providing a sense of the user’s intent and attention by monitoring the user’s gaze point and EEG signals in real time. The eye-tracker tracks the user’s eye movements and determines the user’s gaze point and direction of view. EEG signal recording devices record the user’s EEG activity and extract features related to intent and attention through signal processing and analysis algorithms.

Data processing and algorithms based on eye movement data and EEG signals need to be processed and decoded in real time in order to extract the user’s intent and attentional indications. This involves the use of techniques such as machine learning, pattern recognition and signal processing to recognize and decode the user’s intent and attentional state.

Environment sensing and obstacle avoidance is an important component. The technology utilizes sensors to sense the surrounding environment and the location of obstacles. The environment sensing data and algorithms enable real-time planning of safe paths and collision avoidance, and combine this information with user commands to ensure the safety and accuracy of the robotic arm during movement. The shared controller fuses user commands and robot autonomy commands to form new control commands that are used to precisely control the motion of the end-effector. The actuation system converts the control commands into the actual movement of the robotic arm to achieve accurate position control and gripping action. This requires the collaborative work of motion control algorithms, motion planning, and actuation control strategies to achieve precise communication of user intent and accurate task completion.

The visual feedback interface provides intuitive user interaction and feedback. The GUI displays a real-time scene of the working area of the robotic arm, presenting information such as the target position, obstacles, and the status of the robotic arm, enabling the user to intuitively understand the operation of the system. Meanwhile, augmented reality technology can provide enhanced visual feedback, such as the display of virtual rectangles and directional recognition of target objects, to further improve the accuracy and efficiency of operation. Through the visual feedback interface, users can monitor the robot’s motion status, the position of the target object, and the system’s response in real time to better understand and control the system’s behavior.

The transfer, processing and coordination of data play an important role in assistive robotics control technology. From eye-tracking and EEG signal-recording devices to shared controllers and actuation systems, data flows between different components and is processed and parsed accordingly. Data transfer and coordination ensure the real-time and stability of the system so that the user’s intent can be accurately communicated to the robot and precisely executed. In addition, error handling and fault tolerance mechanisms are included in the underlying logic. The system needs to be able to detect and handle potential errors or anomalies, such as sensor failures, communication interruptions, or motion errors. Fault-tolerance mechanisms ensure that the system is able to handle and recover appropriately in the face of abnormal situations to ensure the reliability and safety of the system.

Through a series of experiments and evaluations, the hybrid BCI-based assistive robot control technology is verified to have excellent performance in target specification, motion control and grasping tasks. The system has a wide range of application prospects and can be used in industrial production, medical care, education and training, service robotics and entertainment. Future research directions for this technology include further improving the performance and applicability of the system, optimizing the control algorithms and human-machine interface, expanding the application areas of the system, and combining it with other advanced technologies to further enhance the perceptual ability and autonomy of the system.

This assistive robot control technology is of great significance in the field of human-robot interaction and intelligent robotics, providing new ideas and methods for realizing automation solutions and artificial intelligence applications. The development of the assistive robotic technology based on hybrid BCI will facilitate the development of human-machine collaboration, improve productivity and quality of life, and promote scientific and technological innovation and social progress.

About WIMI Hologram Cloud

WIMI Hologram Cloud, Inc. (NASDAQ:WIMI) is a holographic cloud comprehensive technical solution provider that focuses on professional areas including holographic AR automotive HUD software, 3D holographic pulse LiDAR, head-mounted light field holographic equipment, holographic semiconductor, holographic cloud software, holographic car navigation and others. Its services and holographic AR technologies include holographic AR automotive application, 3D holographic pulse LiDAR technology, holographic vision semiconductor technology, holographic software development, holographic AR advertising technology, holographic AR entertainment technology, holographic ARSDK payment, interactive holographic communication and other holographic AR technologies.

Safe Harbor Statements

This press release contains “forward-looking statements” within the Private Securities Litigation Reform Act of 1995. These forward-looking statements can be identified by terminology such as “will,” “expects,” “anticipates,” “future,” “intends,” “plans,” “believes,” “estimates,” and similar statements. Statements that are not historical facts, including statements about the Company’s beliefs and expectations, are forward-looking statements. Among other things, the business outlook and quotations from management in this press release and the Company’s strategic and operational plans contain forward−looking statements. The Company may also make written or oral forward−looking statements in its periodic reports to the US Securities and Exchange Commission (“SEC”) on Forms 20−F and 6−K, in its annual report to shareholders, in press releases, and other written materials, and in oral statements made by its officers, directors or employees to third parties. Forward-looking statements involve inherent risks and uncertainties. Several factors could cause actual results to differ materially from those contained in any forward−looking statement, including but not limited to the following: the Company’s goals and strategies; the Company’s future business development, financial condition, and results of operations; the expected growth of the AR holographic industry; and the Company’s expectations regarding demand for and market acceptance of its products and services.

Further information regarding these and other risks is included in the Company’s annual report on Form 20-F and the current report on Form 6-K and other documents filed with the SEC. All information provided in this press release is as of the date of this press release. The Company does not undertake any obligation to update any forward-looking statement except as required under applicable laws.