RideVR: Reducing Sickness for In-Car Virtual Reality by Mixed-in Presentation of Motion Flow Information

Humans spend a significant portion of their daily life in cars. In this study, we investigate a method to reduce motion sickness and allow people to use virtual reality (VR) while riding in cars. As the sickness arises primarily from the sensory conflict between visual and actual (or vestibular) motions, the proposed approach attempts to resolve the mismatch by mixing in and visualizing the estimated information of the actual motion, which is sensed by the on-board diagnostics and inertial measurement unit modules attached to the vehicle. We conduct an experiment to validate our approach by comparing the sickness levels before and after implementing the approach in three in-car VR usage conditions: (1) Default – using the VR content without modification; (2) transparent wall – using the VR content with its background scene changing depending on the car motion; (3) particle flow – mixing in the VR content with the estimated motion flow of the car visualized as moving particles. Our experimental results show that motion sickness is reduced significantly (but not eliminated to a negligible level) using our approach


I. INTRODUCTION
Humans spend a significant portion of their time commuting e.g., for work and travel. One study showed that adults in California spend an average of 111 minutes per day in transit (approximately 40 minutes commuting to work [1], which constitutes approximately 10% of their daily activities [2]. Naturally, people spend their time in vehicles with activities such as taking a nap, listening to music, and reading books. The advent of smart mobile devices and computers has enabled people to perform more critical and involved activities such as social networking, playing games, writing papers/memos, and online meetings. In this regard, selfdriving cars are expected to enable new and more productive or entertaining approaches for people to spend their time in vehicles [3]. In fact, future self-driving cars are being designed to accommodate such requirements and new lifestyles, transforming the vehicle space into an office, living room, or entertainment center [4]. In this context, virtual reality (VR) is a promising media that can be used effectively in vehicles for various purposes, ranging from casual games and entertainment to more serious immersive applications (such as remote business meetings).
However, a major obstacle hindering the wide (consumer level) adoption of VR is motion sickness. The use of VR in vehicles can exacerbate the situation, as people are susceptible to motion sickness while being transported in vehicles. Upon closer observation, we discovered that the root cause of motion sickness is the same for both VR and during transport, i.e., a sensory mismatch between visual feedback and vestibular sense [5]. During transport, a person (depending on one's sensitivity) can feel sick if he/she does not peer out (visually) to the moving scenery but, instead, read a book or use a smartphone. In other words, the vestibular organ senses the movement and acceleration of body motion, whereas the visual feedback is static. In VR, motion sickness is typically derived in the reverse manner, where the user's body is stationary but the visual feedback possesses dynamic components, as exemplified in VR roller coasters. This is known as vection and is the most prominent cause of VR sickness [5]. Therefore, by aligning the movement of the vehicle (and thereby that of the user) to that of the VR content, it might be possible to mitigate the effect of vection and reduce motion sickness. This can be achieved by modeling and designing the content in advance such that the vehicle movement exactly matches the user movement in VR, as in the case of VR "on" roller coasters [6]. Whereas this approach has been known to significantly reduce motion sickness, as intended, it is inflexible because the content can only be used for the target ride, and the exact replication of its movement in the virtual space is labor intensive.
Herein, we propose two methods to resolve the aforementioned sensory mismatch by mixing in and visualizing the estimated information of the actual motion in a non-invasive manner. First, the motion of a vehicle is sensed by an onboard diagnostic module (OBD) and an inertial measurement unit module attached to the vehicle. In the first method, a "transparent wall" adds a background scene to the original VR content, which changes depending on the sensed motion of the car. The scene is visible over the transparent wall or window, providing the user with the visual motion of the car. For example, the background can be a distant road scene whose pathway is distorted in real time owing to the actual motion of the car-this is a technique developed in our previous study [7]. In the second method, the VR content is mixed in and overlaid with the estimated motion flow of the car as moving particles. Figure 1 shows images of the two methods in addition to the default case, where the VR content is used in the car without modification.
In the next sections, we first review previous studies pertaining to motion/VR sickness and the administration of VR usage in moving vehicles. Because the "transparent wall" method has been explained in our previous publication [7], we will briefly summarize this approach herein. Section 4 outlines the two proposed methods of visualizing the estimated visual motion aligned with that of the car for mitigating the effect of vection and reduce sickness. In Section 5, we present details regarding the validation experiment and compare our method with the baseline default case, in which the users use a VR system without any provision for the sensory mismatch problem with respect to the user-perceived sickness levels. Finally, we present our findings and discuss the implications and possible methods to extend our approach to achieve sickness-free in-vehicle VR usage in the future.

II. RELATED WORK
Motion sickness refers to sickness due to riding in vehicles, including cars, buses, boats, and airplanes. Major symptoms include disorientation, headache, nausea, and ocular strains [8] [9]. Humans perceive motion through several sensory channels but primarily through their eyes (visual) and the vestibular organs located in the ears. The vestibular organ detects abrupt changes in linear or rotational motion (acceleration) and balance with respect to the direction of gravity. Typically, humans, while in motion, will receive sensor stimulation in a manner consistent with that of the visual (a) Default case where VR office is shown as being inside car.
(b) Visual motion is observed through the transparent wall in background (c) Visual motion is depicted by approximated optical flow visualized as white flying radial particles. and vestibular senses. The main cause of motion sickness is the "sensory mismatch" between visual and vestibular stimulation [5]. For example, while riding in a car, if a person reads a book instead of peering out the window to the moving scenery, the visual feedback will be static (relatively stationary book pages), whereas the vestibular organ senses the body motion, causing an inconsistent external stimulation and hence the aforementioned symptoms. Furthermore, motion sickness can occur because of the difference between actual and "expected" motion [10] [9]. The driver, as compared with the passenger, is much less likely to experience any motion sickness because he/she has complete control and knowledge of the vehicle in terms of its direction and motion profile (i.e., when to accelerate/decelerate). Although the passenger might know the travel path of the vehicle, he/she is often not aware of the momentary vehicle motion dynamics.
Cyber or simulator sickness is similar to motion sickness but arises when using a motion simulator (instead of an actual moving vehicle). The motion simulator can be purely visual (with the user being stationary and only offering visual dynamics) or can involve limited user motion (e.g., redirected walking [11], in-place walking [12]), or by employing motion platforms). The sickness, in this case, is caused by the same sensory mismatch but in a reverse manner, where the user's body is typically stationary but the visual feedback possesses dynamic components, as exemplified in VR roller coasters. Hence, when riding in a vehicle and consuming VR content, as the visual feedback through the VR display will generally not match the actual user/vehicle motion, motion sickness will likely be experienced. Simulator sickness can be caused by other reasons, such as system (motion to photon) latency, accommodation and convergence mismatch in stereoscopic viewing, and distorted imagery from the headset optics [5]. However, in this study, we focused only on reducing the effect of vection. Extensive efforts have been expended to enable the use of VR in moving vehicles. Holoride [13] is a commercial service framework that provides VR experience in vehicles without causing motion or simulator sickness. The purpose and approach of Holoride appear to be identical to our proposed technique; however, as a commercial product, the exact methodology is not known. Based on the brief description provided in the Holoride web page, we can only presume that Holoride, similar to our approach, aligns the dynamics of the VR content based on that of the vehicle.
Most early approaches to VR usage in vehicles (implicitly with reduced level of sickness) are based on replicating actual pathways in a virtual space [6] [14] [15] [16] but with added or altered roadside content. For instance, in [17], an in-car VR-based flying simulator was described, where the helicopter flew based on the actual movement of a car. Because the virtual helicopter was flying in air (free space) instead of traversing through a particular pathway, content modeling and replication of the real-world pathway were less constrained. The authors demonstrated that when the simulator was operated with a synchronized motion profile, the user experienced much less sickness (compared with operating it in the stationary mode) as well as a higher level of presence and immersion. Paredes et al. investigated the effect of mindfulness VR content (which often features calm and peaceful scenery) with respect to the kinesthetic congruency between the content and interaction type [18]. The authors discovered that the users were more engaged when the dynamic VR content was accompanied by actual motion, which was manually synchronized and aligned to that of VR [18].
McGill et al. investigated the effects of visual motion cues on sickness in in-car VR [19]. They proposed blending an arbitrary background scene that is moving based on that of the car into the main scene to reduce the degree of vection and sickness. However, no definite results were obtained regarding the method to reduce/eliminate in-car VR-induced sickness. Additionally, the same authors introduced a method to subtract (or correct) the motion of a car from the motion sensed by the VR headset such that the VR content can be viewed as unaffected by the car motion. When viewing the VR content using the headset, the view direction is controlled by the gyro sensor embedded in the headset that detects the head rotation. In a moving vehicle, this gyro detects not only the user rotation, but also the rotation of the car, resulting in the unintended rotation of the displayed scene. By contrast, the motion of the vehicle can be obtained directly from the sensors in the OBD module [16] [17] [19] [20] or by attaching the vehicle with separate motion sensors, such as the global positioning system (GPS) and inertial measurement unit (IMU). Haeling et al. suggested an infrared-light-based tracking of a user's pose in a car. Such a non-IMU-based approach eliminates the necessity to modify the readings of the gyro sensor [20].
In some of the studies described above, the effect of vection was alleviated by matching or adding in visual feedback, as would be created by the actual motion. This corresponds to replicating the optical or motion flow of the car. In fact, this is a plausible idea, as human visual motion perception is primarily based on the optical flow, as first discovered by Gibson [21] and further confirmed in brain science research [22] [23]. In follow-up studies, methods have been devised to modulate the degree or characteristics of vection by manipulating the optical flow form [24], overlapping different directional/rotation flows [25], and mixing in visual noise [26]. In fact, the human brain can differentiate between the visual motion caused by oneself and those caused by objects in the visual field, particularly when facilitated by the vestibular sense [27]. Both the "transparent wall" and optical flow mimicking "particle flow" introduce additional visual motion information to the main virtual scene to match the motion of the car with that sensed by the user's vestibular sense.

III. ROADVR
Previously, we introduced the concept of RoadVR [7], in which a particular type of VR content was assumed as a roadlevel navigation scene and initially modeled as an infinitely straight road (with landmarks). RoadVR distorts the pathway in real time based on the actual movement of the car such that the visual feedback matches the corresponding movement as sensed by the user's vestibular sense (see Figure 2). In its current form, the method is restricted to road/street-level navigational VR content, such as racing or chasing games and driving simulators.
The motion of vehicles sensed by the GPS/IMU module is reflected in a manner where the virtual scene is navigated based on the vehicle motion, as well as by distorting the originally straight pathways appropriately. In other words, when turning right, the road is distorted to the curve to the right. This consequently aligns the vehicle motion with the user's expectation (when moving right, one would expect the road to curve to the right) as well as with the visual and vestibular senses (in this case, the resulting optical flow is VOLUME 4, 2016 FIGURE 2. Virtual space distorted in real time (right) to visually match actual driving pathway and user motion -curving to right (left). The bottom right corner inset shows original "infinitely straight" pathway which is dynamically distorted to match curving to right motion of the vehicle. expected to match the vestibular sense). Equations (1) and (2) express the effects of the movement profile of a vehicle on the distortion of a three-dimensional (3D) road-based model.
where p' i corresponds to the distorted vertex, p i the original 3D vertex of the model, δ i the amount of distortion, k i an arbitrary scale-adjusting constant, i the x-or y-dimension (note that δ z = 0), r the rotational velocity, v i the lateral velocity, and d the depth to the vertex from the viewpoint. At every frame, each vertex in the scene is adjusted and displaced by three factors: the rotational (angular) velocity, lateral velocity, and depth of the vertex from the viewpoint.
Only the x-and y-dimensions of the positions are affected, i.e., distortion occurs only in the left-right and top-down directions. The arbitrary constant k adjusts the scale of the distortion in the respective dimensions (set to k x = 1.0, k y = 0.15 based on trial-and-error). In general, a vertex will be distorted more when it is farther from the viewpoint as well as when the lateral speed is lower. Furthermore, the distortion is geometrically proportional to the rotational velocity in the respective directions.
Owing to the 3D space distortion, the virtual car movement was scaled to that in the real space to match them as much as possible. We used an ad-hoc method to calibrate the scale by matching the real-world metric movement to the pixel values using a landmark, and the size of the dashboard of the car in the real and virtual space was used to derive the approximate movement scale between the real and virtual movements. It is noteworthy that the virtual dashboard also served as a reference object that can potentially reduce motion sickness [28].
The distortion of the road rendered the resulting optical flow field consistent with the user's expectation. Figure 3 shows that the optical flow pattern was similar between the actual road and the corresponding virtual scene after applying the RoadVR technique. The comparative experiment also showed a significant reduction in sickness level when RoadVR was used, although not a complete elimination.

IV. RIDEVR
Although we validated the sickness reduction effect of RoadVR by distorting the navigation path based on the vehicle motion, the application possibilities are still limited. In this section, we present RideVR (an extension of RoadVR), which allows motion synchronization to be applied to relatively static VR contents while a person is riding in a vehicle. Examples include working from a virtual office, listening to a 3D immersive online lecture, or socializing in a 3D social metaverse.
The basic idea is to use the metaphor of an "open" camping car, in which the interior is used as the primary operating space (e.g., office, lecture, or social networking space), whereas the visual motion (set by the sensed actual motion) is observed in the distant background over the windows/transparent walls. Hence, the visual motion is partially supplied in the periphery (termed "transparent wall," see Figure 1(b)). As the scene over the window may either attract less attention or be distracting, an alternative idea is to present the visual motion of the vehicle with the approximated optical flow in terms of flying particles or a linear flow (termed "particle flow," see Figure 1(c)) directly over the scene (and no windows). The underlying idea is to partially supply the visual motion based on the vehicle motion without interfering with the main content as much as possible.  Figure 4 shows the overall system configuration of RideVR (essentially the same as RoadVR) as well as the typical usage situation. RideVR would be used by a person sitting in the passenger seat and is wearing a VR headset (or a nonimmersive mobile media device) equipped with an internal gyro sensor for detecting head motion (for view control). In our case, we used the Samsung GearVR [29].

A. SENSING VEHICLE MOTION AND STABILIZATION/CORRECTION
In addition to the user's motion, the vehicle movement is separately detected by the motion sensor module that is interfaced into the OBD and a separate IMU. The sensor values were filtered by the moving averages, whereas the lateral and rotational velocities and accelerations were computed and sent over the wireless network to the user's smartphone (Samsung Galaxy S8) while he/she is wearing the Samsung GearVR headset and driving at rates of 1 Hz (OBD) and 60 Hz (IMU), respectively. Subsequently, the user's rotational movement as sensed from the user's smartphone is subtracted by the values from the vehicle motion sensor module to isolate the user's rotation with respect to that of the vehicle such that the user can stably view the virtual space, unaffected by the vehicle rotation. The user's position with respect to the vehicle was assumed to be stationary (seated).

B. OPTICAL FLOW LIKE PATTERN
The visual motion observed over the "transparent wall" was created in the same manner as the original RoadVR concept (see Section 3). However, the moving road was placed and observed in the background of the front and primarily static scene (such as the virtual office). The transparency value of the window/wall was set empirically to 60%. Hence, the optical-flow-like pattern was conveyed in a limited manner -in the periphery, half transparent, and partly visible (see Figure 1 (middle)). As for the "particle flow," an "expanding" optical flow (for forward movement 1 ) like pattern visualized using the radiating particles was generated and laid over the scene as follows (see Figure 1, bottom): A particle system was installed in the center of the 3D scene (this position was not visible as the walls of the office were no longer transparent). 1 In this experiment, the car only moves forward. The particles were continuously emanated along the direction of the road. It is noteworthy that the road was not visible because it was outside the main scene and obstructed by walls. However, the particles were rendered over the entire scene and appeared to be moving radially toward the user. As the car (displaying the main scene) changed its direction, the particle positions were distorted, as in RoadVR (see Section 3). Hence, the center of the radial flow appeared as moving right or left by an amount scaled by the rotational movement of the vehicle, as illustrated in Figure 5. Various parameters of the particle system, such as the number of particles, birth rate (speed), and dispersion angle were set by trial-and-error to minimize distraction from the main scene. In the future, we plan to replicate the optical flow more precisely.

A. EXPERIMENTAL DESIGN
The experiment was designed as a one-factor (three levels) within-subject single measure, the factor being the type of the optical flow like motion visualization mixed into the content. The three main comparison conditions (as defined by the factor level) were: (1) "Car ride while viewing VR (C1)", (2) "Car ride while viewing VR with the transparent wall sickness mitigation technique (C2), and (3) "Car ride while viewing VR with the particle flow sickness mitigation technique (C3)".
To gain another objective perspective in terms of interpreting the results with regards to the sickness levels, those three conditions were also compared to the following three baseline conditions (unrelated to the experimental factor), namely: (1) "Car ride without any VR viewing (B1)", (2) "VR viewing only using the headset while user is stationary (B2)" and (3) "Car ride while reading a book (B3)". B1 represents the sickness situation owing to the motion of the car riding alone (without any VR viewing). B2 on the other hand represents the sickness situation due to viewing VR alone (without any vehicle riding, and user being stationary). The techniques suggested in this work are expected to reduce the sickness emanating from the vehicle motion (motion sickness) in addition to that of the virtual (simulator sickness). B3 is the sickness situation due to the non-VR task while riding a car and is similar to the case of C1 (viewing VR while riding). Both C1 and B3 are expected to exhibit severe sickness levels, and also constitute an interesting comparison. Table 1 summarizes all the test conditions. We primarily measured the motion/simulator sickness levels using the SSQ (Simulation Sickness Questionnaire) developed by Kennedy et al., which involved 16 symptom-VOLUME 4, 2016 specific questions. All questions were answered on a fourpoint Likert scale (0 = no sickness, 3 = severe sickness), where a higher score corresponded to a higher sickness level. The subcategory scores were scaled by the weight factors, as indicated in [30], for comparison (N = 9.54; O = 7.58; D = 13.92).  Baseline condition unrelated to the factor
Baseline condition unrelated to the factor

B3
Other Baseline (Car sickness -Book): Passenger reads a book riding in the vehicle.
Baseline condition unrelated to the factor

B. EXPERIMENTAL PROCEDURE
A total of 15 paid (KRW 10,000) subjects (14 males and 1 female between the ages of 20 and 25 years, mean = 24.666/SD = 0.617) participated in the experiment. After the consent form was filled out and basic background information regarding the subjects were obtained, the subjects were briefed regarding the purpose of the experiment and provided with instructions for the experimental task. Six subjects indicated that they had used an HMD-based VR system (although only occasionally), but not while riding in a car. The experiment comprised two separate measurement sessions: one for Conditions B1, C1-C3 (three "VR in car") and the other for Conditions B2 and B3 (baselines).
In the first session, Conditions C1-C3 were each administered for a subject based on the Latin square balanced order. Between the trials, the user was allocated at least 10 min for resting, within which the sickness was measured to be negligible (e.g., a total SSQ score of less than 30, as recommended by [30]). Then, before experiencing the C1-C3, the subject rode in the car for 5 min (in less crowded city streets in the daytime) and filled out the motion sickness questionnaire to assess the "before VR" level -which corresponds to the baseline situation, B1.
Subsequently, the subject sat on the passenger side, wore the GearVR headset, and experienced the three VR in-car conditions (C1-C3). The car was driven by a helper moving around a rectangular course on local streets (City of Asan, Chungchung Province, Korea) for five times (approximately 5.5 km), at an maximum speed of 60 km/h, which required approximately 5 min. There were six signal lights to pass through from the start to finish in each round, and the subject would typically accelerate approximately to some maximum speed then decelerate to stop at the lights. However, adhering to exact acceleration or velocity values was not enforced. The course was primarily flat. The detailed course map is shown in Figure 6.
Because the experiment was performed in the city with actual traffic, each treatment could not be performed in the exact same manner. Our purpose was to validate and ascertain the gross effect of significantly reducing sickness using our proposed method. After each ride, the subjects filled out the simulation sickness questionnaire again to record the effects. After experiencing all treatments, a few additional questions were posed regarding their preferences (among C1-C3), and post-briefings were conducted. A similar process was used to obtain the data for B2-B3.

C. RESULTS
One way ANOVA with Tukey HSD pairwise comparison was applied to analyze for the effects of the sickness reduction technique. Figure 7 (a) and Table 2 show the experimental results with regards to the changed levels in the sickness for the B1, C1-C3 (before and after). As expected, the sickness level was the lowest for the B1 (Baseline 1) -casual riding of the car would be unlikely to induce motion sickness, unless for a very sensitive person. C1, as can be expected, caused the highest level of sickness as both the motion and simulation factors were additively present. The differences between B1 and C1, C2, C3 were statistically significant in all sickness categories (see Table 2 for exact statistical figures). The C2 and C3 (the applications of the two sickness reduction techniques) both showed decreases in the sickness level, e.g. compared to the C1. Statistically significant differences were found only between C1 and C3 (Particle flow) in the Occulomotor and Disorientation categories (Paired ttest: t=1.10, p<0.027 and t=3.375, p<0.008 respectively). On the other hand, no statistically significant differences were found between the two proposed approaches, namely, C2 (Transparent wall) and C3 (Particle flow). Still the overall trend was quite clear -Transparent wall and Particle flow had an effect in reducing the sickness level. Subjects also preferred the Transparent wall (6 out of 15) and Particle flow To assess the relative levels of sickness, we performed an indirect and informal comparison with the other baseline cases (a direct comparison was not possible as the data for B2-B3 were obtained separately). Figure 7(b) shows the measured sickness levels of B1-B3. The sickness levels purely from VR (B2) and reading a book while riding in a car (B3) were much higher than that of B1 (casual driving only), as expected, with statistically significant differences for the Oculomotor (pairwise t-test: t = -2.44, p = 0.044 and t = -2.46, p = 0.043) and Disorientation (pairwise t-test: t = -2.96, p = 0.021 and t = -2.62, p = 0.034). The levels of sickness for B2 and B3 were comparable to that of C1 (where VR was used in a moving car without any provision for sickness reduction) and higher than those in C2 and C3 (our two suggested methods). This suggests that the two proposed optical-flow-like pattern visualization methods (Cases 2 and 3) are promising for reducing the sickness level.
The post-briefings were consistent with the quantitative results -most subjects reported that they experienced extremely high levels of motion/VR sickness in C1, unlike in C2 and C3, where feelings of peace were experienced. Additionally, for C3, the participants reported that they felt distracted as well as a sense of speed owing to the visual effect of the flying particles.

VI. DISCUSSION AND CONCLUSION
Herein, we presented two methods to synchronize the visual motion of a car (where the user is riding in the car) to that of the VR content that the user is viewing to reduce the level of motion/simulator sickness. The methods were based on augmenting the VR content with the estimated motion information of the car (1) indirectly as a distant moving scene and observed over the window, or (2) as a particle-based motion flow. Unlike our previous study, the methods are not associated with a particular form of the application and can be used in any VR application. Although the experimental . Comparisons of sickness levels among different test conditions (B1, and C1-C3, and B1-B3). B1 represents sickness during only casual driving, C1, with each of the following: with VR, C2; with VR and transparent wall, C3; with VR and particle flow. In general, C2 and C3 indicated reduced sickness level. In particular, B1 indicated statistically significant differences in Oculomotor and Disorientation categories. Sickness levels purely from VR (B2) and reading a book while riding in a car (B3) were comparable to that of C1, where VR was used in the moving car without any provision for sickness reduction. The horizontal lines and * marks indicate statistically significant differences from the pair-wise t-test. results did not indicate sickness reduction effects of statistical significance compared with the baseline case, a clear promising trend was observed.
There are several limitations to our study. For one, the number of subjects was limited because of the COVID-19. To increase the validity of our experimental results, we hope to continue our research using a sufficiently high number of subjects. Nevertheless, we believe that our initial results are promising. The effect of our approach on simulation sickness by gender may be another interesting future research direction. This could not be assessed in our current study because we recruited only one female subject (vs. 14 males). While we have used the SSQ [30] as it is the most popular and often used method to assess simulator sickness, more accurate and objective assessment would require the use of e.g. physiological signals or on-line measurements to further validate our results in the future. The vehicle motion was not controlled exactly (other than keeping the speed under VOLUME 4, 2016 60 km/hour) as the test was conducted in real traffic. A more careful study would reveal the correlation between the vehicle motion characteristics to the sickness reduction effects.
Other several future studies are planned. First, the optical flow, which is critical in inducing visual motion, was not replicated precisely based on the actual motion of the car. Depiction of the actual optical flow, for example, extracted from the live camera feed of the car, might be problematic owing to the latency and disruptions of the optical flow due to minor objects in the scene (e.g., pedestrians in the streets).
Additionally, despite the promising trend, similar to the study presented in [7], the sickness was not reduced entirely to a negligible level. Other known techniques to reduce motion and VR sickness should be combined, such as the use of a reference object [28], peripheral blurring [31], adjustment of FOV (field of view) size [32], and the use of changed blindness [11].
The IMU sensor was affected by drift and shifted the view orientation gradually (we installed a button such that the user can manually reset the IMU sensor). Furthermore, the car exhibited rolling movements, which were not considered in the approach. Particle flow visualization was invasive to the main content and was reported as distracting by many subjects. A more sophisticated and accurate sensing and software processing that reflects the vehicle movement to the virtual space to replicate a visual feedback that completely matches the corresponding vestibular sense should be developed. GERARD J. KIM received his BS in Electrical and Computer Engineering from Carnegie Mellon University (1987) and MS, PhD in Computer Science at the University of Southern California (1994). He is currently a Professor of Computer Science and Engineering at Korea University. Ger-ard's main research interests are human computer interaction, virtual/mixed reality, and computer music. VOLUME 4, 2016