<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article article-type="research-article" dtd-version="2.3" xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Virtual Real.</journal-id>
<journal-title>Frontiers in Virtual Reality</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Virtual Real.</abbrev-journal-title>
<issn pub-type="epub">2673-4192</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">738613</article-id>
<article-id pub-id-type="doi">10.3389/frvir.2021.738613</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Virtual Reality</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Haplets: Finger-Worn Wireless and Low-Encumbrance Vibrotactile Haptic Feedback for Virtual and Augmented Reality</article-title>
<alt-title alt-title-type="left-running-head">Preechayasomboon and Rombokas</alt-title>
<alt-title alt-title-type="right-running-head">Haplets</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Preechayasomboon</surname>
<given-names>Pornthep</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="corresp" rid="c001">&#x2a;</xref>
<uri xlink:href="https://loop.frontiersin.org/people/1371485/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Rombokas</surname>
<given-names>Eric</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<uri xlink:href="https://loop.frontiersin.org/people/923851/overview"/>
</contrib>
</contrib-group>
<aff id="aff1">
<label>
<sup>1</sup>
</label>Rombolabs, Mechanical Engineering, University of Washington, <addr-line>Seattle</addr-line>, <addr-line>WA</addr-line>, <country>United&#x20;States</country>
</aff>
<aff id="aff2">
<label>
<sup>2</sup>
</label>Electrical Engineering, University of Washington, <addr-line>Seattle</addr-line>, <addr-line>WA</addr-line>, <country>United&#x20;States</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>
<bold>Edited by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/1038613/overview">Daniel Leithinger</ext-link>, University of Colorado Boulder, United&#x20;States</p>
</fn>
<fn fn-type="edited-by">
<p>
<bold>Reviewed by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/917877/overview">Pedro Lopes</ext-link>, University of Chicago, United&#x20;States</p>
<p>
<ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/1370225/overview">Ken Nakagaki</ext-link>, Massachusetts Institute of Technology, United&#x20;States</p>
</fn>
<corresp id="c001">&#x2a;Correspondence: Pornthep Preechayasomboon, <email>prnthp@uw.edu</email>
</corresp>
<fn fn-type="other">
<p>This article was submitted to Haptics, a section of the journal Frontiers in Virtual Reality</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>20</day>
<month>09</month>
<year>2021</year>
</pub-date>
<pub-date pub-type="collection">
<year>2021</year>
</pub-date>
<volume>2</volume>
<elocation-id>738613</elocation-id>
<history>
<date date-type="received">
<day>09</day>
<month>07</month>
<year>2021</year>
</date>
<date date-type="accepted">
<day>06</day>
<month>09</month>
<year>2021</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#xa9; 2021 Preechayasomboon and Rombokas.</copyright-statement>
<copyright-year>2021</copyright-year>
<copyright-holder>Preechayasomboon and Rombokas</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these&#x20;terms.</p>
</license>
</permissions>
<abstract>
<p>We introduce Haplets, a wearable, low-encumbrance, finger-worn, wireless haptic device that provides vibrotactile feedback for hand tracking applications in virtual and augmented reality. Haplets are small enough to fit on the back of the fingers and fingernails while leaving the fingertips free for interacting with real-world objects. Through robust physically-simulated hands and low-latency wireless communication, Haplets can render haptic feedback in the form of impacts and textures, and supplements the experience with pseudo-haptic illusions. When used in conjunction with handheld tools, such as a pen, Haplets provide haptic feedback for otherwise passive tools in virtual reality, such as for emulating friction and pressure-sensitivity. We present the design and engineering for the hardware for Haplets, as well as the software framework for haptic rendering. As an example use case, we present a user study in which Haplets are used to improve the line width accuracy of a pressure-sensitive pen in a virtual reality drawing task. We also demonstrate Haplets used during manipulation of objects and during a painting and sculpting scenario in virtual reality. Haplets, at the very least, can be used as a prototyping platform for haptic feedback in virtual reality.</p>
</abstract>
<kwd-group>
<kwd>haptics</kwd>
<kwd>virtual reality</kwd>
<kwd>augmented reality</kwd>
<kwd>spatial computing</kwd>
<kwd>sensory feedback</kwd>
<kwd>human computer interface</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<title>1 Introduction</title>
<p>Hands can be considered the most dexterous tool that a human naturally possesses, making them the most obvious input modality for virtual reality (VR) and augmented reality (AR). In productivity tasks in AR and VR, natural hand tracking enables seamless context switching between the virtual and physical world i.e.,&#x20;not having to put down a controller first to interact with a physical keyboard. However, a major limitation is the lack of haptic feedback that leads to a poor experience in scenarios that require manual dexterity such as object manipulation, drawing, writing, and typing on a virtual keyboard (<xref ref-type="bibr" rid="B10">Gupta et&#x20;al., 2020</xref>). Using natural hand input invokes the visuo-haptic neural representations of held objects when they are seen and felt in AR and VR (<xref ref-type="bibr" rid="B20">Lengyel et&#x20;al., 2019</xref>). Although haptic gloves seem promising for rendering a realistic sense of touch and textures, or provide kinesthetic impedance in the virtual or augmented space, wearing a glove that covers the fingers greatly reduces the tactile information from physical objects outside the augmented space. Thus having a solution that provides believable haptic feedback with the lowest encumbrance is desirable.</p>
<p>Numerous research devices have shown that there is value in providing rich haptic feedback to the fingertips during manipulation (<xref ref-type="bibr" rid="B15">Johansson and Flanagan, 2009</xref>; <xref ref-type="bibr" rid="B29">Schorr and Okamura, 2017</xref>; <xref ref-type="bibr" rid="B12">Hinchet et&#x20;al., 2018</xref>; <xref ref-type="bibr" rid="B19">Lee et&#x20;al., 2019</xref>), texture perception (<xref ref-type="bibr" rid="B4">Chan et&#x20;al., 2021</xref>), stiffness perception (<xref ref-type="bibr" rid="B27">Salazar et&#x20;al., 2020</xref>), and normal and shear force perception (<xref ref-type="bibr" rid="B16">Kim et&#x20;al., 2018</xref>; <xref ref-type="bibr" rid="B24">Preechayasomboon et&#x20;al., 2020</xref>). Although these devices may render high fidelity haptic feedback, they often come at the cost of being tethered to another device or have bulky electronics that impede the wearability of the device and ultimately hinder immersion of the VR experience. Additionally, once devices are placed on the fingertips, any interaction with objects outside the virtual space is rendered impossible unless the device is removed or put down. <xref ref-type="bibr" rid="B32">Teng et&#x20;al. (2021)</xref> has shown that wearable, wireless, low encumbrance haptic feedback on the fingertips is useful for AR scenarios with a prototype that leaves the fingertips free when haptic feedback is not required. Akin to the growing adoption of virtual reality, the device must be as frictionless to the user as possible&#x2014;wearable haptic devices are no exception.</p>
<p>It has been shown that rendering haptic feedback away from the intended site does provide meaningful sensations that can be interpreted as proxies for the interactions at the hand (<xref ref-type="bibr" rid="B23">Pezent et&#x20;al., 2019</xref>), or for mid-air text entry (<xref ref-type="bibr" rid="B10">Gupta et&#x20;al., 2020</xref>). <xref ref-type="bibr" rid="B2">Ando et&#x20;al. (2007)</xref> has shown that rendering vibrations on the fingernail can be used to augment passive touch-sensitive displays for creating convincing perception of edges and textures, others have extended this technique to include projector-based augmented reality (<xref ref-type="bibr" rid="B25">Rekimoto, 2009</xref>), and even used the fingernail as a haptic display itself (<xref ref-type="bibr" rid="B13">Hsieh et&#x20;al., 2016</xref>). We have shown that there is a perceptual tolerance for conflicting locations of visual and tactile touch, in which the two sensory modalities are fused into a single percept despite arising from different locations (<xref ref-type="bibr" rid="B3">Caballero and Rombokas, 2019</xref>). Furthermore, combining multiple modalities either in the form of augmenting otherwise passive haptic sensations (<xref ref-type="bibr" rid="B5">Choi et&#x20;al., 2020</xref>), using pseudo-haptic illusions (<xref ref-type="bibr" rid="B1">Achibet et&#x20;al., 2017</xref>; <xref ref-type="bibr" rid="B28">Samad et&#x20;al., 2019</xref>), or a believable simulation (<xref ref-type="bibr" rid="B17">Kuchenbecker et&#x20;al., 2006</xref>; <xref ref-type="bibr" rid="B4">Chan et&#x20;al., 2021</xref>), can possibly mitigate the lack of congruence between the visual and tactile sensation. We therefore extend what <xref ref-type="bibr" rid="B2">Ando et&#x20;al. (2007)</xref> has proposed to immersive virtual reality by placing the haptic device on the fingernail and finger dorsum and compensating for the distant stimulation with believable visual and haptic rendering, which leaves the fingerpads still free to interact with real-world objects.</p>
<p>With the hands now free to hold and interact with physical objects, any passive object can become a tangible prop or tool. These held tools can provide passive haptic feedback while presenting familiar grounding and pose for the fingers. Gripmarks (<xref ref-type="bibr" rid="B33">Zhou et&#x20;al., 2020</xref>) has shown that everyday objects can be used as mixed reality input by using the hand&#x2019;s pose as an estimate to derive the object being held. In this paper, we further this concept by introducing Haplets: small, wireless and wearable haptic actuators. Each Haplet is a self-contained unit that consists of the bare minimum required to render vibrotactile stimulus wirelessly: a linear resonant actuator (LRA), a motor driver, a wireless system-on-a-chip (SoC), and a battery. Haplets are worn on the dorsal side of the finger and fingernail, and have a footprint small enough that the hands can still be tracked using computer vision methods. Combined with a believable simulation for rendering vibrotactile feedback in VR, Haplets can be used to augment the sensation of manipulation, textures and stiffness for bare hands while still maintaining the ability to pick up and handle everyday objects outside the virtual space. With a tool held in the hand, Haplets can render haptic effects to emulate the sensations when the tool interacts with the virtual environment. We use Haplets as an exploration platform towards building low-encumbrance, wearable haptic feedback devices for virtual and augmented reality.</p>
<p>The rest of this paper is organized as follows: first, in <xref ref-type="sec" rid="s2">Section 2</xref>, we describe the hardware for each Haplet and engineering choices made for each component, including our low-latency wireless communication scheme. We then briefly cover the characterization efforts for the haptic actuator (the LRA). Then, we cover our software efforts in creating a physically-believable virtual environment that drives our haptic experiences, including physics-driven virtual hands and augmented physical tools. In <xref ref-type="sec" rid="s3">Section 3</xref>, we cover a small user study to highlight one use case of Haplets and explore the practicality of Haplets in a virtual reality scenario. In <xref ref-type="sec" rid="s4">Section 4</xref>, we demonstrate other use cases for Haplets in virtual or augmented reality environment such as manipulation, texture discrimination, and painting with tools. Finally, in <xref ref-type="sec" rid="s5">Section 5</xref>, we discuss our engineering efforts and the results of our user study, and provide insight for shortcomings and potential future&#x20;work.</p>
</sec>
<sec id="s2">
<title>2 Materials and Methods</title>
<p>Haplets can be thought of as distributed wireless wearable haptic actuators. As mentioned previously, each Haplet consists of the bare minimum required to render haptic effects: an LRA, a motor driver, a wireless SoC, and a battery (<xref ref-type="fig" rid="F1">Figure&#x20;1A</xref>). We minimized the footprint so that Haplets, aside from our target area of the finger, can be worn on other parts of the body such as the wrist, arms, or face, or integrated into other, larger systems. Haplets is designed to be able to drive other voice-coil based actuators such as voice coil motors (VCMs), eccentric rotating mass (ERM) actuators, and small brushed DC motors, as&#x20;well.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption>
<p>An overview of Haplets. <bold>(A)</bold> Components comprising of a Haplet unit is shown, including the three key elements: the wireless SoC, the LRA, and the built-in battery. <bold>(B)</bold> Haplets are attached on the fingernails through fingernail-mounted magnets. The magnets are embedded in a plastic housing that is attached to the fingernail using double-sided adhesive. <bold>(C)</bold> The user wears Haplets on the thumb, the index finger and the middle finger. An example of a manipulation scenario as seen in VR compared to the real-world is shown. <bold>(D)</bold> When used with a tool (a pen, as shown), Haplets can be used to augment the virtual representation of the tool in VR by providing vibrotactile feedback in addition to the passive haptic feedback provided by the finger&#x2019;s grounding on the&#x20;tool.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g001.tif"/>
</fig>
<sec id="s2-1">
<title>2.1&#x20;Finger-Mounted Hardware</title>
<p>The core electronic components of each Haplet, as shown in <xref ref-type="fig" rid="F1">Figure&#x20;1A</xref>, are contained within one side of 13.7&#xa0;mm by 16.6&#xa0;mm PCB, while the other side of the PCB is a coin cell socket. We use a BC832 wireless module (Fanstel) that consists of a nRF52832 SoC (Nordic Semiconductor) and an integrated radio antenna. The motor driver for the LRA is a DRV8838 (Texas Instruments), which is chosen for its high frequency, non-audible, pulse width modulation limit (at 250&#xa0;kHz) and versatile voltage input range (from 0 to 11&#xa0;V). The PCB also consists of a J-Link programming and debug port, light emitting diode (LED), and two tactile buttons for resetting the device and entering device firmware upgrade (DFU) mode. The DFU mode is used for programming Haplets over a Bluetooth connection. The coin cell we use is a Lithium-ion CP1254 (Varta), measuring 12&#xa0;mm in diameter by 5.4&#xa0;mm in height, which is chosen for its high current output (120&#xa0;mA) and high power density. In our tests, Haplets can be used for up to 3&#xa0;h of typical usage and the batteries can be quickly replaced. The total weight of one Haplet unit, including the LRA, is 5.2&#xa0;g.</p>
<p>We imagine Haplets as a wearable device, therefore Haplets must be able to be donned and doffed with minimal effort. To achieve this, we use 3D printed nail covers with embedded magnets, as shown in <xref ref-type="fig" rid="F1">Figure&#x20;1B</xref>, to attach the LRA to the fingernail. The nail covers are small, lightweight, and can be attached to the fingernail using double-sided adhesive. Each cover has a concave curvature that corresponds to each fingernail. The Haplets&#x2019; PCB is attached to the dorsal side of the middle phalanx using silicone-based, repositionable double-sided adhesive tape. In our user studies and demonstrations, we place the Haplets on the thumb, index finger and middle finger of the right hand, as shown in <xref ref-type="fig" rid="F1">Figure&#x20;1C</xref>.</p>
</sec>
<sec id="s2-2">
<title>2.2&#x20;Low-Latency Wireless Communication</title>
<p>Since we target the fingers, we desire to reduce the latency from visual perception to tactile perception as much as possible, especially when considering the mechanical time constant of the LRA<xref ref-type="fn" rid="FN1">
<sup>1</sup>
</xref>. Although Bluetooth Low Energy (BLE) is commonplace and readily available in most systems with a wireless interface, the overall latency can vary from device to device. We therefore opted to use Enhanced ShockBurst (ESB)<xref ref-type="fn" rid="FN2">
<sup>2</sup>
</xref>, a proprietary radio protocol developed by Nordic Semiconductor, for our devices instead. ESB enables up to eight primary transmitters<xref ref-type="fn" rid="FN3">
<sup>3</sup>
</xref> to communicate with a primary receiver. In our implementation, each Haplet is a primary transmitter that sends a small packet at a fixed interval to a host microcontroller, a primary receiver, which is another SoC that is connected to a VR headset, smartphone or PC (<xref ref-type="fig" rid="F2">Figure&#x20;2A</xref>). Commands for each Haplet are sent in return along with the acknowledge (ACK) packet from the host device to the Haplets. If each Haplet transmits at an interval of 4 milliseconds, then ideally the maximum latency will be slightly over 4&#xa0;milliseconds when accounting for radio transmission times for the ACK packet.</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption>
<p>Haplets&#x2019; low-latency wireless communication architecture. <bold>(A)</bold> Each Haplet communicates using the Enhanced ShockBurst (ESB) protocol with a host microcontroller that receives command from a host device using USB HID. Commands are updated at the rate of 250&#xa0;Hz. <bold>(B)</bold> Latency, as defined by the time the host microcontroller receives a command from the host device to the time a Haplet receives the command, is shown over a 10&#xa0;s interval. The maximum latency and median latency is 3.60 and 1.50&#xa0;milliseconds, respectively. <bold>(C)</bold> Events received from our logic analyzer showing our timeslot algorithm momentarily adjusting the period for sending packets over ESB to prevent radio collisions between Haplets.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g002.tif"/>
</fig>
<p>Since Haplets transmit at a high frequency (every 4&#xa0;ms or 250&#xa0;Hz), there is a high chance of collisions between multiple units. We mitigate this by employing a simple time-slot synchronization scheme between the Haplets and the host microcontroller where each Haplet must transmit in its own predefined 500&#xa0;microsecond timeslot. The host microcontroller keeps a 250&#xa0;Hz clock and a microsecond counter that resets every tick of the 250&#xa0;Hz clock. The counter value from the host is transmitted along with the command packets and each Haplet then uses the counter value to adjust its next transmitting interval to correct itself. For instance, if a Haplet receives a counter value of 750&#xa0;microseconds and its timeslot is at 1,000&#xa0;microseconds, it will delay its next round of transmission by 250&#xa0;microseconds or 4,250&#xa0;microseconds in total, after the correction, it will transmit at the usual 4,000&#xa0;microsecond interval until another correction is needed.</p>
<p>One drawback of our implementation, as we use a proprietary radio protocol, is we cannot use the built-in Bluetooth capabilities of host devices (i.e.,&#x20;VR headsets or PC) to communicate with Haplets. Therefore, we use a nRF52840 SoC (Nordic Semiconductor) as a host microcontroller that communicates with the host device through a wired USB connection. In order to minimize the end-to-end latency, we use the Human Interface Device (HID) class for our USB connection. The benefits are two-fold: 1. HID has a typical latency of 1&#xa0;millisecond and 2. HID is compatible out-of-the-box with most modern hardware including both Windows and Unix-based PCs, standalone VR headsets such as the Oculus Quest, and most Android-based devices (<xref ref-type="bibr" rid="B24">Preechayasomboon et&#x20;al., 2020</xref>).</p>
<p>We briefly tested the communication latency of our system by running a test program that sends command packets over HID to our host microcontroller to three Haplets at 90&#xa0;Hz&#x2014;this frequency is chosen to simulate the typical framerate for VR applications. Two digital output pins, one from a Haplet and one from the host microcontroller, were connected to a logic analyzer. The Haplet&#x2019;s output pin toggles when a packet is received and the host microcontroller&#x2019;s output pin toggles when a HID packet is received. Therefore, latency here is defined by the interval of the time a command is received from the host device (PC) to the time the Haplet receives the command. We found that with three Haplets receiving commands simultaneously, the median latency is 1.50&#xa0;ms over 10&#xa0;s, with a maximum latency of 3.60&#xa0;ms during our testing window. A plot of the latency over the time period is shown in <xref ref-type="fig" rid="F2">Figure&#x20;2B</xref> along with an excerpt of captured packet times with the timeslot correction in use in <xref ref-type="fig" rid="F2">Figure&#x20;2C</xref>. It should be noted that the test was done in ideal conditions where no packets were lost and the Haplets are in close proximity with the host controller.</p>
</sec>
<sec id="s2-3">
<title>2.3 Vibration Amplitude Compensation</title>
<p>Haplet&#x2019;s LRA is an off-the-shelf G1040003D 10&#xa0;mm LRA module (Jinlong Machinery and Electronics, Inc.). The module has a resonant frequency at 170&#xa0;Hz and is designed to be used at that frequency, however, since the LRA is placed in such close proximity to the skin, we observed that frequencies as low as 50&#xa0;Hz at high amplitudes were just as salient as those closer to the resonant frequency at lower amplitudes. Lower frequencies are important for rendering rough textures, pressure, and softness (<xref ref-type="bibr" rid="B17">Kuchenbecker et&#x20;al., 2006</xref>; <xref ref-type="bibr" rid="B5">Choi et&#x20;al., 2020</xref>) and a wide range of frequency is required for rendering realistic textures (<xref ref-type="bibr" rid="B9">Fishel and Loeb, 2012</xref>). Thus, we performed simple characterization in order to compensate for the output of the LRA at frequencies outside the resonant frequency range, from 50 to 250&#xa0;Hz. <xref ref-type="fig" rid="F3">Figure&#x20;3C</xref> shows the output response of the LRA as supplied by the manufacturer when compared to our own characterization using the characterization jig in <xref ref-type="fig" rid="F3">Figure&#x20;3A</xref> (as suggested by the Haptics Industry Forum<xref ref-type="fn" rid="FN4">
<sup>4</sup>
</xref>), and when characterized on the fingertips (<xref ref-type="fig" rid="F3">Figure&#x20;3B</xref>). The acceleration output was recorded using a micro-electromechanical-based inertial measurement unit (MEMs-based IMU) (ICM42688, TDK) on a 6.4&#xa0;mm by 10.2&#xa0;mm, 0.8&#xa0;mm thick PCB connected to a specialized Haplet via an I<sup>2</sup>C connection through a flat flex ribbon cable (FFC). The specialized Haplet streams readings from the IMU at 1,000&#xa0;Hz to the host device for recording on a PC. We found that when using the compensation profile derived from our characterization jig (<xref ref-type="fig" rid="F3">Figure&#x20;3A</xref>) on the fingernail, the output at higher frequencies were severely overcompensated for and provided uncomfortable levels of vibration. However, when characterization was performed at the target site (the fingertips), with the IMU attached on the fingerpad, the resulting output amplitudes after compensation were subjectively pleasant and more consistent to our expectations.</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption>
<p>Haplets under characterization are shown and their resulting plots. <bold>(A)</bold> The characterization jig used for characterizing a Haplet&#x2019;s LRA. The jig is hung using two threads from a solid foundation. A special Haplet with an IMU is used to record the accelerations resulting from the LRA&#x2019;s inputs. <bold>(B)</bold> The characterization results from the jig closely resembles the characterization derived from the LRA&#x2019;s datasheet, however, the characterization results when the LRA is placed on a fingernail is substantially different. <bold>(C)</bold> The same devices used in the characterization jig are placed on the fingernail and fingerpads to perform LRA characterization at the fingertips. <bold>(D)</bold> Results from characterization at the fingertip at various frequencies. <bold>(E)</bold> After compensating for reduced amplitude outputs using the model derived from characterization, commanded amplitudes, now in m/s<sup>2</sup>, closely match the output amplitude.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g003.tif"/>
</fig>
<p>Characterization was performed by rendering sine wave vibrations at frequencies ranging from 50 to 250&#xa0;Hz in 10&#xa0;Hz increments and at amplitudes ranging from 0.04 to 1.9&#xa0;V (peak-to-peak) in 0.2&#xa0;V increments with a duration of 0.5&#xa0;s. Acceleration data was sampled and collected at 1,000&#xa0;Hz during the vibration interval. A total of five repetitions of the frequency and amplitude sweeps were performed. The resulting amplitude is the mean of the maximum measured amplitude of each repetition for each combination of frequency and amplitude, as presented in <xref ref-type="fig" rid="F3">Figure&#x20;3D</xref>. The output compensation is then calculated first by fitting a linear model for each frequency&#x2019;s response, <italic>amplitude</italic> &#x3d; <italic>x</italic>
<sub>
<italic>f</italic>
</sub>
<italic>V</italic>. Then, the inverse of the model is used with the input being the desired acceleration amplitude, in m/s<sup>2</sup>, and the output being the voltage for achieving that acceleration. Frequencies outside the characterized models are linearly interpolated. The results from our compensation on a subset of the characterized frequencies, along with frequencies outside of the characterization intervals, is shown in <xref ref-type="fig" rid="F3">Figure&#x20;3E</xref>. We use the same compensation profile for every instance of haptic rendering throughout this&#x20;paper.</p>
</sec>
<sec id="s2-4">
<title>2.4 Virtual Environment</title>
<p>Our haptic hardware is only one part of our system. Robust software that can create compelling visuals and audio as well as believable and responsive haptic effects is equally important. In this paper, we build up a software framework using the Unity game engine to create our virtual environments as described in the following sections. Our entire system is run locally on the Oculus Quest two and is completely standalone: requiring only the headset, our USB-connected host microcontroller and the Haplets themselves.</p>
<sec id="s2-4-1">
<title>2.4.1 Haptic Rendering</title>
<p>Haplets are commanded to render sine wave vibrations using packets that describe the sine wave frequency, amplitude and duration. Each vibration becomes a building block for haptic effects and are designed to be either used as a single event or chained together for complex effects. For example, a small &#x201c;click&#x201d; that resembles a click on a trackpad can be commanded as a 10&#xa0;millisecond, 170&#xa0;Hz vibration with an amplitude of 0.2&#xa0;m/s<sup>2</sup>. To render textures, short pulses of varying frequencies and amplitudes are chained together in rapid succession. Due to the low-latency, haptic effects can be dynamic and responsive to the environment. Examples of interactions that highlight the responsiveness of such a low-latency system are presented in the following sections.</p>
</sec>
<sec id="s2-4-2">
<title>2.4.2&#x20;Physics-Driven Hands</title>
<p>The user&#x2019;s hands in our environment are physically simulated using NVIDIA PhysX Articulations system for robotics<xref ref-type="fn" rid="FN5">
<sup>5</sup>
</xref>. Articulations are abstracted as ArticulationBodies in the Unity game engine. This system enables robust hand-object manipulations and believable response towards other rigid bodies in the scene, such as pushing, prodding, and throwing. The fingers are a series of linkages connected using either 1, 2, or 3 degree of freedom revolute joints with joint limits similar to that of a human hand (<xref ref-type="bibr" rid="B7">Cobos et&#x20;al., 2008</xref>). The wrist is connected to the tracked position of actual wrist using a 3 degree of freedom prismatic joint. As a result, pseudo-haptics (<xref ref-type="bibr" rid="B18">L&#xe9;cuyer, 2009</xref>) is readily available as part of the system, meaning that users must extend their limbs further than what is seen in response to a larger force being applied to the virtual hands. This is also known as the pseudo-haptic weight illusion (<xref ref-type="bibr" rid="B28">Samad et&#x20;al., 2019</xref>) or the god object model (<xref ref-type="bibr" rid="B34">Zilles and Salisbury, 1995</xref>). For higher fidelity, we set our simulation time step to 5&#xa0;ms and use the high frequency hand tracking (60&#xa0;Hz) mode on the Oculus Quest 2. A demonstration of the system is available as a video in the <xref ref-type="sec" rid="s11">Supplementary Materials</xref> and <xref ref-type="fig" rid="F4">Figure&#x20;4</xref>.</p>
<fig id="F4" position="float">
<label>FIGURE 4</label>
<caption>
<p>Physics-driven hands. <bold>(A)</bold> Physics-driven hands shown in the following scenarios from left to right: (1) When pressing against a stationary object the physics-driven fingers conform along the object&#x2019;s curvature while respecting joint limits. (2) When pressing on a button with a spring-like stiffness, the fingers do not buckle under the constraints. The whole hand is also offset according to the force resisting the hand, resulting in a pseudo-haptic illusion. (3) When grasping and lifting objects, the fingers respect the geometry of the object and conform along the shape of the object. Gravity acting on the object and inertia also dictates the pseudo-haptic illusion. <bold>(B)</bold> A diagram showing the articulated bodies and their respective joints (The hand&#x2019;s base skeleton is identical to the OVRHand skeleton provided with the Oculus Integration SDK).</p>
</caption>
<graphic xlink:href="frvir-02-738613-g004.tif"/>
</fig>
<p>We take advantage of the robust physics simulation and low-latency communication to render haptic effects. A collision event that occurs between a finger and an object is rendered as a short 10&#xa0;ms burst of vibration with an amplitude scaled to the amount of impulse force. Each object also has unique haptic properties: the frequency of vibration during impact with fingers and the frequency of vibration during fingers sliding across the object. For instance, a wooden surface with high friction would have a sliding frequency of 170&#xa0;Hz and a rubber-like surface with lower friction would have a sliding frequency of 200&#xa0;Hz. Additionally, as both our objects and fingers have friction, when a finger glides across a surface, the stick-slip phenomenon can be observed both visually and through haptic feedback (<xref ref-type="fig" rid="F5">Figures&#x20;5A,B</xref>).</p>
<fig id="F5" position="float">
<label>FIGURE 5</label>
<caption>
<p>Haptic rendering output, shown as waveforms, during a time window of interaction in the following scenarios: <bold>(A)</bold> dragging fingers across wood, <bold>(B)</bold> dragging fingers across smooth plastic, <bold>(C)</bold> picking up and letting go of a plastic toy, <bold>(D)</bold> pressing a button, and <bold>(E)</bold> prodding two similar spheres with different masses.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g005.tif"/>
</fig>
</sec>
<sec id="s2-4-3">
<title>2.4.3 Tools</title>
<p>With the fingerpads free to grasp and hold actual objects, we augment the presence of handheld tools using visual and haptic feedback.</p>
<p>First, we detect the tool being held in the hand using a technique similar to template matching, as presented in GripMarks (<xref ref-type="bibr" rid="B33">Zhou et&#x20;al., 2020</xref>). Each tool has a unique pose of the hand, such as the pose when holding a pen or the pose when holding a spray bottle (<xref ref-type="fig" rid="F6">Figure&#x20;6</xref>), which is stored a set of joint angles for every joint of the hand. Our algorithm then compares each tool&#x2019;s predefined pose to the current user&#x2019;s pose using the Pearson correlation coefficient in a sliding 120 frame window. If 80% of frames in the window contains a pose with over a coefficient over 0.9, then it is deemed that the tool is being held in the user&#x2019;s hand. To &#x201c;release&#x201d; a tool, the user would simply open their hand fully for 1&#xa0;s (<xref ref-type="fig" rid="F6">Figure&#x20;6</xref>). Any tool can now be altered in shape and experience both visually and through haptics through the headset. For instance, the user can physically hold a pen but in a pose akin to holding a hammer, and in their VR environment, they would see and feel as if they are holding a hammer. Additionally, since our algorithm relies only on the hand&#x2019;s pose, an actual tool does not have to be physically held by the user&#x2019;s hand, we also explore this in our user study in the following sections.</p>
<fig id="F6" position="float">
<label>FIGURE 6</label>
<caption>
<p>An illustration of the tool activation system. The user starts with an open hand and holds the desired pose for each tool for 1&#xa0;s. The user can also hold a physical proxy of the tool in the hand. After 1&#xa0;s, a tool is visually rendered in the virtual hands and the haptic rendering system augments any interaction of the tool with environment. To release a tool, the user fully opens their hand for 1&#xa0;s. Switching between tools requires the user to release the current tool&#x20;first.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g006.tif"/>
</fig>
<p>When a tool is detected, the tool is visually rendered attached to the hand. In our physics simulation, the tool&#x2019;s rigid body is attached to the wrist&#x2019;s ArticulationBody and thus the tool can respond dynamically to the environment as if the tool and the hands are a single object, maintaining the same pseudo-haptic capabilities as presented in the previous sections. The physically held tools provide passive haptic feedback in the form of pressure and familiar grounding while Haplets can be used to render vibrotactile feedback to augment the presence of the held tool in response to the virtual environment. Three examples of haptic rendering schemes for tools are presented in <xref ref-type="fig" rid="F7">Figure&#x20;7</xref>.</p>
<fig id="F7" position="float">
<label>FIGURE 7</label>
<caption>
<p>Haptic rendering output, shown as waveforms, during a time window of tool interaction in the following scenarios: <bold>(A)</bold> when drawing with the pen tool, <bold>(B)</bold> when striking objects of different weights with the hammer tool, <bold>(C)</bold> when spraying paint using the spray painting tool.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g007.tif"/>
</fig>
</sec>
</sec>
</sec>
<sec id="s3">
<title>3 User Study</title>
<p>In this section, we introduce a sketching user study to evaluate the feasibility of using Haplets in a productivity scenario. We chose a sketching task because it encompasses the main concepts introduced in this paper: 1) a physical tool (a pen) is held by the user, thus allowing Haplets to augment the tool with vibrotactile haptics, 2) upon the pen contacting with a surface and while drawing, Haplets renders impacts and textures, and 3) physics-driven hands and tools respond to the sketching environment, introducing pseudo-haptic force and friction. The main task is loosely based on VRSketchPen (<xref ref-type="bibr" rid="B8">Elsayed et&#x20;al., 2020</xref>) where the user would trace a shape shown on a flat canvas. With a simulated pressure sensitive pen, users would need to maintain a precise distance from the canvas in order to draw a line that matches the line thickness of the provided guide. We hypothesize that with Haplets providing vibrotactile feedback, users would be able to draw lines closer to the target thickness. In addition to the sketching task, after the end of the session, the user is presented with a manipulation sandbox for them to explore the remaining modalities that Haplets has to offer such as texture discrimination, object manipulation, and pseudo-haptic weight. The details for this sandbox is described in <xref ref-type="sec" rid="s4-1">Section&#x20;4.1</xref>.</p>
<sec id="s3-1">
<title>3.1 Experimental Setup</title>
<p>We recruited eight right-handed participants (2 females, aged 22&#x2013;46, mean &#x3d; 32.75, SD &#x3d; 7.44) to participate in the study. Proper social distancing and proactive disinfection according to local guidance was maintained at all times and the study was mostly self-guided through prompts in the VR environment. The study was approved by our institution&#x2019;s IRB and participants gave informed consent. Participants were first seated and started by donning three Haplets on the thumb, index and middle fingers of their right hand. Then they donned an Oculus Quest two headset. Participants then picked up a physical pen and held it in their left hand using the headset&#x2019;s AR Passthrough mode, then a standalone VR Unity application was launched. All experiments were run and logged locally on the headset. All interactions in VR were done using on-device hand tracking (<xref ref-type="bibr" rid="B11">Han et&#x20;al., 2020</xref>).</p>
<p>The VR environment consists of a single desktop with a large canvas, as shown in <xref ref-type="fig" rid="F8">Figure&#x20;8</xref> The canvas is used to present instructions, questions and the actual tracing task. Participants were first instructed to hold the physical pen in their right hand, which will also create a virtual pen in their hand using the tool detection system presented in the previous section. The participant uses the pen to interact with most elements in the environment including pressing buttons and drawing.</p>
<fig id="F8" position="float">
<label>FIGURE 8</label>
<caption>
<p>An overview of the user study environment: <bold>(A)</bold> An adjustable floating desk is presented to the user along with a canvas. The canvas contains template for the user to trace with along with an indicator for the remaining time in each repetition. The canvas can also present buttons for the user to indicate that they&#x2019;re done with the drawing or answers to questions. <bold>(B)</bold> Three shapes are used in the user study: a circle, a square and an outline of a hand. Participants start tracing at the green line section and end at the red line section. <bold>(C)</bold> A plot of the line width that results from how deep the users actual hand is penetrating the surface of the canvas. Participants must aim for the 5&#xa0;mm line width which corresponds to a 10&#xa0;mm depth. Vibrotactile feedback is provided as shown in <xref ref-type="fig" rid="F7">Figure&#x20;7A</xref>. <bold>(D)</bold> An example of the participant&#x2019;s view captured from the headset during a&#x20;trial.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g008.tif"/>
</fig>
<p>The main task consists of participants tracing three shapes with the virtual pen: a square, a circle and a hand, as shown in <xref ref-type="fig" rid="F8">Figure&#x20;8</xref>. The shapes are presented in a randomized order and each shape is given 15&#xa0;s to complete. The virtual pen is pressure sensitive and the lines the users draw vary in thickness depending on how hard the user is pressing against the canvas&#x2014;we simulate this using our physically simulated hands, which means that the further the user&#x2019;s real hands interpenetrates the canvas, the thicker the line will be. The target thickness for all shapes is 5&#xa0;mm. Lines are rendered in VR using the Shapes real-time vector library<xref ref-type="fn" rid="FN6">
<sup>6</sup>
</xref>.</p>
<p>When drawing, for every 2.5&#xa0;mm the pen has traveled on the canvas, Haplets render a vibration for 10&#xa0;ms at 170&#xa0;Hz with an amplitude that is mapped to how much virtual force is exerted on the canvas. Since the amount of force is also proportional to the line width, the amplitude of vibration is also mapped to the line width, as shown in <xref ref-type="fig" rid="F8">Figure&#x20;8C</xref>. In other words, the pen and Haplets emulates the sensation of drawing on a rough surface and the pressure is represented as the strength of vibration. An example of the haptic rendering scheme&#x2019;s output is shown in <xref ref-type="fig" rid="F7">Figure&#x20;7A</xref>.</p>
<p>Participants first perform 12 trials in a training phase to get familiar with the task where they would hold a physical pen but would not receive haptic feedback. Participants then performed two sets of 36 trials (12 of each shape per set) with or without holding the physical pen in their hand, totaling 72 trials. We balance the order of this throughout our participants to account for any order effects. Half of the 36 trials in each set have the Haplets turned off, presented in a randomized order. In summary, participants are given two conditions for either holding or not holding the pen and two conditions for either having or not having haptic feedback. After each set of trials, they are presented with a short questionnaire. After all trials have concluded, the participant is given a short demo of other capabilities of Haplets, as described in the following sections.</p>
<p>For each trial, we collected the line thickness for every line segment, the coordinates along each line segment, and the time it took to complete the drawing. Through post-processing, we calculate the mean line thickness for each trial, the mean drawing speed and the mean 2D error from the given&#x20;guide.</p>
</sec>
<sec id="s3-2">
<title>3.2 Results</title>
<p>We performed a two-way repeated measures analysis of variance (ANOVA) for each independent variable: line thickness, drawing speed and drawing error with two within-subject factors: with or without haptic feedback, and with or without a physical pen held. Our analysis, as presented in <xref ref-type="fig" rid="F9">Figure&#x20;9</xref>, revealed main effects for haptic feedback for line thickness (F (1,7) &#x3d; 15.82, <italic>p</italic>&#x20;&#x3c; 0.01), drawing speed (F (1,7) &#x3d; 10.75, <italic>p</italic>&#x20;&#x3c; 0.02), and drawing error (F (1,7) &#x3d; 14.24, <italic>p</italic>&#x20;&#x3c; 0.01), but non-significance for the physical pen conditions nor the interactions between factors. Pairwise <italic>t</italic>-tests between the haptic and non-haptic conditions for each independent variable confirmed significance in line thickness (<italic>p</italic>&#x20;&#x3c; 0.001), drawing speed (<italic>p</italic>&#x20;&#x3c; 0.001), and drawing error (<italic>p</italic>&#x20;&#x3d; 0.001).</p>
<fig id="F9" position="float">
<label>FIGURE 9</label>
<caption>
<p>Results from the user study: <bold>(A)</bold> Participants draw lines that are closer to the target thickness (5&#xa0;mm) with haptic feedback. <bold>(B)</bold> Users slow down significantly when haptic feedback is provided. <bold>(C)</bold> Users produce less 2D error when drawing with haptic feedback.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g009.tif"/>
</fig>
<p>For line thickness, we can observe that participants can rely on the haptic feedback for guidance and draw lines that are closer to the guide&#x2019;s thickness, with an mean error across subjects of 1.58&#xa0;mm with haptic feedback and 2.68&#xa0;mm without haptic feedback. Having a physical pen in the hand seems to negatively impact the line thickness, we attribute this to the deteriorated tracking accuracy when the physical pen occludes parts of the tracked fingers&#x2014;a limitation of our particular setup. We confirmed this by observing video recordings of the sessions, where we could correspond moments of large line width variations to temporary losses of hand tracking (indicated by malformed hand rendering or hand disappearance). We can also observe that participants slow down significantly when haptic feedback is provided. We hypothesize that participants were actively using the haptic feedback to guide their strokes. The reduced drawing error is most likely to be influenced directly by the reduced speed of drawing.</p>
<p>When interviewed during a debriefing session after the experiment, two participants (P3 and P5) noted that they &#x201c;forgot (the Haplets) were there&#x201d;. Some participants (P1, P3 and P8) preferred having the physical pen in their hands, while other participants (P2 and P7) did not. One participant (P7) suggests that they were more used to using smaller styli and thus preferred not having the larger pen when using a virtual canvas. Another participant (P6) noted that they felt the presence of the physical pen even after it has been removed from their&#x20;hands.</p>
<p>From our questionnaire (<xref ref-type="fig" rid="F10">Figure&#x20;10</xref>), we can observe that participants had a reasonable amount of body ownership (Q1) and agency (Q2, Q3). Having a physical pen in their hands did not seem to alter the experience of drawing on a virtual canvas (Q4, Q7). However, the presence of holding a virtual pen in VR seems to be positively impacted by having active haptic rendering (from the Haplets) along with the passive haptic feedback from holding a physical pen (Q5,&#x20;Q6).</p>
<fig id="F10" position="float">
<label>FIGURE 10</label>
<caption>
<p>Results from the questionnaire provided during the end of each set of trials.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g010.tif"/>
</fig>
</sec>
</sec>
<sec id="s4">
<title>4 Demonstration</title>
<p>We built demonstrations for highlighting two potential use-cases for Haplets: 1. manipulation in AR/VR and 2. a painting application that uses our tool system.</p>
<sec id="s4-1">
<title>4.1 Manipulation Sandbox</title>
<p>The manipulation sandbox is a demonstration presented to participants at the end of our user study. The user is presented with a desk with several objects and widgets, as shown in <xref ref-type="fig" rid="F11">Figure&#x20;11</xref>. A video showing the demonstration is provided with the <xref ref-type="sec" rid="s11">Supplementary Materials</xref>.</p>
<fig id="F11" position="float">
<label>FIGURE 11</label>
<caption>
<p>After the user study concludes, participants are presented with a desk with various objects to interact with. From left to right: a smooth plastic square, a rough wooden square, a lightweight orange block, a soft green block, a heavy blue block, a rubber ducky (solid), a heavy tethered ball, a lightweight tethered ball. Participants can use the lower right buttons to toggle the use of tools and toggle gravity on and off. Each object responds with haptic feedback as shown in <xref ref-type="fig" rid="F5">Figures 5</xref>, <xref ref-type="fig" rid="F7">7</xref>.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g011.tif"/>
</fig>
<p>The top-left corner of the desk consists of two squares with two different textures. The left square represents a smooth, black, plastic surface and the right square represents a grainy, wooden surface. The smooth surface has a low coefficient of friction of 0.1 while the rough surface has a high coefficient of friction of 1.0. When the user runs their fingers across each surface, the haptics system renders different frequencies for each texture, at 200 and 100&#xa0;Hz, respectively. Each vibration is generated after the fingertips have traveled at least 2.5&#xa0;mm, similarly to the pen&#x2019;s haptic rendering scheme presented in the previous section. Since the wooden texture has higher friction, the user&#x2019;s finger will stick and slip, rendering both visually and through haptics, the sensation of a rough surface. An example of the surfaces&#x2019; haptic rendering system in use is shown in <xref ref-type="fig" rid="F5">Figures&#x20;5A,B</xref>.</p>
<p>Towards the center of the desk are three cubes with varying densities. The user can either pick up or prod the cubes to figure out which cube is lighter or heavier than the others. When prodded at, visually, the lighter cube will slide while and heavier cube will topple. When picked up, the lighter cube will render a lower control-display ratio (pseudo-haptic weight illusion) than the heavier cubes. Each cube also responds to touch differently. The lighter cube will render a vibration of 100&#xa0;Hz upon touch, to simulate a softer texture, while the heavier cube will render a vibration of 200&#xa0;Hz to simulate contact with a dense object. Held cubes can also be tapped against the desk, and similar vibrations will be rendered upon impact. Upon release, Haplets will render a smaller amplitude vibration of the same frequency. A rubber duck is also presented nearby, with similar properties to the cubes. An example of the haptic rendering output is shown in <xref ref-type="fig" rid="F5">Figure&#x20;5C</xref>.</p>
<p>Towards the right of the desk are two buttons: one for toggling gravity on and off and another for toggling the tool system on and off. The button responds to initial touch using the same system as other objects but emit a sharp click (170&#xa0;Hz, 20&#xa0;ms) when depressed a certain amount to signal that the button has activated. If the user turns the tool system off, tools will not be created when a pose is recognized. The buttons&#x2019; haptic rendering scheme is shown in <xref ref-type="fig" rid="F5">Figure&#x20;5D</xref>.</p>
<p>When the tool system is active, users can create a hammer in their hand by holding the &#x201c;thumbs-up&#x201d; pose (see <xref ref-type="fig" rid="F6">Figure&#x20;6</xref>). The hammer can be used to tap and knock the items on the desk around. The hammer&#x2019;s haptic system is similar to the fingertips, where each object responds with different frequencies depending on pre-set properties and different amplitudes depending on how much reaction force is generated when the hammer strikes the object. All three Haplets will vibrate upon hammer strikes under the assumption that the user is holding the hammer with all three fingers. Additionally, a lower frequency reverberation is also rendered immediately after the initial impact in order to emulate stiffness. An example of the haptic rendering scheme for the hammer is shown in <xref ref-type="fig" rid="F7">Figure&#x20;7B</xref>.</p>
</sec>
<sec id="s4-2">
<title>4.2 Painting</title>
<p>Our painting demonstration, as shown in <xref ref-type="fig" rid="F12">Figure 12</xref>, is designed to highlight the use of our tool system. A desk with a large, gray duck sculpture is presented to the user. The sculpture can be rotated using the user&#x2019;s bare hands. The user can use three tools during the demonstration: a spray bottle, a pen and a hammer. Each tool is placed in the user&#x2019;s hand when they produce the correct pose, as shown in <xref ref-type="fig" rid="F6">Figure&#x20;6</xref>. The lower right corner of the desk contains a palette of five colors: the user can tap the tool on the color to switch the tool to operate with that color. A video showing the demonstration is provided with the <xref ref-type="sec" rid="s11">Supplementary Materials</xref>.</p>
<fig id="F12" position="float">
<label>FIGURE 12</label>
<caption>
<p>The painting application highlights the use of haptic feedback to enhance the experience of using otherwise passive tools in the hand. Shown are four interactions overlaid from left to right: using the hammer tool to adjust the sculpture&#x2019;s geometry, using the pen tool to draw on the sculpture, using the spray bottle to spray paint on the sculpture, and selecting colors by tapping the tools on the swatches provided. Haptic feedback provided by the tools is visualized in <xref ref-type="fig" rid="F7">Figure&#x20;7</xref>.</p>
</caption>
<graphic xlink:href="frvir-02-738613-g012.tif"/>
</fig>
<p>The spray bottle is used to quickly paint the duck sculpture. As the user presses down on the bottle&#x2019;s nozzle, a small click is rendered on the index finger Haplet. When the nozzle is engaged and the tool is producing paint, all three Haplets pulse periodically with an amplitude that corresponds to how much the nozzle is depressed. An example of the haptic rendering output is shown in <xref ref-type="fig" rid="F7">Figure&#x20;7C</xref>. Paint is deposited onto the sculpture similar in behavior to spray painting.</p>
<p>The pen is used to mark fine lines on the sculpture. The haptic rendering scheme is identical to that of the pen described in the user study, where all three Haplets render vibrations with amplitudes that correspond to the depth of penetration and line&#x20;width.</p>
<p>The hammer is used to modify the sculpture by creating indentations. The sculpture&#x2019;s mesh is modified in response to the reaction force caused by strikes of the hammer. Haptic feedback for the hammer is similar to that presented in the previous section (<xref ref-type="fig" rid="F7">Figure&#x20;7B</xref>).</p>
</sec>
</sec>
<sec id="s5">
<title>5 Discussion</title>
<p>Haplets introduces a wireless, finger-mounted haptic display for AR and VR that leaves the user&#x2019;s fingertip free to interact with real-world objects, while providing responsive vibrotactile haptic rendering. Each Haplet is a self-contained unit with a footprint small enough to fit on the back of the fingers and fingernail. We also present an engineering solution to achieve low-latency wireless communication that adds haptic rendering to various use cases in VR such as manipulation, texture rendering, and tool usage. Our simulation system for physics-driven virtual hands complements our haptic rendering system by providing pseudo-haptics, robust manipulation, and realistic friction. With a real-world tool held in the hand, Haplets render vibrotactile feedback along with visuals from our simulation system to augment the presence of the tool. Our user study and demonstrations show that Haplets is a feasible solution for a low-encumbrance haptic device.</p>
<p>Although Haplets exclusively provide vibrotactile feedback, we have introduced several engineering efforts to maximize the rendering capabilities of our haptic actuator, the LRA. Our brief characterization of the LRA shows that LRAs can be used at frequencies outside the resonant frequency when properly compensated for. Furthermore, our characterization also shows that the material (or body part) on which the actuator is mounted on to can cause the output of the actuator to vary significantly and therefore needs to be characterized for the intended location of the actuator. Our low-latency wireless communication also helps minimize the total latency from visual stimuli to tactile stimuli, which is especially useful when considering the inherent mechanical time delay for&#x20;LRAs.</p>
<p>Our user study and subjective feedback from the demonstrations have shown that Haplets and the current framework do provide adequate haptic feedback for the given tasks and experiences. However, the human hand can sense much more than simple vibrations such as the sensation of shear force, normal force, and temperature. We address this shortcoming by introducing believable visuals in the form of physics-driven hands, and make up for the lack of force rendering by introducing passive haptic feedback in the form of tools. Our low-latency solution enables impacts, touch and textures to be rendered responsively according to the simulation and visuals. Furthermore, we have yet to fully explore the voice coil-like rendering capabilities of the haptic actuator (LRA). Therefore, our immediate future improvement to our system is the ability to directly stream waveform data to the device. This will enable the ability to render arbitrary waveforms on the LRAs or VCMs which can be used to render highly realistic textures (<xref ref-type="bibr" rid="B4">Chan et&#x20;al., 2021</xref>) or the use of audio-based tools for authoring haptic effects (<xref ref-type="bibr" rid="B14">Israr et&#x20;al., 2019</xref>; <xref ref-type="bibr" rid="B22">Pezent et&#x20;al., 2021</xref>).</p>
<p>Our implementation of passive haptic feedback for tools uses a pen for physical grounding of the fingers, which provides adequate grounding for a number of tasks. Inertial cues that provide the sense of weight to the pen are presented using pseudo-haptic weight. <xref ref-type="bibr" rid="B30">Shigeyama et&#x20;al. (2019)</xref> have shown that VR controllers with reconfigurable shapes can provide realistic haptic cues for inertia and grounding. Therefore, a potential venue for future work is the use of Haplets in conjunction with actual tools (e.g., holding an actual hammer or an actual spray can) or reconfigurable controllers, which would not only provide realistic grips and inertia but also additional haptic feedback that the tool may provide such as depressing the nozzle of a spray bottle.</p>
<p>For other potential future work, our framework provides a foundation for building wearable haptic devices which are not necessarily limited to the fingers. In its current form, Haplets can be placed on other parts of the body with minimal adjustments for rapid prototyping haptic devices, such as the forearm, temple, and thighs (<xref ref-type="bibr" rid="B6">Cipriani et&#x20;al., 2012</xref>; <xref ref-type="bibr" rid="B31">Sie et&#x20;al., 2018</xref>; <xref ref-type="bibr" rid="B26">Rokhmanova and Rombokas, 2019</xref>; <xref ref-type="bibr" rid="B21">Peng et&#x20;al., 2020</xref>). With some modifications, namely to the number of motor drivers and firmware, Haplets can also be used for rendering a larger number of vibrotactile actuators at once, which could be used to create haptic displays around the wrist or on the forearm. Furthermore, our motor drivers are not limited to driving vibrotactile actuators, skin stretch and normal force can be rendered with additional hardware and DC motors (<xref ref-type="bibr" rid="B24">Preechayasomboon et&#x20;al., 2020</xref>).</p>
</sec>
<sec id="s6">
<title>6 Conclusion</title>
<p>We have introduced Haplets as a wearable haptic device for fingers in VR that is low encumbrance. Haplets can augment the presence of virtual hands in VR and we strengthen that further with physics-driven hands that respond to the virtual environment. We have also introduced an engineering solution for achieving low-latency wireless haptic rendering. Our user study and demonstration shows that Haplets have potential in improving hand and tool-based VR experiences. Our system as a whole provides a framework for prototyping haptic experiences in AR and VR and our immediate future work is exploring more use cases for Haplets.</p>
</sec>
</body>
<back>
<sec id="s7">
<title>Data Availability Statement</title>
<p>The original contributions presented in the study are included in the article/<xref ref-type="sec" rid="s11">Supplementary Material</xref>, further inquiries can be directed to the corresponding authors.</p>
</sec>
<sec id="s8">
<title>Ethics Statement</title>
<p>The studies involving human participants were reviewed and approved by University of Washington IRB. The participants provided their written informed consent to participate in this study.</p>
</sec>
<sec id="s9">
<title>Author Contributions</title>
<p>Both authors conceptualized the device. PP designed and built the hardware, firmware and software for the devices, user study and demonstrations. Both authors conceptualized and designed the user study. PP ran the user study. PP wrote the first draft of the manuscript. Both authors contributed to manuscript revision, read, and approved the submitted version.</p>
</sec>
<sec sec-type="COI-statement" id="s10">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec sec-type="disclaimer" id="s12">
<title>Publisher&#x2019;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<ack>
<p>The authors would like to thank current and former members of Rombolabs for their valuable input and insightful discussions: David Boe, Maxim Karrenbach, Abhishek Sharma, and Astrini&#x20;Sie.</p>
</ack>
<sec id="s11">
<title>Supplementary Material</title>
<p>The Supplementary Material for this article can be found online at: <ext-link ext-link-type="uri" xlink:href="https://www.frontiersin.org/articles/10.3389/frvir.2021.738613/full#supplementary-material">https://www.frontiersin.org/articles/10.3389/frvir.2021.738613/full&#x23;supplementary-material</ext-link>
</p>
<supplementary-material xlink:href="Video1.MP4" id="SM1" mimetype="application/MP4" xmlns:xlink="http://www.w3.org/1999/xlink"/>
</sec>
<fn-group>
<fn id="FN1">
<label>1</label>
<p>
<ext-link ext-link-type="uri" xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="https://www.vibration-motor.com/wp-content/uploads/2019/05/G1040003D.pdf">https://www.vibration-motor.com/wp-content/uploads/2019/05/G1040003D.pdf</ext-link>
</p>
</fn>
<fn id="FN2">
<label>2</label>
<p>
<ext-link ext-link-type="uri" xlink:href="https://developer.nordicsemi.com/nRF_Connect_SDK/doc/latest/nrf/ug_esb.html">https://developer.nordicsemi.com/nRF_Connect_SDK/doc/latest/nrf/ug_esb.html</ext-link>
</p>
</fn>
<fn id="FN3">
<label>3</label>
<p>It is worth noting that although this suggests that a maximum of eight Haplets can be communicating with one host microcontroller at once, there exists techniques such as radio time-slot synchronization similar to those used in Bluetooth that can increase the number of concurrent transmitters to&#x20;20.</p>
</fn>
<fn id="FN4">
<label>4</label>
<p>High Definition Inertial Vibration Actuator Performance Specification <ext-link ext-link-type="uri" xlink:href="https://github.com/HapticsIF/HDActuatorSpec">https://github.com/HapticsIF/HDActuatorSpec</ext-link>
</p>
</fn>
<fn id="FN5">
<label>5</label>
<p>
<ext-link ext-link-type="uri" xlink:href="%20https://gameworksdocs.nvidia.com/PhysX/4.0/documentation/PhysXGuide/Manual/Articulations.html">https://gameworksdocs.nvidia.com/PhysX/4.0/documentation/PhysXGuide/Manual/Articulations.html</ext-link>
</p>
</fn>
<fn id="FN6">
<label>6</label>
<p>Shapes by Freya Holm&#xe9;r <ext-link ext-link-type="uri" xlink:href="https://acegikmo.com/shapes">https://acegikmo.com/shapes</ext-link>
</p>
</fn>
</fn-group>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Achibet</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Le Gouis</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Marchal</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>L&#xe9;ziart</surname>
<given-names>P.-A.</given-names>
</name>
<name>
<surname>Argelaguet</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Girard</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Lecuyer</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Kajimoto</surname>
<given-names>H.</given-names>
</name>
</person-group> (<year>2017</year>). &#x201c;<article-title>FlexiFingers: Multi-finger Interaction in VR Combining Passive Haptics and Pseudo-haptics</article-title>,&#x201d; in <conf-name>2017 IEEE Symposium on 3D User Interfaces (3DUI), 18-19 March 2017, Los Angeles, CA, USA</conf-name>, <fpage>103</fpage>&#x2013;<lpage>106</lpage>. <pub-id pub-id-type="doi">10.1109/3DUI.2017.7893325</pub-id> </citation>
</ref>
<ref id="B2">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Ando</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Kusachi</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Watanabe</surname>
<given-names>J.</given-names>
</name>
</person-group> (<year>2007</year>). &#x201c;<article-title>Nail-mounted Tactile Display for Boundary/texture Augmentation</article-title>,&#x201d; in <conf-name>Proceedings of the international conference on Advances in computer entertainment technology, 13-15 June 2007, Salzburg, Austria</conf-name> (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery</publisher-name>), <fpage>292</fpage>&#x2013;<lpage>293</lpage>. <comment>ACE &#x2019;07</comment>. <pub-id pub-id-type="doi">10.1145/1255047.1255131</pub-id> </citation>
</ref>
<ref id="B3">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Caballero</surname>
<given-names>D. E.</given-names>
</name>
<name>
<surname>Rombokas</surname>
<given-names>E.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>Sensitivity to Conflict between Visual Touch and Tactile Touch</article-title>,&#x201d; in <conf-name>IEEE Transactions on Haptics</conf-name> <volume>12</volume>, <fpage>78</fpage>&#x2013;<lpage>86</lpage>. <pub-id pub-id-type="doi">10.1109/TOH.2018.2859940</pub-id> </citation>
</ref>
<ref id="B4">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Chan</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Tymms</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Colonnese</surname>
<given-names>N.</given-names>
</name>
</person-group> (<year>2021</year>). <source>Hasti: Haptic and Audio Synthesis for Texture Interactions</source>, <volume>6</volume>.</citation>
</ref>
<ref id="B5">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Choi</surname>
<given-names>I.</given-names>
</name>
<name>
<surname>Zhao</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Gonzalez</surname>
<given-names>E. J.</given-names>
</name>
<name>
<surname>Follmer</surname>
<given-names>S.</given-names>
</name>
</person-group> (<year>2020</year>). &#x201c;<article-title>Augmenting Perceived Softness of Haptic Proxy Objects through Transient Vibration and Visuo-Haptic Illusion in Virtual Reality</article-title>,&#x201d; in <conf-name>IEEE Transactions on Visualization and Computer Graphics</conf-name>, <fpage>1</fpage>. <pub-id pub-id-type="doi">10.1109/TVCG.2020.3002245</pub-id> </citation>
</ref>
<ref id="B6">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Cipriani</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>D&#x27;Alonzo</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Carrozza</surname>
<given-names>M. C.</given-names>
</name>
</person-group> (<year>2012</year>). &#x201c;<article-title>A Miniature Vibrotactile Sensory Substitution Device for Multifingered Hand Prosthetics</article-title>,&#x201d; in <conf-name>IEEE Transactions on Biomedical Engineering</conf-name> <volume>59</volume>, <fpage>400</fpage>&#x2013;<lpage>408</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2011.2173342</pub-id> </citation>
</ref>
<ref id="B7">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Cobos</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Ferre</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Sanchez Uran</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Ortego</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Pena</surname>
<given-names>C.</given-names>
</name>
</person-group> (<year>2008</year>). &#x201c;<article-title>Efficient Human Hand Kinematics for Manipulation Tasks</article-title>,&#x201d; in <conf-name>2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, 22-26 September 2008, Nice, France</conf-name>, <fpage>2246</fpage>&#x2013;<lpage>2251</lpage>. <pub-id pub-id-type="doi">10.1109/IROS.2008.4651053</pub-id> </citation>
</ref>
<ref id="B8">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Elsayed</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Barrera Machuca</surname>
<given-names>M. D.</given-names>
</name>
<name>
<surname>Schaarschmidt</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Marky</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>M&#xfc;ller</surname>
<given-names>F.</given-names>
</name>
<name>
<surname>Riemann</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Matviienko</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Schmitz</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Weigel</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>M&#xfc;hlh&#xe4;user</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2020</year>). &#x201c;<article-title>VRSketchPen: Unconstrained Haptic Assistance for Sketching in Virtual 3D Environments</article-title>,&#x201d; in <conf-name>26th ACM Symposium on Virtual Reality Software and Technology, 1-4 November 2020, Virtual Event, Canada</conf-name> (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery</publisher-name>), <fpage>1</fpage>&#x2013;<lpage>11</lpage>. <comment>VRST &#x2019;20</comment>. <pub-id pub-id-type="doi">10.1145/3385956.3418953</pub-id> </citation>
</ref>
<ref id="B9">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fishel</surname>
<given-names>J.&#x20;A.</given-names>
</name>
<name>
<surname>Loeb</surname>
<given-names>G. E.</given-names>
</name>
</person-group> (<year>2012</year>). <article-title>Bayesian Exploration for Intelligent Identification of Textures</article-title>. <source>Front. Neurorobot.</source> <volume>6</volume>. <pub-id pub-id-type="doi">10.3389/fnbot.2012.00004</pub-id> </citation>
</ref>
<ref id="B10">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Gupta</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Samad</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Kin</surname>
<given-names>K.</given-names>
</name>
<name>
<surname>Kristensson</surname>
<given-names>P. O.</given-names>
</name>
<name>
<surname>Benko</surname>
<given-names>H.</given-names>
</name>
</person-group> (<year>2020</year>). &#x201c;<article-title>Investigating Remote Tactile Feedback for Mid-air Text-Entry in Virtual Reality</article-title>,&#x201d; in <conf-name>2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 9-13 November 2020, Porto de Galinhas, Brazil</conf-name>, <fpage>350</fpage>&#x2013;<lpage>360</lpage>. <pub-id pub-id-type="doi">10.1109/ISMAR50242.2020.00062</pub-id> </citation>
</ref>
<ref id="B11">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Han</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>Cabezas</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Twigg</surname>
<given-names>C. D.</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Petkau</surname>
<given-names>J.</given-names>
</name>
<etal/>
</person-group> (<year>2020</year>). <article-title>MEgATrack</article-title>. <source>ACM Trans. Graph.</source> <volume>39</volume>. <pub-id pub-id-type="doi">10.1145/3386569.3392452</pub-id> </citation>
</ref>
<ref id="B12">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Hinchet</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Vechev</surname>
<given-names>V.</given-names>
</name>
<name>
<surname>Shea</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Hilliges</surname>
<given-names>O.</given-names>
</name>
</person-group> (<year>2018</year>). &#x201c;<article-title>DextrES</article-title>,&#x201d; in <conf-name>Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, 14-17 October 2018, Berlin, Germany</conf-name> (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery</publisher-name>), <fpage>901</fpage>&#x2013;<lpage>912</lpage>. <comment>UIST &#x2019;18</comment>. <pub-id pub-id-type="doi">10.1145/3242587.3242657</pub-id> </citation>
</ref>
<ref id="B13">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Hsieh</surname>
<given-names>M.-J.</given-names>
</name>
<name>
<surname>Liang</surname>
<given-names>R.-H.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>B.-Y.</given-names>
</name>
</person-group> (<year>2016</year>). &#x201c;<article-title>NailTactors</article-title>,&#x201d; in <conf-name>Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services, 6-9 September 2016, Florence, Italy</conf-name>. (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery</publisher-name>), <fpage>29</fpage>&#x2013;<lpage>34</lpage>. <comment>MobileHCI &#x2019;16</comment>. <pub-id pub-id-type="doi">10.1145/2935334.2935358</pub-id> </citation>
</ref>
<ref id="B14">
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Israr</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Zhao</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Schwemler</surname>
<given-names>Z.</given-names>
</name>
<name>
<surname>Fritz</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>Stereohaptics Toolkit for Dynamic Tactile Experiences</article-title>,&#x201d; in <source>HCI International 2019&#x20;&#x2013; Late Breaking Papers</source>. Editor <person-group person-group-type="editor">
<name>
<surname>Stephanidis</surname>
<given-names>C.</given-names>
</name>
</person-group> (<publisher-loc>Cham</publisher-loc>: <publisher-name>Springer International Publishing</publisher-name>), <fpage>217</fpage>&#x2013;<lpage>232</lpage>. <comment>Lecture Notes in Computer Science</comment>. <pub-id pub-id-type="doi">10.1007/978-3-030-30033-3_17</pub-id> </citation>
</ref>
<ref id="B15">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Johansson</surname>
<given-names>R. S.</given-names>
</name>
<name>
<surname>Flanagan</surname>
<given-names>J.&#x20;R.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Coding and Use of Tactile Signals from the Fingertips in Object Manipulation Tasks</article-title>. <source>Nat. Rev. Neurosci.</source> <volume>10</volume>, <fpage>345</fpage>&#x2013;<lpage>359</lpage>. <pub-id pub-id-type="doi">10.1038/nrn2621</pub-id> </citation>
</ref>
<ref id="B16">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kim</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Yi</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Lee</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Lee</surname>
<given-names>W.</given-names>
</name>
</person-group> (<year>2018</year>). <article-title>HapCube</article-title>. <pub-id pub-id-type="doi">10.1145/3173574.3174075</pub-id>
<fpage>13</fpage> </citation>
</ref>
<ref id="B17">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Kuchenbecker</surname>
<given-names>K. J.</given-names>
</name>
<name>
<surname>Fiene</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Niemeyer</surname>
<given-names>G.</given-names>
</name>
</person-group> (<year>2006</year>). &#x201c;<article-title>Improving Contact Realism through Event-Based Haptic Feedback</article-title>,&#x201d; in <conf-name>IEEE Transactions on Visualization and Computer Graphics</conf-name> <volume>12</volume>, <fpage>219</fpage>&#x2013;<lpage>230</lpage>. <pub-id pub-id-type="doi">10.1109/TVCG.2006.32</pub-id> </citation>
</ref>
<ref id="B18">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>L&#xe9;cuyer</surname>
<given-names>A.</given-names>
</name>
</person-group> (<year>2009</year>). <article-title>Simulating Haptic Feedback Using Vision: A Survey of Research and Applications of Pseudo-haptic Feedback</article-title>. <source>Presence: Teleoperators and Virtual Environments</source> <volume>18</volume>, <fpage>39</fpage>&#x2013;<lpage>53</lpage>. <pub-id pub-id-type="doi">10.1162/pres.18.1.39</pub-id> </citation>
</ref>
<ref id="B19">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Lee</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Sinclair</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Gonzalez-Franco</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Ofek</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Holz</surname>
<given-names>C.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>TORC: A Virtual Reality Controller for In-Hand High-Dexterity Finger Interaction</article-title>,&#x201d; in <conf-name>Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 6-9 May 2019, Glasgow, Scotland UK</conf-name>. (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery</publisher-name>), <fpage>1</fpage>&#x2013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.1145/3290605.3300301</pub-id> </citation>
</ref>
<ref id="B20">
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lengyel</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>&#x17d;alalyt&#x117;</surname>
<given-names>G.</given-names>
</name>
<name>
<surname>Pantelides</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Ingram</surname>
<given-names>J.&#x20;N.</given-names>
</name>
<name>
<surname>Fiser</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Lengyel</surname>
<given-names>M.</given-names>
</name>
<etal/>
</person-group> (<year>2019</year>). <article-title>Unimodal Statistical Learning Produces Multimodal Object-like Representations</article-title>. <source>eLife</source> <volume>8</volume>, <fpage>e43942</fpage>. <pub-id pub-id-type="doi">10.7554/eLife.43942</pub-id> </citation>
</ref>
<ref id="B21">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Peng</surname>
<given-names>Y.-H.</given-names>
</name>
<name>
<surname>Yu</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>S.-H.</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>C.-W.</given-names>
</name>
<name>
<surname>Taele</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Yu</surname>
<given-names>N.-H.</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>M. Y.</given-names>
</name>
</person-group> (<year>2020</year>). &#x201c;<article-title>WalkingVibe: Reducing Virtual Reality Sickness and Improving Realism while Walking in VR Using Unobtrusive Head-Mounted Vibrotactile Feedback</article-title>,&#x201d; in <conf-name>Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 25-30 April 2020, Honolulu, HI, USA</conf-name> (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery)</publisher-name>), <fpage>1</fpage>&#x2013;<lpage>12</lpage>. <comment>CHI &#x2019;20</comment>. <pub-id pub-id-type="doi">10.1145/3313831.3376847</pub-id> </citation>
</ref>
<ref id="B22">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Pezent</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Cambio</surname>
<given-names>B.</given-names>
</name>
<name>
<surname>OrMalley</surname>
<given-names>M. K.</given-names>
</name>
</person-group> (<year>2021</year>). &#x201c;<article-title>Syntacts: Open-Source Software and Hardware for Audio-Controlled Haptics</article-title>,&#x201d; in <conf-name>IEEE Transactions on Haptics</conf-name>, <volume>14</volume>, <fpage>225</fpage>&#x2013;<lpage>233</lpage>. <pub-id pub-id-type="doi">10.1109/TOH.2020.3002696</pub-id> </citation>
</ref>
<ref id="B23">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Pezent</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Israr</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Samad</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Robinson</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Agarwal</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Benko</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Colonnese</surname>
<given-names>N.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>Tasbi: Multisensory Squeeze and Vibrotactile Wrist Haptics for Augmented and Virtual Reality</article-title>,&#x201d; in <conf-name>2019 IEEE World Haptics Conference (WHC), 9-12 July 2019, Tokyo, Japan</conf-name>, <fpage>1</fpage>&#x2013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1109/WHC.2019.8816098</pub-id> </citation>
</ref>
<ref id="B24">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Preechayasomboon</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Israr</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Samad</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2020</year>). &#x201c;<article-title>Chasm: A Screw Based Expressive Compact Haptic Actuator</article-title>,&#x201d; in <conf-name>Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 25-30 April 2020, Honolulu, HI, USA</conf-name>, (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery</publisher-name>), <fpage>1</fpage>&#x2013;<lpage>13</lpage>. <comment>CHI &#x2019;20</comment>. <pub-id pub-id-type="doi">10.1145/3313831.3376512</pub-id> </citation>
</ref>
<ref id="B25">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Rekimoto</surname>
<given-names>J.</given-names>
</name>
</person-group> (<year>2009</year>). &#x201c;<article-title>SenseableRays</article-title>,&#x201d; in <conf-name>Proceedings of the 27th international conference extended abstracts on Human factors in computing systems - CHI EA &#x2019;09</conf-name> (<publisher-loc>Boston, MA, USA, 4-9 April 2009, Boston, MA, USA </publisher-loc>: <publisher-name>ACM Press</publisher-name>), <fpage>2519</fpage>. <pub-id pub-id-type="doi">10.1145/1520340.1520356</pub-id> </citation>
</ref>
<ref id="B26">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Rokhmanova</surname>
<given-names>N.</given-names>
</name>
<name>
<surname>Rombokas</surname>
<given-names>E.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>Vibrotactile Feedback Improves Foot Placement Perception on Stairs for Lower-Limb Prosthesis Users</article-title>,&#x201d; in <conf-name>2019 IEEE 16th International Conference on Rehabilitation Robotics (ICORR), 24-28 June 2019, Toronto, ON, Canada</conf-name>, <fpage>1215</fpage>&#x2013;<lpage>1220</lpage>. <pub-id pub-id-type="doi">10.1109/ICORR.2019.8779518</pub-id> </citation>
</ref>
<ref id="B27">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Salazar</surname>
<given-names>S. V.</given-names>
</name>
<name>
<surname>Pacchierotti</surname>
<given-names>C.</given-names>
</name>
<name>
<surname>de Tinguy</surname>
<given-names>X.</given-names>
</name>
<name>
<surname>Maciel</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Marchal</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2020</year>). &#x201c;<article-title>Altering the Stiffness, Friction, and Shape Perception of Tangible Objects in Virtual Reality Using Wearable Haptics</article-title>,&#x201d; in <conf-name>IEEE Transactions on Haptics</conf-name> <volume>13</volume>, <fpage>167</fpage>&#x2013;<lpage>174</lpage>. <pub-id pub-id-type="doi">10.1109/TOH.2020.2967389</pub-id> </citation>
</ref>
<ref id="B28">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Samad</surname>
<given-names>M.</given-names>
</name>
<name>
<surname>Gatti</surname>
<given-names>E.</given-names>
</name>
<name>
<surname>Hermes</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Benko</surname>
<given-names>H.</given-names>
</name>
<name>
<surname>Parise</surname>
<given-names>C.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>Pseudo-Haptic Weight</article-title>,&#x201d; in <conf-name>Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 4-9 May 2019, Glasgow, Scotland UK</conf-name>. (<publisher-loc>Glasgow Scotland Uk</publisher-loc>: <publisher-name>ACM</publisher-name>), <fpage>1</fpage>&#x2013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.1145/3290605.3300550</pub-id> </citation>
</ref>
<ref id="B29">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Schorr</surname>
<given-names>S. B.</given-names>
</name>
<name>
<surname>Okamura</surname>
<given-names>A. M.</given-names>
</name>
</person-group> (<year>2017</year>). &#x201c;<article-title>Fingertip Tactile Devices for Virtual Object Manipulation and Exploration</article-title>,&#x201d; in <conf-name>Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, 6-11 May 2017, Denver, CO, USA</conf-name>. (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery</publisher-name>), <fpage>3115</fpage>&#x2013;<lpage>3119</lpage>. <comment>CHI &#x2019;17</comment>. <pub-id pub-id-type="doi">10.1145/3025453.3025744</pub-id> </citation>
</ref>
<ref id="B30">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Shigeyama</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Hashimoto</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Yoshida</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Narumi</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Tanikawa</surname>
<given-names>T.</given-names>
</name>
<name>
<surname>Hirose</surname>
<given-names>M.</given-names>
</name>
</person-group> (<year>2019</year>). &#x201c;<article-title>Transcalibur: A Weight Shifting Virtual Reality Controller for 2D Shape Rendering Based on Computational Perception Model</article-title>,&#x201d; in <conf-name>Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 4-9 May 2019, Glasgow, Scotland UK</conf-name>. (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery</publisher-name>), <fpage>1</fpage>&#x2013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1145/3290605.3300241</pub-id> </citation>
</ref>
<ref id="B31">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Sie</surname>
<given-names>A.</given-names>
</name>
<name>
<surname>Boe</surname>
<given-names>D.</given-names>
</name>
<name>
<surname>Rombokas</surname>
<given-names>E.</given-names>
</name>
</person-group> (<year>2018</year>). &#x201c;<article-title>Design and Evaluation of a Wearable Haptic Feedback System for Lower Limb Prostheses during Stair Descent</article-title>,&#x201d; in <conf-name>2018 7th IEEE International Conference on Biomedical Robotics and Biomechatronics (Biorob), 26-29 August 2018, Enschede, Netherlands</conf-name>, <fpage>219</fpage>&#x2013;<lpage>224</lpage>. <pub-id pub-id-type="doi">10.1109/BIOROB.2018.8487652</pub-id> </citation>
</ref>
<ref id="B32">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Teng</surname>
<given-names>S.-Y.</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>P.</given-names>
</name>
<name>
<surname>Nith</surname>
<given-names>R.</given-names>
</name>
<name>
<surname>Fonseca</surname>
<given-names>J.</given-names>
</name>
<name>
<surname>Lopes</surname>
<given-names>P.</given-names>
</name>
</person-group> (<year>2021</year>). &#x201c;<article-title>Touch&#x26;Fold: A Foldable Haptic Actuator for Rendering Touch in Mixed Reality</article-title>,&#x201d; in <conf-name>Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 8-13 May 2021, Yokohama, Japan</conf-name>. (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery</publisher-name>). <fpage>1</fpage>&#x2013;<lpage>14</lpage>. <pub-id pub-id-type="doi">10.1145/3411764.3445099</pub-id> </citation>
</ref>
<ref id="B33">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Zhou</surname>
<given-names>Q.</given-names>
</name>
<name>
<surname>Sykes</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Fels</surname>
<given-names>S.</given-names>
</name>
<name>
<surname>Kin</surname>
<given-names>K.</given-names>
</name>
</person-group> (<year>2020</year>). &#x201c;<article-title>Gripmarks: Using Hand Grips to Transform In-Hand Objects into Mixed Reality Input</article-title>,&#x201d; in <conf-name>Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 25-30 April 2020, Honolulu, HI, USA,</conf-name> (<publisher-loc>New York, NY, USA</publisher-loc>: <publisher-name>Association for Computing Machinery</publisher-name>), <fpage>1</fpage>&#x2013;<lpage>11</lpage>. <comment>CHI &#x2019;20</comment>. <pub-id pub-id-type="doi">10.1145/3313831.3376313</pub-id> </citation>
</ref>
<ref id="B34">
<citation citation-type="confproc">
<person-group person-group-type="author">
<name>
<surname>Zilles</surname>
<given-names>C. B.</given-names>
</name>
<name>
<surname>Salisbury</surname>
<given-names>J.&#x20;K.</given-names>
</name>
</person-group> (<year>1995</year>). &#x201c;<article-title>A Constraint-Based God-Object Method for Haptic Display</article-title>,&#x201d; in <conf-name>Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots, 5-9 August 1995, Pittsburgh, PA, USA,</conf-name> <volume>3</volume>, <fpage>146</fpage>&#x2013;<lpage>151</lpage>. <pub-id pub-id-type="doi">10.1109/IROS.1995.525876</pub-id> </citation>
</ref>
</ref-list>
</back>
</article>