Next Article in Journal
A Wearable System for the Estimation of Performance-Related Metrics during Running and Jumping Tasks
Next Article in Special Issue
Haptics and VR: Technology and Applications
Previous Article in Journal
The Study of the Surface Plasmon Polaritons at the Interface Separating Nanocomposite and Hypercrystal
Previous Article in Special Issue
A Review of Training and Guidance Systems in Medical Surgery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Texture Identification of Objects Using a Robot Fingertip Module with Multimodal Tactile Sensing Capability

1
Group for Mechanical Metrology, Division of Physical Metrology, Korea Research Institute of Standards and Science, 267 Gajeong-ro, Yuseong-gu, Daejeon 34113, Korea
2
Department of Science of Measurement, University of Science and Technology, 217 Gajeong-ro, Yuseong-gu, Daejeon 34113, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(11), 5256; https://doi.org/10.3390/app11115256
Submission received: 30 April 2021 / Revised: 27 May 2021 / Accepted: 2 June 2021 / Published: 5 June 2021
(This article belongs to the Special Issue Haptics: Technology and Applications)

Abstract

:
Modern robots fall behind humans in terms of the ability to discriminate between textures of objects. This is due to the fact that robots lack the ability to detect the various tactile modalities that are required to discriminate between textures of objects. Hence, our research team developed a robot fingertip module that can discriminate textures of objects via direct contact. This robot fingertip module is based on a tactile sensor with multimodal (3-axis force and temperature) sensing capabilities. The multimodal tactile sensor was able to detect forces in the vertical (Z-axis) direction as small as 0.5 gf and showed low hysteresis error and repeatability error of less than 3% and 2% in the vertical force measurement range of 0–100 gf, respectively. Furthermore, the sensor was able to detect forces in the horizontal (X- and Y-axes) direction as small as 20 mN and could detect 3-axis forces with an average cross-talk error of less than 3%. In addition, the sensor demonstrated its multimodal sensing capability by exhibiting a near-linear output over a temperature range of 23–35 °C. The module was mounted on a motorized stage and was able to discriminate 16 texture samples based on four tactile modalities (hardness, friction coefficient, roughness, and thermal conductivity).

1. Introduction

The texture of an object that a person feels is determined based on similar past experiences with the physical quantities transmitted from the object. Even for identical objects, some may deem it to be soft as the product is relatively softer than an object that the person had experienced before, whereas others may find the object to be rough. As a result, companies that manufacture products that come into direct contact with human skin (e.g., cosmetics, car seats, etc.) employ numerous methods to analyze the touch experienced by people when using their products. As a notable method, selected individuals discriminate the texture of objects based on prior training with tactile reference samples (e.g., TouchFeel Box [1]). However, as this method is based on tactile sensations experienced by humans, the results are likely to be affected by subjective information. To accurately discriminate the textures of objects in an objective manner, it is necessary to build a robot that can detect physical quantities that are transferred from objects. However, current-day robots lack the adequate tactile sensing capabilities to be able to detect physical quantities that are necessary in discriminating textures of objects [2]. Therefore, tactile sensors that are able to mimic the tactile sensation of humans have garnered interest as a method for overcoming the aforementioned limitations. The tactile sensing capability of humans is a recognition method via direct contact that involves the sensing of physical quantities generated by static touch and dynamic touch [3,4,5]. The main difference between the two methods of touch is the movement of the finger. For example, the thermal conductivity of an object can be detected by simply placing one’s hand on the object, whereas the roughness, friction coefficient, and hardness of an object require relative movement between the object and one’s finger. Such physical quantities (hardness, friction coefficient, roughness, and thermal conductivity) are referred to as tactile modalities, which are important indicators in discriminating and recognizing objects. In other words, for a robot to discriminate the textures of objects, a tactile sensor capable of multimodal (3-axis force and temperature) sensing capabilities is essential to detect various tactile modalities. Therefore, numerous studies are in progress to develop tactile sensors with multimodal sensing capabilities that can be implemented into robots.
Chathuranga et al. [6,7] developed a soft biomimetic fingertip that can measure micro-vibrations and force modalities using five commercial accelerometers and seven commercial force sensors, which are placed under a polyurethane rubber layer. The developed soft biomimetic fingertip was able to discriminate seven texture samples. However, the force sensors that were used lacked the sensitivity to detect various materials, and the hysteresis errors of the sensors could further increase due to the rubber layer. In addition, although multiple force sensors were used to detect static force distributions, the sensing area of a single force sensor was too large compared to the size of the designed fingertip, rendering it difficult to detect force distributions in practice. Petriu group [8,9] applied multiple sensors (a 32-taxel tactile array, a 9 DOF MARG sensor, and a deep pressure sensor) to a compliant structure to develop a bio-inspired tactile sensing module. However, to avoid interference from external magnetic fields—a flaw of the magnetic sensors used in the module—the researchers arranged the angular rate sensor and the gravity sensor on a single layer. As a result, this increased the sensor signal processing complexity. Adelson group [10,11,12] used an elastomer-embedded camera to develop an optical GelSight sensor with excellent spatial resolution. This sensor detects the hardness, surface geometry, and incipient slip of objects by using an internal camera to detect deformations caused when a transparent rubber layer coated with a reflective membrane comes into contact with the object. However, due to its mechanisms, the sensor is limited in terms of how small it can be manufactured. Furthermore, the sensor lacked the sampling rate to be used in robotics. Above all, as the aforementioned sensors were developed with a focus on vibrations and force modalities, none are able to detect temperature, which is necessary to measure the inherent thermal conductivity of an object. On the other hand, the BioTac (SynTouch LLC, Los Angeles, CA, USA) fingertip sensor [13,14,15] is also able to measure thermal conductivity. However, the BioTac sensor struggles to directly obtain information regarding an object’s friction coefficient, which is measured using horizontal direction forces.
Hence, our research team developed a robot fingertip module that can discriminate textures of objects based on a tactile sensor with multimodal (3-axis force and temperature) sensing capabilities to detect various tactile modalities. The multimodal tactile sensor that was applied to the module had a single-cell structure comprised of four U-shaped strain gauges based on boron-doped single crystalline silicon, which enabled the sensor to measure 3-axis forces. In addition, a thin serpentine-shaped gold wire was designed to simultaneously detect the temperature of the module. However, temperature sensing capabilities alone are insufficient to detect thermal conductivity, which is related to the flow of heat. As such, we integrated a heater based on enameled copper wire (Elektrisola Korea Co., Ltd., Siheung-si, Korea) to the module to maintain the module temperature above room temperature. This enabled the flow of heat when the module came into contact with an object that had been placed at room temperature. Therefore, the multimodal tactile sensor mounted on the module exhibited excellent sensitivity by detecting vertical (Z-axis) forces as small as 0.5 gf. Moreover, the sensor showed a low hysteresis error and repeatability error of less than 3% and 2% in the vertical force measurement range of 0 gf to 100 gf, respectively. In terms of horizontal direction (X- and Y-axes) forces, the sensor was able to detect forces as small as 20 mN. As such, the sensor could detect 3-axis forces with an average cross-talk error of less than 3%. In addition, the sensor demonstrated its multimodal sensing capability by exhibiting a constant degree of change according to thermal stimuli applied to the module over a temperature range of 23 °C to 35 °C. Furthermore, the robot fingertip module was mounted on a motorized stage to enable both static contact and dynamic contact between the module and the object. Based on the sensor measurement results from both forms of motion, four tactile modalities (i.e., hardness, friction coefficient, roughness, and thermal conductivity) were calculated to discriminate the texture of the object.

2. Design and Fabrication of the Robot Fingertip Module

A tactile sensor with multimodal sensing capabilities is required to measure the various tactile modalities of objects. Additionally, the sensor must possess sufficient sensitivity to be able to discriminate between various materials. Hence, in accordance with strategies proposed in previous studies [16,17,18], we designed a hybrid-type 3-axis force sensor that possesses the excellent sensitivity and repeatability characteristics of single-crystalline silicon and the mechanical flexibility of polymer-based elastomers. Furthermore, we designed a tactile sensor with multimodal sensing capabilities by adding a gold wire-based temperature sensor. By using such inorganic substances as sensing materials, the developed sensor possesses greater mechanical and chemical stability and exhibits superior repeatability characteristics and lower hysteresis errors compared to other sensors that use organic sensing materials (such as carbon-polymer mixed sensors). These characteristics are crucial as they are directly related to sensor durability. The sensing layer structure with the advantages mentioned above is as follows. The sensitivity of the 3-axis force sensor, which is based on the strain gauge method, is determined by the gauge factor of the material of which the gauge is comprised, with higher values resulting in greater sensitivity. In the case of single-crystalline silicon, of which our sensor was comprised, the gauge factor depends on the doping concentration (in our case, 9.0 × 1018 ions/cm3); generally, the value falls within the range of −120~180 [19,20]. This value is rather high considering how the gauge factor is approximately 2~4 for metals [21] and approximately 6 for graphene [22]. Therefore, to measure 3-axis forces using single-crystalline silicon-based strain gauges, four U-shaped strain gauges were arranged at 90-degree intervals (N, E, S, W) to form a cell. Most importantly, the sensor was designed with a single cell as it is more suitable compared to array types when measuring roughness, a tactile modality that is necessary to discriminate the texture of objects. This is because roughness measurements require fast Fourier transform (FFT) analysis with high-speed sampling. In the case of the temperature sensor, gold was used as it produces stable sensor outputs according to thermal stimuli. However, since gold is also a metal, the sensor can react to strain changes in addition to thermal stimuli. As such, the sensor was designed with a thin gold wire with a serpentine shape to only react to temperature. In general, it is known that the elastic stretchability of a wire designed in a serpentine shape improves as the arc angle (in our case, 180°) increases [23]. The design of the sensing layer of the multimodal tactile sensor is illustrated in Figure 1a.
However, a sensing layer designed with inorganic materials poses the concern of being vulnerable to impacts upon contact. Thus, this mechanical weakness was compensated by adopting a hybrid structure in which a polymer-based elastomer was wrapped around the sensing layer. There are various types of polymer-based elastomer materials, with the most notable examples including Ecoflex, Dragon Skin, and PDMS. In our design, we wrapped the sensing layer with PDMS (Polydimethylsiloxane, Sylgard 184, Dow Corning, Midland, MI, USA), as its physical stiffness and adhesion can be easily controlled and since it is easy to bond with other PDMS-based layers. The designed PDMS layers include a deformation layer, a bump layer, and an encapsulation layer. The deformation layer, which corresponds to an elastic sensing element, was located below the PI (Polyimide, 25 μm, DuPont Kapton®) substrate (sensing layer) and served to generate a strain proportional to the contact force. To improve the sensitivity of the sensor, we used a 50:1 ratio (base:curing agent) PDMS to result in a deformation layer with soft physical stiffness [24]. However, this soft physical property raised two issues. One was that PDMS with such soft physical properties is not able to immediately return to its original state as it is deformed to a greater degree. As this is directly related to the hysteresis error of the sensor, a thin deformation layer with a thickness of 700 um was used to limit its deformation. The other concern was that the adhesive strength of PDMS increases as the physical stiffness becomes softer. Notably, due to the strong adhesive strength of 50:1 PDMS, it is difficult to move a deformation layer comprised of this material to a specific position once it is fabricated. This is similar to the challenges involved with handling double-sided tape with both adhesive surfaces exposed. Therefore, by designing the deformation layer above the PI substrate (adhesion layer), it becomes easier to use as one side loses its adhesion. The bump layer is a layer with cuboid-shaped bumps (W; 2.5 mm × L; 2.5 mm × H; 1.0 mm) that serves to ensure the bump is positioned on the cell in a stable manner. In addition to concentrating the forces onto the four strain gauges that constitute the cell to increase sensitivity, the bump serves to determine the direction of the force by generating different strains on each strain gauge according to the direction of the force [25,26]. Therefore, the bump has to transmit external forces to the strain gauges without deforming its original shape, and thus it is advantageous to fabricate bump with greater mechanical stiffness. However, if the bump is fabricated using metal or plastic materials to improve mechanical stiffness, it will be difficult to ensure the mechanical flexibility of the sensor and excessive concentration of strain could damage the semiconductor strain gauges. Above all, it is difficult to attach bumps made of such materials to the sensing layer. Taking such conditions into consideration, a bump layer constituted of 5:1 (base:curing agent) PDMS was placed on top of the sensing layer [27]. However, even this design could not ensure the stability of the sensor performance as the bump layer is not combined with the sensing layer. Thus, an encapsulation layer constituted of PDMS with the same ratio was placed in between the sensing and bump layers to serve as an adhesion layer with the bump layer and protect the sensing layer from external physical/chemical contact. This multimodal tactile sensor has dimensions of 12 mm × 26 mm. The cross section of the sensor and the thickness details of each layer are shown in Figure 1b.
To build a robot fingertip module that can discriminate the textures of objects based on the multimodal tactile sensor designed in this structure, we designed a robot fingertip (W: 16.6 mm × H: 36.16 mm × t: 18 mm) similar in size to a human fingertip. Aspects that were considered when implementing the module included a heater that can maintain the temperature of the module similar to human body homeostasis, artificial skin with human-like fingerprints, and a sensor signal processing circuit board (described in detail in Section 3.1) that can be embedded in the robot fingertip. To maintain the module temperature above room temperature, a heater comprised of an enameled copper wire with a diameter of 0.05 mm was wrapped (472 turns/8 layers) around a bobbin (inner diameter: 1.5 mm, height: 4 mm), which was positioned on the metallic fingertip frame. The heater had an approximate output of 471 mW at a voltage of 3.3 V. As a result, heat would flow as the module came into contact with an object placed at room temperature, which then enabled the developed temperature sensor to detect temperature changes of the module and measure the inherent thermal conductivity of the object. The artificial skin of the module was designed to possess a stiffness corresponding to the Young’s modulus range of the index finger (0.07 MPa–0.2 MPa) [28] by using Ecoflex 0030 (Smooth-on Inc., Macungie, PA, USA) [29]. Furthermore, a fingerprinted skin with human-like fingerprints was used to induce vibrations when rubbing the object with the robot fingertip [30]. The robot fingertip module is illustrated in Figure 1c.
The implementation of the designed module begins with the fabrication of the multimodal tactile sensor. The fabrication process of the tactile sensor can be subdivided into the fabrication of the sensing layer and the PDMS layers. The initial stages of the sensing layer fabrication process involve the transfer of a doped 100nm-thick single-crystalline silicon layer from an SOI (Silicon-On-Insulator, SOITEC, Isere, France) wafer to a PI substrate (sensing layer) via a previously reported dry-transfer method. The forms of the U-shaped strain gauges were patterned onto the transferred silicon layer via the photolithography process. Subsequently, etching equipment (ERR-5006, LAT Co., Ltd., Suwon-si, Korea) was used to leave the pattern and etch the remaining parts. To fabricate the electrodes of the strain gauges, deposition equipment (KVE-T2000, Vacuum Tech, Gimpo-si, Korea) was used to deposit a metal layer (Cr: 5.7 nm/Au: 120 nm), and the electrode shape was patterned via the photolithography process. The wet etching method was employed to leave only the electrode shape to fabricate the 3-axis force sensor. Subsequently, the sensing layer was spin-coated using Su-8 2005 (MicroChem Corporation, Westborough, MA, USA) to fabricate an insulating layer that separates the temperature sensor from the 3-axis force sensor. Next, the temperature sensor was fabricated by patterning the shape of the temperature sensor onto the insulating layer via the photolithography process and placing it in the deposition equipment to deposit a metal layer (Cr: 5.7 nm/Au: 120 nm). Once the deposition process was completed, the lift-off process was used to leave only the temperature sensor shape to conclude the sensing layer fabrication process. For the fabrication of the PDMS layers, the encapsulation layer was produced by pouring 5:1 PDMS directly on the completed sensing layer, conducting spin coating, and curing it in an oven (70 °C) for 2 h. The bump layer was produced by pouring 5:1 PDMS onto a metal mold and curing it in an oven under the same conditions as the encapsulation layer. The fabricated encapsulation and bump layers were then combined via the O2 plasma treatment process [31,32]. Next, the deformation layer was fabricated by attaching a PI substrate (adhesion layer) to a wafer coated with 10:1 PDMS, pouring 50:1 PDMS on top of the PI substrate, conducting spin coating, and curing at room temperature (23 °C) for 24 h. The tactile sensor fabrication process concluded by attaching the PI substrate (sensing layer) combined with the bump layer on top of the produced deformation layer. To attach the fabricated tactile sensor to the fingertip frame, a thin layer of 50:1 PDMS was applied to the section where the sensor is to be located and cured in an oven (70 °C, 2 h), after which the PI substrate (adhesion layer), the lowest layer of the sensor, was attached to the section. Subsequently, the fingertip frame with the sensor attached was placed in a mold dedicated to the fabrication of fingerprinted skin that was filled with Ecoflex 0030, and the mold was then cured in an oven (70 °C) for 2 h to produce the fingerprinted skin. Lastly, the heater and circuit board of the sensor were mounted in the space on the opposite end to the sensor, and the remaining parts were mounted to complete the robot fingertip module with multimodal sensing capabilities.

3. Experimental Setup & Procedure for Texture Discrimination

3.1. Experimental Setup

For the robot fingertip module with multimodal sensing capabilities to discriminate the textures of objects, the module should be capable of vertically pressing (static contact) and rubbing (dynamic contact) objects. Hence, we used a motorized stage to develop a texture discrimination platform capable of both types of actions, as shown in Figure 2. The platform included a Z-axis motion stage, which enables the robot fingertip module with the mounted multimodal tactile sensor to touch the surface of samples in the vertical (Z-axis) direction, and an X-axis motion stage, which enables the module to scan the surface of an object it is in contact with at a constant velocity. A laser displacement sensor (ZX-LD100L, OMRON, Kyoto, Japan) was installed on the Z-axis motion stage to measure the distance between the module and the object to enable control of the speed at which the module approaches the object. Notably, the displacement data from the sensor becomes the standard for measuring the hardness of the object. If an object is pressed (indentation) with the same displacement, harder objects will produce larger contact forces whereas soft objects like a sponge will produce smaller contact forces. In addition, a tension block that provides compliance was installed to allow the robot finger mounted on the Z-axis stage to move according to irregularities on the object surface and maintain a preloaded force in a series of experiments so that variations of the contact pressure at a start of the sliding motion among samples could be minimized. Furthermore, the robot fingertip module was mounted on a rotary stage that was installed on the Z-axis stage; this enabled control over the area of the module that was in contact with the sample by changing the contact angle. A holder to place the texture sample corresponding to the object was installed on the X-axis motion stage. The holder, which had dimensions of W: 60 mm × H: 150 mm, was designed in the form of a flat and angular u-shape with a depth of 2 mm so that a 140mm-long texture sample could be fixed. In addition, the holder was made of steel to enable a pad-shaped rubber magnet located under each sample to attach to the holder, which aimed to minimize the movement of the sample during the measurement process. To embed the circuit board that processes the measurement signals from the tactile sensor onto the module, the circuit board was developed with a two layer-structure (dimensions: W: 18 mm × H: 14.5 mm × t: 5.2 mm). One layer consists of an analog strain gauge signal amplifier circuit and the other layer consists of an analog to digital converter and a digital circuit that performs computation and communication functions. Additionally, the length of wiring to connect the sensor and the circuit board was minimized, which also reduced the effects of external noise. Therefore, the noise levels of the gauges processed using the developed circuit board were maintained below 0.02%, which enabled sufficient discrimination of 0.5 gf Z-axis force signals. Furthermore, by setting the internal timer interrupt generated every 2 ms in the MCU (STM8L151G6) and measuring the voltage every interrupt cycle, the sensor signals could be processed at a sampling rate of 500 Hz. Figure 3 shows the communication method used to control the platform and the circuit board design.

3.2. Experimental Procedure

Figure 4 shows the flow chart describing the experiment performed using the texture discrimination platform to discriminate between texture samples. The experiment was divided into a training mode and a texture discrimination mode. The training mode was a learning step, and the texture discrimination mode was a step in which arbitrary samples were discriminated based on the learned data. The method of operation was as follows. First, the positions of the motion stages (X-axis stage and Z-axis stage) were initialized before conducting measurements under each mode. In the case of the training mode, the number of measurement repetitions was inputted. The averages and standard deviations of each of the four tactile modalities could be obtained through repeated measurements. The error of the average value decreases as the number of repetitions increases; thus, we conducted 10 repeated measurements to sufficiently minimize the error of the average values. Next, to ensure the robot fingertip module approached the texture sample in a stable manner, the Z-axis stage, which has a displacement sensor, was set to approach the sample in two phases. In the first phase, the relative positions of the sample to be measured and the module was determined using the displacement sensor, and this information was sent to the stage as feedback. The stage then rapidly closed the distance between the module and the sample (Fast approach). Afterwards, the module approached the texture sample until the moment of contact at a slow, constant speed (Slow approach). When the tactile sensor in the module detected contact force, this signal was fed back to the stage to halt the movement. Once the module and sample came into contact in this manner, the static contact phase commenced. In this phase, the Z-axis stage moved in steps to measure the hardness and thermal conductivity of the sample. First, the module moved downwards in the vertical (Z-axis) direction by a total displacement of 1 mm over eight equal steps, and the contact force was measured for each step. Through this method, the hardness of the sample could be estimated by measuring the difference in contact force between samples for a given displacement. After indenting to a depth of 1 mm, the contact was maintained for 15 s to detect changes in the temperature of the module over time as heat flowed from the module to the sample. This information could be used to obtain a value representative of the thermal conductivity of the sample. The static contact phase was completed by completing the aforementioned sequence of actions. For the following dynamic contact phase, the X-axis stage was moved at a constant speed (10 mm/s) for 10 s to scan the surface of the sample. This process provided data that could be used to measure the friction coefficient and roughness of the texture sample. Lastly, the tactile sensor signals obtained from each phase were analyzed in the time and frequency domains. Therefore, in the training mode, repeated measurements were conducted for each sample to analyze the obtained sensor signals with specific formulas, and the obtained data (average and standard deviation) were saved on a database. For the texture discrimination mode, a specific sample was measured and the analyzed data were compared to the saved database to discriminate which sample most closely resembles the texture sample being measured using a specific algorithm (described in detail in Section 4). The overall process was controlled using the LabVIEW software.

3.3. Texture Samples

A total of 16 texture samples of various materials were selected to demonstrate whether the robot fingertip module on the texture discrimination platform could discriminate the textures of objects. As shown in Figure 5, the 16 selected samples were as follows (in ascending order): Kimwipes, fine fabric, denim, leather, rubber, metal (Cu), wood, paper box, metal (Al), acrylic, Styrofoam, hard sponge, soft sponge, leggings fabric, flannel fabric, and toweling. The samples had dimensions of W: 60 mm × H: 140 mm and had slight discrepancies in terms of thickness; however, all samples had thicknesses in the range of t: 1.9~25.2 mm. This thickness includes the rubber magnet (t: 1 mm) that is used to attach and detach the sample to the holder. In addition, to measure the distance between the sample and the robot finger without error using the laser displacement sensor attached to the Z-axis stage, a piece of reflector paper was attached to each sample where the laser of the distance sensor was beamed onto the sample.

4. Results and Discussion

4.1. Multimodal Sensing Capabilities of the Tactile Sensor

The repeatability and stability characteristics of a sensor are as important as multimodal sensing capabilities when discriminating the textures of objects based on various tactile modalities. Even if a sensor is able to measure numerous physical quantities, the reliability of its measurements is compromised if the sensor has low stability and repeatability characteristics. For example, a sensor with such characteristics would produce fluctuating readings even when measuring a single sample, greatly hindering the effectiveness of learning. As such, it is vital to conduct repeatability and stability evaluations in addition to evaluations of the physical quantities that can be detected by the robot fingertip module with the tactile sensor.
To evaluate the vertical (Z-directional) force measurement capability, the robot fingertip module was placed on a precision balance (BSA6202S-CW, Sartorius, Göttingen, Germany) with a resolution of 0.01 g and a metal tip with a diameter of 3 mm was attached to the indenter connected to the 3-axis stage [16]. The metal tip was placed on the center of the bump of the tactile sensor built into the robot fingertip module, and the vertical force measurement capability of the sensor was evaluated. The vertical direction (Z-axis) force measurement capability ( R z ) serves as a measure of the two aforementioned characteristics as well as the range of force the sensor is capable of detecting, and is calculated as the average rate of change in resistance of the four strain gauges [33]. This can be expressed by the following equation:
R z = ( 1 4 × ( Δ R N ( R N ) 0 + Δ R E ( R E ) 0 + Δ R S ( R S ) 0 + Δ R W ( R W ) 0 ) ) × 100 ( % )
Δ R i = R i ( R i ) 0
where   Δ R i is the change in resistance of strain gauge i ; ( R i ) 0 is the initial resistance value of strain gauge i ; and i refers to the strain gauge located in the N, E, S, or W directions. As a result of evaluating the vertical direction (Z-axis) force measurement capability using this equation, it was found that the hysteresis error was low at 2.66% within the force range of 0–100 gf, and the sensor possessed the sensitivity to sufficiently discriminate forces as small as 0.5 gf. In addition, the maximum detection force (100 gf) was applied to the module 1000 times to evaluate the repeatability characteristics. As a result, the repeatability error was also low at 1.96%, highlighting the excellent repeatability and stability characteristics of the sensor. Data on the sensor characteristics are shown in Figure 6a,b.
The horizontal direction (X- and Y-axes) force measurement capabilities of the sensor are important as these physical quantities are used to measure the friction coefficient of samples, which is not possible with the vertical direction (Z-axis) force measurement capability. The difference in resistance change rates between the two gauges located along the X-axis was used to calculate the X-axis force ( R x ), and the Y-axis force ( R y ) was calculated in the same manner using the difference in resistance change rates of the two gauges located along in the Y-axis [33]. This can be expressed by the following equations:
R x = ( Δ R W ( R W ) 0 Δ R E ( R E ) 0 ) × 100 ( % )
R y = ( Δ R S ( R S ) 0 Δ R N ( R N ) 0 ) × 100 ( % )
To evaluate the 3-axis force measurement capabilities, the outputs of the robot fingertip module were compared with those of a commercial multi-axis load cell (model: SRI-M3701A, Sunrise Instruments, Canton, MI, USA) with Fz, Fx, and Fy capacities of 100 N, 50 N, and 50 N, respectively. A metal tip with a diameter of 3 mm connected to the multi-axis load cell was used to apply horizontal forces to the module. The test started with the preloading of a vertical force of 10 gf, then the metal tip was moved in the (+) Y-axis direction by approximately 20 μm, 40 μm, and 60 μm to apply 20 mN, 40 mN, and 60 mN horizontal forces, respectively, while the sensor signals were observed. As the developed sensor comprises of a single cell to measure 3-axis forces, it is necessary to calculate the cross-talk error ( S ), which indicates whether the sensor is able to independently measure forces in each direction without the reading being affected by the signals of the other force directions [34]. This is calculated by applying the maximum horizontal direction (in our case, Y-axis) force to the module and comparing the changes in the force measurement capabilities in the other directions (X- and Z-axes) to the changes in the horizontal (Y-axis) direction force measurement capabilities before and after the force is applied. This can be expressed by the following equations:
S x = | Δ R x Δ R y | × 100 ( % )
S z = | Δ R z Δ R y | × 100 ( % )
Δ R a = ( R a ) After   maximum   movement ( R a ) before   maximum   movement
where Δ R a is the force measurement capability for direction a , and a refers to the X-, Y-, or Z-axes. From the equations, S x and S z were calculated as 3.57% and 2.14%, respectively. Thus, the developed sensor had sufficient sensitivity to detect horizontal forces as small as 20 mN, and the average cross-talk error was less than 3%. This highlights the 3-axis force sensing capabilities of the sensor. Related sensor characteristics are shown in Figure 6c.
To evaluate the temperature sensing capabilities of the sensor, the fingerprinted skin of the robot fingertip module was placed on a hot Plate (MSH-30D, Daihan Scientific, Wonju, Korea) and the temperature of the hot plate was increased by 4 °C intervals within the range of 23–35 °C. For accurate temperature measurements, a platinum resistance thermometer with a resolution of 0.01 °C was attached to the hot plate. The resistance of the temperature sensor was measured 30 min after each temperature increase to ensure the heat from the hot plate had sufficiently transferred to the module to achieve equilibrium. As shown in Figure 6d, the resistance of the temperature sensor increased by approximately 79 Ω as the hot plate temperature increased from 22.9 °C (approximately 2900 Ω) to 34.4 °C (approximately 2979 Ω). The sensor resistance changed in a near-linear manner according to the temperature increase of each interval. The temperature coefficient of resistance (TCR, α) of the temperature sensor was calculated as 0.23 %/°C, which is lower than the reported TCR of gold (0.34 %/°C) [35]. This is because the temperature sensor was fabricated by depositing Au pellets, which resulted in a non-uniform crystal structure compared to bulk Au.

4.2. Demonstration of Texture Discrimination

Since the performance of the robot fingertip module with the built-in multimodal tactile sensor was verified, we attempted to demonstrate whether our module could discriminate textures of objects. We conducted the experiments according to the aforementioned flow chart using the 16 selected tactile samples. First, 10 repeated measurements were conducted for each sample under the training mode and the measurement results were used to calculate the averages ( m ) and standard deviations ( σ ) for six specific indicators ( h , μ, 1st PSD, 2nd PSD, 3rd PSD, ϑ ) that correspond to the four tactile modalities (hardness, friction coefficient, roughness, and thermal conductivity). The calculated data were used to build a database, which was used to perform the texture discrimination experiment with arbitrary samples. The following paragraphs describe the calculation methods and formulas for the specific indicators of each modality.
Hardness ( h ) is an indicator of the rigidity of a sample and can be estimated using the vertical direction force measurement capabilities of the tactile sensor by moving the module after coming into contact with the sample in the vertical direction in eight steps ( n ). We calculated hardness by averaging the vertical direction force measurement capability ( R z ) values detected by the sensor for the final four of the total eight steps. This is because there is the potential for the contact area of the module with the sample to increase during the initial four steps, which can slightly alter the angle of the finger module, leading to horizontal forces being applied in tandem with vertical forces. However, such a phenomenon does not take place after the third step. The formula can be expressed as follows:
h = 1 4 × ( n = 5 8 ( R z ) n ( R z ) n 1 )
Friction coefficient ( μ ) is a proportional constant that refers to the degree of friction between two surfaces in contact and is calculated by dividing the horizontal force by the vertical force. We calculated the horizontal (Y-axis) direction force measurement capability ( R y ) and the vertical direction (Z-axis) force measurement capability ( R z ) of the tactile sensor when the module is rubbed on the sample (dynamic contact). This is because the developed module moves in the Y-axis direction of the cell of the 3-axis sensor. This can be expressed by the following equation:
μ = R y R z
By plotting the calculated values of the two indicators for the 16 samples on a single graph, the graph could be viewed in quadrants, as shown in Figure 7a. As a result, it was found that the samples could be categorized into groups. In particular, samples located in the first quadrant, which had smooth surfaces, exhibited significant dispersion in terms of friction coefficient due to the rubbing action (dynamic contact), resulting in the stick-slip phenomenon. Conversely, no samples were located in the fourth quadrant, which indicates that none of the samples measured by the developed platform had both high friction coefficient and low hardness. According to the hardness data, Sample No. 11 (Styrofoam) was measured as the hardest material despite the fact that the actual hardness of the material is low, which reflects how humans struggle to determine differences in hardness between Styrofoam and a relatively harder material like Sample No. 10 (acrylic). This is due to the fact that international standards of hardness are measured by indenting a pointed probe to a certain depth on a material. On the other hand, the texture discrimination system developed in this study measures hardness similar to how humans press on materials with their fingers, hence the difficulty in discriminating the samples with the exception of soft materials, as in the case of Sample No. 13 (soft sponge).
Power Spectral Density (PSD) represents the contact energy. As the module moved horizontally while in contact with the texture sample, the magnitude (energy) of the detected signal from the sensor changed according to regular or irregular bumps on the surface of the sample. As such, this physical quantity is a good representation of the roughness of the surface of the object [14]. The first, second, and third PSDs were obtained by converting time domain signals to the frequency domain through Fast Fourier Transforms (FFT), categorizing signals into the low frequency domain (0~1 Hz, 1st PSD), medium frequency domain (1~10 Hz, 2nd PSD), high frequency domain (10~20 Hz, 3rd PSD), very high frequency domain (20~250 Hz, 4th PSD), and calculating the area of each frequency domain. The measured PSD values (1st, 2nd and 3rd PSDs) showed differences significant enough to discriminate samples as shown in Figure 7b, but the 4th PSD value did not exhibit a meaningful difference in our experiments; therefore, we did not use the 4th PSD value as an indicator.
Thermal conductivity ( ϑ ) is an inherent property of an object and is important in discriminating between textures of objects. The built-in heater was used to maintain the temperature of the module above the ambient temperature (at approximately 30 °C). As the module maintained contact (static contact) for a certain period of time (15 s in our experiments) with the texture sample, which had achieved equilibrium with the ambient temperature (20~23 °C), heat began to flow between the module and the sample. The resulting change in temperature of the module could be used to calculate a value representative of the thermal conductivity of the sample. However, since the degree of temperature change may vary depending on the ambient temperature, the measurements were normalized using the difference between the initial temperatures of the module and sample. The temperature of the module was calculated using a linear equation obtained through the linear fitting of the temperature sensing capabilities of the temperature sensor. The equation to estimate the thermal conductivity of the sample can be expressed as follows:
ϑ = T c T i T i T a
where T c is the module temperature 15 s after contact with the sample, T i is the initial module temperature, and T a is the ambient temperature. According to the thermal conductivity values of the 16 samples shown in Figure 7c that were obtained using the aforementioned equation, Samples No. 11 (Styrofoam) and No. 13 (sponge) had high values compared to the other samples. This is due to the insulating effect of these materials as a consequence of their low thermal conductivities. Conversely, Samples No. 6 (copper) and No. 9 (aluminum) had low values due to their high thermal conductivities, which indicates that the temperature decreased rapidly as the finger came into contact with the samples.
The texture discrimination experiment was performed with the training database that was constructed using the aforementioned equations. First, an arbitrary sample was measured according to the process flow chart to obtain values ( x j ) for each indicator (j). Next, we confirmed whether x j was within the range of m j ± σ j of the 16 samples for each indicator to calculate a weight value ( w j ). The method of calculating w j is as follows. If x j is within the range of a specific sample for the corresponding index ( m j σ j x j m j + σ j ), w j is set as zero. If this is not the case ( x j m j σ j ,   x j m j + σ j ), w j is calculated by ranking in ascending order the difference between the measured value with the average values of each sample using arg rank ( | m j x j | ) . Here, the maximum value is 16. In this manner, the sum of the weight values ( w j ) is obtained by adding all calculated w j values of each sample for the six indicators. Using this result, the following equation is used to calculate the degree of similarity ( Y k ) with each sample.
Y k = p × q w j p × q × 100 ( % )
where k is the tactile sample number ranging from 1 to 16, p is the number of tactile samples accumulated in the database, and q is the number of indicators. As a result of performing the texture discrimination experiment using our method with Sample No. 2, for example, the probability of Sample No. 2 was the highest (77.15%), as shown in Figure 8. Ninety-eight percent accuracy was obtained after repeating the texture discrimination experiment 50 times.

5. Conclusions

Our research team developed a robot fingertip module based on a multimodal tactile sensor capable of detecting various tactile modalities to realize a robot with human tactile sensing capabilities. The robot fingertip module was mounted on a texture discrimination platform based on a 2-axis motorized stage to simulate static and dynamic contact on 16 texture samples. The experiments allowed the module to measure four tactile modalities (hardness, friction coefficient, roughness, and thermal conductivity) and discriminate arbitrary samples following training. This allowed us to take a step closer to implementing robots with the ability to quantify textures of objects. This research can contribute to the advancement of various industries related to objects that come into contact with our skin, such as textiles, automobile seats, and cosmetics. In the future, we would like to develop an array-type sensor capable of detecting a greater variety of tactile modalities (e.g., depth, humidity) to more accurately discriminate textures of objects. Furthermore, we are planning to discriminate the textures of a wider array of objects by integrating artificial intelligence [36] with the sensor.

Author Contributions

Conceptualization, B.-G.B. and J.-S.J.; Fabrication, B.-G.B. and J.-S.J.; validation, B.-G.B. and J.-S.J.; formal analysis, B.-G.B. and J.-S.J.; data curation, B.-G.B. and J.-S.J.; writing—original draft preparation, B.-G.B.; writing—review and editing, B.-G.B. and J.-S.J.; supervision, M.-S.K.; All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Technology Innovation Program (20001856, Development of robotic work control technology capable of grasping and manipulating various objects in everyday life environment based on multimodal recognition and using tools) and (10077620, Development of artificial electronic skin that mimics human skin structure and functions for tactile and kinesthetic feedback in robotic surgery or prosthetic arms) funded By the Ministry of Trade, Industry & Energy (MOTIE, Korea).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Available online: https://www.zins-ziegler-instruments.com/en/product/touchfeel-box/ (accessed on 28 April 2021).
  2. Dahiya, R.S.; Metta, G.; Valle, M.; Sandini, G. Tactile Sensing—From Humans to Humanoids. IEEE Trans. Robot. 2010, 26, 1–20. [Google Scholar] [CrossRef]
  3. Turvey, M.; Carello, C. Dynamic Touch. In Perception of Space and Motion; Elsevier: Amsterdam, The Netherlands, 1995; pp. 401–490. [Google Scholar]
  4. Lederman, S.J.; Klatzky, R.L. Haptic perception: A tutorial. Atten. Percept. Psychophys. 2009, 71, 1439–1459. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. Pont, S.C.; Kappers, A.M.L.; Koenderink, J.J. Similar mechanisms underlie curvature comparison by static and dynamic touch. Percept. Psychophys. 1999, 61, 874–894. [Google Scholar] [CrossRef] [Green Version]
  6. Chathuranga, D.S.; Hirai, S. Investigation of a biomimetic fingertip’s ability to discriminate fabrics based on surface textures. In Proceedings of the 2013 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Wollongong, Australia, 9–12 July 2013. [Google Scholar]
  7. Chathuranga, K.; Hirai, S. A bio-mimetic fingertip that detects force and vibration modalities and its application to surface identification. In Proceedings of the 2012 IEEE International Conference on Robotics and Biomimetics (ROBIO), Guangzhou, China, 11–14 December 2012. [Google Scholar]
  8. De Oliveira, T.E.A.; Cretu, A.-M.; Petriu, E.M. Multimodal Bio-Inspired Tactile Sensing Module. IEEE Sens. J. 2017, 17, 3231–3243. [Google Scholar] [CrossRef]
  9. De Oliveira, T.E.A.; Cretu, A.-M.; Petriu, E.M.; De Oliveira, T.A. Multimodal Bio-Inspired Tactile Sensing Module for Surface Characterization. Sensors 2017, 17, 1187. [Google Scholar] [CrossRef] [Green Version]
  10. Li, R.; Adelson, E.H. Sensing and Recognizing Surface Textures Using a GelSight Sensor. In Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition, Portland, OR, USA, 23–28 June 2013. [Google Scholar]
  11. Yuan, W.; Li, R.; Srinivasan, M.A.; Adelson, E.H. Measurement of shear and slip with a GelSight tactile sensor. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 25–30 May 2015. [Google Scholar]
  12. Yuan, W.; Zhu, C.; Owens, A.; Srinivasan, M.A.; Adelson, E.H. Shape-independent hardness estimation using deep learning and a GelSight tactile sensor. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Marina Bay Sands, Singapore, 29 May–3 June 2017. [Google Scholar]
  13. Wettels, N.; Santos, V.J.; Johansson, R.S.; Loeb, G. Biomimetic Tactile Sensor Array. Adv. Robot. 2008, 22, 829–849. [Google Scholar] [CrossRef] [Green Version]
  14. Fishel, J.A.; Loeb, G.E. Bayesian Exploration for Intelligent Identification of Textures. Front. Neurorobotics 2012, 6, 4. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Xu, D.; Loeb, G.; Fishel, J.A. Tactile identification of objects using Bayesian exploration. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013. [Google Scholar]
  16. Kim, M.-S.; Shin, H.-J.; Park, Y.-K. Design concept of high-performance flexible tactile sensors with a robust structure. Int. J. Precis. Eng. Manuf. 2012, 13, 1941–1947. [Google Scholar] [CrossRef]
  17. Jang, J.-S.; Kang, T.-H.; Song, H.-W.; Park, Y.-K.; Kim, M.-S. High-Performance Multimodal Flexible Tactile Sensor Capable of Measuring Pressure and Temperature Simultaneously. J. Korean Soc. Precis. Eng. 2014, 31, 683–688. [Google Scholar] [CrossRef]
  18. Park, M.; Kim, M.-S.; Park, Y.-K.; Ahn, J.-H. Si membrane based tactile sensor with active matrix circuitry for artificial skin applications. Appl. Phys. Lett. 2015, 106, 043502. [Google Scholar] [CrossRef]
  19. Smith, C.S. Piezoresistance Effect in Germanium and Silicon. Phys. Rev. 1954, 94, 42–49. [Google Scholar] [CrossRef]
  20. Kanda, Y. Piezoresistance effect of silicon. Sens. Actuators A Phys. 1991, 28, 83–91. [Google Scholar] [CrossRef]
  21. Yang, S.; Lu, N. Gauge Factor and Stretchability of Silicon-on-Polymer Strain Gauges. Sensors 2013, 13, 8577–8594. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  22. Lee, Y.; Bae, S.; Jang, H.; Jang, S.; Zhu, S.-E.; Sim, S.H.; Song, Y.I.; Hong, B.H.; Ahn, J.-H. Wafer-Scale Synthesis and Transfer of Graphene Films. Nano Lett. 2010, 10, 490–493. [Google Scholar] [CrossRef] [Green Version]
  23. Fan, J.A.; Yeo, W.-H.; Su, Y.; Hattori, Y.; Lee, W.; Jung, S.-Y.; Zhang, Y.; Liu, Z.; Cheng, H.; Falgout, L.; et al. Fractal design concepts for stretchable electronics. Nat. Commun. 2014, 5, 3266. [Google Scholar] [CrossRef] [Green Version]
  24. De Paoli, F. Measuring Polydimethylsiloxane (PDMS) Mechanical Properties Using Flat Punch Nanoindentation Focusing on Obtaining Full Contact. Master’s Thesis, University of South Florida, Tempa, FL, USA, 2015. [Google Scholar]
  25. Hwang, E.-S.; Seo, J.-H.; Kim, Y.-J. A Polymer-Based Flexible Tactile Sensor for Both Normal and Shear Load Detections and Its Application for Robotics. J. Microelectromech. Syst. 2007, 16, 556–563. [Google Scholar] [CrossRef]
  26. Beccai, L.; Roccella, S.; Arena, A.; Valvo, F.; Valdastri, P.; Menciassi, A.; Carrozza, M.C.; Dario, P. Design and fabrication of a hybrid silicon three-axial force sensor for biomechanical applications. Sensors Actuators A Phys. 2005, 120, 370–382. [Google Scholar] [CrossRef]
  27. Wang, Z.; Volinsky, A.A.; Gallant, N.D. Crosslinking effect on polydimethylsiloxane elastic modulus measured by custom-built compression instrument. J. Appl. Polym. Sci. 2014, 131. [Google Scholar] [CrossRef] [Green Version]
  28. Oprişan, C.; Cârlescu, V.; Barnea, A.; Prisacaru, G.; Olaru, D.N.; Plesu, G. Experimental determination of the Young’s modulus for the fingers with application in prehension systems for small cylindrical objects. IOP Conf. Ser. Mater. Sci. Eng. 2016, 147, 012058. [Google Scholar] [CrossRef] [Green Version]
  29. Park, Y.-L.; Majidi, C.; Kramer, R.K.; Bérard, P.; Wood, R.J. Hyperelastic pressure sensing with a liquid-embedded elastomer. J. Micromech. Microeng. 2010, 20. [Google Scholar] [CrossRef] [Green Version]
  30. Scheibert, J.; Leurent, S.; Prevost, A.; Debregeas, G. The Role of Fingerprints in the Coding of Tactile Information Probed with a Biomimetic Sensor. Science 2009, 323, 1503–1506. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  31. Duffy, D.C.; McDonald, J.C.; Schueller, O.J.; Whitesides, G.M. Rapid Prototyping of Microfluidic Systems in Poly(dimethylsiloxane). Anal. Chem. 1998, 70, 4974–4984. [Google Scholar] [CrossRef] [PubMed]
  32. Bhattacharya, S.; Datta, A.; Berg, J.M.; Gangopadhyay, S. Studies on surface wettability of poly(dimethyl) siloxane (PDMS) and glass under oxygen-plasma treatment and correlation with bond strength. J. Microelectromech. Syst. 2005, 14, 590–597. [Google Scholar] [CrossRef]
  33. Lee, H.-K.; Chung, J.; Chang, S.-I.; Yoon, E. Real-time measurement of the three-axis contact force distribution using a flexible capacitive polymer tactile sensor. J. Micromech. Microeng. 2011, 21. [Google Scholar] [CrossRef] [Green Version]
  34. Akbari, H.; Kazerooni, A. Improving the coupling errors of a Maltese cross-beams type six-axis force/moment sensor using numerical shape-optimization technique. Measurement 2018, 126, 342–355. [Google Scholar] [CrossRef]
  35. Available online: https://en.wikipedia.org/wiki/Electrical_resistivity_and_conductivity#cite_note-serway-27 (accessed on 28 April 2021).
  36. Zou, L.; Ge, C.; Wang, Z.J.; Cretu, E.; Li, X. Novel Tactile Sensor Technology and Smart Tactile Sensing Systems: A Review. Sensors 2017, 17, 2653. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Conceptual diagram of a robot fingertip module with multimodal tactile sensing capability. (a) An optical image about the sensing layer of the multimodal tactile sensor (w/o bump). (b) Cross-sectional view of a multimodal tactile sensor and its dimension. (c) 3D exploded view of the developed robot fingertip module with fingerprinted silicone skin.
Figure 1. Conceptual diagram of a robot fingertip module with multimodal tactile sensing capability. (a) An optical image about the sensing layer of the multimodal tactile sensor (w/o bump). (b) Cross-sectional view of a multimodal tactile sensor and its dimension. (c) 3D exploded view of the developed robot fingertip module with fingerprinted silicone skin.
Applsci 11 05256 g001
Figure 2. Photo showing the experiment of measuring a hard sponge (# 12) sample with a motorized texture discrimination platform equipped with a robot fingertip module.
Figure 2. Photo showing the experiment of measuring a hard sponge (# 12) sample with a motorized texture discrimination platform equipped with a robot fingertip module.
Applsci 11 05256 g002
Figure 3. Schematic diagram of the motorized texture discrimination platform with the sensor circuit board for processing multimodal tactile sensor signals that are derived from static and dynamic contact. (Inset) Block diagram of a sensor circuit board.
Figure 3. Schematic diagram of the motorized texture discrimination platform with the sensor circuit board for processing multimodal tactile sensor signals that are derived from static and dynamic contact. (Inset) Block diagram of a sensor circuit board.
Applsci 11 05256 g003
Figure 4. The flow chart for operating the motorized texture discrimination platform.
Figure 4. The flow chart for operating the motorized texture discrimination platform.
Applsci 11 05256 g004
Figure 5. The picture of 16 tactile samples used in the tactile identification experiment.
Figure 5. The picture of 16 tactile samples used in the tactile identification experiment.
Applsci 11 05256 g005
Figure 6. Multimodal sensing characteristics of the tactile sensor: (a) Output data about hysteresis characteristic and sensing range. (Inset) Sensor resolution data according to 0.5 g step force. (b) One-thousand repeated measurement data at maximum sensing normal force (100 gf). (c) 3-axis force data related to the shear force vector generated by a 20 μm step motion in the Y-axis direction. (d) Temperature sensor characteristics for the temperature range from room temperature (23 °C) to human body temperature (35 °C).
Figure 6. Multimodal sensing characteristics of the tactile sensor: (a) Output data about hysteresis characteristic and sensing range. (Inset) Sensor resolution data according to 0.5 g step force. (b) One-thousand repeated measurement data at maximum sensing normal force (100 gf). (c) 3-axis force data related to the shear force vector generated by a 20 μm step motion in the Y-axis direction. (d) Temperature sensor characteristics for the temperature range from room temperature (23 °C) to human body temperature (35 °C).
Applsci 11 05256 g006
Figure 7. Mean and standard deviation data derived from 10 repeated measurements of 16 tactile samples for each indicator. (a) Hardness & Friction coefficient. (b) Roughness. (c) Thermal conductivity.
Figure 7. Mean and standard deviation data derived from 10 repeated measurements of 16 tactile samples for each indicator. (a) Hardness & Friction coefficient. (b) Roughness. (c) Thermal conductivity.
Applsci 11 05256 g007
Figure 8. An image showing the result of a tactile identification experiment with a fine fabric (# 2) sample based on the database built in the training mode.
Figure 8. An image showing the result of a tactile identification experiment with a fine fabric (# 2) sample based on the database built in the training mode.
Applsci 11 05256 g008
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Bok, B.-G.; Jang, J.-S.; Kim, M.-S. Texture Identification of Objects Using a Robot Fingertip Module with Multimodal Tactile Sensing Capability. Appl. Sci. 2021, 11, 5256. https://doi.org/10.3390/app11115256

AMA Style

Bok B-G, Jang J-S, Kim M-S. Texture Identification of Objects Using a Robot Fingertip Module with Multimodal Tactile Sensing Capability. Applied Sciences. 2021; 11(11):5256. https://doi.org/10.3390/app11115256

Chicago/Turabian Style

Bok, Bo-Gyu, Jin-Seok Jang, and Min-Seok Kim. 2021. "Texture Identification of Objects Using a Robot Fingertip Module with Multimodal Tactile Sensing Capability" Applied Sciences 11, no. 11: 5256. https://doi.org/10.3390/app11115256

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop