# Smart Parallel Picking Skill

## Overview <a href="#user-content-overview" id="user-content-overview"></a>

A typical mechanical type of gripping is the parallel grip, in which the mostly linear stroke movement of the gripper jaws enables objects to be lifted or gripped.&#x20;

The possible gripping axes of the gripper jaws are represented in the GUI by the lines on the objects. There are three different color codes, blue describes the current gripping candidate. Yellow stands for possible other gripping points, while red indicates that these gripping axes are rejected. Depending on the settings, more or less probable grab possibilities are shown here. Limiting the possible grab point options allows an individually adjustable balance between quality and speed.

Similar to the "Smart Vacuum Picking Skill", this skill can be optimally used for separation tasks. It can be used in many ways after it recognizes the best picking candidate on unknown parts in the source box. Classically, it is used in logistics areas, where the variety of objects is often not very clear. It solves the problem of kitting in the automotive industry, as well as the separation in warehouse processes.

<table data-card-size="large" data-view="cards"><thead><tr><th></th><th data-hidden data-card-cover data-type="files"></th></tr></thead><tbody><tr><td>Boxes</td><td><a href="https://1459495663-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FzRF3SV87vu3nkNgfjyt7%2Fuploads%2F2tjztMdjdtUjhLkFZhO7%2FSmart%20Parallel_Boxes.png?alt=media&#x26;token=cad0375a-25bf-4c64-8860-c2fc25196f1a">Smart Parallel_Boxes.png</a></td></tr><tr><td>Cylinders</td><td><a href="https://1459495663-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FzRF3SV87vu3nkNgfjyt7%2Fuploads%2FT7Mz1earUfQPGfZfKWym%2FSmart%20Parallel_Cylindrical.png?alt=media&#x26;token=9ff6188a-638f-46a1-81f8-e96589db7648">Smart Parallel_Cylindrical.png</a></td></tr><tr><td>M10 nuts</td><td><a href="https://1459495663-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FzRF3SV87vu3nkNgfjyt7%2Fuploads%2FZK7Cg0uMncR26kDBSnpb%2FSmart%20Parallel_M10.png?alt=media&#x26;token=eb0eb822-043f-464e-a0af-2ba2418910af">Smart Parallel_M10.png</a></td></tr><tr><td>Random Items</td><td><a href="https://1459495663-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FzRF3SV87vu3nkNgfjyt7%2Fuploads%2FIOk6OnzXkrU7WrJm2Qiy%2FSmart%20Parallel_Random.png?alt=media&#x26;token=d0849dcd-ecab-43c9-a92f-1e182c22f4b5">Smart Parallel_Random.png</a></td></tr></tbody></table>

## Skill Result Information

<table data-header-hidden><thead><tr><th width="187"></th><th></th></tr></thead><tbody><tr><td><em><mark style="color:blue;"><strong>Position</strong></mark></em></td><td>Position of the grasp point relative to the robot’s coordinate system <em>(in m)</em></td></tr><tr><td><mark style="color:blue;"><strong>Orientation</strong></mark></td><td>Rotation of the grasp point relative to the robot’s coordinate system <em>(in m)</em></td></tr><tr><td><em><mark style="color:blue;"><strong>Width</strong></mark></em></td><td>Determined grip width <em>(in m)</em></td></tr></tbody></table>

## Specifications&#x20;

<table data-view="cards"><thead><tr><th align="center"></th><th></th><th></th><th></th><th></th><th data-type="files"></th></tr></thead><tbody><tr><td align="center"><mark style="color:blue;"><strong>Conditions</strong></mark></td><td><p><strong>Camera mount:</strong> </p><ul><li>Dynamic</li><li>Static</li></ul></td><td><p><strong>Camera distance:</strong> </p><p>35 – 45 cm</p></td><td><p></p><p><strong>Parts dimensions:</strong> </p><p>2 – 6 cm</p></td><td><p></p><p><strong>Parts material:</strong> </p><p>matt - low reflection</p></td><td></td></tr><tr><td align="center"><mark style="color:blue;"><strong>Specs</strong></mark></td><td><p><strong>Avg. recognition time:</strong></p><p>&#x3C; 0.5 seconds</p></td><td><p></p><p><strong>Supported grippers:</strong></p><ul><li>Two-finger-gripper</li><li>Robotiq 2F-85</li><li>Zimmer GEH6040IL</li><li>OnRobot</li></ul><p></p></td><td></td><td></td><td></td></tr><tr><td align="center"><mark style="color:blue;"><strong>Features</strong></mark></td><td><ul><li>Collision detection</li><li>Generalized to a variety of object types</li><li>Picking from bulk</li></ul></td><td></td><td></td><td></td><td></td></tr></tbody></table>

## Parameter Example&#x20;

To ensure accurate identification of various types of objects, the skill parameters can be easily adjusted to fit your specific needs. Here are some recommendations to help you find the perfect parameters for your application. For your convenience, we've only included descriptions of parameters that differ from the default settings.

{% tabs %}
{% tab title="Box" %}

<figure><img src="https://1459495663-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FzRF3SV87vu3nkNgfjyt7%2Fuploads%2F2tjztMdjdtUjhLkFZhO7%2FSmart%20Parallel_Boxes.png?alt=media&#x26;token=cad0375a-25bf-4c64-8860-c2fc25196f1a" alt=""><figcaption><p>Smart Parallel Picking Skill for boxes</p></figcaption></figure>

* Used objects: small cardboard boxes (70x50x50mm), matt surface
* Camera distance: 550mm
* Camera mount: 30° angled
* Skill parameters:
  * Min score: 0.5
  * other parameters: default values
    {% endtab %}

{% tab title="Cylinders" %}

<figure><img src="https://1459495663-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FzRF3SV87vu3nkNgfjyt7%2Fuploads%2FT7Mz1earUfQPGfZfKWym%2FSmart%20Parallel_Cylindrical.png?alt=media&#x26;token=9ff6188a-638f-46a1-81f8-e96589db7648" alt=""><figcaption><p>Smart Parallel Picking Skill for cylindrical objects</p></figcaption></figure>

* Used objects: Coffee cups with varying diameters and surface finishes
* Camera distance: 0.65
* Camera mount: 30° angled
* Skill parameters:
  * Edge sensitivity: 0.85
  * Min score: 0.65
  * other parameters: default values
    {% endtab %}

{% tab title="M10 nuts" %}

<figure><img src="https://1459495663-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FzRF3SV87vu3nkNgfjyt7%2Fuploads%2FZK7Cg0uMncR26kDBSnpb%2FSmart%20Parallel_M10.png?alt=media&#x26;token=eb0eb822-043f-464e-a0af-2ba2418910af" alt=""><figcaption><p>Smart Parallel Picking Skill for small metal parts</p></figcaption></figure>

* Used objects: M10 metal nuts
* Camera distance: 320mm
* Camera mount: 30° angled
* Skill parameters:
  * Edge sensitivity: 0.85
  * Min score: 0.3
  * Max grasping width: 0.02 (Ensures that, if two objects are close together, the robobrain doesn't mistake them for one piece. Adjust this value to the outside dimension of your specific part.)
    {% endtab %}

{% tab title="Random" %}

<figure><img src="https://1459495663-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FzRF3SV87vu3nkNgfjyt7%2Fuploads%2FIOk6OnzXkrU7WrJm2Qiy%2FSmart%20Parallel_Random.png?alt=media&#x26;token=d0849dcd-ecab-43c9-a92f-1e182c22f4b5" alt=""><figcaption><p>Smart Parallel Picking Skill for random objects</p></figcaption></figure>

* Used objects: variety of products of different sizes and shapes, shiny and matt surfaces
* Camera distance: 550 mm
* Camera mount: 30° angled
* Skill parameters:
  * Edge sensitivity: 0.85
  * Min score: 0.65
    {% endtab %}
    {% endtabs %}

## Technical Parameter Description <a href="#user-content-parameters" id="user-content-parameters"></a>

### Parameter

{% tabs %}
{% tab title="Basic" %}

<table data-view="cards"><thead><tr><th>Name</th><th>Parameter</th><th>Type</th><th>Description </th></tr></thead><tbody><tr><td><mark style="color:blue;"><strong>Edge sensitivity</strong></mark></td><td><code>edge_sensitivity</code></td><td><code>float</code></td><td>Determines the sensitivity for the edge-detection algorithm. A larger sensitivity can help to detect small steps in depth values. However, a higher sensitivity also leads to more false edge detections and a longer runtime.</td></tr><tr><td><mark style="color:blue;"><strong>Min score</strong></mark></td><td><code>min_score</code></td><td><code>float</code></td><td>Minimum quality score for grasp candidates. The quality score is estimated by a neural network.</td></tr><tr><td><mark style="color:blue;"><strong>Max candidates</strong></mark></td><td><code>max_candidates</code></td><td><code>int</code></td><td>The maximum number of grasp candidates that are evaluated by the neural network. While larger numbers increase the runtime, they can also improve the robustness of the skill.</td></tr><tr><td><mark style="color:blue;"><strong>Friction coefficient</strong></mark></td><td><code>friction_coefficient</code></td><td><code>float</code></td><td>The friction coefficient between the gripper fingers and items. The default value of 0.2 fits for most materials.</td></tr><tr><td><mark style="color:blue;"><strong>Min grasp distance</strong></mark></td><td><code>min_grasp_distance</code></td><td><code>float</code></td><td>The minimum distance between two grasp candidates. This parameter is used to limit the number of very similar candidates and, thus, increasing the robustness and reducing the runtime of the skill.</td></tr><tr><td><mark style="color:blue;"><strong>Grasp offset</strong></mark></td><td><code>grasp_offset</code></td><td><code>float</code></td><td>An offset for the grasp width. This parameter is used to increase the predicted grasp with to avoid collisions between the gripper and the item to be picked.</td></tr></tbody></table>
{% endtab %}

{% tab title="Gripper settings" %}

<table data-view="cards"><thead><tr><th>Name</th><th>Parameter</th><th>Type</th><th>Description </th></tr></thead><tbody><tr><td><mark style="color:blue;"><strong>Min grasping width</strong></mark></td><td><code>min_grasp_width</code></td><td><code>float</code></td><td>The minimum allowed width for a grasp candidate, This value should be adapted according to the item geometry and gripper restrictions.</td></tr><tr><td><mark style="color:blue;"><strong>Max grasping width</strong></mark></td><td><code>max_grasp_width</code></td><td><code>float</code></td><td>The maximum allowed width for a grasp candidate, This value should be adapted according to the item geometry and gripper restrictions.</td></tr><tr><td><mark style="color:blue;"><strong>Finger dimension along grasp axis</strong></mark></td><td><code>finger_on_axis</code></td><td><code>float</code></td><td>The finger dimension along the grasp axis. This value should be adapted to the geometry of the gripper.</td></tr><tr><td><mark style="color:blue;"><strong>Finger dimension perpendicular to grasp axis</strong></mark></td><td><code>finger_off_axis</code></td><td><code>float</code></td><td>The finger dimension perpendicular to the grasp axis. This value should be adapted to the geometry of the gripper.</td></tr></tbody></table>
{% endtab %}

{% tab title="Expert" %}

<table data-view="cards"><thead><tr><th>Name</th><th>Parameter</th><th>Type</th><th>Description </th></tr></thead><tbody><tr><td><mark style="color:blue;"><strong>Enforce GPU usage</strong></mark></td><td><code>force_gpu</code></td><td><code>boolean</code></td><td>Enforces GPU usage for this skill. Enforcing GPU usage for one skill can disable GPU usage for other skills.</td></tr></tbody></table>
{% endtab %}
{% endtabs %}

### Detections <a href="#user-content-detections" id="user-content-detections"></a>

{% tabs %}
{% tab title="Detections Data" %}

<table data-card-size="large" data-view="cards"><thead><tr><th>Type</th><th>Description </th></tr></thead><tbody><tr><td><code>pose (Transformation)</code></td><td>The pick pose for the parallel gripper</td></tr><tr><td><code>width (float)</code></td><td>Grasp width in meters</td></tr><tr><td><code>quality (float)</code></td><td>Grasp quality as estimated by the neural network.</td></tr></tbody></table>
{% endtab %}

{% tab title="Detection Example" %}

```json
{
  "detections":[
    {
      "pose":{
        "x":-0.03820001546400386,
        "y":-0.34587006375071555,
        "z":0.3627129340834892,
        "rx":0.4781488000765904,
        "ry":0.2873872297184221,
        "rz":-1.034786806850317
      },
      "width":0.029587643355909903,
      "quality":0.954622745513916,
      "optimization_angle":3.141592653589793
    }
  ]
}
```

{% endtab %}
{% endtabs %}
