LogoLogo
robomindsInterested in our solutions?
robobrain®
robobrain®
  • Manual robobrain®
  • robobrain®
    • robobrain® product documentation
      • Unboxing the robobrain®
      • robobrain® - installation and setup
        • Connecting the hardware
        • Setting up the network connection
        • Connecting the camera
      • User Interface
        • robobrain.cloud
        • Camera Settings
        • Hand-Eye-Calibration
        • Setting Up Bins
        • Creating a Workspace
        • robobrain® Operation
      • AI Skills
  • Robots
    • Robot Interfaces
      • Universal Robots
        • UR: Installation and setup
        • UR: robobrain® URCap
          • URCap: Landmark node
          • URCap: Calibration node
          • URCap: "robobrain PickPose" node
          • URCap: Functions
          • URCap: Detections list
        • UR: Example picking program
      • FANUC
        • FANUC: Installation and setup
          • Network and communication settings
          • Installation
        • FANUC: robobrain® interface
          • FANUC: Functions
          • FANUC: Used registers
        • FANUC: Hand-eye-calibration
        • FANUC: Workspace setup
        • FANUC: Example picking program
        • FANUC: Downloads
  • External Interfaces
    • Software interfaces
      • robobrain® RAP Python client
      • robobrain® RAP C++ client
      • robobrain® ROS2 Node
  • Helpdesk
    • FAQs & Troubleshooting
  • Technical documentation
    • EU Declaration of Conformity
Powered by GitBook

robominds GmbH

  • Moosacher Str. 42
  • 80809 München

Contact

  • support@robominds.de
  • +49 89 200 657 990
On this page
  1. Robots
  2. Robot Interfaces
  3. Universal Robots
  4. UR: robobrain® URCap

URCap: Detections list

PreviousURCap: FunctionsNextUR: Example picking program

Last updated 1 year ago

The robobrain® does not only return the preferred pick point for an object, but, depending on the skill, also many more detections that can be used in your robot program. While the pick point is automatically returned if you're using the "robobrain Pick Pose" node, the other values need to be accessed with special functions. The following picture shows an overview of those functions as depicted on the UR Teach Pendant when using the "Assignment" node.

The specific values and respective data types, that are returned by each skill, can be found on the robobrain.skills support page:

Depending on the detection you are interested in, you have to choose the appropriate robominds function. If you are using the Smart Parallel Picking Skill and want to get the grasping width for the gripper, you need to use the rb_get_attr_float() function:

grasp_width := rb_get_attr_float("detections[0].width")

Or you can use it directly for the first returned pick candidate:

grasp_width := rb_get_pick_width()

Other examples with the Smart Item Detection Skill:

Get the dimensions of the first / best pick point along the x- and y-axis:

dim_x := rb_get_attr_float("detections[0].dimensions[0]")
dim_y := rb_get_attr_float("detections[0].dimensions[1]")

Get the total number of detected pick points:

n_dets := rb_get_attr_int("detections.length()")
robobrain.skillsrobobrain.vision Manual
robominds functions to access all the additional detections
Logo