Difference between revisions of "API Reference"

Jump to: navigation , search
m
 
Line 1: Line 1:
<br>We charge a £20 administration and supply charge and you can pay over the cellphone with a debit or credit card. To use this Bin checker, first it's important to open this website in your browser. The bank identification number uniquely identifies the issuing bank of the card. For more info on [http://Puzzleonly.com/profile/tylerkeesle Cc BIN Search] look into our own website. Examine the returned HTTP headers of any API request to see your current fee restrict status.<br><br>Bintray will confirm the given sha2 and on mismatch will return a 409 Conflict error response. The forex of the additional amount charged due to further amounts set in the skin used within the HPP fee request. Use the C2 nation code for China worldwide for comparable uncontrolled worth (CUP) methodology, bank card, and cross-border transactions. Check with Receipt Information Fields under for a listing of the potential fields returned.<br><br>The final digit is the check digit and is added to validate the authenticity of the credit card number (based on the Luhn algorithm). You can see this for your self and we are convinced that you will not find any clean entries in it. All Financial institution Identification Numbers are supplied with complete and accurate info. Linkcheck is presently a number of instances faster than blc and all other hyperlink checkers that go to a minimum of comparable depth.<br><br>Bank Credit Card Number Generator - To generate valid (but fake) random Credit Card or Debit Card Numbers base on financial institution particular BIN. Thus, in an effort to decrease the risks of fraud and even alleviate the equivalent, lately, you can flip to a bin checker. Greatest BIN Checkers within the World - There are thousands and thousands of payment playing cards in circulation all over the world; every of those cards is exclusive, what differentiates each one in all them is the Bank Identification Number (BIN).<br><br>This algorithm uses the checksum method to validate a bank card. Oh and we need a big limit on the credit card as a result of the toaster my foster mum desires to buy is a very good one and costs $1,000,000. Or, you should utilize Eno to create digital playing cards earlier than making purchases. Digits seven to the second last of the bank card quantity denotes the account number for the cardboard.<br><br>Find Swift code, BIC Code of (INTERNATIONAL DIAMOND AND JEWELRY GROUP) Branch of ABN AMRO BANK N.V. HONG KONG BRANCH (ABNAHKAAIDJ) in HONG KONG, HONG KONG. SWIFT Code stands for Society for Worldwide Interbank Monetary Telecommunication code. In most cases, the first 8 characters (here ABNAHKAA) is adequate for initiating Worldwide Wire Transfers as the main department can internally switch to other department where the bank account of the recipient is current. If you transfer funds to someone who has an account with a bank outside the SEPA space, you will have to specify that financial institution's BIC.<br><br>The decision will both return None (in which case a person wasn't found), a dict ‘pending': true , wherein case the lookup is queued, or the related person dict. To save lots of you making two requests to do this, we provide a mixed lookup API. Exactbins is a reputation which you can belief for ironclad security in opposition to bank card fraud. BankBinList additionally permits you to obtain BIN information from their website for reference goal.<br><br>For those who simply give you a series of random numbers - either by yourself or by the generator - and you hope that it is a valid account which may be charged, then you'll be able to check your luck. These digits are the distinctive account quantity for the cardboard. The faux working cc numbers above are for cart checkout test that Q.A. engineers have to work on. They'll NOT work for real shopping as a result of they are from algorithms however don't hyperlink to anybody's bank account.<br><br>A SWIFT code (or BIC code) is a novel code that identifies monetary and non-monetary establishments and is especially used for worldwide wire transfers between banks. The registrations of Swift Codes are handled by Society for Worldwide Interbank Monetary Telecommunication (SWIFT) and their headquarters is situated in La Hulpe, Belgium. In 1991, Algemene Financial institution Nederland (ABN) and AMRO Bank (itself the result of a merger of the Amsterdamsche Financial institution and the Rotterdamsche Financial institution within the Sixties) agreed to merge to create the original ABN AMRO.<br><br>If a SWIFT code is eleven characters, which means the bank has added a three-digit code to denote a selected branch of a financial institution. A credit card is an exciting banking product supplied by several banking corporations today. You should use the app to log in and submit a wide range of transactions using your identification code or fingerprint, with out having to use your e.dentifier.<br>Optionally available, specify N or omit to return an account token in the account field within the response. The service retrieves the prevailing profile and updates it with the values in the request. The __mro__ attribute of the item-or-sort lists the strategy resolution search order used by each getattr() and tremendous() The attribute is dynamic and can change every time the inheritance hierarchy is updated.<br>
+
__NOTOC__
 +
<div class="title-block">
 +
 
 +
<span style="font-size:120%;">'''This page serves as a lookup reference for all the hardware and functionality for the SDK. The main interface of the SDK is via ROS Topics and Services, which you will find listed and described below along with other core information needed to interface with the robot.'''</span>
 +
 
 +
</div>
 +
 
 +
{{TOClimit|limit=2}}
 +
 
 +
<div class="content-block">
 +
 
 +
= ROS Topic API Reference =
 +
 
 +
 
 +
==Python Code API==
 +
: For the [[Robot Interface|Robot Interface]] Python classes (built on top of the ROS API), please see the [https://rethinkrobotics.github.io/intera_sdk_docs/index.html Code API Reference] page.
 +
 
 +
{| class="wikitable"
 +
|-
 +
! scope="col"| Robot
 +
! scope="col"| Movement
 +
! scope="col"| Sensors+
 +
! scope="col"| I/O
 +
|- style="vertical-align:top;"
 +
|
 +
* [[#Enable Robot|Enabling the Robot]]
 +
* [[#Robot Description (URDF)|Robot Description (URDF)]]
 +
||
 +
* [[#Joints|Joints]]
 +
** [[#Arm Joints|Arm Joints]]
 +
** [[#Head Joints|Head Joints]]
 +
* [[#Cartesian Endpoint|Cartesian Endpoint]]
 +
** [[#Endpoint State|Endpoint State]]
 +
** [[#Kinematics Solver Service|Kinematics Solver Service]]
 +
* [[#Gripper (End-Effector)|Gripper (End-Effector)]]
 +
||
 +
* [[#Accelerometer|Accelerometer]]
 +
* [[#Cameras|Cameras]]
 +
* [[#Head Display Screen|Head Display Screen]]
 +
|
 +
* [[#Navigators|Navigators]]
 +
* [[#Cuff Buttons|Cuff Buttons]]
 +
* [[#Lights|LED Lights]]
 +
|}
 +
 
 +
</div>
 +
 
 +
<div class="content-block">
 +
 
 +
= Robot =
 +
 
 +
 
 +
== Enable Robot ==
 +
Be sure that you 'Enable' the robot before attempting to control any of the motors. The easiest method for controlling the robot is to use the <code>enable_robot.py</code> ROS executable found in the following example:[https://github.com/RethinkRobotics/intera_sdk/blob/master/intera_interface/scripts/enable_robot.py Enable Robot Script]
 +
 
 +
=== Robot State ===
 +
<code>/robot/state</code> ([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/AssemblyState.html intera_core_msgs/AssemblyState])
 +
Subscribe to the Robot State for the enabled and error state of the robot hardware itself. It also includes information on the EStop.
 +
The robot must be enabled (<code>enabled: true</code>) in order to move the robot. Use the [https://github.com/RethinkRobotics/intera_sdk/blob/master/intera_interface/scripts/enable_robot.py Enable Robot Script], or the "Enable Robot Topic" below, to enable the robot.
 +
It is possible for the robot to have non-fatal errors, so <code>error</code> can be <code>true</code> while <code>enabled</code> is also <code>true</code>.
 +
For more complete information on robot state, see [[E-STOP and Enable Robot]].
 +
 
 +
=== Enable Robot ===
 +
<code>/robot/set_super_enable</code> ([http://www.ros.org/doc/api/std_msgs/html/msg/Bool.html std_msgs/Bool])
 +
<code>data</code>: <code>true</code> to Enable robot motors; <code>false</code> to disable.
 +
You can check the Robot State Topic to see if the robot enabled properly or if it has an error.
 +
 
 +
=== Reset Robot State ===
 +
<code>/robot/set_super_reset</code> ([http://www.ros.org/doc/api/std_msgs/html/msg/Empty.html std_msgs/Empty]) 
 +
Publish an Empty message to reset the state after an error.
 +
A reset will clear all pre-existing errors and the state (it will disable).
 +
 
 +
 
 +
== Robot Description (URDF) ==
 +
 
 +
Sawyer automatically builds an appropriate URDF (Unified Robot Description Format) on boot and loads it onto the ROS Parameter Server, under the ROS parameter name <code>/robot_description</code>. From here, it is accessible by rviz, tf and other ROS utilities that use the URDF.
 +
 
 +
The [http://wiki.ros.org/urdf Unified Robot Description Format (URDF)] is the standard ROS XML representation of the robot model (kinematics, dynamics, sensors) describing Sawyer.
 +
 
 +
Sawyer generates his URDF dynamically on robot startup. This model is updated when any gripper is attached or detached, an object is 'grasped' or released and its mass is compensated for, and when new urdf segments are provided/commanded to the gripper plugins. As of SDK versions >= 1.0.0 Sawyer's internal robot model, is loaded to the [http://wiki.ros.org/Parameter%20Server parameter server] on the topic <code>/robot_description</code>
 +
 
 +
The default URDF for Sawyer is available in the [https://github.com/RethinkRobotics/intera_common intera_common] repository. The package <code> sawyer_description </code> contains the URDF and accompanying meshes.
 +
 
 +
=== Getting a Copy of the URDF from the parameter server ===
 +
 
 +
You can now get the current URDF describing '''your''' Sawyer.
 +
 
 +
From a [[SDK_Shell | properly initialized]] Sawyer environment, export the URDF from the <code>/robot_description</code> parameter on the ROS parameter server where it is stored, to a file of your choice (ex: <code>sawyer_urdf.xml</code>):
 +
 
 +
<source lang="bash">
 +
$ rosparam get -p /robot_description | tail -n +2 > sawyer_urdf.xml
 +
</source>
 +
 
 +
The <code>-p</code> outputs the parameter using pretty print. The output urdf is piped through the <code>tail</code> command first to remove a dummy first line - an artifact of the pretty print.
 +
 
 +
'''Tip:''' You can check that you now have a proper URDF by running:
 +
<source lang="bash">
 +
$ rosrun urdfdom check_urdf sawyer_urdf.xml
 +
</source>
 +
 
 +
=== Robot State Publisher ===
 +
 
 +
The URDF is used by Sawyer's [http://wiki.ros.org/urdf Robot State Publishers] to create a tree of [http://wiki.ros.org/tf transforms (tfs)]. In fact, Sawyer has two of such publishers:
 +
'''robot_ref_publisher''': publishes transforms that reflect the commanded robot state.
 +
'''robot_state_publisher''': publishes transforms that reflect the measured state of the robot.
 +
 
 +
These robot publishers live internal to Sawyer and are accessible to the RSDK over ROS. The "ref" tfs are used by the robot internals, but you may find them useful to see where the robot will move at the next timestep. Otherwise, be sure to use the non-"ref" transforms if you're only interested in the Sawyer's current state.
 +
 
 +
=== Getting a Copy of the URDF Dynamically ===
 +
Sawyer generates the URDF dynamically on initialization, based on the attached arm.  In some cases, users may want to get the current URDF off the robot. 
 +
From a working Sawyer RSDK environment, export the URDF from the <code>/robot_description</code> parameter on the ROS parameter server where it is stored, to a file of your choice (ex: <code>sawyer_urdf.xml</code>):
 +
 
 +
<source lang="bash">
 +
$ rosparam get -p /robot_description | tail -n +2 > sawyer_urdf.xml
 +
</source>
 +
 
 +
The <code>-p</code> outputs the parameter using pretty print.  The output urdf is piped through the <code>tail</code> command first to remove a dummy first line - an artifact of the pretty print. 
 +
 
 +
'''Tip:''' You can check that you now have a proper URDF by running:
 +
 
 +
<source lang="bash">
 +
$ rosrun urdf_parser check_urdf sawyer_urdf.xml
 +
</source>
 +
 
 +
If this doesn't work, you can just remove the tail command and use a text editor to manually remove the first few lines before the actual xml (all the lines before <code><?xml version="1.0" ?></code>).
 +
<source lang="bash">
 +
    $ rosparam get -p /robot_description > sawyer_urdf.xml
 +
    $ head sawyer_urdf.xml
 +
    | 
 +
      <?xml version="1.0" ?> 
 +
      <!-- =================================================================================== --> 
 +
      <!-- |    This document was autogenerated by xacro from sawyerp2.urdf.xacro            | --> 
 +
    ... 
 +
    $ gedit sawyer_urdf.xml &
 +
    $ head sawyer_urdf.xml
 +
      <?xml version="1.0" ?> 
 +
      <!-- =================================================================================== --> 
 +
      <!-- |    This document was autogenerated by xacro from sawyerp2.urdf.xacro            | --> 
 +
    ...
 +
</source>
 +
</div>
 +
 
 +
 
 +
<div class="content-block">
 +
 
 +
= Movement =
 +
 
 +
 
 +
== Joints ==
 +
Sawyer has 7 joints (DoF) in arm and one more joint in its head (side-to-side panning).  The control for the head is done separately from the arm; however, you can read the current joint states (position, velocity, and effort) for all the joints on arm and head by subscribing to one topic:
 +
<code>/robot/joint_states</code> ([http://www.ros.org/doc/api/sensor_msgs/html/msg/JointState.html sensor_msgs-JointState]) 
 +
where the units for the position of a joint are in (rad), the units of velocity are in (rad/s) and the units of effort in each joint is in (Nm).
 +
 
 +
=== Arm Joints ===
 +
The following sections cover the arm joints sensing and control in more detail: [[Arm Joints]].
 +
 
 +
=== Head Joints ===
 +
The head state topic will give you the current <code>pan</code> angle (side-to-side) of the head and report boolean status flags if the robot is currently moving its head. 
 +
 
 +
'''Note: Flags may not report 'true' values until after the first respective movement command is sent.''' 
 +
 
 +
'''Component ID:'''<code>head_pan</code>
 +
 
 +
'''Head State:'''  <br />
 +
<code>/robot/head/head_state</code> ([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/HeadState.html intera_core_msgs-HeadState]). <code>pan</code> field gives you the current angle (radians) of the head.  0 is forward, <code>-pi/2</code> to Sawyer's right, and <code>+pi/2</code> to Sawyer's left. <code>isPanning</code> is boolean field that will switch to True while the robot is executing a command. 
 +
 
 +
'''Note: The <code>isPanning</code> field is initialized to True upon startup and will update thereafter.'''
 +
 
 +
'''Head (Joint) State:'''  <br />
 +
<code>/robot/joint_states</code> ([http://www.ros.org/doc/api/sensor_msgs/html/msg/JointState.html sensor_msgs-JointState]).The position of the head may also be determined from the <code>joint_state</code> message.
 +
 
 +
=== Head Movement Control ===
 +
'''Pan Head:'''  <br />
 +
<code>/robot/head/command_head_pan</code> ([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/HeadPanCommand.html intera_core_msgs-HeadPanCommand]) 
 +
<code>target</code> sets the target angle.  0.0 is straight ahead.
 +
<code>speed</code> is an integer from [0-100], 100 = max speed.
 +
Setting an angle in the command_head_pan topic does not gurantee the head will get to that position.  There is a small deband around the reference angle around the order of +/- 0.12 radians.
 +
 
 +
=== Example: ===
 +
<source lang="bash">
 +
    # Check head position/state:
 +
    $ rostopic echo /robot/head/head_state
 +
    # Move (pan) head side-to-side:
 +
    $ rostopic pub /robot/head/command_head_pan intera_core_msgs/HeadPanCommand -- 0.0 100
 +
</source>
 +
 
 +
 
 +
 
 +
== Cartesian Endpoint ==
 +
Published at 100 Hz, the endpoint state topic provides the current Cartesian Position, Velocity and Effort at the endpoint for either limb.
 +
 
 +
=== Endpoint State ===
 +
'''Endpoint State''': <code>/robot/limb/right/endpoint_state</code>  [https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/EndpointState.html intera_core_msgs-EndpointState]).
 +
The endpoint state message provides the current <code>position/orientation pose</code>, <code>linear/angular velocity</code>, and <code>force/torque effort</code> of the robot end-effector at 100 Hz. Pose is in Meters, Velocity in m/s, Effort in Nm.
 +
The robot's "endpoint" is definied as the <code>right_gripper</code> tf frame. This frame is updated dynamically when gripper is connected to the robot <!--or a [[Gripper Customization]] command is sent-->.
 +
The [[Robot Description | URDF]] on the parameter server will now update when the Robot Model is updated by Gripper changes. Check the ROS Parameter for an updated copy of the URDF, especially before using IK or motion planners, such as [[MoveIt_Tutorial|MoveIt!]].
 +
 
 +
=== Kinematics Solver Service ===
 +
The following sections cover the Forward Kinematics Solver Service and Inverse Kinematics Solver Service in more detail: [[Kinematics Solvers |Kinematics Solvers]]
 +
 
 +
 
 +
 
 +
== Gripper (End-Effector) ==
 +
Before using an End-Effector, or Gripper, you must first send the calibration command. You can check whether the gripper has been calibrated yet by echoing on the gripper state topic for that hand. Once calibrated, gripper can be controlled using the simplified command_grip and command_release topics, or using the more direct command_set topic.
 +
For more information on using the gripper, see the [[Gripper_Example | Gripper Example Program]].
 +
 
 +
=== Gripper Configuration ===
 +
([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/IOComponentCommand.html intera_core_msgs-IOComponentCommand])
 +
 
 +
'''Calibrate Gripper'''  <br />
 +
Publish an IO Command message to calibrate a new gripper by set signal <code>calibrate</code> to <code>True</code>. Gripper should open and close once.
 +
The <code>calibrated</code> field of the gripper state topic will also update to '1' after successful calibration.
 +
Once calibrated, the gripper will not calibrate again unless the command reset message is sent, or the robot is restarted.
 +
 
 +
'''Reset Gripper'''  <br />
 +
Publish an IO Command message to reset the gripper state by set signal <code>reboot</code> to <code>True</code>.
 +
The <code>calibrated</code> field of the gripper state message will reset to '0'.
 +
 
 +
=== Gripper State ===
 +
'''Gripper State'''  <br />
 +
<code>/io/end_effector/state</code>
 +
The io signal <code>calibrated</code> field must be true (1) before you can control the gripper.  Use the IO command to calibrate the gripper.
 +
The gripper state message will also give you the current <code>position</code>, <code>force</code>, and if the gripper is current <code>moving</code>.  Position is from [0.0-100.0] [close-open].
 +
 
 +
=== Simple Gripper Control ===
 +
'''Simple Gripper Close''' <br />
 +
<code>/io/end_effector/command</code> the signal <code>'position_m'</code> value to MIN_POSITION: 0.0.
 +
Publish an IO Command message to grip.
 +
 
 +
'''Simple Gripper Open'''  <br />
 +
<code>/io/end_effector/command</code> the signal <code>'position_m'</code> value to MAX_POSITION: 0.041667.
 +
Publish an IO Command message to release.
 +
 
 +
</div>
 +
 
 +
 
 +
<div class="content-block">
 +
 
 +
= Sensors+ =
 +
 
 +
== Accelerometer ==
 +
The robot hand has a 3-axis accelerometer located inside the cuff, in the same plane as the gripper electrical connection header.  The positive z-axis points back 'up' the arm (towards the previous wrist joint, j6).  The positive x-axis points towards the direction of gripper, and the y-axis points towards the cuff buttons, using standard [http://en.wikipedia.org/wiki/Right-hand_rule Right-Hand-Rule] notation.
 +
 
 +
'''Component IDs:''' <br />
 +
<code>right_accelerometer</code> 
 +
 
 +
'''Accelerometer State:''' <br />
 +
<code>/robot/accelerometer/<component_id>/state</code>([https://http://docs.ros.org/api/sensor_msgs/html/msg/Imu.html sensor_msgs-ImuMessage])
 +
 
 +
Acceleration values (in m/s^2) are published under <code>linear_acceleration</code> for the x, y, and z axes.  The force of gravity is NOT compensated for.
 +
 
 +
<source lang="bash">
 +
names: ['right_accelerometer']
 +
states:
 +
  -
 +
    header:
 +
      seq: 120921
 +
      stamp:
 +
        secs: 0
 +
        nsecs: 0
 +
      frame_id: ''
 +
    orientation:
 +
      x: 0.0
 +
      y: 0.0
 +
      z: 0.0
 +
      w: 0.0
 +
    orientation_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
 +
    angular_velocity:
 +
      x: 0.0
 +
      y: 0.0
 +
      z: 0.0
 +
    angular_velocity_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
 +
    linear_acceleration:
 +
      x: 0.0
 +
      y: 0.0
 +
      z: 0.0
 +
    linear_acceleration_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
 +
</source>
 +
 
 +
 
 +
== Cameras ==
 +
You can access Sawyer's hand camera and the head camera using the standard ROS image types and image_transport mechanism listed below.  You can use the ROS Services to open, close, and configure each of the cameras. See the [[Camera Image Display Example]] for more information on using the cameras. Useful tools for using cameras in ROS include [http://wiki.ros.org/rviz/DisplayTypes/Camera rviz Camera Display]. 
 +
 
 +
'''IMPORTANT:''' You can only have one open at a time at standard resolutions, due to bandwidth limitations.
 +
 
 +
'''Component IDs:''' <code>right_hand_camera</code>, <code>head_camera</code>
 +
 
 +
'''Camera Published Topics'''
 +
 
 +
Raw Image: <code>/internal_camera/<component_id>/image_raw</code> ([http://www.ros.org/doc/api/sensor_msgs/html/msg/Image.html sensor_msgs-Image]) 
 +
 
 +
Camera Intrinsics: <code>/internal_camera/<component_id>/camera_info</code> ([http://www.ros.org/doc/api/sensor_msgs/html/msg/CameraInfo.html sensor_msgs-CameraInfo])
 +
 
 +
Rectify Color Image: <code>/internal_camera/head_camera/image_rect_color</code> ([http://wiki.ros.org/image_proc image_proc])
 +
 
 +
Rectify Image: <code>/internal_camera/right_hand_camera/image_rect</code> ([http://wiki.ros.org/image_proc image_proc])
 +
 
 +
 
 +
== Head Display Screen ==
 +
Images can be displayed on Sawyer's LCD screen by publishing the image data as a ROS <code>sensor_msgs/Image</code>. 
 +
 
 +
'''Display Image'''<code>/robot/head_display</code> ([http://www.ros.org/doc/api/sensor_msgs/html/msg/Image.html sensor_msgs-Image]).
 +
 
 +
Publish image data as a [http://ros.org/wiki/sensor_msgs ROS Image message] to update the display.
 +
The screen resolution is 1024 x 600.  Images smaller than this will appear in the top-left corner. 
 +
There are dedicated ROS packages for working with and sending ROS Image messages, including [http://ros.org/wiki/image_transport image_transport] and [http://ros.org/wiki/image_pipeline image_pipeline]. 
 +
Useful tools for working with images in ROS include [http://wiki.ros.org/image_view Image_view] and [http://ros.org/wiki/image_transport#republish republish]. Also see [http://ros.org/wiki/camera_drivers camera_drivers] for assistance working with your own cameras.
 +
 
 +
For more information on displaying images to Sawyer's LCD screen, see the [[Head Display Image Example]].
 +
</div>
 +
 
 +
<div class="content-block">
 +
 
 +
= Inputs and Outputs =
 +
 
 +
 
 +
== Navigators ==
 +
There are two Navigators on Sawyer's body: one on side of the body and one on the arm.  Each Navigator is comprised of three push buttons, one of which is also an indexing scroll wheel, and one set of white LED light.
 +
 
 +
'''Component IDs''': <code>right, head</code>
 +
 
 +
'''Read Button States:'''<code>/io/robot/navigator/state</code> ([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/IODeviceStatus.html intera_core_msgs-IODeviceStatus])
 +
 
 +
'''Wheel State:'''<code>/io/robot/navigator/state</code> ([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/IODeviceStatus.html intera_core_msgs-IODeviceStatus])
 +
 
 +
'''Command Buttons:'''<code>/io/robot/navigator/command</code>([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/IOComponentCommand.html intera_core_msga-IOComponentCommand])
 +
 
 +
The states of the push buttons are the values type in integer at <code>data</code> area. For the wheel on navigator, the <code>data</code> field returns an integer between [0-255].  Each physical 'click' of the wheel corresponds to a +/-1 increment.  The value will loop when it goes above or below the bounds. 
 +
The values have corresponding meaning: <code>0:'OFF', 1:'CLICK', 2:'LONG_PRESS', 3:'DOUBLE_CLICK'</code>
 +
 
 +
 
 +
*  '''<Component ID>_button_ok:'''  The circular button in the middle of the navigator.
 +
*  '''<Component ID>_button_back:'''  The button above the OK button, typically with a 'Back' arrow symbol.
 +
*  '''<Component ID>_button_show:'''  The "Rethink Button", is above the OK button, next to back button and typically is labeled with the Rethink logo.
 +
*  '''<Component ID>_button_triangle(the 'X' button):'''  The button below circle button and square button.
 +
*  '''<Component ID>_button_circle:'''  The button labeled with a circle, next to the square button.
 +
*  '''<Component ID>_button_square:'''  The button labeled with a square, next to the circle button.
 +
*  '''<Component ID>_wheel:''' The wheel of circular button in the middle of the navigator.
 +
 
 +
 
 +
== Cuff Buttons ==
 +
There are two buttons and one touch sensor in the cuff of the hand: cuff button, OK button and cuff grasp button.  The state of each button is published in a DigitalIOState message under its own topic (DigitalIOState constants: PRESSED==1, UNPRESSED==0). Integer <code>data</code> will read PRESSED (1) when the cuff sensor is squeezed, and UNPRESSED (0) otherwise.
 +
 
 +
'''Component IDs:''' <code>right_cuff, right_button_lower, right_button_upper</code>.
 +
 
 +
'''Read Button Squeezed:''' <code>/io/robot/cuff/state</code> ([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/IODeviceStatus.html intera_core_msgs-IODeviceStatus])
 +
 
 +
'''Command Cuff:'''<code>/io/robot/cuff/command</code>([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/IOComponentCommand.html intera_core_msga-IOComponentCommand])
 +
 
 +
 
 +
== Lights  ==
 +
The head LEDs at the top of Sawyer's head, and navigator LEDs are inside of navigator.
 +
 
 +
To get state or set command to HALO and Navigator LEDS, we are using <code>IODeviceInterface</code> to config, status and command topics. The data are type in bool in <code>data</code> area.
 +
 
 +
'''Lights Component IDs:'''<br />
 +
<code>head_red_light</code>, <code>head_blue_light</code>, <code>head_green_light</code>, <code>right_hand_blue_light</code>, <code>right_hand_green_light</code>,  <code>right_hand_red_light</code>
 +
 
 +
'''LEDs State:''' <code>/io/robot/robot/state</code> ([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/IODeviceStatus.html intera_core_msgs-IODeviceStatus])
 +
 
 +
'''Command Light:'''<code>/io/robot/robot/command</code>([https://rethinkrobotics.github.io/intera_sdk_docs/5.0.4/intera_core_msgs/html/msg/IOComponentCommand.html intera_core_msga-IOComponentCommand])
 +
 
 +
</div>

Latest revision as of 09:27, 29 May 2020

This page serves as a lookup reference for all the hardware and functionality for the SDK. The main interface of the SDK is via ROS Topics and Services, which you will find listed and described below along with other core information needed to interface with the robot.

Robot

Enable Robot

Be sure that you 'Enable' the robot before attempting to control any of the motors. The easiest method for controlling the robot is to use the enable_robot.py ROS executable found in the following example:Enable Robot Script

Robot State

/robot/state (intera_core_msgs/AssemblyState) Subscribe to the Robot State for the enabled and error state of the robot hardware itself. It also includes information on the EStop. The robot must be enabled (enabled: true) in order to move the robot. Use the Enable Robot Script, or the "Enable Robot Topic" below, to enable the robot. It is possible for the robot to have non-fatal errors, so error can be true while enabled is also true. For more complete information on robot state, see E-STOP and Enable Robot.

Enable Robot

/robot/set_super_enable (std_msgs/Bool) data: true to Enable robot motors; false to disable. You can check the Robot State Topic to see if the robot enabled properly or if it has an error.

Reset Robot State

/robot/set_super_reset (std_msgs/Empty) Publish an Empty message to reset the state after an error. A reset will clear all pre-existing errors and the state (it will disable).


Robot Description (URDF)

Sawyer automatically builds an appropriate URDF (Unified Robot Description Format) on boot and loads it onto the ROS Parameter Server, under the ROS parameter name /robot_description. From here, it is accessible by rviz, tf and other ROS utilities that use the URDF.

The Unified Robot Description Format (URDF) is the standard ROS XML representation of the robot model (kinematics, dynamics, sensors) describing Sawyer.

Sawyer generates his URDF dynamically on robot startup. This model is updated when any gripper is attached or detached, an object is 'grasped' or released and its mass is compensated for, and when new urdf segments are provided/commanded to the gripper plugins. As of SDK versions >= 1.0.0 Sawyer's internal robot model, is loaded to the parameter server on the topic /robot_description

The default URDF for Sawyer is available in the intera_common repository. The package sawyer_description contains the URDF and accompanying meshes.

Getting a Copy of the URDF from the parameter server

You can now get the current URDF describing your Sawyer.

From a properly initialized Sawyer environment, export the URDF from the /robot_description parameter on the ROS parameter server where it is stored, to a file of your choice (ex: sawyer_urdf.xml):

$ rosparam get -p /robot_description | tail -n +2 > sawyer_urdf.xml

The -p outputs the parameter using pretty print. The output urdf is piped through the tail command first to remove a dummy first line - an artifact of the pretty print.

Tip: You can check that you now have a proper URDF by running:

$ rosrun urdfdom check_urdf sawyer_urdf.xml

Robot State Publisher

The URDF is used by Sawyer's Robot State Publishers to create a tree of transforms (tfs). In fact, Sawyer has two of such publishers: robot_ref_publisher: publishes transforms that reflect the commanded robot state. robot_state_publisher: publishes transforms that reflect the measured state of the robot.

These robot publishers live internal to Sawyer and are accessible to the RSDK over ROS. The "ref" tfs are used by the robot internals, but you may find them useful to see where the robot will move at the next timestep. Otherwise, be sure to use the non-"ref" transforms if you're only interested in the Sawyer's current state.

Getting a Copy of the URDF Dynamically

Sawyer generates the URDF dynamically on initialization, based on the attached arm. In some cases, users may want to get the current URDF off the robot. From a working Sawyer RSDK environment, export the URDF from the /robot_description parameter on the ROS parameter server where it is stored, to a file of your choice (ex: sawyer_urdf.xml):

$ rosparam get -p /robot_description | tail -n +2 > sawyer_urdf.xml

The -p outputs the parameter using pretty print. The output urdf is piped through the tail command first to remove a dummy first line - an artifact of the pretty print.

Tip: You can check that you now have a proper URDF by running:

$ rosrun urdf_parser check_urdf sawyer_urdf.xml

If this doesn't work, you can just remove the tail command and use a text editor to manually remove the first few lines before the actual xml (all the lines before <?xml version="1.0" ?>).

    $ rosparam get -p /robot_description > sawyer_urdf.xml
    $ head sawyer_urdf.xml
    |  
      <?xml version="1.0" ?>  
      <!-- =================================================================================== -->  
      <!-- |    This document was autogenerated by xacro from sawyerp2.urdf.xacro            | -->  
    ...  
    $ gedit sawyer_urdf.xml &
    $ head sawyer_urdf.xml
      <?xml version="1.0" ?>  
      <!-- =================================================================================== -->  
      <!-- |    This document was autogenerated by xacro from sawyerp2.urdf.xacro            | -->  
    ...


Movement

Joints

Sawyer has 7 joints (DoF) in arm and one more joint in its head (side-to-side panning). The control for the head is done separately from the arm; however, you can read the current joint states (position, velocity, and effort) for all the joints on arm and head by subscribing to one topic: /robot/joint_states (sensor_msgs-JointState) where the units for the position of a joint are in (rad), the units of velocity are in (rad/s) and the units of effort in each joint is in (Nm).

Arm Joints

The following sections cover the arm joints sensing and control in more detail: Arm Joints.

Head Joints

The head state topic will give you the current pan angle (side-to-side) of the head and report boolean status flags if the robot is currently moving its head.

Note: Flags may not report 'true' values until after the first respective movement command is sent.

Component ID:head_pan

Head State:
/robot/head/head_state (intera_core_msgs-HeadState). pan field gives you the current angle (radians) of the head. 0 is forward, -pi/2 to Sawyer's right, and +pi/2 to Sawyer's left. isPanning is boolean field that will switch to True while the robot is executing a command.

Note: The isPanning field is initialized to True upon startup and will update thereafter.

Head (Joint) State:
/robot/joint_states (sensor_msgs-JointState).The position of the head may also be determined from the joint_state message.

Head Movement Control

Pan Head:
/robot/head/command_head_pan (intera_core_msgs-HeadPanCommand) target sets the target angle. 0.0 is straight ahead. speed is an integer from [0-100], 100 = max speed. Setting an angle in the command_head_pan topic does not gurantee the head will get to that position. There is a small deband around the reference angle around the order of +/- 0.12 radians.

Example:

    # Check head position/state: 
    $ rostopic echo /robot/head/head_state
    # Move (pan) head side-to-side: 
    $ rostopic pub /robot/head/command_head_pan intera_core_msgs/HeadPanCommand -- 0.0 100


Cartesian Endpoint

Published at 100 Hz, the endpoint state topic provides the current Cartesian Position, Velocity and Effort at the endpoint for either limb.

Endpoint State

Endpoint State: /robot/limb/right/endpoint_state intera_core_msgs-EndpointState). The endpoint state message provides the current position/orientation pose, linear/angular velocity, and force/torque effort of the robot end-effector at 100 Hz. Pose is in Meters, Velocity in m/s, Effort in Nm. The robot's "endpoint" is definied as the right_gripper tf frame. This frame is updated dynamically when gripper is connected to the robot . The URDF on the parameter server will now update when the Robot Model is updated by Gripper changes. Check the ROS Parameter for an updated copy of the URDF, especially before using IK or motion planners, such as MoveIt!.

Kinematics Solver Service

The following sections cover the Forward Kinematics Solver Service and Inverse Kinematics Solver Service in more detail: Kinematics Solvers


Gripper (End-Effector)

Before using an End-Effector, or Gripper, you must first send the calibration command. You can check whether the gripper has been calibrated yet by echoing on the gripper state topic for that hand. Once calibrated, gripper can be controlled using the simplified command_grip and command_release topics, or using the more direct command_set topic. For more information on using the gripper, see the Gripper Example Program.

Gripper Configuration

(intera_core_msgs-IOComponentCommand)

Calibrate Gripper
Publish an IO Command message to calibrate a new gripper by set signal calibrate to True. Gripper should open and close once. The calibrated field of the gripper state topic will also update to '1' after successful calibration. Once calibrated, the gripper will not calibrate again unless the command reset message is sent, or the robot is restarted.

Reset Gripper
Publish an IO Command message to reset the gripper state by set signal reboot to True. The calibrated field of the gripper state message will reset to '0'.

Gripper State

Gripper State
/io/end_effector/state The io signal calibrated field must be true (1) before you can control the gripper. Use the IO command to calibrate the gripper. The gripper state message will also give you the current position, force, and if the gripper is current moving. Position is from [0.0-100.0] [close-open].

Simple Gripper Control

Simple Gripper Close
/io/end_effector/command the signal 'position_m' value to MIN_POSITION: 0.0. Publish an IO Command message to grip.

Simple Gripper Open
/io/end_effector/command the signal 'position_m' value to MAX_POSITION: 0.041667. Publish an IO Command message to release.


Sensors+

Accelerometer

The robot hand has a 3-axis accelerometer located inside the cuff, in the same plane as the gripper electrical connection header. The positive z-axis points back 'up' the arm (towards the previous wrist joint, j6). The positive x-axis points towards the direction of gripper, and the y-axis points towards the cuff buttons, using standard Right-Hand-Rule notation.

Component IDs:
right_accelerometer

Accelerometer State:
/robot/accelerometer/<component_id>/state(sensor_msgs-ImuMessage)

Acceleration values (in m/s^2) are published under linear_acceleration for the x, y, and z axes. The force of gravity is NOT compensated for.

names: ['right_accelerometer']
states: 
  - 
    header: 
      seq: 120921
      stamp: 
        secs: 0
        nsecs: 0
      frame_id: ''
    orientation: 
      x: 0.0
      y: 0.0
      z: 0.0
      w: 0.0
    orientation_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
    angular_velocity: 
      x: 0.0
      y: 0.0
      z: 0.0
    angular_velocity_covariance: [-1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
    linear_acceleration: 
      x: 0.0
      y: 0.0
      z: 0.0
    linear_acceleration_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]


Cameras

You can access Sawyer's hand camera and the head camera using the standard ROS image types and image_transport mechanism listed below. You can use the ROS Services to open, close, and configure each of the cameras. See the Camera Image Display Example for more information on using the cameras. Useful tools for using cameras in ROS include rviz Camera Display.

IMPORTANT: You can only have one open at a time at standard resolutions, due to bandwidth limitations.

Component IDs: right_hand_camera, head_camera

Camera Published Topics

Raw Image: /internal_camera/<component_id>/image_raw (sensor_msgs-Image)

Camera Intrinsics: /internal_camera/<component_id>/camera_info (sensor_msgs-CameraInfo)

Rectify Color Image: /internal_camera/head_camera/image_rect_color (image_proc)

Rectify Image: /internal_camera/right_hand_camera/image_rect (image_proc)


Head Display Screen

Images can be displayed on Sawyer's LCD screen by publishing the image data as a ROS sensor_msgs/Image.

Display Image/robot/head_display (sensor_msgs-Image).

Publish image data as a ROS Image message to update the display. The screen resolution is 1024 x 600. Images smaller than this will appear in the top-left corner. There are dedicated ROS packages for working with and sending ROS Image messages, including image_transport and image_pipeline. Useful tools for working with images in ROS include Image_view and republish. Also see camera_drivers for assistance working with your own cameras.

For more information on displaying images to Sawyer's LCD screen, see the Head Display Image Example.

Inputs and Outputs

Navigators

There are two Navigators on Sawyer's body: one on side of the body and one on the arm. Each Navigator is comprised of three push buttons, one of which is also an indexing scroll wheel, and one set of white LED light.

Component IDs: right, head

Read Button States:/io/robot/navigator/state (intera_core_msgs-IODeviceStatus)

Wheel State:/io/robot/navigator/state (intera_core_msgs-IODeviceStatus)

Command Buttons:/io/robot/navigator/command(intera_core_msga-IOComponentCommand)

The states of the push buttons are the values type in integer at data area. For the wheel on navigator, the data field returns an integer between [0-255]. Each physical 'click' of the wheel corresponds to a +/-1 increment. The value will loop when it goes above or below the bounds. The values have corresponding meaning: 0:'OFF', 1:'CLICK', 2:'LONG_PRESS', 3:'DOUBLE_CLICK'


  • <Component ID>_button_ok: The circular button in the middle of the navigator.
  • <Component ID>_button_back: The button above the OK button, typically with a 'Back' arrow symbol.
  • <Component ID>_button_show: The "Rethink Button", is above the OK button, next to back button and typically is labeled with the Rethink logo.
  • <Component ID>_button_triangle(the 'X' button): The button below circle button and square button.
  • <Component ID>_button_circle: The button labeled with a circle, next to the square button.
  • <Component ID>_button_square: The button labeled with a square, next to the circle button.
  • <Component ID>_wheel: The wheel of circular button in the middle of the navigator.


Cuff Buttons

There are two buttons and one touch sensor in the cuff of the hand: cuff button, OK button and cuff grasp button. The state of each button is published in a DigitalIOState message under its own topic (DigitalIOState constants: PRESSED==1, UNPRESSED==0). Integer data will read PRESSED (1) when the cuff sensor is squeezed, and UNPRESSED (0) otherwise.

Component IDs: right_cuff, right_button_lower, right_button_upper.

Read Button Squeezed: /io/robot/cuff/state (intera_core_msgs-IODeviceStatus)

Command Cuff:/io/robot/cuff/command(intera_core_msga-IOComponentCommand)


Lights

The head LEDs at the top of Sawyer's head, and navigator LEDs are inside of navigator.

To get state or set command to HALO and Navigator LEDS, we are using IODeviceInterface to config, status and command topics. The data are type in bool in data area.

Lights Component IDs:
head_red_light, head_blue_light, head_green_light, right_hand_blue_light, right_hand_green_light, right_hand_red_light

LEDs State: /io/robot/robot/state (intera_core_msgs-IODeviceStatus)

Command Light:/io/robot/robot/command(intera_core_msga-IOComponentCommand)