Skip to content

Commit

Permalink
new readme and docs
Browse files Browse the repository at this point in the history
  • Loading branch information
cpaxton committed Oct 19, 2016
1 parent 0ab5a45 commit ab8bde1
Show file tree
Hide file tree
Showing 4 changed files with 41 additions and 38 deletions.
2 changes: 1 addition & 1 deletion INSTALL.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Note: CoSTAR installation has only been tested on ROS Indigo on Ubuntu 14.04 mac


## Prerequisites
To successfully install CoSTAR system, some software are demanded to be installed as prerequisite:
To successfully install the CoSTAR system, some software are demanded to be installed as prerequisite:

* Python (tested version 2.7.12)
* Git (tested version 1.9.1)
Expand Down
43 changes: 6 additions & 37 deletions Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,53 +78,22 @@ roslaunch instructor_core instructor.launch
* Bringup: launch files, RVIZ configurations, et cetera
* [Librarian](costar_librarian/Readme.md): file management
* [Predicator](costar_predicator/Readme.md): robot knowledge management
* [Perception](costar_perception/Readme.md): semantic segmentation and object detection via [SP Segmenter](https://github.com/jhu-lcsr/sp_segmenter)
* Gripper: utilities for integrating different grippers into UI
* Robot: utilities and services allowing high-level control of the robot and integrating these behaviors into the UI. Contains the `CostarArm` component.
* Tools: packages used for data collection, maintaining MoveIt planning scene, and other purposes

### Other Requirements

* [SP Segmenter](https://github.com/jhu-lcsr/sp_segmenter)

### Proprietary Code

* Instructor: Behavior Tree-based user interface (built on [Beetree](https://github.com/futureneer/beetree/))
* Ready Air: drives the current tool attachment and provides services

Due to licensing issues these have not yet been made open source.

## CoSTAR Tools

This section includes the code to manage the MoveIt collision detection and a few other utilities that provide useful services for creating powerful robot programs.

**MoveIt collision detection**: creates a PlanningScene based on detected objects, plus adds a table.

Run with:
```
roslaunch moveit_collision_environment colision_env.launch mesh_source:=$(find moveit_collision_environment)/data/mesh tableTFname:=ar_marker_2 defineParent:=true parentFrameName:=/world
```

**Recording point clouds**: this is used to collect data and scan objects. Run with:

```
rosrun point_cloud_recorder point_cloud_recorder.py _id:=$OBJECT_NAME _camera:=$CAMERA_ID
```

This will expose the ``/record_camera`` rosservice which can be called from a UI node. Files will be created wherever you ran the point cloud recorder.

Roslaunch would look something like:

```xml
<node name="point_cloud_recorder" pkg="point_cloud_recorder" type="point_cloud_recorder.py">
<param name="id" value="NameOfMyObject"/>
<param name="camera" value="kinect2"/>
</node>
```

## Gripper

* ***Simple S Model Server***: This is a part of our CoSTAR UI -- cross platform robot graphical user interface for teaching complex behaviors to industrial robots. This wraps a couple simple 3-finger gripper commands, which we can then expose as UI behaviors.=

## Contact

CoSTAR is maintained by Chris Paxton ([email protected])
CoSTAR is maintained by Chris Paxton ([email protected]).

Other core contributors include:
* Felix Jonathan
* Andrew Hundt
7 changes: 7 additions & 0 deletions costar_perception/Readme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
# CoSTAR Perception

CoSTAR does not rely on any particular perception solution. Instead, we require a couple specific outputs, including TF frames and object detection messages.

## SP Segmenter

This is our initial perception package. See [here](costar_perception/sp_segmenter/README.md) for more information.
27 changes: 27 additions & 0 deletions costar_tools/Readme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
# CoSTAR Tools

This section includes the code to manage the MoveIt collision detection and a few other utilities that provide useful services for creating powerful robot programs.

**MoveIt collision detection**: creates a PlanningScene based on detected objects, plus adds a table.

Run with:
```
roslaunch moveit_collision_environment colision_env.launch mesh_source:=$(find moveit_collision_environment)/data/mesh tableTFname:=ar_marker_2 defineParent:=true parentFrameName:=/world
```

**Recording point clouds**: this is used to collect data and scan objects. Run with:

```
rosrun point_cloud_recorder point_cloud_recorder.py _id:=$OBJECT_NAME _camera:=$CAMERA_ID
```

This will expose the ``/record_camera`` rosservice which can be called from a UI node. Files will be created wherever you ran the point cloud recorder.

Roslaunch would look something like:

```xml
<node name="point_cloud_recorder" pkg="point_cloud_recorder" type="point_cloud_recorder.py">
<param name="id" value="NameOfMyObject"/>
<param name="camera" value="kinect2"/>
</node>
```

0 comments on commit ab8bde1

Please sign in to comment.