Roadmap
- Majority Vote
- Initial Poses for Manual Scenes
Updated installation steps fo my PC environment
Prerequisites
Ubuntu 20.04
CUDA 12.1
ROS Noetic
. Recommended installation method: 小鱼一键安装GS-Net
: My GS-net repo. Run:python flask_server.py
Installation
# Install Active Grasp
sudo apt install liborocos-kdl-dev
mkdir -p ws/src && cd ws/src
git clone https://github.com/0nhc/active_grasp.git
conda create -n active_grasp python=3.8
cd active_grasp && conda activate active_grasp
pip install -r requirements.txt
conda install libffi==3.3
conda install conda-forge::python-orocos-kdl
cd ..
git clone https://github.com/0nhc/vgn.git -b devel
cd vgn
pip install -r requirements.txt
cd ..
git clone https://github.com/0nhc/robot_helpers.git
cd ..
rosdep install --from-paths src --ignore-src -r -y
catkin build
# Install Active Perception
cd ws/src/active_grasp/src/active_grasp/active_perception/modules/module_lib/pointnet2_utils/pointnet2
pip install -e .
Quick Start
# Terminal 1
conda activate active_grasp
cd ws
source devel/setup.bash
roslaunch active_grasp env.launch sim:=true
# Terminal 2
conda activate active_grasp
cd ws
source devel/setup.bash
cd src/active_grasp
python3 scripts/run.py ap-single-view
Closed-Loop Next-Best-View Planning for Target-Driven Grasping
This repository contains the implementation of our IROS 2022 submission, "Closed-Loop Next-Best-View Planning for Target-Driven Grasping". [Paper][Video]
Setup
The experiments were conducted with a Franka Emika Panda arm and a Realsense D435 attached to the wrist of the robot. The code was developed and tested on Ubuntu 20.04 with ROS Noetic. It depends on the following external packages:
- MoveIt
- robot_helpers
- TRAC-IK
- VGN
- franka_ros and realsense2_camera (only required for hardware experiments)
Additional Python dependencies can be installed with
pip install -r requirements.txt
Run catkin build active_grasp
to build the package.
Finally, download the assets folder and extract it inside the repository.
Experiments
Start a roscore.
roscore
To run simulation experiments.
roslaunch active_grasp env.launch sim:=true
python3 scripts/run.py nbv
To run real-world experiments.
roslaunch active_grasp hw.launch
roslaunch active_grasp env.launch sim:=false
python3 scripts/run.py nbv --wait-for-input
Description
Languages
Python
96.1%
Cuda
2.1%
C++
1.1%
Jupyter Notebook
0.4%
C
0.3%