Skip to content

Commit

Permalink
modify installation verification
Browse files Browse the repository at this point in the history
  • Loading branch information
liqikai9 committed Sep 16, 2022
1 parent 0506d07 commit cc5b57d
Show file tree
Hide file tree
Showing 3 changed files with 77 additions and 14 deletions.
10 changes: 5 additions & 5 deletions demo/docs/2d_human_pose_demo.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
## 2D Human Pose Demo

<img src="https://raw.githubusercontent.com/open-mmlab/mmpose/master/demo/resources/demo_coco.gif" width="600px" alt><br>
We provide demo scripts to perform human pose estimation on images or videos.

### 2D Human Pose Top-Down Image Demo

Expand All @@ -18,7 +18,7 @@ python demo/image_demo.py \

If you use a heatmap-based model and set argument `--draw-heatmap`, the predicted heatmap will be visualized together with the keypoints.

The pre-trained hand pose estimation model can be downloaded from [model zoo](https://mmpose.readthedocs.io/en/1.x/model_zoo/body_2d_keypoint.html).
The pre-trained human pose estimation model can be downloaded from [model zoo](https://mmpose.readthedocs.io/en/1.x/model_zoo/body_2d_keypoint.html).
Take [coco model](https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth) as an example:

```shell
Expand Down Expand Up @@ -62,7 +62,7 @@ python demo/topdown_demo_with_mmdet.py \
[--bbox-thr ${BBOX_SCORE_THR} --kpt-thr ${KPT_SCORE_THR}]
```

Examples:
Example:

```shell
python demo/topdown_demo_with_mmdet.py \
Expand All @@ -84,7 +84,7 @@ The above demo script can also take video as input, and run mmdet for human dete

Assume that you have already installed [mmdet](https://github.com/open-mmlab/mmdetection) with version >= 3.0.

Examples:
Example:

```shell
python demo/topdown_demo_with_mmdet.py \
Expand All @@ -93,7 +93,7 @@ python demo/topdown_demo_with_mmdet.py \
configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_hrnet-w32_8xb64-210e_coco-256x192.py \
https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192-c78dce93_20200708.pth \
--input tests/data/posetrack18/videos/000001_mpiinew_test/000001_mpiinew_test.mp4 \
--output-root=vis_results/demo --show --draw-heatmap
--output-root=vis_results/demo --show --draw-heatmap
```

### Speed Up Inference
Expand Down
42 changes: 37 additions & 5 deletions docs/en/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ mim install mmengine
mim install "mmcv>=2.0.0rc1"
```

Note that some of the demo scripts in MMPose require [MMDetection](https://github.com/open-mmlab/mmdetection) (mmdet) for human detection. If you want to run these demo scripts with mmdet, you can install mmdet as a dependency by running:
Note that some of the demo scripts in MMPose require [MMDetection](https://github.com/open-mmlab/mmdetection) (mmdet) for human detection. If you want to run these demo scripts with mmdet, you can easily install mmdet as a dependency by running:

```shell
mim install "mmdet>=3.0.0rc0"
Expand Down Expand Up @@ -94,13 +94,25 @@ mim install "mmpose>=1.0.0b0"

### Verify the installation

To verify that MMPose is installed correctly, you can run the following inference demo script:
To verify that MMPose is installed correctly, you can run an inference demo with the following steps.

**Step 1.** We need to download config and checkpoint files.

```shell
mim download mmpose --config td-hm_hrnet-w48_8xb32-210e_coco-256x192 --dest .
```

The downloading will take several seconds or more, depending on your network environment. When it is done, you will find two files `td-hm_hrnet-w48_8xb32-210e_coco-256x192.py` and `hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth` in your current folder.

**Step 2.** Run the inference demo.

Option (A). If you install mmpose from source, just run the following command under the folder `$MMPOSE`:

```shell
python demo/image_demo.py \
tests/data/coco/000000000785.jpg \
configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_hrnet-w48_8xb32-210e_coco-256x192.py \
https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth \
td-hm_hrnet-w48_8xb32-210e_coco-256x192.py \
hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth \
--out-file vis_results.jpg \
--draw-heatmap
```
Expand All @@ -109,7 +121,27 @@ If everything goes fine, you will get this visualization result:

![image](https://user-images.githubusercontent.com/87690686/187824033-2cce0f55-034a-4127-82e2-52744178bc32.jpg)

And the output will be saved as `$MMPOSE/vis_results.jpg`.
And the visualization result will be saved as `vis_results.jpg` on your current folder, where the predicted keypoints and heatmaps are plotted on the person in the image.

Option (B). If you install mmpose with pip, open you python interpreter and copy & paste the following codes.

```python
from mmpose.apis import inference_topdown, init_model
from mmpose.utils import register_all_modules

register_all_modules()

config_file = 'td-hm_hrnet-w48_8xb32-210e_coco-256x192.py'
checkpoint_file = 'hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth'
model = init_model(config_file, checkpoint_file, device='cpu') # or device='cuda:0'

# please prepare an image with person
results = inference_topdown(model, 'demo.jpg')
```

The `demo.jpg` can be downloaded from [Github](https://raw.githubusercontent.com/open-mmlab/mmpose/1.x/tests/data/coco/000000000785.jpg).

The inference results will be a list of `PoseDataSample`, and the predictions are in the `pred_instances`, indicating the detected keypoint locations and scores.

### Customize Installation

Expand Down
39 changes: 35 additions & 4 deletions docs/zh_cn/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,13 +98,25 @@ mim install "mmpose>=1.0.0b0"

### 验证安装

为了验证 MMPose 是否安装正确,您可以运行以下推理示例脚本:
为了验证 MMPose 是否安装正确,您可以通过以下步骤运行模型推理。

**第 1 步** 我们需要下载配置文件和模型权重文件

```shell
mim download mmpose --config td-hm_hrnet-w48_8xb32-210e_coco-256x192 --dest .
```

下载过程往往需要几秒或更多的时间,这取决于您的网络环境。完成之后,您会在当前目录下找到这两个文件:`td-hm_hrnet-w48_8xb32-210e_coco-256x192.py``hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth`, 分别是配置文件和对应的模型权重文件。

**第 2 步** 验证推理示例

如果您是**从源码安装**的 mmpose,可以直接运行以下命令进行验证:

```shell
python demo/image_demo.py \
tests/data/coco/000000000785.jpg \
configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_hrnet-w48_8xb32-210e_coco-256x192.py \
https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth \
td-hm_hrnet-w48_8xb32-210e_coco-256x192.py \
hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth \
--out-file vis_results.jpg \
--draw-heatmap
```
Expand All @@ -113,7 +125,26 @@ python demo/image_demo.py \

![image](https://user-images.githubusercontent.com/87690686/187824033-2cce0f55-034a-4127-82e2-52744178bc32.jpg)

输出图片将会保存到 `$MMPOSE/vis_results.jpg`.
代码会将预测的关键点和热图绘制在图像中的人体上,并保存到当前文件夹下的 `vis_results.jpg`

如果您是**作为 Python 包安装**,可以打开您的 Python 解释器,复制并粘贴如下代码:

```python
from mmpose.apis import inference_topdown, init_model
from mmpose.utils import register_all_modules

register_all_modules()

config_file = 'td-hm_hrnet-w48_8xb32-210e_coco-256x192.py'
checkpoint_file = 'hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth'
model = init_model(config_file, checkpoint_file, device='cpu') # or device='cuda:0'

# 请准备好一张带有人体的图片
results = inference_topdown(model, 'demo.jpg')
```

示例图片 `demo.jpg` 可以从 [Github](https://raw.githubusercontent.com/open-mmlab/mmpose/1.x/tests/data/coco/000000000785.jpg) 下载。
推理结果是一个 `PoseDataSample` 列表,预测结果将会保存在 `pred_instances` 中,包括检测到的关键点位置和置信度。

### 自定义安装

Expand Down

0 comments on commit cc5b57d

Please sign in to comment.