Skip to content

Commit

Permalink
Merge pull request #122 from FederatedAI/feature-2.2.0-update_doc
Browse files Browse the repository at this point in the history
Feature 2.2.0 update doc
  • Loading branch information
mgqa34 authored Aug 1, 2024
2 parents c5a9d6a + cc9c82e commit 5fad8d8
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 9 deletions.
12 changes: 3 additions & 9 deletions doc/standalone_deploy.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,14 +41,7 @@ After installing successfully, please refer to [tutorials](../README.md#quick-st
In this way, user can run tasks with Pipeline or Launcher.

### 3.1 Installing Python Environment
- Prepare and install [conda](https://docs.conda.io/projects/miniconda/en/latest/) environment.
- Create a virtual environment:

```shell
# FATE-LLM requires Python >= 3.10
conda create -n fate_env python=3.10
conda activate fate_env
```
Please refer to section-2.1

### 3.2 Installing FATE-LLM with FATE, FATE-Flow, FATE-Client

Expand All @@ -59,7 +52,8 @@ pip install fate_client[fate,fate_flow,fate_client]==2.2.0
### 3.3 Service Initialization

```shell
fate_flow init --ip 127.0.0.1 --port 9380 --home $HOME_DIR
mkdir fate_workspace
fate_flow init --ip 127.0.0.1 --port 9380 --home $(pwd)/fate_workspace
pipeline init --ip 127.0.0.1 --port 9380
```
- `ip`: The IP address where the service runs.
Expand Down
2 changes: 2 additions & 0 deletions doc/tutorial/fdkt/fdkt.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -426,6 +426,7 @@
" seq_num_for_single_category=2000,\n",
" slm_generation_config=dict(\n",
" max_new_tokens=256,\n",
" do_sample=True,\n",
" temperature=1.0,\n",
" top_k=50,\n",
" top_p=0.9,\n",
Expand Down Expand Up @@ -636,6 +637,7 @@
" seq_num_for_single_category=2000,\n",
" slm_generation_config=dict(\n",
" max_new_tokens=256,\n",
" do_sample=True,\n",
" temperature=1.0,\n",
" top_k=50,\n",
" top_p=0.9,\n",
Expand Down

0 comments on commit 5fad8d8

Please sign in to comment.