Skip to content

Commit 1ad6e8c

Browse files
Refactor documentation for Ray: standardize section titles and improve clarity
1 parent a5bd330 commit 1ad6e8c

4 files changed

Lines changed: 11 additions & 15 deletions

File tree

content/learning-paths/servers-and-cloud-computing/ray-on-axion/_index.md

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,6 @@
11
---
22
title: Scale AI workloads with Ray on Google Cloud C4A Axion VM
33
description: Deploy and run distributed AI workloads using Ray on Google Cloud Axion C4A Arm-based VMs, covering parallel tasks, hyperparameter tuning, and model serving with Ray Core, Train, Tune, and Serve.
4-
5-
draft: true
6-
cascade:
7-
draft: true
84

95
minutes_to_complete: 30
106

content/learning-paths/servers-and-cloud-computing/ray-on-axion/distributed_workloads.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ results = ray.get([square.remote(i) for i in range(10)])
3232
print("Results:", results)
3333
```
3434

35-
### Explanation
35+
### Code explanation
3636

3737
* `ray.init()` → connects to the running Ray cluster
3838
* `@ray.remote` → converts a function into a distributed task
@@ -92,7 +92,7 @@ trainer = TorchTrainer(
9292
trainer.fit()
9393
```
9494

95-
### Execute training
95+
### Run the training script
9696

9797
```bash
9898
python3 ray_train.py
@@ -115,14 +115,14 @@ The output is similar to:
115115

116116
This confirms distributed training across multiple workers.
117117

118-
## Explanation
118+
## Training code explanation
119119

120120
* `TorchTrainer` → handles distributed training execution
121121
* `ScalingConfig(num_workers=2)` → runs training on 2 workers
122122
* Each worker executes training in parallel
123123
* Logs can appear from multiple processes
124124

125-
## Ray Jobs View (Tasks & Training)
125+
## Ray Jobs view (tasks and training)
126126

127127
![Ray Dashboard Jobs tab showing successful execution of ray_test.py and ray_train.py#center](images/ray-jobs.png "Ray Jobs tab showing distributed tasks and training execution status")
128128

content/learning-paths/servers-and-cloud-computing/ray-on-axion/setup_and_cluster.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -175,9 +175,9 @@ http://<VM-IP>:8265
175175

176176
The Ray Dashboard provides visibility into jobs, tasks, and resource utilization.
177177

178-
## Ray Dashboard Overview
178+
## Ray Dashboard overview
179179

180-
![Ray Dashboard showing cluster overview, utilization, and navigation tabs#center](images/ray-dashboard.png "Ray Dashboard Overview showing cluster status and metrics")
180+
![Ray Dashboard showing cluster overview, utilization, and navigation tabs#center](images/ray-dashboard.png "Ray Dashboard overview showing cluster status and metrics")
181181

182182
The Ray Dashboard helps monitor distributed execution and debug workloads in real time.
183183

content/learning-paths/servers-and-cloud-computing/ray-on-axion/tuning_serving_benchmark.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -42,14 +42,14 @@ results = tuner.fit()
4242
print("Best result:", results.get_best_result(metric="score", mode="max"))
4343
```
4444

45-
### Explanation
45+
### Code explanation
4646

4747
* `tune.grid_search()` → tries multiple hyperparameter values
4848
* Each value runs as a separate parallel trial
4949
* `session.report()` → sends metrics back to Ray
5050
* `Tuner.fit()` → executes all trials
5151

52-
### Execute tuning
52+
### Run hyperparameter tuning
5353

5454
```bash
5555
python3 ray_tune.py
@@ -126,7 +126,7 @@ app = Model.bind()
126126
serve.run(app)
127127
```
128128

129-
### Explanation
129+
### Code explanation
130130

131131
* `serve.start()` → initializes serving system
132132
* `@serve.deployment` → defines deployable service
@@ -150,7 +150,7 @@ The output is similar to:
150150
{"message":"Hello from Ray Serve on Arm VM!"}
151151
```
152152

153-
## Ray Tune Execution in Dashboard
153+
## Ray Tune execution in Ray Dashboard
154154

155155
![Ray Dashboard Jobs tab showing ray_tune.py trials with SUCCEEDED status#center](images/ray-jobs-status.png "Ray Tune trials executed successfully with different configurations")
156156

@@ -183,7 +183,7 @@ print("Execution Time:", end - start)
183183
```
184184

185185

186-
### Execute benchmark
186+
### Run the benchmark
187187

188188
```bash
189189
ray stop

0 commit comments

Comments
 (0)