You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: fast-api-react/README.md
+13-4Lines changed: 13 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,27 +6,36 @@ This is an example project demonstrating how to use Hatchet with FastAPI.
6
6
7
7
Before running this project, make sure you have the following:
8
8
9
-
1. Python 3.7 or higher installed on your machine.
9
+
1. Python 3.10 or higher installed on your machine.
10
10
2. Poetry package manager installed. You can install it by running `pip install poetry`, or by following instructions in the [Poetry Docs](https://python-poetry.org/docs/#installation)
11
11
3. (Optional) If you would like to run the example frontend, Node which can be installed from the [node website](https://nodejs.org/en/download)
12
12
13
13
## Setup
14
14
15
15
1. Create a `.env` file in the `./backend` directory and set the required environment variables. This requires the `HATCHET_CLIENT_TOKEN` variable created in the [Getting Started README](../README.md). You will also need, a `OPENAI_API_KEY` which can be created on the [OpenAI Website](https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key).
16
16
17
+
**If you're running Hatchet locally without TLS:**
18
+
17
19
```
18
20
HATCHET_CLIENT_TLS_STRATEGY=none
19
21
HATCHET_CLIENT_TOKEN="<token>"
20
22
OPENAI_API_KEY="<openai-key>"
21
23
```
22
24
25
+
**If you're using Hatchet Cloud:**
26
+
27
+
```
28
+
HATCHET_CLIENT_TOKEN="<token>"
29
+
OPENAI_API_KEY="<openai-key>"
30
+
```
31
+
23
32
2. Open a terminal and navigate to the project backend directory (`/fast-api-react/backend`).
24
33
25
34
3. Run the following command to install the project dependencies:
Copy file name to clipboardExpand all lines: simple-examples/README.md
+20-10Lines changed: 20 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,30 +6,39 @@ This is an example project demonstrating how to use Hatchet with Python.
6
6
7
7
Before running this project, make sure you have the following:
8
8
9
-
1. Python 3.7 or higher installed on your machine.
9
+
1. Python 3.10 or higher installed on your machine.
10
10
2. Poetry package manager installed. You can install it by running `pip install poetry`, or by following instructions in the [Poetry Docs](https://python-poetry.org/docs/#installation)
11
11
12
12
## Setup
13
13
14
14
1. Create a `.env` file in the `./backend` directory and set the required environment variables. This requires the `HATCHET_CLIENT_TOKEN` variable created in the [Getting Started README](../README.md). If you would like to try the Generative AI examples in [./src/genai](./src/genai) You will also need, a `OPENAI_API_KEY` which can be created on the [OpenAI Website](https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key).
15
15
16
+
**If you're running Hatchet locally without TLS:**
17
+
16
18
```
17
19
HATCHET_CLIENT_TLS_STRATEGY=none
18
20
HATCHET_CLIENT_TOKEN="<token>"
19
21
OPENAI_API_KEY="<openai-key>" # (OPTIONAL) only required to run examples in [./src/genai](./src/genai)
20
22
```
21
23
24
+
**If you're using Hatchet Cloud:**
25
+
26
+
```
27
+
HATCHET_CLIENT_TOKEN="<token>"
28
+
OPENAI_API_KEY="<openai-key>" # (OPTIONAL) only required to run examples in [./src/genai](./src/genai)
29
+
```
30
+
22
31
2. Open a terminal and navigate to the project root directory (`/simple-examples`).
23
32
24
33
3. Run the following command to install the project dependencies:
25
34
26
-
```shell
27
-
poetry install
28
-
```
35
+
```shell
36
+
poetry install
37
+
```
29
38
30
39
### Running the Hatchet Worker
31
40
32
-
In a separate terminal, start the the Hatchet worker by running the following command:
41
+
Next, start the the Hatchet worker by running the following command:
33
42
34
43
```shell
35
44
poetry run hatchet
@@ -50,10 +59,11 @@ The project contains example workflows in the [`./src`](./src) directory. These
50
59
The project includes a variety of basic workflows to demonstrate Hatchet's core capabilities, each showcasing different features:
51
60
52
61
1.**[Simple Workflow](./src/simple/worker.py)**: Demonstrates a straightforward process flow, showcasing the basics of setting up a workflow in Hatchet.
53
-
2.**[Concurrency Limit Workflow](./src/concurrency_limit/worker.py)**: Shows how to manage concurrency limits within workflows to ensure that only a certain number of instances run simultaneously.
54
-
3.**[Directed Acyclic Graph (DAG) Workflow](./src/dag/worker.py)**: Illustrates setting up workflows with dependencies that form a Directed Acyclic Graph, demonstrating the advanced orchestration capabilities of Hatchet.
55
-
4.**[Manual Trigger Workflow](./src/manual_trigger/worker.py)**: Explains how to initiate workflows manually, offering control over workflow execution triggers.
56
-
5.**[Timeout Workflow](./src/timeout/worker.py)**: Demonstrates handling timeout scenarios within workflows, ensuring that long-running or stalled processes are appropriately managed.
62
+
2.**[Async Workflow](./src/async_workflow/worker.py)**: An example of using `async def` together with `asyncio.sleep`.
63
+
3.**[Concurrency Limit Workflow](./src/concurrency_limit/worker.py)**: Shows how to manage concurrency limits within workflows to ensure that only a certain number of instances run simultaneously.
64
+
4.**[Directed Acyclic Graph (DAG) Workflow](./src/dag/worker.py)**: Illustrates setting up workflows with dependencies that form a Directed Acyclic Graph, demonstrating the advanced orchestration capabilities of Hatchet.
65
+
5.**[Manual Trigger Workflow](./src/manual_trigger/worker.py)**: Explains how to initiate workflows manually, offering control over workflow execution triggers.
66
+
6.**[Timeout Workflow](./src/timeout/worker.py)**: Demonstrates handling timeout scenarios within workflows, ensuring that long-running or stalled processes are appropriately managed.
57
67
58
68
#### Generative AI Workflows
59
69
@@ -62,6 +72,6 @@ For more complex use cases, the project includes examples that integrate with Op
62
72
1.**[Simple Response Generation](./src/genai/simple.py)**: A single-step workflow that makes a request to OpenAI, showcasing how to incorporate AI services into Hatchet workflows.
63
73
2.**[Basic Retrieval Augmented Generation (BasicRag)](./src/genai/basicrag.py)**: A multi-step workflow that involves loading website content with Beautiful Soup, reasoning about the information, and generating a response with OpenAI, demonstrating the potential for complex, AI-driven processes.
64
74
65
-
### Exposing the workflows via a RestAPI
75
+
### Exposing the workflows via FastAPI
66
76
67
77
For a more complete example of how you might use Hatchet as part of a deployed production service, check out the [FastAPI Example](../fast-api-react/README.md)
0 commit comments