11<h1 align =" center " >
2- Transformers PHP
2+ TransformersPHP
33</h1 >
44
55<h3 align =" center " >
66 <p>State-of-the-art Machine Learning for PHP</p>
77</h3 >
88
9- Transformers PHP is designed to be functionally equivalent to the Python library, while still maintaining the same level
9+ TransformersPHP is designed to be functionally equivalent to the Python library, while still maintaining the same level
1010of performance and ease of use. This library is built on top of the Hugging Face's Transformers library, which provides
1111thousands of pre-trained models in 100+ languages. It is designed to be a simple and easy-to-use library for PHP
1212developers using a similar API to the Python library. These models can be used for a variety of tasks, including text
1313generation, summarization, translation, and more.
1414
15- Transformers PHP uses [ ONNX Runtime] ( https://onnxruntime.ai/ ) to run the models, which is a high-performance scoring
15+ TransformersPHP uses [ ONNX Runtime] ( https://onnxruntime.ai/ ) to run the models, which is a high-performance scoring
1616engine for Open Neural Network Exchange (ONNX) models. You can easily convert any PyTorch or TensorFlow model to ONNX
17- and use it with Transformers PHP using [ 🤗 Optimum] ( https://github.com/huggingface/optimum#onnx--onnx-runtime ) .
17+ and use it with TransformersPHP using [ 🤗 Optimum] ( https://github.com/huggingface/optimum#onnx--onnx-runtime ) .
1818
1919TO learn more about the library and how it works, head over to
2020our [ extensive documentation] ( https://codewithkyrian.github.io/transformers-php/introduction ) .
2121
2222## Quick tour
2323
24- Because Transformers PHP is designed to be functionally equivalent to the Python library, it's super easy to learn from
24+ Because TransformersPHP is designed to be functionally equivalent to the Python library, it's super easy to learn from
2525existing Python or Javascript code. We provide the ` pipeline ` API, which is a high-level, easy-to-use API that groups
2626together a model with its necessary preprocessing and postprocessing steps.
2727
@@ -109,7 +109,7 @@ Next, you must run the installation/initialize command to download the shared li
109109
110110## PHP FFI Extension
111111
112- Transformers PHP uses the PHP FFI extension to interact with the ONNX runtime. The FFI extension is included by default
112+ TransformersPHP uses the PHP FFI extension to interact with the ONNX runtime. The FFI extension is included by default
113113in PHP 7.4 and later, but it may not be enabled by default. If the FFI extension is not enabled, you can enable it by
114114uncommenting(remove the ` ; ` from the beginning of the line) the
115115following line in your ` php.ini ` file:
@@ -133,13 +133,13 @@ documentation : [https://codewithkyrian.github.io/transformers-php](https://code
133133
134134## Usage
135135
136- By default, Transformers PHP uses hosted pretrained ONNX models. For supported tasks, models that have been converted to
136+ By default, TransformersPHP uses hosted pretrained ONNX models. For supported tasks, models that have been converted to
137137work with [ Xenova's Transformers.js] ( https://huggingface.co/models?library=transformers.js ) on HuggingFace should work
138- out of the box with Transformers PHP .
138+ out of the box with TransformersPHP .
139139
140140## Configuration
141141
142- You can configure the behaviour of the Transformers PHP library as follows:
142+ You can configure the behaviour of the TransformersPHP library as follows:
143143
144144``` php
145145use Codewithkyrian\Transformers\Transformers;
@@ -159,16 +159,16 @@ the [documentation](https://codewithkyrian.github.io/transformers-php/configurat
159159
160160## Convert your models to ONNX
161161
162- Transformers PHP only works with ONNX models, therefore, you must convert your PyTorch, TensorFlow or JAX models to
162+ TransformersPHP only works with ONNX models, therefore, you must convert your PyTorch, TensorFlow or JAX models to
163163ONNX. It is recommended to use [ 🤗 Optimum] ( https://huggingface.co/docs/optimum ) to perform the conversion and
164164quantization of your model.
165165
166166## Pre-Download Models
167167
168- By default, Transformers PHP automatically retrieves model weights (ONNX format) from the Hugging Face model hub when
168+ By default, TransformersPHP automatically retrieves model weights (ONNX format) from the Hugging Face model hub when
169169you first use a pipeline or pretrained model. This can lead to a slight delay during the initial use. To improve the
170170user experience, it's recommended to pre-download the models you intend to use before running them in your PHP
171- application, especially for larger models. One way to do that is run the request once manually, but Transformers PHP
171+ application, especially for larger models. One way to do that is run the request once manually, but TransformersPHP
172172also comes with a command line tool to help you do just that:
173173
174174``` bash
@@ -195,7 +195,7 @@ Explanation of Arguments:
195195
196196## Supported tasks/models
197197
198- This package is a WIP, but here's a list of tasks and architectures currently tested and supported by Transformers PHP .
198+ This package is a WIP, but here's a list of tasks and architectures currently tested and supported by TransformersPHP .
199199
200200### Tasks
201201
0 commit comments