| 
							 | 
						--- | 
					
					
						
						| 
							 | 
						base_model: openai/whisper-small | 
					
					
						
						| 
							 | 
						library_name: transformers.js | 
					
					
						
						| 
							 | 
						--- | 
					
					
						
						| 
							 | 
						 | 
					
					
						
						| 
							 | 
						https://huggingface.co/openai/whisper-small with ONNX weights to be compatible with Transformers.js. | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						## Usage (Transformers.js) | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using: | 
					
					
						
						| 
							 | 
						```bash | 
					
					
						
						| 
							 | 
						npm i @huggingface/transformers | 
					
					
						
						| 
							 | 
						``` | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						**Example:** Transcribe audio from a URL. | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						```js | 
					
					
						
						| 
							 | 
						import { pipeline } from '@huggingface/transformers'; | 
					
					
						
						| 
							 | 
						 | 
					
					
						
						| 
							 | 
						const transcriber = await pipeline('automatic-speech-recognition', 'Xenova/whisper-small'); | 
					
					
						
						| 
							 | 
						const url = 'https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/jfk.wav'; | 
					
					
						
						| 
							 | 
						const output = await transcriber(url); | 
					
					
						
						| 
							 | 
						``` | 
					
					
						
						| 
							 | 
						
 | 
					
					
						
						| 
							 | 
						Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`). |