Making Predictions
Making predictions with predictors.
The very first step in making predictions is finding a prediction function to use:
Explore Predictors on Function
Explore public predictors on Function. These predictors can be used by any user on the Function platform.
Join our waitlist to bring your custom Python functions and run them on-device.
Making Predictions
Making predictions with Function can be done in as little as two lines of code.
import { Function } from "fxnjs"
// 💥 Create your Function client
const fxn = new Function({ accessKey: "..." });
// 🔥 Run the prediction locally
const prediction = await fxn.predictions.create({
tag: "@fxn/greeting",
inputs: { name: "Yusuf" }
});
// 🚀 Print the result
console.log(prediction.results[0]);
Using Prediction Values
Function supports a fixed set of value types for prediction input and output values:
Function supports the following floating-point numbers:
Function value type | C/C++ type | Description |
---|---|---|
float16 | float16_t | IEEE 754 16-bit floating point number. |
float32 | float | IEEE 754 32-bit floating point number. |
float64 | double | IEEE 754 64-bit floating point number. |
const prediction = await fxn.predictions.create({
tag: "@fxn/identity",
inputs: {
radius: 4.5
}
});
const radius = prediction.results[0] as number;
In languages that don’t support fixed-size floating point scalars, the data type for floating point values
defaults to float32
. Use a tensor constructor to explicitly specify the data type.
Support for half-precision floating point scalars float16
is planned for the future depending on language support.
Function supports floating point vectors (i.e. one-dimensional floating point tensors):
import type { Tensor } from "fxnjs"
const prediction = await fxn.predictions.create({
tag: "@fxn/identity",
inputs: {
vector: new Float32Array([ 1.2, 2.2, 3.2, 4.5 ])
}
});
const vector = prediction.results[0] as Tensor;
Although Function supports input vectors, predictors will always output either scalars or Tensor
instances—never plain vectors.
Function supports floating point tensors:
import type { Tensor } from "fxnjs"
const prediction = await fxn.predictions.create({
tag: "@fxn/identity",
inputs: {
matrix: {
data: new Float64Array([ 1.2, 2.2, 3.2, 4.5 ]),
shape: [2, 2]
} satisfies Tensor
}
});
const matrix = prediction.results[0] as Tensor;
Function supports several signed and unsigned integer scalars:
Function value type | C/C++ type | Description |
---|---|---|
int8 | int8_t | Signed 8-bit integer. |
int16 | int16_t | Signed 16-bit integer. |
int32 | int32_t | Signed 32-bit integer. |
int64 | int64_t | Signed 64-bit integer. |
uint8 | uint8_t | Unsigned 8-bit integer. |
uint16 | uint16_t | Unsigned 16-bit integer. |
uint32 | uint32_t | Unsigned 32-bit integer. |
uint64 | uint64_t | Unsigned 64-bit integer. |
const prediction = await fxn.predictions.create({
tag: "@fxn/squeeze",
inputs: {
oranges: 12
}
});
const cups = prediction.results[0] as number;
When integer scalars are passed to predictors, the data type defaults to int32
. Use
a tensor constructor to explicitly specify the data type.
Function supports integer vectors (i.e. one-dimensional integer tensors) of the aforementioned integer types:
import type { Tensor } from "fxnjs"
const prediction = await fxn.predictions.create({
tag: "@fxn/identity",
inputs: {
vector: new Int16Array([ 1, 2, 3, 4 ])
}
});
const vector = prediction.results[0] as Tensor;
Although Function supports input vectors, predictors will always output either scalars or Tensor
instances—never plain vectors.
Function supports integer tensors:
import type { Tensor } from "fxnjs"
const prediction = await fxn.predictions.create({
tag: "@fxn/transpose",
inputs: {
matrix: {
data: new Int16Array([ 1, 2, 3, 4 ]),
shape: [2, 2]
} satisfies Tensor
}
});
const matrix = prediction.results[0] as Tensor;
Unsigned integer tensors are not supported in our Android client because of missing language support in Java.
Function supports boolean scalars:
const prediction = await fxn.predictions.create({
tag: "@fxn/negate",
inputs: {
value: true
}
});
const truthy = prediction.results[0] as boolean;
Function supports boolean vectors (i.e. one-dimensional boolean tensors):
import { BoolArray, type Tensor } from "fxnjs"
const prediction = await fxn.predictions.create({
tag: "@fxn/identity",
inputs: {
vector: new BoolArray([ true, true, false, true ])
}
});
const vector = prediction.results[0] as Tensor;
Although Function supports input vectors, predictors will always output either scalars or Tensor
instances—never plain vectors.
Function supports boolean tensors:
import { BoolArray, type Tensor } from "fxnjs"
const prediction = await fxn.predictions.create({
tag: "@fxn/transpose",
inputs: {
matrix: {
data: new BoolArray([ true, true, false, true ]),
shape: [2, 2]
} satisfies Tensor
}
});
const matrix = prediction.results[0] as Tensor;
Function supports string values:
const prediction = await fxn.predictions.create({
tag: "@fxn/upper",
inputs: {
text: "hello from function"
}
});
const uppercase = prediction.results[0] as string;
Function supports lists of values, each with potentially different types:
const prediction = await fxn.predictions.create({
tag: "@fxn/identity",
inputs: {
elements: ["hello", 10, false]
}
});
const elements = prediction.results[0] as any[];
Input list values must be JSON-serializable.
Function supports dictionary values:
const prediction = await fxn.predictions.create({
tag: "@fxn/identity",
inputs: {
person: {
name: "Sara",
age: 27
}
}
});
const person = prediction.results[0] as Record<string, any>;
Input dictionary values must be JSON-serializable.
Function supports images, represented as raw pixel buffers with 8 bytes per pixel and interleaved by channel. Function supports three pixel buffer formats:
Pixel format | Channels | Description |
---|---|---|
A8 | 1 | Single channel luminance or alpha image. |
RGB888 | 3 | Color image without alpha channel. |
RGBA8888 | 4 | Color image with alpha channel. |
Some client SDKs provide Image
utility types for working with images:
import type { Image } from "fxnjs"
const prediction = await fxn.predictions.create({
tag: "@vision-co/remove-background",
inputs: {
image: {
data: new Uint8ClampedArray(1280 * 720 * 3),
width: 1280,
height: 720,
channels: 3
} satisfies Image
}
});
const image = prediction.results[0] as Image;
Function supports binary blobs:
const prediction = await fxn.predictions.create({
tag: "@vision-co/decode-jpeg",
inputs: {
buffer: new ArrayBuffer(1024)
}
});
const buffer = prediction.results[0] as ArrayBuffer;
Because Function’s security model prohibits file system access, binary input values are always fully read into memory before being passed to the predictor.
To make predictions on large files, consider mapping the file into memory
using mmap
or your environment’s equivalent.