Pictofit / Web SDK / 7.3.1 / Custom Mannequins

Custom Mannequins

Using our Shape from Measurements feature, we can also generate customized mannequins for mix & match and size recommendation. This uses the Pictofit Compute Server so please check out the documentation on how to set it up first. The following code requires the compute server to be v1.7.0 or later. Check out this demo to see the feature in action. You can also find the source code for it in our sample respository.

Custom Mannequin Demo

Fetching Options from the Server

When generating a custom mannequin, several aspects can be configured:

  • bodymodelId There are currently two body models (one for each gender) which define the general body shape & pose (and gender) of the mannequin.
  • shapeTypes The shape types refer to the basic body type (Ectomorph, Mesomorph, Endomorph). Custom shape types can be provided in the future.
  • measurements This is the list of input measurements which define the shape of the generated mannequin.

Depending on the used configuration files, the available options might change. Therefore, make sure to query the possible options from the server. How this is done can be seen in the following snippet. Please check the API Reference for more information on each of these parameters.

let computeServer = new Pictofit.ComputeServer(SERVER_URL, SERVER_TOKEN);

// request all available body model ids
let bodymodelIds = await computeServer.requestBodyModelIds();
// request all available shape types
let shapeTypes = await computeServer.requestShapeTypes();

// request all available measurements (identifier and default values)
let measurements = await computeServer.requestMeasurements(myBodyModelId);
// Optionally you may also use a custom config file
let measurements = await computeServer.requestMeasurements(myBodyModelId, configUrl);

Generating a Mannequin

To generate your custom mannequin, create a request of type BasicMannequinRequestInfo and provide your values for the different options. Most important are the measurements that you provide using the setMeasurement method. These define how your mannequin will look in the end. If you don’t provide a target value for a certain measurement, the default defined by the selected body type (shapeType) will be used.

The generated mannequin is displayed using and instance of the Pictofit.WebViewer. The appearance of the mannequins surface is defined by a material using our JSON configuration format. Make sure to load this configuration file containing the material first and to specify the name of the material when generating the mannequin. An example for a simple material could look like this:

  "version" : 2,
  "scene" : {
    "materials" : [
        "name" : "Mannequin-Material",
        "type" : "StandardMaterial",
        "diffuseColor" : [0.5, 0.5, 0.5],
        "emissiveColor" : [0.5, 0.5, 0.5]

The following snippet shows how to create and trigger a request for generating a custom mannequin:

// load the mannequin material before triggering the request
await viewer.loadConfig("assets/mannequin-material.json");

let info = new Pictofit.BasicMannequinRequestInfo();
info.measurementsConfigUrl = "https://myServer/mannequinMeasurementsConfig.json"
info.pbmId = myBodyModelId;
info.shapeType = myShapeType;
info.pose = await fetch("https://pose/myPose.bm3dpose").then((r) => r.blob()); 
info.setMeasurement(new Pictofit.Measurement(myMeasurementId, myMeasurementValue));

let customMannequin = new Pictofit.CustomMannequin(computeServer, viewer);
customMannequin.requestInfo = info;
// request the mannequin and display it web viewer. This will add one cascade layer to the scene.
await customMannequin.compute("Mannequin-Material", "Mannequin");

If you want to extract generated blobs from the response, this is how you would do it:

// load the mannequin material before triggering the request
await viewer.loadConfig("assets/mannequin-material.json");

let info = new Pictofit.BasicMannequinRequestInfo();
// request the body model state file
/** configure request ... **/

let customMannequin = new Pictofit.CustomMannequin(computeServer, viewer);
customMannequin.requestInfo = info;
// calling compute() will always request a model file, since its needed to display the mannequin in the WebViewer
// e.g. info.returnFiles.push(Pictofit.MannequinRequestReturnFile.MODEL);
const blobs = await customMannequin.compute("Mannequin-Material", "Mannequin");
// now the blobs can be extracted out of the response
// note that these might not exist when they were not requested
const bodyModelStateBlob = blobs[Pictofit.MannequinRequestReturnFile.BODY_MODEL_STATE];
const modelBlob = blobs[Pictofit.MannequinRequestReturnFile.MODEL];

To create a new scene for the WebViewer with a set of existing / cached blobs, call the createScene method directly.

// load the mannequin material before loading the mannequin
await viewer.loadConfig("assets/mannequin-material.json");

let customMannequin = new Pictofit.CustomMannequin(computeServer, viewer);
const modelBlob = ...; // restore model blob
// this will add a new cascade layer and does not make a new request to the compute server
await customMannequin.createScene(modelBlob, "Mannequin-Material", "Mannequin");

The result should then look something like this, depending on your scene and request configuration:

Example of a Custom Mannequin

The scene can be customized using the Scene Cascading feature. This allows you to add a skybox, background elements, lights and different other things. You can also easily exchange the material of the mannequin without triggering the request again. Simply load multiple materials and then cycle through them using the nextConfiguration(sceneCascacdeIndex : number) and previousConfiguration(sceneCascacdeIndex : number) methods. The following code shows how to define a list of configurations for materials:

// define multiple configuration files contenting materials for the mannequin.
await viewer.loadConfig("assets/mannequin-material-1.json", "assets/mannequin-material-2.json");
// ...
// switch to the next material. This assumes that the materials have been defined first and therefore have cascade index 0

Make sure to use the same material name in your different configuration files since the mannequin refers to exactly one (the material name provided when setting up the request).

How-To: Cache the Generated Body Model State

It is adviseable to cache the bodyModelStateBlob and reuse it for performance and UX reasons. One way to do this is to store it in the current browser session. The following sample shows how this can be done:

// setup common entrypoints
const storage = window.sessionStorage;
const STORAGE_KEY = "bodyModelState";

// save the bodyModelStateBlob
const blobUrl = await blobToUrl(bodyModelStateBlob);
storage.setItem(STORAGE_KEY, blobUrl);

// helper function to convert our blob object into a less complex url string
// that we can later fetch to get our blob back
async function blobToUrl(blob) {
    const reader = new FileReader();
    await new Promise((resolve, reject) => {
        reader.onload = resolve;
        reader.onerror = reject;
    return reader.result.toString();

Now you can reuse this state at a later point:

// restore the bodymodelstate blob
let bodyModelBlobUrl = storage.getItem(STORAGE_KEY);
const restoredBlob = await fetch(bodyModelBlobUrl).then((r) => r.blob());
© 2014-2020 Reactive Reality AG