You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Feature Request Description
I propose adding a save method to the SemanticSegmentationhere class. Currently, I use a workaround where I save the model to IndexedDB after loading it. Here's my implementation:
letgraphModel=null;try{console.log("loading from memory");graphModel=awaittfjs.loadGraphModel('indexeddb://deep');console.log("loaded from memory");}catch(e){console.log("failed to load from memory, loading from network");graphModel=awaittfconv.loadGraphModel(modelConfig.modelUrl||getURL(modelConfig.base!,modelConfig.quantizationBytes!));console.log("saving to memory");graphModel.save("indexeddb://deep");}
Impact on Current API
This addition would simplify workflows for browser-based applications.
Target Audience
Users running TensorFlow.js in the browser.
The text was updated successfully, but these errors were encountered:
Thanks for the feature request. tfjs-models provide pre-trained models and direct APIs for making predictions. The reason why not all TFJS APIs are available in the model repository is because this repository focuses on simplifying prediction for users. However, you can still leverage TFJS APIs (like the .save method) in conjunction with tfjs-models.
Could you please elaborate further on how this would benefit the community?
Let me know if i have missed anything. Thank You!!
System Information
Feature Request Description
I propose adding a
save
method to theSemanticSegmentation
here class. Currently, I use a workaround where I save the model to IndexedDB after loading it. Here's my implementation:Impact on Current API
This addition would simplify workflows for browser-based applications.
Target Audience
Users running TensorFlow.js in the browser.
The text was updated successfully, but these errors were encountered: