package triton
- Alphabetic
- Public
- All
Value Members
-
object
InferenceClient
Interface to access triton inference server, using a gRPC client, example usage below
Interface to access triton inference server, using a gRPC client, example usage below
InferenceClient.config = Util.readConfig("config.json") val model = "a-sensor-id" // models have same name as sensorId //check channel ready InferenceClient.channelReady(false) //server api InferenceClient.serverLive() InferenceClient.serverReady() InferenceClient.serverMetadata() //model api InferenceClient.modelReady(model) InferenceClient.modelMetadata(model) InferenceClient.modelConfig(model) //linear interpolate val tensor = (1 to 40).map(x => Array(x.toDouble, 100 + x.toDouble)).toArray val inter = InferenceClient.linearInterpolate(tensor, 100) //inference val jsonArray : Array[String] = provide Behavior json array InferenceClient.inferWithJson(model, jsonArray, config) //