SVM model yields different outputs on Redis Gear

Dear all,

I am currently starting to learn more about the amazing RedisGears and RedisAI, hence I appologize in advance if I missed something.

The problem I am facing comes up when I am trying to run a toy SVM model trained using and exported from scikit-learn to ONNX: Using the RedisAI C API yields different class labels than using Redis CLI.

Example 1

Using the RedisAI C API for instance provides the following output (I would consider [6.93673868037757e-310] being the class label):

Loading data
[5.7, 2.8, 4.5, 1.3]
Executing model
[6.93673868037757e-310]
[-0.22895553708076477, 2.2316672801971436, 0.9738349914550781]

While running the same model using the same input data on Redis CLI provides:

127.0.0.1:6379> AI.TENSORSET x DOUBLE 1 4 VALUES 5.7 2.8 4.5 1.3
OK
127.0.0.1:6379> AI.MODELRUN svm INPUTS x OUTPUTS classes prob
OK
127.0.0.1:6379> AI.TENSORGET classes VALUES
1) (integer) 1
127.0.0.1:6379> AI.TENSORGET prob VALUES
1) "-0.22895553708076477"
2) "2.2316672801971436"
3) "0.97383499145507812"

The label should be 1 as stated in the output from the Redis CLI while the RedisAI C API indicates the label (approx.) 0.

Example 2

Using RedisAI C API:

Loading data
[5.1, 3.5, 1.4, 0.2]
Executing model
[6.9367386804922e-310]
[2.2354373931884766, 1.1609625816345215, -0.25650709867477417]

Using Redis CLI:

127.0.0.1:6379> AI.TENSORSET x DOUBLE 1 4 VALUES  5.1 3.5 1.4 0.2
OK
127.0.0.1:6379> AI.MODELRUN svm INPUTS x OUTPUTS classes prob
OK
127.0.0.1:6379> AI.TENSORGET classes VALUES
1) (integer) 0
127.0.0.1:6379> AI.TENSORGET prob VALUES
1) "2.2354373931884766"
2) "1.1609625816345215"
3) "-0.25650709867477417"

I am using the latest version of the Docker image redislabs/redismod:edge.

My code looks like follows:

import redisAI

def analytics(datapoint):

	# loading data
	print('Loading data')
	x = datapoint['value']['data']
	input_data = list(map(float, x.replace('[', '').replace(']', '').split(' ')))
	
	input_tensor = redisAI.createTensorFromValues('DOUBLE', [1, 4], input_data)
	
	# creating RedisAI model runner and execute
	print("Executing model")
	modelRunner = redisAI.createModelRunner('svm')

	redisAI.modelRunnerAddInput(modelRunner, 'input', input_tensor)

	redisAI.modelRunnerAddOutput(modelRunner, 'output_1')

	redisAI.modelRunnerAddOutput(modelRunner, 'output_2')

	model_replies = redisAI.tensorToFlatList(
		redisAI.modelRunnerRun(modelRunner)[0])

	model_replies_1 = redisAI.tensorToFlatList(
		redisAI.modelRunnerRun(modelRunner)[1])

	model_output = model_replies

	# write output to stream
	if model_output:
		print('Writing to output stream')
		execute('XADD', 'result_stream', 'MAXLEN', '~', '100', '*'
				, 'class', model_output
				, 'prob', model_replies_1)

gb = GearsBuilder('StreamReader', desc='').map(analytics).register('input_stream')

Any help is appreciated!

Thanks a lot!

Hey @manl,

Your code looks fine and I can not spot any error, any chance you can somehow share the model and I will try run it myself?

Hi @meirsh,

you guys are amazing. Thanks a lot for the fast response. I have uploaded the toy example to github. It should be available at https://github.com/dsmanl/redisgears_redisai_example/blob/master/exported_sklearn_model_svm.onnx.

Again, thanks a lot!

1 Like

Sure, will check and get back to you.

1 Like

@manl I guess your model has memory? In your code, you are running it twice (each time you call redisAI.modelRunnerRun). I changed the script to look like this and it looks like its working:

import redisAI

def analytics(datapoint):

    # loading data
    print('Loading data')
    x = datapoint['value']['data']
    input_data = list(map(float, x.replace('[', '').replace(']', '').split(' ')))
    print(input_data)
    input_tensor = redisAI.createTensorFromValues('DOUBLE', [1, 4], input_data)
    
    # creating RedisAI model runner and execute
    print("Executing model")
    modelRunner = redisAI.createModelRunner('svm')

    redisAI.modelRunnerAddInput(modelRunner, 'input', input_tensor)

    redisAI.modelRunnerAddOutput(modelRunner, 'output_1')

    redisAI.modelRunnerAddOutput(modelRunner, 'output_2')

    model_res = redisAI.modelRunnerRun(modelRunner)

    model_replies = redisAI.tensorToFlatList(model_res[0])

    model_replies_1 = redisAI.tensorToFlatList(model_res[1])

    model_output = model_replies

    # write output to stream
    if model_output:
        print('Writing to output stream')
        execute('XADD', 'result_stream', 'MAXLEN', '~', '100', '*'
                , 'class', model_output
                , 'prob', model_replies_1)

gb = GearsBuilder('StreamReader', desc='').map(analytics).register('input_stream')

And then:

127.0.0.1:6379> xadd input_stream * data "[5.1 3.5 1.4 0.2]"
"1593092231545-0"
127.0.0.1:6379> XREAD STREAMS result_stream 0-0
1) 1) "result_stream"
   2) 1) 1) "1593092231546-0"
         2) 1) "class"
            2) "[0.0]"
            3) "prob"
            4) "[2.2354373931884766, 1.1609625816345215, -0.25650709867477417]"

Thanks a lot @meirsh! I have changed the code accordingly. However, I am still getting the same result:

127.0.0.1:6379> xadd input_stream * data "[5.1 3.5 1.4 0.2]"
"1593152921372-0"
127.0.0.1:6379> xrevrange result_stream + - COUNT 10
1) 1) "1593152921375-0"
   2) 1) "class"
      2) "[6.89959811957904e-310]"
      3) "prob"
      4) "[2.2354373931884766, 1.1609625816345215, -0.25650709867477417]"

Here is, how I transfer the model to Redis and how initialise the Gear:

from ml2rt import load_model
import redis

if __name__ == "__main__":

	# establish redis connection
	r = redis.Redis(host='localhost', port=6379)

	if not r.ping():
		raise Exception('Could not connect to Redis')

	# load model saved in ONNX format
	print('Loading model')
	model = load_model('exported_sklearn_model_svm.onnx')

	res = r.execute_command('AI.MODELSET', 'svm', 'ONNX', 'CPU', 'TAG', 'v0.1', 'BLOB', model)
	print('--- Status - Model loading: ' + str(res))

	# load and initialise the gear
	print('Loading and initialising gears')
	gears_file = 'gear.py'

	with open(gears_file, 'rb') as f:
		gear = f.read().decode()
		res = r.execute_command('RG.PYEXECUTE', gear)
		print('--- Status - Gear loading: ' + str(res))

The Gear is now defined as you have described:

import redisAI

def analytics(datapoint):
	# loading data
	print('Loading data')
	x = datapoint['value']['data']
	input_data = list(map(float, x.replace('[', '').replace(']', '').split(' ')))
	print(input_data)
	input_tensor = redisAI.createTensorFromValues('DOUBLE', [1, 4], input_data)

	# creating RedisAI model runner and execute
	print("Executing model")
	modelRunner = redisAI.createModelRunner('svm')

	redisAI.modelRunnerAddInput(modelRunner, 'input', input_tensor)

	redisAI.modelRunnerAddOutput(modelRunner, 'output_1')

	redisAI.modelRunnerAddOutput(modelRunner, 'output_2')

	model_res = redisAI.modelRunnerRun(modelRunner)

	model_replies = redisAI.tensorToFlatList(model_res[0])

	model_replies_1 = redisAI.tensorToFlatList(model_res[1])

	model_output = model_replies

	# write output to stream
	if model_output:
		print('Writing to output stream')
		execute('XADD', 'result_stream', 'MAXLEN', '~', '100', '*'
				, 'class', model_output
				, 'prob', model_replies_1)


gb = GearsBuilder('StreamReader', desc='').map(analytics).register('input_stream')

As mentioned before, I am using the following Docker image:

(base) ➜  ~ docker image list redislabs/redismod:edge
REPOSITORY           TAG                 IMAGE ID            CREATED             SIZE
redislabs/redismod   edge                8d656b7a393c        3 days ago          1.28GB

I have also updated to the most recent version (updated 11 hours ago). My OS is macOS Catalina Version 10.15.4.

Thanks!

@manl you are right, there is an issue on the ‘tensorToFlatList’ function that does not handle the tensor correctly if the encoding is Long. This issue was hidden when I compiled RedisGears with debug flags and this is why I thought its working.

This PR should fix it https://github.com/RedisGears/RedisGears/pull/374.

I will notify you here one its merge and can you check again?

@manl the PR was merged and the docker image was already built so it should work now. Thanks for reporting this.

Dear @meirsh, thanks a lot for all of your efforts! It now works like a charm! Best regards, manl

1 Like