Kafka to redis using kafka-connect-sink config for writting as hash-map into redis

I want to use the capabilities of Kafka Connect to read data from Kafka topics and write them to Redis. My goal is to write the records, which are written in JSON format on the topic, using the hset command on Redis. I have tried several libraries for this, but it seems that there are few helpful documents on the internet for connecting Kafka to Redis.

I have tested some existing connectors, but none of them met my needs for example I followed the instructions in this repository and first created a configuration as follows:

curl -X POST -H "Content-Type: application/json" --data '
  {"name": "redis-sink",
   "config": {
     "redis.type": "HASH",
     "key.converter": "org.apache.kafka.connect.storage.StringConverter",
     "value.converter": "org.apache.kafka.connect.json.JsonConverter",
     "value.converter.schemas.enable": "false"
}}' http://localhost:8083/connectors -w "\n"

so when I produce a sample message like bellow :

{"foo": "bar"}

I get an output on redis as :

"MSET" "com.redis.kafka.connect.sink.offset.testTopic.0" "{\"topic\":\"testTopic\",\"partition\":0,\"offset\":0}"

I have been spending a week trying to set up this demo, but I haven’t found a solution yet. I have also searched the internet (web and YouTube) and visited these connectors on GitHub, but it seems that there is no clear and helpful documentation for this purpose. I would appreciate it if you could guide me on this matter.

kafka to redis , read string from kafka and write them using hset into redis