Write multiple spark df within one connection

i am doing structured streaming and i sometimes have multiple dfs to write to redis using spark-redis.
i wanted to check if there is a way to write multiple dataframes with just one write command within one redis connection.

please let me know if the question needs more details.

No, it’s not possible.

One dataframe is saved as one hash. There is no single command to write in multiple hashes. So there is no way to write multiple dataframes with just one write command.