I've been struggling to cleanly/safely pass small amounts of aggregated data between Redis and my openresty backend without splitting the data into a bunch of different keys/lists in Redis, and I'd be interested in feedback on the very lightweight JSON-in-Redis approach I've taken - others may possibly find it useful.
I was mixing different facts in a single Redis key (perhaps a bad idea), and I was keen to write to this atomically without having key locking taking place in between the app read and write.
My approach has been to transform some Lua code which modifies the JSON structure into a series of simple diff commands which are sent through to Redis to be executed atomically with EVAL. This means that two processes can execute the following code with having to worry about locking (although Redis does some locking in order to make this work - it is faster than the alternative) as follows:
Process 1:
user = atomic_redis(client, "KEYNAME")
user:set("password","hunter2")
user = atomic_redis(client, "KEYNAME")
user:set("todo_lists", {{name = "Work", items = {}}, {name = "Home", items = {}}})
These will never override each other changes to the JSON object stored on key KEYNAME - the changes are only passed through to Redis to be applied to the JSON object as a diff.
Deletions, inserted, etc are all supported as diffs, eg:
user("todo_lists"):match("name", "Home"):del() will go into todo_lists, and delete the object where name = Home.
The code is up on github at https://github.com/forkfork/atomic-redis and suggestions are of course welcome.
NB: I don't recommend using this for anything terribly important (I guess use a proper SQL Database for that).
Tim