> Thanks for responding by the way.
Same to you. Mobile apps and cryptography are admittedly not my day job, so talking over these details with folks is really helpful.
> enforce maximum number of accounts/keypairs on one device
Comrad makes heavy use of client's hardware: it compiles Themis, a high level cryptography library, so that all the encryption keys can be generated locally and only public keys need be transferred over the network; and it also uses torpy, a python-based implementation of a Tor SOCKS proxy. (This is why it can't work in a browser and needs to be an app.) So, similarly, the app could also just detect whether any encryption keys are already stored on the device's hardware that the app is running on, before allowing the creation of a new user/set of keys. You could probably get around this restricting by re-engineering the app software, but at least that's made more difficult to do.
> > make user prove not a bot
> This is necessary to prevent spam (even if you remove upvotes), but automatic verification would be difficult since it has to work in terminal and GUI.
Hm. Are there no client-based/pythonic ways to test humanity? Some math problems? A funny socialist quote requiring some basic reading comprehension? Again, maybe not foolproof, but it adds another barrier.
> Additionally, Comrad does not seem to have any moderation, whether by users or by admins, meaning any verified user can easily spam.
Not yet anyway. For group accounts, the idea is that moderation could be achieved through a 'web of trust'. If A vouches for B, then B 'joins' the group. If B then vouches for C and D, potentially A would have to approve it for C and D to join. Then if B writes to the group, the message is actually only sent to A, C, and D (i.e. B's immediate neighbors in the web of trust). Each of those recipients would then have to accept B's message as legitimate in order for B's message to keep propagating through the web of trust to the rest of the group. The additional advantage of that model is that every message remains encrypted E2E between specific users, so no one actually needs the group's key; and the server doesn't need to store who vouches for whom, since every user can simply store whom they've vouched for on their hardware. At least, that's the idea so far.
> If you really want to have upvotes, make them non-anonymous. That way, an upvote is just a shorthand for replying "Good post".
Hm, that's interesting. So in that case the server could store which users have liked any given post. (Everything the server stores btw is in a simple key-value store: the value is encrypted with a symmetric key hidden on hardware, and the key is 'hidden' by running it through a hash algorithm using an extra 'salt' string, which is also hidden on hardware.)
My worry (maybe paranoid) with allowing UserX-upvotes-PostY to be public is that someone could potentially reconstruct your real-life social network from whose posts you like, and from there guesstimate who you are depending on other creepy data the FBI or whoever has. But that level of paranoia may run counter to the logic of a social network, which is semi-public by design, anyway.
I can imagine two other potential solutions to this problem, all half satisfactory:
* Store only on your hardware whether you've upvoted PostY. Keeps anonymity, but leaves it susceptible to someone rewriting the client software.
* UserX sends a message encrypted to the author of PostY that they liked the post. The server stores how many people liked the post, but only UserX and the author of PostY can read the E2E encrypted fact of the upvote.
* Store on the server whether UserX-upvoted-PostY, but store this fact as an encrypted value with a salt-hashed key, so it's at least hard to find or read. Keeps anonymity but leaves anonymity vulnerable to attack on server (which would have to be hacking based, since you'd need hardware access to find the server's encryption key and hash salt phrase).
> Same as every other anonymous P2P network (e.g. I2P, Freenet)
I looked into that: I2P, Freenet, IPFS, etc. None of them (afaik) work on a mobile client, and some are hard to set up even on a desktop. I really want this to run on mobile and be easy to set up, since your average comrade/leftist has no idea how to set up all this stuff.
> > They're meant to be readable to anyone using the app
> Which is why it's pointless to encrypt them. There is no improvement to privacy compared to storing them in plain text.
Maybe so, but it ensures that nothing unencrypted is ever stored or transmitted, which at least makes it more difficult for an outside actor (like an ISP) to listen in and decipher it. It also means that every message a user receives is encrypted E2E personally to them: since all group/direct messages are already direct, and since messages to the world are encrypted automatically to the user from the Operator. It also lets the Operator change its private/public key and notify users of the change, so that only someone storing the history of every public key over time could decrypt older data.
> > The Op verifies the sender's signature by checking it against the sender's public key on file
> Is there any point to this being done by the server, rather than the client?
It's done by both. I guess the Op doing it too just makes sure nothing illegitimate is stored or sent to the client, even if the client could also determine the same thing.
> Except it's also impossible to verify that the server is running unmodified code.
True. Not sure what to do. Perhaps a public and legally binding commitment of some kind. But again, this seems standard. We can't ever fully know what bunkerchan or mastodon or any other server is doing with the data we send them.
> > A p2p network would be even less likely to ever fully delete a piece of data, once uploaded.
> For publicly shared data, sure. But privately shared data would only reach its intended recipents unencrypted, meaning you only have to trust them.
> I suggest you look into anonymous P2P networks.
See above: they don't currently work on a mobile client, as far as I know. Also, I thought that networks like IPFS never actually delete a piece of data. Which anonymous private networks are you thinking of? I initially set up a custom Kademlia P2P network for Comrad users, with the server just acting as the seed/prime node. All the data was encrypted E2E, but the users/nodes needed to know and remember each other's IP addresses in order to communicate.
> Good point, but the auto-deletion might make people more okay with sharing confidential information.
Fair enough. I guess you'd have to educate people what the risks are for any given action on the app.
> Since Comrad servers are not federated, if you decided to use an alternative instance you would only be able to communicate with the other 3 people who use it.
I meant something closer to piratebay: if the worry is that it could be shut down because it's only a single server, then it would be easy to spring it back up somewhere, and just change the .onion API address the app uses.
> There are already far too many social networks. What advantages does Comrad have compared to: ...
I try to lay that out [here](https://github.com/ComradOrg/Comrad/wiki/Comparison-of-alternative-social-networks),
but I guess in short it's basically just a more accessible mixture of those elements. It makes public key crypto and Tor accessible to normal people: it all works behind the scenes in a single app, which to all appearances works just like twitter.
Sorry for the long post, but thanks for your and everyone's comments, I appreciate it.