I hope they don’t arrest them too.
Not that the action against Telegram is right, but there’s a big difference between what Signal and Telegram is doing.
Would you have more info on the differences? I was wondering the same thing, but I don’t know enough about Telegram to compare
Signal always responds to authorities when they ask for data, and they give them all they have: the day they registered, their phone number and the timestamp they last used the app.
Telegram has unencrypted channels of drug dealing, and what I heard is a lot of illegal porn too. The authorities want information on certain users there and Telegram doesn’t comply. This is directly against the law Signal is not breaking, because they always send all the data they have to the law enforcement.
Telegram is a propaganda weapon in some sense, between two worldviews - one is “a good service doesn’t require trust, because they physically can’t sell you”, another is “a good service you can trust because they won’t sell you”. And Telegram helps the latter.
So frankly - kill it with fire. Sadly I’m in Russia and everybody uses it here.
while not wrong context matters, US social media companies also enable human, weapons, and drug trafficking. they play a role in a few genocides too.
but the western regime does not care.
But they give their data when the officials ask. That is all that matters. And I seriously hope none of us uses Telegram or WhatsApp to any discussions. Use Signal because that is so far pretty unbreakable.
Telegram is already in the hands of that tiny Russian old man and WhatsApp is owned by a lizard.
Yeah, try telling your family, friends, colleagues, therapist to use Signal.
Did so years ago. Everybody uses it from my family and friends. I’ve had a very active group chat there for eight years with friends. My mom uses it actively, even calls me using Signal. My partner knows it is the best chat app and actively uses it.
I just asked ages ago for everybody to switch to signal, they valuated the features and for a group chat automatically deleted messages and strong encryption were really interesting for everybody. Now we can shoot shit in a group chat without needing to worry that the logs are stored somewhere forever.
All of the illegal stuff like that that I’ve seen around on social media always linked to telegram channels. Most of the time what you see on regular social media are bots advertising the telegram channels, where the real people are at
Hilarious that it’s impossible. They don’t even horde your data.
Is it time stamp of last usage, or time stamp of all messages?
I’m no authority on it but from what I’ve read it seems to have more to do with the social features of telegram where lots of content is being shared, both legal and illegal. Signal doesn’t have channels that support hundreds of thousands of people at once, nor media hosting to match.
Right, the French authorities are going to present evidence that this dude was aware of specific illegal activity and refuse to comply with a legal warrant involving said actively, making him guilty of obstruction at best, and possibly conspiracy. Signal complies with warrants, they just don’t have anyone’s keys. Telegram has everyone’s keys, and theoretically could turn them over but they refuse. That’s a huge difference from a legal perspective.
Thank you. I’m going to restate your explanation to be sure I’ve got it:
- authorities want platforms to comply with legal requests
- when Signal gets a subpoena, they open the key locker and show that it’s empty. They provide the metadata they can (sign up date and last seen date, full stop) and tell authorities they can’t do better.
- when Telegram gets a subpoena, they open the key locker and show all the keys, then slam it shut in the face of the investigator, telling them to get bent.
- conclusion: it’s easier to never have the keys in the first place than to tease the government with them
It’s easier, but Telegram’s authors are from Russia. They psychologically can’t accept that “never have the keys” thing. They want to have control and they want to be able to tell “yes” to the investigator, possibly for something in return.
And it’s sad that it doesn’t. Because that’s why people use Telegram.
Media hosting - we-ell, I suppose something similar to bittorrent (or just sharing encrypted files over bittorrent) would do to back such a system?
Telegram’s channels are like blogs, they have reactions and comment links leading to a groupchat associated with a channel.
It’s basically a social network in an instant messenger format.
Telegram is socially , in terms of finding a market niche, the smartest thing of what’s happened in the Internet recently. Durov really is a good businessman.
She responds to this point in the interview.
Indeed there is, one is an op funded by US intelligence agencies and the other is a platform that the US has no control over.
Telegram is available on F-Droid. Signal is not. Whatever is Signal doing, it’s pretty bad.
Are you developing your opinions based on vibes or have you actually audited their software yourself (you are free to do so both client and federation server code)?
If you audited it, have you produced an actual report with metrics and points of reference for your data points?
This person has been running around spreading FUD in every post about this
It’s what Ive come to expect from the lemmy.ml instance and I finally blocked the entire instance.
But you still post in lemmy.ml/privacy?
It’s actually sad, even though I’m a libertarian, tankies and in general marxists could have made a good input into our future. But if they can believe in Telegram being secure because of vibes and not even doing basic research, they’ve already lost.
Heeey I am also a libertarian, I just tend towards left libertarian. Back to the point of discussion, I find it difficult to ha e a meaningful conversation with the tankies or in general anyone from lemmy.ml . The discussions tend to lack any real data and feel entirely vibe based OR it’s apologist bullshit for Russia.
Like it’s cool if you like communism and have a philosophy based around why you think it’ll help humanity. I can politely disagree but still listen and discuss. It’s quite another to just be a complete dipshit and say “Ukraine had the invasion coming” (actual quote I’ve seen).
Doesn’t take away the fact that not being on F-droid is a huge issue and says a lot about how much they care about privacy and security.
The folks at F-Droid have said that Signal would certainly qualify, but Signal doesn’t want multiple channels out there. F-Droid is just honoring their wishes.
Assuming you’ve audited Signal, can you tell us what your findings were and why you think Signal must be up to something pretty bad? I’m very curious and would love to be enlightened by someone as knowledgeable as you.
I’ll leave it up to you to decide if that is bad or not, but one of the reasons the Signal app can’t be put unaltered on F-droid is because it loads in external dependencies from Google at run-time, which can also be altered by Google at will with any Android update.
one of the reasons the Signal app can’t be put unaltered on F-droid is because it loads in external dependencies from Google at run-time
IIRC, the APK you get directly from their website doesn’t have the GCM bits in it (edit: I did not recall correctly; the GCM bits are there, but there is a websocket fallback if GCM isn’t available), and will work without them. At least, I didn’t have any issues with notifications back when I was running the website APK with GrapheneOS and no Google bits.
Lots of apps have slight modifications in F-Droid. Like Telegram for instance.
How significant is it that the server code is open-source or not? It’s possible for Signal to publish their server code while running completely different software on their servers. The point of the client is being open source and audited on a regular basis by the community, which is why it doesn’t make sense to trust the server-side software.
The entire point is that we don’t have to trust the sever at all. The client is open source and regularly audited by the community. As long as the client stays fully open source, everything’s fine. Also, the closed source dependencies are part of a spam reduction effort which IMO is well worth it. Prior to this, Signal had a spam problem and the client itself remains fully open source.
Signal could have very well not even told people that they added a closed source dependency on Google to its servers and just lied by publishing fake server code that omits the closed source dependency., but instead they were very transparent about the spam problem. In terms of they “why?” regarding the closed source dependencies, their argument is that making it open source would almost immediately result in all anti-spam measures being thwarted. Frankly I’m inclined to agree and again, as long as the client is fully open source and regularly audited, the server code is irrelevant to user privacy/security.
https://community.signalusers.org/t/spam-scam-on-signal/26665
The external Google dependencies I am talking about are loaded into the client not the server, so that’s en entirely different issue.
Every app from the Play store requires GCM though, and Signal functions even if a user disables GCM. It pertains to a phone’s ability to notify a user of a new message. But again, users can disable GCM and the app itself will continue to work just fine.
For what it’s work, the APK on Signal’s website (obviously) doesn’t have the external Google dependencies. Personally, I really don’t see this as an issue at all.
It would still be nice to have the server code. I want to run my own server on my own hardware
Someone should audit your downvote
Jokes aside, I’m a firm believer that upvotes/downvotes should be private and I think it’s very unfortunate that they aren’t. I’m fine with people downvoting me and me not knowing who they are.
Wonder how you get negative one down vote…
Yeuup
She has her hand in too many strategic places, unlike Telegram.
employed at Google for 13 years
speaker at the 2018 World Summit
written for the American Civil Liberties Union
advised the White House, the FCC, the FTC, the City of New York, the European Parliament, and many other governments and civil society organizations
It’s a pleasing thought, of course, that an influential person may have morals and good goals (and nice looks).
But since there’s no way to know for sure, I think I’ll just stop trying to classify those names into good and evil.
The very fact that there have never been any attempts in the west to stop Signal from operating says volumes in my opinion.
She’s in the US
Say what you will about US but they are pouring money into the cyber security industry
Dude, it’s a non-profit, and their biggest contribution is money that was made by selling WhatsApp to Facebook. Cuz the guy just couldn’t live with what happened to his creation.
They won’t there’s no need. Their clients are garbage and they’re most likely backdoored anyways. This action against Telegram is only happening because they can’t get inside it, they can’t backdoor it nor corrupt anyone. If they were able to do that they wouldn’t be doing this.
No matter how good the protocol or client encryption, your privacy is only as good as your own physical security for the device in question.
Given that if you lose your private key, there is no recovery, I would be surprised if there were real back doors in the clients. Maybe unintentional ways to leak data, but you can go look for yourself: https://github.com/signalapp/Signal-Android
They have one for each client.
As an example of this, I believe SexyCyborg got in trouble for reporting on leaks via people’s 3rd party Chinese language keyboards. So her theory is that the keyboard apps people had installed leaked data when Hong Kong protesters were communicating with the press, rather than the actual Signal app. But… as stated above, people have to take responsibility for their device and in this case, they had chosen to install apps with leak issues into the communication process.
This is precisely why opsec is more than just an app.
Leaky keyboards are a possibility, but what is actually far more likely is just that someone on the signal group chat was a mole who was archiving the traffic for the party. Signal has since made efforts to bring anonymous accounts to the platform, which will help thwart such attacks. Though against a state actor it is still not enough unless you take additional measures to obfuscate traffic. And then that still doesn’t protect you against some CCP brownshirt from tailing you and then snatching your phone out of your hand when you unlock it.
Leaky keyboards are more than a possibility. Sogou, the biggest one for Chinese typing, got found out a year or so ago for having terrible client-server encryption. They fixed it in an update, but many people didn’t get the update - not to mention it’s still sending every keystroke to Tencent (are the owners I think?) so they could also be saving and analysing private typing anyway.
deleted by creator
Maybe unintentional ways to leak data,
Yeah, that’s what I think it may be. Just like Apple reporting on all apps you open on un-encrypted HTTP calls and a few other things.
are you talking about phone notification bullshit and google got caught reporting to government with no warrants.
Not only that, https://sneak.berlin/20201112/your-computer-isnt-yours/
Signal’s defaults are pretty good about that. Push notifications are both opt-in and the information they send can be selected by the user. You can have it say “new message” and that’s it. Or the senders name. Or the whole message.
I agree that it’s not intuitive that that’s a leak to most people, but push notifications are kind of wonky how they work.
signal is all around very strong… my main criticism is the “trust signal bro” cult pretending like Signal would not log chats if ordered by the spooks. which is naive AF and feels like they are trying to make normeis comfortable so they don’t demand better.
Telegram isn’t even E2EE
If you don’t turn on the secret chat feature it wont be, yes. However if E2EE was the only deciding factor for a gov to go against an App then they woudln’t be going after Telegram. The fact that govts are going so hard at telegram simply proves that even when the company has access to all our chats they don’t actually provide them to said govts.
I’m not saying telegram is good from a security perspective, I’m just saying that event without E2EE and all the modern wonders govts can’t still get in because the company doesn’t indulge their requests.
This is a very rude question, but on this subject of being lean, I looked up your 990, and you pay yourself less than … well, you pay yourself half or a third as much as some of your engineers.
Yes, and our goal is to pay people as close to Silicon Valley’s salaries as possible, so we can recruit very senior people, knowing that we don’t have equity to offer them. We pay engineers very well. [Leans in performatively toward the phone recording the interview.] If anyone’s looking for a job, we pay very, very well.
But you pay yourself pretty modestly in the scheme of things.
I make a very good salary that I’m very happy with.
That’s pretty cool. But knowing the number would matter.
IIRC She earns around 400+k per year. Which is a nice salary, but rather low compared to other execs.
LOL it’s actually even lower if you look at Schedule J. Her base compensation is only 115,057. It’s bonus and incentive comp (76,172) that brings it up.
As a happy user of Signal (no bugs or incidents from my viewpoint), I regardless chime in to say a word for decentralization. :)
Signal is centralized:
- there is a single Signal implementation, with a single developing entity
- you have to install its mobile version before you may run the desktop version
There exist protocols like Tox which go a step beyond Signal and offer more freedom -> have multiple clients from diverse makers (some of them unstable), don’t have centralized registration, and don’t rely on servers to distribute messages - only to distribute contact information.
In the grand comparison table of protocols (not clients), Tox is among the few lines that’s all green (Signal has one red square).
Tox isn’t the most secure or private. I would go Simplex Chat
Removed by mod
Not anymore. They have made hostile changes are are screwing over there early adopters. It also lacks forward secrecy
And effectively cannot be selfhosted.
Signal’s hostility to third party clients is a huge red flag.
They also refuse to distance themselves from Google’s app store.
That’s outdated information:
- Molly ~5 years in development
- gurk-rs ~4 years in development
- signal-cli ~9 years in development
- Flare
- Beeper
Go forth and contribute, fork, or create your own.
They also refuse to distance themselves from Google’s app store.
This link has existed forever at this point if we count in internet years: https://signal.org/android/apk/ - getting an app directly from the developer with no middleman is about as distant as you can get from Google’s app store.
Those clients exist despite Signal Foundation, not because they encourage community development. They are doing everything they can to discourage third party app development.
They are doing everything they can to discourage third party app development.
I’d say you’re moving the goalpost. Other than the hostility the founder showed towards LibreSignal nearly 10 years ago now, can you source any evidence to support your claim?
Lots of red flags here in Github: https://github.com/signalapp/Signal-Android/issues/9044
That link, and I could be missing it, has nothing to do with what I claimed. Mind editing your post and quoting a red flag linked at the source you provided?
Some of my favourite red flags:
Signal’s dependence on Google libraries: https://github.com/signalapp/Signal-Android/issues/9044#issuecomment-535194837
Signal dev bullshitting a non-answer and then hilariously refuting his non-answer: https://github.com/signalapp/Signal-Android/issues/9044#issuecomment-534340623
Signal hiding its serverside source code for many months: https://github.com/signalapp/Signal-Android/issues/11101
You can find many more examples.
The last one about server side code, together with Signal’s funding sources and their obsession with phone numbers code leads me to suspect that Signal is just a honeypot by US intelligence.
Those clients exist despite Signal Foundation, not because they encourage community development. They are doing everything they can to discourage third party app development.
That was your original claim. None of the sources you provided back up your original claim. We can talk about Google libraries or the delay in server side code if you want to go down that path, but that’s a completely different discussion. Why are you pivoting to other topics? Will you concede your original point or do you have evidence to back it up?
I wish they had Signal on F-droid but at the end of the day at least it is possible to use Molly Foss.
Signal actually has a rule on not using third party clients on its servers. These clients existing do not prove the point you intend.
can you post a link to this rule?
Yeah, I would like to use it from f-droid instead of google store or apk
https://molly.im/ Especially the FOSS version. Need to manually add the repository though.
This is the way.
Or use Accrescent
What? How is this a red flag? Having third party clients is not good for security.
Having third party clients is not good for security.
If the first party provider told you this, you should always second guess them.
Moreover, providing an option that informed users can choose doesn’t hurt security. This idea the user can’t be trusted to use the appropriate type of messaging if provided options needs to die.
Is there any merit to this comment?
When you use a client, you are relying on the client’s crypto implementation to be correct. This is only one part of it and there’s a lot more to it when it comes to hardening the program. Signal focuses on their desktop and mobile clients and they hire actual security professionals and cryptographers (unlike the charlatans in this thread) to implement it correctly.
Having third party clients would not definitively mean the client is bad, but it most likely would break the security model. Just take a look at Matrix’s clients.
When you use a client, you are relying on the client’s crypto implementation to be correct.
Nothing prevents this other client from using the same as the original app. When the alt client is just a fork, it’s even easier to check if they kept it intact or not.
This is only one part of it and there’s a lot more to it when it comes to hardening the program.
Something at which even the original Signal fails. It has received criticism multiple times (1, 2) for not being verifiable whether it’s been tampered with by the app’s distributor, and also for having included properietary google services dependencies which dynamically load further code from the phone which is also a security issue. Worthy forks solve both of these.
Signal focuses on their desktop and mobile clients and they hire actual security professionals and cryptographers (unlike the charlatans in this thread) to implement it correctly.
Last I heard (a month or so ago) the desktop client had serious unfixed issues.
I think it further erodes your point that Signal is not just hostile in terms of not wanting it, but Moxie for instance has been very, very verbal about this.
Something at which even the original Signal fails. It has received criticism multiple times (1, 2) for not being verifiable whether it’s been tampered with by the app’s distributor, and also for having included properietary google services dependencies which dynamically load further code from the phone which is also a security issue. Worthy forks solve both of these.
That’s unfortunate. I do hope that these forks don’t go and start making extensive changes though, because that’s where it becomes a problem.
Appreciate the link. I still believe in Matrix, even if the client ecosystem isn’t there yet. There HAS to be something to replace discord, the enshitification has already begun.
I wouldn’t call it a discord alternative. It is closer to fancy IRC/live forms.
Then again I don’t really use Discord
Excellent point! If I’m sending someone information that could get me killed if it were intercepted by the state, I’d sure as hell want some guarantees about how the other side is handling my data. Disallowing third party clients gives me at least one such guarantee.
You have absolutely zero guarantees, with or without their policy on third party apps. You can not send sensitive information to someone else’s phone and tell yourself it couldn’t possibly have been intercepted, or that someone couldn’t get ahold of that phone, or that the person you’re sending it to won’t take a screenshot and save it to their cloud.
A lot of software nowadays is doing a real disservice to their users by continuing to lie to them like this by selling them the notion that they can control their information after it has been sent. It’s really making people forget basic information hygiene. No app can guarantee that message won’t be intercepted or mishandled. They can only give you tools to hopefully prevent that, but there are no guarantees.
Moreover, this policy does not exclude them from including third-party functionality and warning the user when they are communicating with somebody that isn’t using encryption.
Too many of these apps and services are getting away with the “security” excuse for what is effectively just creating a walled garden to lock users in. Ask yourself how you can get your own data out of these services when you decide to quit them, and it becomes more apparent what they’re doing.
A lot of software nowadays is doing a real disservice to their users by continuing to lie to them like this by selling them the notion that they can control their information after it has been sent. It’s really making people forget basic information hygiene. No app can guarantee that message won’t be intercepted or mishandled. They can only give you tools to hopefully prevent that, but there are no guarantees.
Oh, yes. These “deleted messages”, or these “hidden likes”, or whatever else.
I mean, there are fundamental things and algorithms allowing to create such a system, with blinded keys, ghost keys and what not, only these disgusting cheats have a centralized service where any employee can see everything, yet pretend that they have “a security feature”.
Of course, I fully agree! My point was just that you can eliminate the risk of poorly implemented cryptography at the endpoints. Obviously there’s a thousand and one other ways things could go wrong. But we do the best we can with security.
Anyway apparently third party clients are allowed after all? So it’s a moot point.
Signal doesn’t disallow third party clients, you should always understand the risk when messaging anyone on any platform. See my post here: https://lemmy.ml/post/19672991/13312234
You have no control on the receiving end. Zero.
You do if third party clients aren’t possible? You have control over what client the receiving end is using.
But apparently third party clients are possible, so it’s moot.
No, if your system can’t support 3rd party clients properly, it is inherently insecure, especially in an e2ee context where you supposedly don’t have to trust the server/vendor. If a system claims to be e2ee, but tightly controls both clients and servers (for example WhatsApp), that means they can rug-pull that e2ee at any point in time and even selectively target people with custom updates to break that e2ee for them only. The only way to realistically protect yourself from that is using a 3rd party client (and yes, I know, in case of Signal also theoretically reviewing every code change and using reproducible builds, but that’s not very realistic).
Now admittedly, Signal has started to be less hostile to 3rd party clients like Molly, so it’s not as bad anymore as it used to be.
Signal third party clients base off the Signal code base. They just add patches and remove certain dependencies. Also they are often more secure. You logic is from the Apple PR department.
Again, having third party clients would not definitively mean the client is bad. Obviously, if it’s a simple fork with hopefully small patches that are just UI changes, it’s probably not going to harm the security model.
I should have phrased this better in my original post. When I was thinking about third party clients, Matrix and XMPP immediately came to my mind. Not very simple forks. So I’ll phrase this better: “Having non-trivial third party clients is not good for security.” What non-trivial means is left to interpretation though, I suppose.
Why do you think so? I see it as a strength in diversity and a great driving force for a proper server api
Do you hate Signal or do you hate the west? There legitimate reasons to not like Signal but calling them hostile toward third party clients is untrue. Last time I checked Signal wasn’t proprietary.
They have demonstrated history of asking third party clients to not use the signal name, and not use the signal network. The client that currently exists that do this do it against the wishes of the signal foundation
They have demonstrated history of asking third party clients to not use the signal name, and not use the signal network.
The lead developer, nearly 10 years ago now, specifically asked LibreSignal to stop. A single event does not make a demonstrated history.
- Molly ~5 years in development
- gurk-rs ~4 years in development
- signal-cli ~9 years in development
- Flare ~2 years in development
- Beeper
The client that currently exists that do this do it against the wishes of the signal foundation
If you have evidence to back this claim, I would like to see it so I can stop spreading misinformation.
In the Libra signal issue that you linked to, they made it clear they don’t want third-party clients talking to signal servers
You’re free to use our source code for whatever you would like under the terms of the license, but you’re not entitled to use our name or the service that we run.
If you think running servers is difficult and expensive (you’re right), ask yourself why you feel entitled for us to run them for your product.
He was specifically talking to that developer. The “You” and “You’re” in that quote was specifically targeted at the LibreSignal developer.
I recall the gurk-rs developer specifically mentioned that his client reports to Signal’s servers as a non-official app. The Signal admins can see the client name and version - just like websites can tell what browser you’re using - and could easily block third party clients if they wanted to but they don’t.
If Signal wanted to block third party clients, they would have blocked them already.
Moxie made it incredibly clear, he does not want third party is talking to the signal servers.
Libra signal took him at his word and turn themselves off
The other developers, like Molly, take a stronger road.
Is signal currently banning third party clients? No. But they’ve made it clear they don’t like them. They didn’t actually ban Libra signal, they just asked them to stop. Could they ban the clients in the future? Yes
I’ll reiterate my statement as you didn’t address it.
If Signal wanted to block third party clients, they would have blocked them already.
I haven’t seen evidence to back up your claims
If you have a backdoored client, then you would naturally object to third party clients :)
This is the same Meredith Whittaker doing interviews with US defense-department aligned sites like LawFare.
Why are all these big tech sites like wired so interested in pushing signal anyway?
I find it intriguing that the people will scrutinize messaging platforms such as Telegram, and explain in detail how one should not entrust their messages’ encryption keys to these services. Yet, these same people seem unable to comprehend the concerns regarding Signal server having access to phone numbers of its users. The fact that these people are able to perceive potential vulnerabilities in one platform while remaining oblivious to similar concerns on another highlights that their arguments are more ideological than rational.
For sure. I’m convinced signal is supported mainly for the same reason’s apple products are: it’s got a shiny user interface and it’s simple to use. That let’s them overlook all the privacy dangers behind the curtain.
A gigantic US-based service based on phone-number(meaning real identity) identifiers.
Exactly, it takes a lot of credulity to believe that the US government would just altruistically develop and fund a messaging platform that genuinely respects privacy. I recall somebody was talking about how collecting metadata is basically equivalent to having a private investigator follow you around, and I think that’s a great analogy. People tend to fixate on the content of the conversations, but the reality is that knowing who talks to whom is just as valuable.
Do you think they’re lying to authorities when they get a search warrant? https://signal.org/bigbrother/santa-clara-county/ That would be quite a big deal, and someone will be going to jail if you’re right.
All they have is your phone number, the date the account was created, and the last time it connected to the service. Yes, that represents a vulnerability, but you;re just casting aspersions that the whole thing is compromised.
Maybe there is some super secret NSA back door that Signal engineers aren’t even aware of. But it’s at least pretty clear that the local fascist authorities aren’t getting that info even with a warrant.
I think that the operations of US government are very opaque, and it’s perfectly possible that Signal has to work with authorities like the NSA, while they don’t have to cooperate with other authorities. However, even in case they currently don’t cooperate that can’t be used as a guarantee that this will continue to be the case going forward.
The key point here is that if data is leaked it has to be assumed that it is used maliciously, privacy assessments cannot be trust based. And the motivations of the government funding and promoting Signal do matter in the calculus.
Maybe the US government (or even “deep state” or something) has realized that making everyone use insecure devices for easier surveillance is as smart as forbidding fire exits so that people would be easier to arrest.
I haven’t heard too many bad things about Signal.
Various dictatorships want to simply read correspondence because the social graphs producing actual value and keeping stability in our world, and also protecting their embezzled value stored abroad, are all abroad too, and they won’t hurt these. Some politicians in the west want to invade privacy for the same reason - what they embezzle is stored in ways unaffected by insecure communications in their own countries.
But if you are part of some establishment, even if not well-meaning, you are interested to protect the system from outright erosion, meaning secure communications.
Other than that, WhatsApp and FB Messenger are owned by Zuck and he’s become too big to tolerate, Telegram is an African brothel with no protection and plenty of diseases, and in general it’s all corporate around.
Let’s please also remember that there are people of various views and interests in every organization and force.
Isn’t Signal at least partially funded by the agency?
No, they found some billionaires to do it 😉
What part of non-profit and open-source do you not understand?
Review the source, build it yourself, be happy. It uses well-known assymetric encryption algorithms. Not much your agency could really do here even if they harvest all the traffic from the server.
Was my fucking question about the integrity of the algorithms they use, or was it about who’s been funding the product? Because a quick web search will show you that they did in fact fund it at one point.
And so what? You could be an oil dictatorship prince and donate a billion to Signal. It’s not going to compromise it in any way that is not directly auditable.
So, your fuckin question is misguided. You’re “only asking questions” while implying intent.
Signal is compleletly compromised through spell check on 99% of OEM smart devices. Spell check can see what your typing word by word, and signal uses it. Feds are 100% using spell check to view your private messages. And by feds I mean every government on earth with a computer.
Spell check? If you mean smartphone keyboards, then yes, the non-foss ones are keyloggers. One of my side-projects is a privacy-oriented keyboard, but there are many out there that don’t require network calls to google or apple.
Nah dude the red squiggly lines are actually CIA backdoors
Is this some
Network Allowed
problem that I’m tooNetwork Not Allowed
to understand?Are you using a custom rom? I don’t have this option on my oneplus 9 pro. but I have something else.
GrapheneOS! I’ve been using it for a few years. Never going back.
Removed by mod
The problem is actually further - it’s that they push people to use Signal on mobile.
In the official desktop client, there is no option to register (even though it would likely be not that hard to add a box accepting a verification code), they tell you to use it in the mobile app instead. All while far from all phones can have privacy-respecting OSes installed on them at all.
Yes, there are ways around (Signal-cli or an Android VM - and even then you have to use Molly since the official client requires you to scan a QR rather than following a link). But arbitrarily directing people to a platform that is harder to make private is nonetheless weird.
The thing I hate about signal is the UI. Everything looks way too big on my device. WhatsApp, for example, holds 2 more chats, and the messages themselves are tidier.
This may seem like it’s not a big deal, but UI is absolutely crucial on order to get people to actually use the app. I moved a few people to signal but they just hated the way it looks. “seems like an app for old people, font too big”. I can see that. They moved back to insta/WhatsApp.
I think some small and easy UI changes could make the app much better: just give us a “compact” mode.
Both WhatsApp and Signal show the same amount of chats to me (9 for both). WhatsApp does show a small sliver of a tenth chat, but it’s not really properly visible. There is a compact mode for the navigation bar in Signal, which helps a bit here.
From what I can see there’s slightly more whitespace between chats, and Signal uses the full height for the chat (eg same size as the picture), whereas WhatsApp uses whitespace above and below, pushing the name and message preview together.
In chats the sizes seem about the same to me, but Signal colouring messages might make it appear a bit more bloated perhaps? Not sure.
For me, I can see 7 chats on signal, 9 chats on WhatsApp. There are tons of wasted space on signal for me. It just looks bad to my eyes.
Yeah, Signal is more than encrypted messaging it’s a metadata harvesting platform. It collects phone numbers of its users, which can be used to identify people making it a data collection tool that resides on a central server in the US. By cross-referencing these identities with data from other companies like Google or Meta, the government can create a comprehensive picture of people’s connections and affiliations.
This allows identifying people of interest and building detailed graphs of their relationships. Signal may seem like an innocuous messaging app on the surface, but it cold easily play a crucial role in government data collection efforts.
Also worth of note that it was originally funded by CIA cutout Open Technology Fund, part of Radio Free Asia. Its Chairwoman is Katherine Maher, who worked for NDI/NED: regime-change groups, and a member of Atlantic Council, WEF, US State Department Foreign Affairs Policy Board etc.
Katherine Maher is a very busy beaver. She’s a major figure in the contemporary manufacturing of consent.
2020: Meet Wikipedia’s Ayn Rand-loving founder and Wikimedia Foundation’s regime-change operative CEO
2024: Web Summit CEO jumps ship to head up NPR after just 3 months
Ahh classic can’t question loyalty to the genocide state lol
Yeah, Signal is more than encrypted messaging it’s a metadata harvesting platform. It collects phone numbers of its users, which can be used to identify people making it a data collection tool that resides on a central server in the US. By cross-referencing these identities with data from other companies like Google or Meta, the government can create a comprehensive picture of people’s connections and affiliations.
This allows identifying people of interest and building detailed graphs of their relationships. Signal may seem like an innocuous messaging app on the surface, but it cold easily play a crucial role in government data collection efforts.
Strictly speaking, the social graph harvesting portion would be under the Google umbrella, as, IIRC, Signal relies on Google Play Services for delivering messages to recipients. Signal’s sealed sender and “allow sealed sender from anyone” options go part way to addressing this problem, but last I checked, neither of those options are enabled by default.
However, sealed sender on its own isn’t helpful for preventing build-up of social graphs. Under normal circumstances, Google Play Services knows the IP address of the sending and receiving device, regardless of whether or not sealed sender is enabled. And we already know, thanks to Snowden, that the feds have been vacuuming up all of Google’s data for over a decade now. Under normal circumstances, Google/the feds/the NSA can make very educated guesses about who is talking to who.
In order to avoid a build-up of social graphs, you need both the sealed sender feature and an anonymity overlay network, to make the IP addresses gathered not be tied back to the endpoints. You can do this. There is the Orbot app for Android which you can install, and have it route Signal app traffic through the Tor network, meaning that Google Play Services will see a sealed sender envelope emanating from the Tor Network, and have no (easy) way of linking that envelope back to a particular sender device.
Under this regime, the most Google/the feds/the NSA can accumulate is that different users receive messages from unknown people at particular times (and if you’re willing to sacrifice low latency with something like the I2P network, then even the particular times go away). If Signal were to go all in on having client-side spam protection, then that too would add a layer of plausible deniability to recipients; any particular message received could well be spam. Hell, spam practically becomes a feature of the network at that point, muddying the social graph waters further.
That Signal has
- Not made sealed sender and “allow sealed sender from anyone” the default, and
- Not incorporated anonymizing overlay routing via tor (or some other network like I2P) into the app itself, and
- Is still in operation in the heart of the U.S. empire
tells me that the Feds/the NSA are content with the current status quo. They get to know the vast, vast majority of who is talking (privately) to who, in practically real time, along with copious details on the endpoint devices, should they deem tailored access operations/TAO a necessary addition to their surveillance to fully compromise the endpoints and get message info as well as metadata. And the handful of people that jump through the hoops of
- Enabling sealed sender
- Enabling “allow sealed sender from anyone”
- Routing app traffic over an anonymizing overlay network (and ideally having their recipients also do so)
can instead be marked for more intensive human intelligence operations as needed.
Finally, the requirement of a phone number makes the Fed’s/the NSA’s job much easier for getting an initial “fix” on recipients that they catch via attempts to surveil the anonymizing overlay network (as we know the NSA tries to). If they get even one envelope, they know which phone company to go knocking on to get info on where that number is, who it belongs to, etc.
This too can be subverted by getting burner SIMs, but that is a difficult task. A task that could be obviated if Signal instead allowed anonymous sign-ups to its network.
That Signal has pushed back hard on every attempt to remove the need for a phone number tells me that they have already been told by the Feds/the NSA that that is a red line, and that, should they drop that requirement, Signal’s days of being a cushy non-profit for petite bourgeois San Francisco cypherpunks would quickly come to an end.
Incidentally, this explains why Signal insists that the app has to be installed through the Play store as opposed to f-droid.
Strictly speaking, you can download it directly from their website, but IIRC, the build will still default to trying to use Google Play Services, and only fall back to a different service if Google Play Services is not on the device. Signal really, really wants to give Google insight into who is messaging who.
exactly, vast majority of users will be going through Google’s store when installing it
Anyone who has any experience with centralized databases, would be able to tell you how useless sealed sender is. With message recipients and timestamps, it’d be trivial to discover who the senders are.
Also, signal has always had a cozy relationship with the US (radio free asia was it’s initial funder) . After yasha levine posted an article critical of signal a few years back, RFA even tried to do damage control at a privacy conference on signal s behalf:
Libby Liu, president of Radio Free Asia stated:
Our primary interest is to make sure the extended OTF network and the Internet Freedom community are not spooked by the [Yasha Levine’s] article (no pun intended). Fortunately all the major players in the community are together in Valencia this week - and report out from there indicates they remain comfortable with OTF/RFA.
These are high-up US government employees trying to further spread signal.
You can read more about this here.
A really excellent writeup!
Law enforcement doesn’t request data frequently enough in order to build a social graph. Also they probably don’t need to as Google and Apple likely have your contacts.
Saying that it is somehow a tool for mass surveillance is frankly wrong. It has its issues but it also balances ease of use. It is the most successful secure messager out there. (WhatsApp doesn’t count)
Sure it has problems. I personally don’t understand there refusal to be on F-droid. However, phone numbers are great for ease of use and help prevent spam. You need to give your personal information to get a phone number. Signal also has very nice video calls which no other messager can seem to replicate.
Law enforcement doesn’t request data frequently enough in order to build a social graph. Also they probably don’t need to as Google and Apple likely have your contacts.
They don’t need to request data. They have first-class access to the data themselves. Snowden informed us of this over a decade ago.
Saying that it is somehow a tool for mass surveillance is frankly wrong.
Signal per se is not the mass surveillance tool. Its dependence on Google is the mass surveillance tool.
However, phone numbers are great for ease of use and help prevent spam.
And there’s nothing wrong with allowing that ease-of-use flow for users that don’t need anonymity. The problem is disallowing anonymous users.
Signal is not dependent on Google. Also to my knowledge Signal isn’t part of AT&T
If that were the case Molly FOSS wouldn’t exist
If that were the case Molly FOSS wouldn’t exist
I’m not speaking of hard dependence as in “the app can’t work without it.” I’m speaking to the default behavior of the Signal application:
- It connects to Google
- It does not make efforts to anonymize traffic
- It does makes efforts to prevent anonymous sign-ups
Molly FOSS choosing different defaults doesn’t change the fact that the “Signal” client app, which accounts for the vast majority of clients within the network, is dependent on Google.
And in either case – using Google’s Firebase system, or using Signal’s websocket system – the metadata under discussion is still not protected; the NSA doesn’t care if they’re wired into Google’s data centers or Signal’s. They’ll be snooping the connections either way. And in either case, the requirement of a phone number is still present.
Perhaps I should restate my claim:
Signal per se is not the mass surveillance tool. Its
dependence on Googledesign choices of (1) not forcing an anonymization overlay, and (2) forcing the use of a phone number, is the mass surveillance tool.
It collects phone numbers of its users, which can be used to identify people making it a data collection tool that resides on a central server in the US. By cross-referencing these identities with data from other companies like Google or Meta, the government can create a comprehensive picture of people’s connections and affiliations.
That’s fuck up. I always found bad to have the phone number as requirement but that’s make a lot of sense.
Indeed, the fact that the phone number is a requirement is a huge red flag for any platform that claims to care about privacy.
Phone numbers are no longer required iirc
Phone numbers are no longer required iirc
Phone numbers are still required to register and maintain an account. Only difference now is you can choose to hide it from other users and give people a ‘username’ to look you up with instead.
Yog is gettin downvoted by dotworld feds but as usual is undefeated in the comments.
😄
deleted by creator
If Tor leaks data about you then yes you should also be concerned about that.
That has nothing to do with the team behind it. Also it is the best tool right now even if it isn’t perfect. You just need to be aware of its limitations. (For the love of god turn off JavaScript)
I hate to break it to you but the internet itself was created by the US.
The team behind it very much does matter because you can infer the motivations from knowing who develops a particular piece of technology. However, my point was that the question with both Signal and Tor is what data they leak based on their technical design. That’s what people should be concerned with first and foremost.
Meanwhile, the internet was created by CERN https://home.cern/science/computing/where-web-was-born
deleted by creator
Wait until you here about DARPA
Cross referenced you on the sister thread.
People there positing that this is no correct. Granted their info appears to be signal “disclosed” to the feds as part of a court proceed what it collects, which is only apparently when you connect to the server.
Doesnt answer the issue if they could collect your call logs though
My reply from the other thread. People who claim this isn’t true aren’t being honest. The phone number is the key metadata. Meanwhile, nobody outside the people who are actually operating the server knows what it’s doing and what data it retains. Faith based approach to privacy is fundamentally wrong. Any data that the protocol leaks has to be assumed to be available to adversaries.
Furthermore, companies can’t disclose if they are sharing data under warrant. This is why the whole concept of warrant canary exists. Last I checked Signal does not have one.
When you install Signal, it asks for access to your contacts, and says very proudly, “we don’t upload your contacts, it all stays on your phone.”
And then it spams all of your contacts who have Signal installed, without asking your first.
And it shares your phone number with everyone in your contacts who has Signal installed.
And then when you scream ARE YOU FUCKING KIDDING ME and delete your account and purge the app, guess what? All those people running Signal still have your phone number displayed for them right there in plain text. Deleting your account does not delete the information that the app shared without your permission.
So yeah. Real nice “privacy” app you’ve got there.
Update, 2018: Subsequently.
Wow didn’t even know about that, what a shit show. It’s so weird how Signal has become a sacred cow in the west now, and you can’t have a rational discussion about its many problems without a whole bunch of trolls piling on saying you should just put faith in Signal unconditionally.
It is a decent app, it does what it says. Daddy can’t read your shit until quantum break encryption.
Real question is whether it is a honeypot to make edgelords feelz good. Strong allegation, no doubt but we are also in the grey zone it seems. Based on that, you have to assume, they are farming the info at least to the security apparatus.
That’s my view as well, the only way to know that data isn’t being used for adversarial purposes is not to share it in the first place. I think it’s fine to use Signal as long as it’s an informed choice. The primary issue I have is that people don’t seem to want to accept that Signal collects phone numbers and that this could be used in a nefarious way. It seems to be an ideological stance as opposed to a rational one.
The app (locally, on your device) checks if someone from your contact list installed (became available) on Signal, and if they did, you get notified by the app.
And it shares your phone number with everyone in your contacts who has Signal installed.
Someone can get notified only if they already have you in their contact list (so they already have your phone number), and have Signal installed.
I still wish you could choose if you want others to be notified tho…
phone number isn’t just any metadata; it is the anchoring data around which the rest of metadata is collected, and it is also connected to govt/corporate verified real identity.
why would anyone even claim to offer privacy around such an anchor ?
Exactly, especially when we’re talking about the US government that has access to all the data from other large US based media companies like Google and Meta. We know this for a fact thanks to Snowden leaks. Once you have a phone number, you know the identity of the person, and you can trivially cross reference all the other data to see if that person is of interest. And thanks to their Signal connection graph, the government can easily tell what other people they communicate privately with.
And thanks to their Signal connection graph, the government can easily tell what other people they communicate privately with.
So what? I’m sure your neighbor couple talk privately to each other most of the time and you know that happens. The important part is that the conversation is private.
Signal is not an anonymous messenger app. It never claimed to be. It’s for you to have a private conversation where your device holds the encryption keys.
Not like WhatsApp, where Meta has access to the keys of all conversations. Also 95 % of the worlds population is on WhatsApp, so why don’t you go and complain to them for lack of privacy and security?
If you want an “anonymous” chat client they are out there to use. Good luck getting more people onboard other than your savy friend.
If you understand that this information is being leaked, and that’s not part of your threat profile that’s perfectly fine. The problem is that a lot of people don’t seem to understand the implications of Signal harvesting phone numbers, and therefore make bad assumptions regarding the safety of using Signal. It’s pretty clear that a lot of people aren’t conscious about this in this very thread in fact.
yes most people seem oblivious what mass bulk data collection can do.
and nobody has yet to answer, if there is something to stop Signal from collecting metadata logs of its users and their groups.
it does not seem people understand this risk.
either way, nobody produced a reasonable position on this. so presumption is that signal can farm this data and sell/give it out. since best we got is Signal’s responses to US courts which would also be subject to the same conditions if national security type people got involved.
Signal’s use case is “authentic communication”. like when a govt person interacts with other govt person and doesn’t want a second govt to snoop on the actual contents on the communication, but accepts that metadata is public.
It is whatsapp for such people, without being whatsapp.
But then why would you use whatsapp either ?
This is really interesting. It brings two questions to mind.
-
Don’t all messaging apps use phone number as a primary metadata value?
-
Are you suggesting that Signal could either not use this metadata or not collect it and yet they choose to collect it and can therefore lose it to exfiltration or warrant?
- Nope, for example Wire is based on Signal protocol and doesn’t harvest phone numbers https://wire.com/en
- I’m suggesting that if metadata is being leaked then it has to be assumed that it will be used nefariously at some point
Exact same argument that applies for wanting e2e encrypted messages that aren’t seen by the server also applies to any metadata associated with these messages.
-
I use the Molly-FOSS fork, do you know if that removes the metadata collection? I know it doesn’t use any Google Play Services and it comes with its own notification bubble though.
It doesn’t because you’re still talking to the same server.
I see. Thanks.
Signal does not collect metadata.
that amounts to trust me bro since nobody actually knows what the server does with the data
You don’t have to trust the server and shouldn’t have to trust the server if the client is doing proper E2E because you know the maximum amount of metadata it’s got.
Your phone number is the metadata that’s not encrypted, that’s literally the whole problem here. Signal server is able to harvest graphs of phone numbers that interact with one another.
With ‘sealed sender’ your phone number, or any other identifying information, is not included in the metadata on the envelope, only the recipient’s id is visible, and it’s up to the recipient’s client to validate the sender information that is inside the encrypted envelope. It looks like a step in the right direction, though I don’t use signal enough to have looked into auditing it myself.
Again, this is a trust based system because you don’t know what the server is actually doing. The fact is that the server does collect enough information to trivially make the connection between phone numbers and the connections on the network. If trust me bro from Moxie is good enough for you, that’s of course your prerogative.
I’m talking about the information the server has. The encrypted envelope has nothing to do with that. Your register with the server using your phone number, that’s a unique identifier for your account. When you send messages to other people via the server it knows what accounts you’re talking to and what their phone numbers are.
first comment to provide a decent counterpoint.
Looks like signal and email use both. but it still does not answer
AI said:
The server knows who initiated the communication (they handed over their lockbox first), but not the direction of individual messages within a conversation.
The identifier is unavoidable for push notifications to work. It needs to know which phone to send it after all, even if it doesn’t use Google’s services, it would still need a way to know which device has new messages when it checks in. If it’s not a phone number it’s gonna be some other kind of ID. Messages need a recipient.
Also, Signal’s goal is protecting conversations for the normies, not be bulletproof to run the next Silk Road at the cost of usability. Signal wants to upgrade people’s SMS messaging and make encryption the norm, you have to make some sacrifices for that. Phone numbers were a deliberate decision so that people can just install Signal and start using E2E texting immediately.
If you want something really private you should be using Tor or I2P based solutions because it’s the only system that can reasonably hide both source and destination completely. Signal have your phone number and IP address after all. They could track your every movements.
Most people don’t need protection against who they talk to, they want privacy of their conversations and their content. Solutions with perfect anonymity between users are hard to understand and use for the average person who’s the target audience of Signal.
The identifier absolutely does not need to be your phone number, and plenty of other apps are able to do push notifications without harvesting personal information from the users.
Meanwhile, normies don’t need Signal in the first place since e2ee primarily protects you from things like government agencies snooping on your data.
Just a side note but both Simplex Chat and Briar are free of unique identifiable IDs.
For Simplex Chat it uses hash tables. It still has a centralized server (which you can self host) but you can use the built in Tor functionality to hide your IP.
For Briar it is totally decentralized. All messages go directly over Tor but it also can use WiFi and Bluetooth. It supports group content types such as Forms and blogs. The downside is that you need a connected device. You can also use Briar Mailboxes on a old phone to receive messages more reliably.
Signal has been forced by court to provide all the information they have for specific phone numbers [0][1]. The only data they can provide is the date/time a profile was created and the last date (not time) a client pinged their server. That’s it, because that’s all the data they collect.
Feel free to browse the evidence below, they worked with the ACLU to ensure they could publish the documents as they were served a gag order to not talk about the request publicly [2].
[0] https://signal.org/bigbrother/
[2] https://www.aclu.org/sites/default/files/field_document/open_whisper_documents_0.pdf#page=8
Once again, even if this is the way things worked back in 2016 there is no guarantee they still work like that today. This is the whole problem with a trust based system. You are trusting that people operating the server. It’s absolutely shocking to me that people have such a hard time accepting this basic fact.
Once again, even if this is the way things worked back in 2016 there is no guarantee they still work like that today.
You have to trust someone. You’re not building all your software and reading every line yourself are you?
While there’s no guarantees, Signal continues to produce evidence that they don’t collect data. Latest publication August 8th, 2024: https://signal.org/bigbrother/santa-clara-county/
The code is open has had a few audits: https://community.signalusers.org/t/overview-of-third-party-security-audits/13243
This is the whole problem with a trust based system
Can you point me to a working trustless system? I’m not sure one exists. You might say peer-to-peer systems are trustless because there’s no third party, but did you compile the code yourself? did you read every last line of code before you compiled and understood exactly what it was doing?
It’s absolutely shocking to me that people have such a hard time accepting this basic fact.
What’s shocking to me is the lack of understanding that unless you’re developing the entire platform yourself, you have to trust someone at some point and Signal continues to post subpoenas to prove they collect no data, has an open source client/server, provides reproducible builds and continues to be the golden standard recommended by cryptographers.
I would recommend to anyone reading this to rely on the experts and people who are being open and honest vs those who try to push you to less secure platforms.
You have to trust someone. You’re not building all your software and reading every line yourself are you?
No, you don’t have to trust anyone. That’s literally the point of having secure protocols that don’t leak your personal data. 🤦
Signal made an intentional choice to harvest people’s phone numbers. The rationale for doing that is very thin, and plenty of other messengers avoid doing this. The fact that Signal insists on doing that is a huge red flag all of its own.
The code is open has had a few audits
Only people who are actually operating the server know what’s running on it. The fact that Signal aggressively prevents use of third party clients and refuses to implement federation that would allow other servers to run is again very suspect.
Can you point me to a working trustless system?
SimpleX, Matrix, Briar, and plenty of other chat systems do not collect personal data.
You might say peer-to-peer systems are trustless because there’s no third party, but did you compile the code yourself? did you read every last line of code before you compiled and understood exactly what it was doing?
The discussion in this thread is specifically about Signal harvesting phone numbers. Something Signal has no technical reason to do.
What’s shocking to me is the lack of understanding that unless you’re developing the entire platform yourself, you have to trust someone at some point and Signal continues to post subpoenas to prove they collect no data, has an open source client/server, provides reproducible builds and continues to be the golden standard recommended by cryptographers.
Kind of ironic that you’ve exposed yourself as being utterly clueless on the subject while accusing me of lack of understanding.
I would recommend to anyone reading this to rely on the experts and people who are being open and honest vs those who try to push you to less secure platforms.
I would recommend anyone reading this to rely on rational thinking and ignore trolls who tell you to just trust Signal. Privacy and security are not based on trust, and if you ask any actual expert in the field they will tell you that.
True but I find the opposite end of the spectrum hard to believe. Extraordinary claims require extraordinary proof.
What is known is that government agents from countries like Iran, China and Russia actively are spreading misinformation. Not to say that you are a government agent but you should doubt the argument on both sides. For instance, using Signal is way better than not using an audited encrypted messager. Often times I see people jump to worse platforms. I think it is important to understand the problems with Signal.
It’s well known that the US and other western countries actively spread misinformation. It’s also known thanks to Snowden that the US regime harvests personal data aggressively. Anybody who puts blind faith into a US based security company is frankly an imbecile.
True, however your claim lacks evidence. They have your phone number and a few time stamps. That isn’t going help much.
My claim is that privacy should not be based on trust. This appears to be a very difficult concept for people in this thread to understand.
You always will have to trust something at some level.
Yeah, you trust that the encryption algorithm is designed correctly and that it doesn’t leak data because many people have audited it and nobody found a flaw in it. You absolutely will not have to trust people operating servers however. If you can figure out why e2ee is important then I’m sure you’ll be able to extrapolate from that why metadata shouldn’t be seen by the server either.
I’m not very tech-savvy, and that article looks very nice, but it’s kind of old and it’s true that they haven’t been as transparent (and frequently audited) as other services and they still require a phone number to set up an account, even if you can switch to only using a username later. Also, they removed encrypted database, and Molly brings that back which is the main reason I use it. Another thing I don’t like about Signal is how ferociously they’ve tried to shut down forks in the past, and how they don’t say that you need Google Play Services for it to work properly. Sadly it’s the only “privacy-conscious” service I’ve managed to make most of my family and friends use, after trying for years.
They only shut down forks that violate Signal branding. Mozilla does the same thing with Firefox.
It is libre so if you fork it there is nothing they can do. Also if they were really hostile they would of used a non libre license or made it entirely proprietary.
They have your phone number and time stamps. Nothing more nothing less. Also chances are that isn’t being used to create a massive social graph or whatever the Lemmy.ml users are going on about.
For most people it doesn’t matter. Signal has the benefit of being widely adopted and being easy to use. Simplex Chat is another alternative although it isn’t as well funded or as well known.
There is no metadata harvesting on Signal and the use of a phone number is so convenient and helped massively with adoption from the general unaware public.
I loved that it acted as a private and secure drop in replacement for SMS (particularly before they removed that integration) that does what I needed and does it very well and easily connects me with people that already have my number. This made sharing Signal very easy. The only data Signal has to even provide to the authorities is your registration date, phone number, and time of last connection. The absolute minimum. It’s fantastic. If you compare this to Whatsapp which has everything but the exact content of your messages, it’s not even a contest.
For myself on Signal and everyone else I’ve known that that uses Whatsapp or Insta or whatever, the extra absolute anonymity of also removing phone numbers from the already small equation just isn’t needed or worth it, otherwise you wouldn’t be using Signal, let alone fucking Facebook.
You can believe whatever you want of course, but the reality is that Signal collects phone numbers on registration and these can be used in many ways. The fact that you chose to trust Signal to be a good actor is your prerogative, but it’s based purely on your faith which is not how privacy or security works.
deleted by creator
I don’t think you’re aware of how independent audits, open source, good cryptography, a non-profit, government data subpoenas, and a lack of data collection works.
I think that you maybe the one who doesn’t understand how any of this works. Security and privacy are guaranteed by design, and any information that is collected has to be assumed to be available to bad actors. Period. The same reason logic about trusting the server to do the encryption applies to letting the server handle metadata. No amount of audits can guarantee that people operating the server are doing it in good faith.
Meanwhile, the concern isn’t just about somebody having your phone number it’s about Signal server having the ability to map out relationships between these numbers. It’s perfectly fine for people to reason that this is not something they’re worried about, and make an informed choice to use Signal. However, it’s incredibly disingenuous to pretend this problem doesn’t exist.
Edit: nevermind I typed a lot but that Lemmygrad user made a far better post that I agree with.
This message is definitely giving all the vibes of a disinformation/misinformation attempt. There is no metadata to harvest from signal.
Here is an example of all the extent of data that signal has on any given user: https://signal.org/bigbrother/cd-california-grand-jury/
It involves phone number, account creation time and last connected time. That’s it. Nothing more.
The cross referencing of data is just nonsense. Google and meta already have your phone number. Adding signal info to it adds absolutely zero information to them. They have it all already. They know nothing of who you talk with, which groups you are part of.
The funding of Signal did involve public grants but that’s not anything bad. Many projects and nonprofits receive public money. It does not imply that there are backdoors or anything like that. And signal was purposefully designed so that no matter who owns and operates it, the messages stay hidden independently on the server infrastructure. They did the best possible to remove themselves from the chain of trust. Expert cryptographers and auditors trust signal. Don’t listen to this random ramble of an online stranger whose intentions are just to confuse you and make you doubt.
It’s fascinating that these kinds of trolls come out of the woodwork any time obvious problems with Signal are brought up.
Phone numbers very obvious are metadata. If you think that cross referencing data is nonsense then you have absolutely no clue what you’re talking about. It’s not about Google or Meta having your phone number, it’s about having a graph of people doing encrypted communication with each other over Signal. The graph of contacts is what’s valuable.
Don’t listen to this random ramble of an online stranger whose intentions are just to confuse you and make you doubt.
What you absolutely shouldn’t listen to are trolls who tell you to just trust that Signal is not abusing the data it’s collecting about you. The first rule of security is that it can’t be faith based.
What are you talking about? you get a phone number from signal, and what will you be able to derive from it? there is no graph. signal does not hold any “relationships” information.
The phone number is a unique identifier for your account. When you send a message to another user on Signal, that message goes to the server, and then gets routed to the other party. The server therefore has to know which parties talk to each other. Let me know if you have trouble understanding this and need it explained in simpler terms.
Youre right, thats how it works in almost all messaging apps. But signal implemented sealed sender specifically to counter this.
You can read more about it here: https://signal.org/blog/sealed-sender/
I encourage you to read the first paragraph, which is important in the context of our conversation.
I’m talking about the information the server has. The encrypted envelope has nothing to do with that. Your register with the server using your phone number, that’s a unique identifier for your account. When you send messages to other people via the server it knows what accounts you’re talking to and what their phone numbers are. The first paragraph amounts to nothing more than trust me bro because the only people who know what the Signal server actually does are the people operating it.
You are routing your traffic over the public internet. Nothing is secure at all. That’s why we implement strong cryptography
Seriously, what are you talking about? The vast majority of people don’t want anonymity. Obviously Signal isn’t cut out for that! The fact is, most people don’t care about anonymity.
And what metadata can you harvest exactly from a UNIX timestamp and phone number? Signal can tell who is communicating to who, but they cannot read your messages.
Anyone who has worked with centralized databases can tell you how useless that is. With message recipients and timestamps, its trivial to find the real sender.
Give me your phone number. I’ll quickly be able to find out where you live.
Signal’s hostility to 3rd party clients is a huge red flag.
Can you further explain? A red flag to open-source, federation and such, can’t disagree. But to privacy and security? I’m not convinced.
Third party clients are the best way to verify that the protocol works as advertised.
If you backdoored your client, then you will naturally oppose anyone else who develops a client.
Its the tankies.
Honestly if they can recommend something better I’m all for it but I haven’t heard anything.
Take a look here for some alternatives:
https://dessalines.github.io/essays/why_not_signal.html#good-alternatives
- Matrix
- XMPP
- Briar
- SimpleX
Also just because there are no alternatives doesn’t mean your default position should be we just have to trust whatever exists now because it’s good enough. Or that we can’t criticize it ruthlessly, distrust it. Call it out and as a result of that build perhaps the desire for something better, a fix as it were.
The evidence and history clearly points towards Signal being very suspicious and likely in bed with the feds. This is not conspiracy thinking. Conspiracy thinking is thinking that the country/empire that gave away old German engima machines whose code they’d cracked to developing countries without telling them they’d cracked it in the late 40s/early 50s, that went on to establish a crypto company just to subvert its encryption. That’s done everything Snowden revealed has in fact changed suddenly for the first time in half a century for no particular reason and not to its own benefit. That’s fanciful thinking. That’s a leap of logic away from the proven trends, the pattern of behavior, and indeed the incentivizes to continue using their dominant position to maintain dominance and power. They didn’t back down on the clipper chip because they just gave up and decided to let people have privacy and rights. They gave up on it because they found better ways of achieving the same results with plausible deniability.
Also why is everything “tankies” with you people. Privacy advocates point out the obvious and suddenly it’s a communist conspiracy. LOL
-
Matrix and XMPP are not alternatives and are worse for privacy and security
-
Simplex Chat is actually is pretty sold but isn’t the most user friendly
-
Briar is very cool but its complexity makes it hard to use. It also has problems with real time communications
Matrix and XMPP are not alternatives and are worse for privacy and security
XMPP is exactly as good or bad for privacy as the servers and clients you choose. It’s a protocol, not a service. Unlike Signal, which is a brand/app/service package.
The protocol is worse for privacy
Is that better?
The protocol is worse for privacy
‘Trust me bro’
The problem is, you’re comparing apples with orchards. Analogous would be: ‘email is worse for privacy than yahoomail’. Plus in this scenario yahoomail only lets you send emails to yahoomail addresses.
-
0% chance that the feds don’t have Signal backdoors, otherwise Wired wouldn’t be promoting it. fyi everyone Proton is CIA. It’s modern cryptoAG.
https://community.signalusers.org/t/overview-of-third-party-security-audits/13243
https://freedom.press/newsletter/crossfire-over-messaging-security/
https://freedom.press/training/locking-down-signal/
You don’t have to take Signal’s word for it, because it’s been audited. The EFF, who are VERY privacy minded, and do extensive research into this type of thing, recommends Signal because it’s known to be secure.
Does the EFF have access to signal’s server? Where they store all the phone numbers and messages for its users?
Well, I disagree about Signal. Proton however, I agree is extremely shady and should be avoided at all costs.
That’s pretty strong and I’ve never seen or heard anything like it before. If it’s true I’m betting the rest of Lemmy would like some details, too.
No support for Monero despite it being requested on uservoice 6 years ago. A Bitcoin wallet (seriously?) which is easily traceable. Important email metadata is also not zero access encrypted (i.e., subject headers, from/to headers) which leaks a substantial amount of information even if the body is encrypted. Not to mention they had clearnet redirects from their onion service a while back, something a lot of honeypots usually do.
Even if it’s not a honeypot, you’re sure as hell not getting any privacy with Proton. That’s for sure.
You can’t e2e the to and from headers in an email. that’s a problem with the protocol, not with proton. I’d assume the subject line falls into a similar bucket, because mailservers probably want to use it to filter spam
I never said anything about E2EE. Please re-read what I wrote carefully.
Centralized service with servers in the US, requires a phone number to create an account, and tech bros like it. “0% chance” 100% confirmed.
Ppl just gone use it to cheat smh