privacy

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

elias_griffin, (edited ) in Signal leaked random contacts to me!

Huge if true! You could conceivably submit your phone to a Cybersecurity company and share in any reward.

Help us with:

  • Your OS Version
  • OS settings that are possibly related
  • How you obtained Signal
  • Signal version
  • Video proof
  • Steps to reproduce

Who knows how to compute a hash for an installed mobile phone app? We need to compare it with legit.

ErKaf,

imgur.com/a/a6CQSpA

The video proof. It also shows the OS and Steps to reproduce. How I obtained Signal: Flathub Signal Version: 6.38.0 OS Settings: Nothing relevant.

filcuk,

Wicked, thanks for sharing

mintycactus,
@mintycactus@lemmy.world avatar

deleted_by_author

  • Loading...
  • ErKaf,

    Tell me any other more offical way to optain Signal on fedora. Signal only provides .deb files. Flathub is my only option.

    Pantherina,

    Flathub. Opensuse has a repo but just use Flathub, Dependencies are a mess.

    ErKaf,

    Oh you mean literally the source I said in the comment above.

    Pantherina,

    Yup either official and through an Ubuntu/Debian container, or mess up your local system with the Opensuse Repo, or just use the Flatpak that just works

    ErKaf,

    Yea so what I already do…

    elias_griffin, (edited )

    This is super helpful, I may post this to infosec.exchange. Flathub makes this so much more difficult to find the reason for what looks like a real breach. I don’t use Flathub for security reasons so I don’t know if you can even isolate the PID? Anyone know?

    I don’t want you to have to spend a lot of time or troubleshoot over the web but if you see anything that stands out as “wow shouldn’t be there/running” when you run these commands come back to us:

    1. ps the PID of Signal or secondarily, Flathub
    2. lsof -p PID
    3. strace
      • sudo strace -f -t -e trace=file -p PID
    4. sysctl kernel.randomize_va_space
      • pkill/killall Flathub/Signal and restart FH/Signal and see if it still presents the vulnerability
    pkill, (edited )

    I advise you stop using Signal Desktop immediately, they keep the database key in plaintext. Exposed over 5 years ago and still not fixed. Frankly I find this pretty pathetic. Making this safer could be as simple as encrypting such files with something like age and perhaps regenerate the keys on a frequent basis (yes I know full disk encryption is somehow a viable solution against unwanted physical access. But instead, they’d rather focus on security by network effect by adding shiny UX features instead of fixing infrastructural stuff, like improving trust by decentralization, not requiring phone numbers to join, or adding support for app pasphrase (which is available in case of Molly, along with regular wiping of RAM data which makes things like cold boot or memory corruption attacks harder)

    ErKaf,

    There is nothing more that I hate then typing on my Phone. I can’t life without Signal Desktop.

    wincing_nucleus073,
    pkill,

    maybe try setting up a matrix bridge if you feel confident you can secure that properly. On one hand it might increase attack surface (use only servers and bridges with End to Bridge Encryption) but what’s an attack surface on software that is so ridiculously compromised. Also you can try using an alternative client such as Flare. Though YMMV, for me the last time I’ve used it it was quite rough around the edges but I’m happy to see it’s actively maintained so might be worth checking out.

    Also no, flatpak doesn’t fix this issue. Yeah it provides some isolation which can be further improved with flatseal, and other defense-in-depth methods. But unless you are willing to face the trade-offs of using Qubes, you won’t compartmentalize your entire system. The key file in question is stored in ~/.local/share. I’m not denying vulnerabilities in userland applications, but thanks to it’s wide reach, often massive codebases and use of unsafe languages like C, it’s the core system or networked software that is the most common attack vector. And that doesn’t ship and will never ship via flatpak.

    The most obvious way this is exploitable is directory traversal. But not only that. Just look up “Electron $VULNERABILITY”, be it CSRF, XSS or RCE. Sandbox escape is much easier with this crap than any major browser, since contextIsolation is often intentionally disabled to access nodejs primitives instead of electron’s safer replacements. Btw Signal Desktop is also an electron app.

    JoeBidet, in Signal leaked random contacts to me!
    @JoeBidet@lemmy.ml avatar

    am glad that simplex.chat doesn’t even need to touch sensitive personal data strong selectors such as phone numbers or email addresses!

    emptiestplace,

    Why is this being downvoted?

    shadearg,
    @shadearg@lemmy.world avatar

    I think some people get lost and don’t realize that this is a privacy-centric community.

    The mere potential for identifier leaking is 100% anti-privacy.

    pkill,

    Also, Signal’s centralization, sussy shenanigans with mobilecoin and not updating their server app repo for over a year (latter they ceased afterwards iirc but still very detrimental to trust, especially since git reflog manipulation is ridiculously easy) and dependence on proprietary libraries and network services (in case of libraries there are thankfully at least a couple forks without such dependencies). Plus most of their servers that aren’t necessarily CDN being located in glowieland…

    shadearg,
    @shadearg@lemmy.world avatar

    The huge red flag to me is that Signal is no longer decried as the devil of western intelligence anymore.

    Frank Figliuzzi (former FBI cointel) and Chuck Rosenberg (former DEA admin) used to rail on about all of the dangers posed by Signal, but I haven’t heard an unkind word in over a couple years now.

    pkill,

    French authorities consider it a “terrorist app”. Louis Rossmann made a video about it. It was in some court case but at this point I don’t remember whether it was a local court or higher and frankly don’t care enough to check.

    emptiestplace,

    Privacy aside, but just for a second - if we don’t hold ourselves to a higher standard, our standard will just be lower. That’s all that will happen.

    shadearg,
    @shadearg@lemmy.world avatar

    We each make a choice according to our level of comfort in concern to privacy, or lack thereof, in how we choose to conduct ourselves afforded by the solutions we utilize and the rituals we observe.

    Remember, privacy can never be enforced or guaranteed, only encouraged. Best practices, as available, as it were.

    emptiestplace,

    Agree, but I wasn’t talking about privacy.

    shadearg,
    @shadearg@lemmy.world avatar

    Privacy aside, but just for a second

    I apologize, you were very clear about being outside of privacy. Forgive me, I’m having trouble separating its context in this regard.

    I liken level of standard similar to personal reputation. At the end of the day, that’s all we have—we accept what we are willing to live with.

    emptiestplace,

    No worries, it seems like you understand perfectly - I was just reflecting on the downvotes above.

    I like it here because the people often seem real, and the voting generally seems (to me, anyway) to follow more of a meritocratic pattern than whatever the fuck has been going on at the other place for the last ten or more years.

    We should probably try to really understand these differences so we might get better at designing communities that are actually sustainable. Maybe I am just getting old - I’m tired of starting over, I’m tired of watching great communities self-destruct.

    KLISHDFSDF,
    @KLISHDFSDF@lemmy.ml avatar

    Likely because while simplex looks great and is very promising, it doesn’t add much to the conversation here. Signal is primarily a replacement for SMS/MMS, this means people generally would want their contacts readily available and discoverable to minimize the friction of securely messaging friends/family. Additionally it’s dangerous to be recommending a service that hasn’t been audited nor proven itself secure over time.

    shadearg, (edited )
    @shadearg@lemmy.world avatar

    a service that hasn’t been audited

    Edit: provided link to audit

    KLISHDFSDF,
    @KLISHDFSDF@lemmy.ml avatar

    awesome! I obviously haven’t been keeping up. thanks!

    wincing_nucleus073,

    simplex is the real answer. especially over tor. and anyone can host a relay. it’s extremely secure too.

    LWD, (edited ) in 23andMe hackers accessed ancestry information on millions of customers using a feature that matches relatives

    deleted_by_author

  • Loading...
  • Canadian_Cabinet,

    Yep. I did their test like 5 years ago and after the major hack (or breach or whatever) recently I requested that they delete my info

    starchylemming, in Meta sues FTC, hoping to block ban on monetizing kids’ Facebook data

    have they ever done something that’s not morally questionable ?

    ultratiem,
    @ultratiem@lemmy.ca avatar

    At this point I’m pretty convinced Zuckerberg eats about 2-3 babies a month.

    wincing_nucleus073,

    i feel bad for laughing at this lol

    mindbleach, in Google Researchers’ Attack Prompts ChatGPT to Reveal Its Training Data

    Text engine trained on publicly-available text may contain snippets of that text. Which is publicly-available. Which is how the engine was trained on it, in the first place.

    Oh no.

    PoliticalAgitator,

    Now delete your posts from ChatGPTs memory.

    mindbleach,

    Deleting this comment won’t erase it from your memory.

    Deleting this comment won’t mean there’s no copies elsewhere.

    archomrade,

    Deleting a file from your computer doesn’t even mean the file isn’t still stored in memory.

    Deleting isn’t really a thing in computer science, at best there’s “destroy” or “encrypt”

    mindbleach,

    Yes, that’s the point.

    You can’t delete public training data. Obviously. It is far too late. It’s an absurd thing to ask, and cannot possibly be relevant.

    PoliticalAgitator,

    And to be logically consistent, do you also shame people for trying to remove things like child pornography, pornographic photos posted without consent or leaked personal details from the internet?

    DontMakeMoreBabies,

    Or maybe folks should think before putting something into the world they can't control?

    joshcodes,
    @joshcodes@programming.dev avatar

    User name checks out

    PoliticalAgitator,

    Yeah it’s their fault for daring to communicate online without first considering a technology that didn’t exist.

    DarkDarkHouse,
    @DarkDarkHouse@lemmy.sdf.org avatar

    Sooner or later these models will be trained with breached data, accidentally or otherwise.

    JonEFive,

    This whole internet thing was a mistake because it can’t be controlled.

    JonEFive,

    Delete that comment you just posted from every Lemmy instance it was federated to.

    PoliticalAgitator,

    I consented to my post being federated and displayed on Lemmy.

    Did writers and artists consent to having their work fed into a privately controlled system that didn’t exist when they made their post, so that it could make other people millions of dollars by ripping off their work?

    The reality is that none of these models would be viable if they requested permission, paid for licensing or stuck to work that was clearly licensed.

    Fortunately for women everywhere, nobody outside of AI arguments considers consent, once granted, to be both unrevokable and valid for any act for the rest of time.

    JonEFive, (edited )

    While you make a valid point here, mine was simply that once something is out there, it’s nearly impossible to remove. At a certain point, the nature of the internet is that you no longer control the data that you put out there. Not that you no longer own it and not that you shouldn’t have a say. Even though you initially consented, you can’t guarantee that any site will fulfill a request to delete.

    Should authors and artists be fairly compensated for their work? Yes, absolutely. And yes, these AI generators should be built upon properly licensed works. But there’s something really tricky about these AI systems. The training data isn’t discrete once the model is built. You can’t just remove bits and pieces. The data is abstracted. The company would have to (and probably should have to) build a whole new model with only propeely licensed works. And they’d have to rebuild it every time a license agreement changed.

    That technological design makes it all the more difficult both in terms of proving that unlicensed data was used and in terms of responding to requests to remove said data. You might be able to get a language model to reveal something solid that indicates where it got it’s information, but it isn’t simple or easy. And it’s even more difficult with visual works.

    There’s an opportunity for the industry to legitimize here by creating a method to manage data within a model but they won’t do it without incentive like millions of dollars in copyright lawsuits.

    Nyanix, in I don't have anything to hide, so I don't care
    @Nyanix@lemmy.ca avatar

    We’re entitled to a reasonable amount of privacy, such as locks on our doors and curtains on our windows, why shouldn’t reasonable privacy also apply to our lives online?

    christ0st, in I don't have anything to hide, so I don't care

    It’s not about secrecy. It’s about privacy.

    AdolfSchmitler, in I don't have anything to hide, so I don't care

    Gotta hit them with the “oh cool so let me see your phone and browsing history then”

    anothermember, in Question about phones: Am I overreacting?

    I find they’re a pain to use and I only have one out of social pressure, and privacy or not I’m constantly confused on why they’re so popular.

    I just use a throwaway account and have the rule of not putting in any data that I don’t want to be read - which is barely anything any way because I do all my computing on my Linux laptop. I figure if they’re collecting location data and recording me then they’re just associating it with “random guy x” because I’ve never given it anything else. I should look in to one of the de-Googled Android distributions but I have so little interest and energy in anything to do with it, if it could be made totally private I would still rarely use it.

    possiblylinux127, in I don't have anything to hide, so I don't care

    We all have something to hide

    qjkxbmwvz, in I don't have anything to hide, so I don't care

    Lot of folks here making the “nothing to hide? Great show me your browsing history” type arguments.

    I think this isn’t really arguing in good faith. There’s a big difference between a personal friend knowing something about you, and a faceless algorithm knowing something about you. The two cases are different; it’s fair to argue about how one is better or worse, but they are different.

    vsis, in KeepassXC and KeepassDX Guide
    @vsis@feddit.cl avatar

    KeepassXC + Syncthing is my personal solution to keep my credentials and sensitive data across my devices.

    Imprint9816, in Question about phones: Am I overreacting?
    AI_toothbrush, in I don't have anything to hide, so I don't care

    I usually ask them to hand me their phone while its unlocked and that really makes some people think. Its funny because at the same time i have so little to hide that the only reason i have a passeord on my phone is because it makes stealing it harder. But im not gonna hand my data some random company just to watch braindead 30 second videos.

    ICastFist, in I don't have anything to hide, so I don't care
    @ICastFist@programming.dev avatar

    I guess they think they have nothing to hide, because they don’t know, or don’t care about, how their own information can be used against them.

    Because it doesn’t happen in an obviously invasive manner, they don’t think it’s a big deal. It’s harder to associate an abstract concept to actual value.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • privacy@lemmy.ml
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #