• 2 Posts
  • 17 Comments
Joined 19 days ago
cake
Cake day: January 7th, 2026

help-circle

  • With “deletion” you’re simply advancing the moment, they’re supposedly “deleting” your data; something I refuse to believe, they actually do. Instead, I suspect they “anonymize”, or effectively “pseudonymize” the data (as cross-referencing is trivial, when showing equal patterns on a new account; would the need arise). Stagnation wouldn’t require services to take such steps, and any personal data remains connected to you, personally.

    For the Gmail account, I would recommend: not deleting the account, opening an account at a privacy-respecting service (using Disroot as an example), connect the Gmail account to an email-client (like Thunderbird), copy all its contents (including ‘sent’ or other specific folders) to a local folder (making sure to back these up periodically), delete all contents from the Gmail server, and simply wait for incoming messages, at the now empty Gmail account.

    If a worthy email comes in: copy it over to the local folder, and delete it from the Gmail server. For used services, you could change the contact address to the Disroot account, and for others you could delete them, or simply mark them as spam (and periodically emptying the spam-folder). You may not want to wait for privacy-sensitive services, to finally make an appearance, and change these over to the Disroot address right away.

    I’ve been doing this for years now, and my big-tech accounts remain empty most of the time. Do make sure to transfer every folder, and make regular backups!



  • My emails forced me to, locking me out of accounts I needed to access.

    Microsoft had me fill this form, to “prove” I was the rightful owner of the account, after some suspicious login-attempts from an African country. The form included fields like: name (I don’t think I supplied at creation, or a false one), other email addresses, previous passwords (which potentially yield completely unrelated passwords), etc.; only for the application to be rejected and locking me out of my primary email for a full month. After that outright violation, I immediately switched to Disroot, and haven’t had any of said problems ever since. I backup all its contents locally using Thunderbird, and delete the origins from the server afterwards.

    Many platforms have this messed up dark pattern, of revoking one’s access to a real-world dependencies, unless giving in to the service’s demands. Enforcement of 2FA is another one of those “excuses” for this type of misbehavior, and so is bot-detection.


  • Yeah, I think they employ a pretty sophisticated bot detection algorithm. I vaguely remember there being this ‘make 5 friends’ objective, or something along those lines, which I had no intention of fulfilling. If a new account, having triggered the manual reviewing process, doesn’t adhere to common usage patterns, simply have them supply additional information. Any collateral damage simply means additional data, to be appended to Facebook’s self-profiling platform… I mean, what else would one expect when Facebook’s first outside investor was Palentir’s Peter Thiel?



  • I don’t know: it’s not just the outputs posing a risk, but also the tools themselves. The stacking of technology can only increase attack-surface it seems, at least to me. The fact that these models seem to auto-fill API values, without user-interaction, is quite unacceptable to me; it shouldn’t require additional tools, checking for such common flaws.

    Perhaps AI tools in professional contexts, can be best seen as template search tools. Describe the desired template, and it simply provides the template, it believes most closely matches the prompt. The professional can then “simply” refine the template, to match it to set specifications. Or perhaps rather use it as inspiration and start fresh, and not end up spending additional time resolving flaws.




  • Ah sorry, it seems I read over that part. Unless programmers have the exceptional skills and time required, to effectively reverse engineer these complex algorithms, nobody will bother to do so; especially when required after each update. On the contrary, if source code was available, the bar of entry is significantly lower and requires way less specialized skills. So save to say, most programmers won’t even bother inspecting a binary, unless there’s absolutely no other way around or have time to burn. Where as, if you’d open up the source, there would be a lot more, let’s say C programmers, able to inspect the algorithm. Really, have a look at what it requires to write binary code, let alone reverse engineering complicated code, that somebody else wrote.

    I agree with Linus’ statement though: I rarely inspect source-code myself, but I find it more comforting knowing, package-maintainers for instance, could theoretically check the source before distribution. I stand by my opinion that it’s a bad look for a privacy- and security-oriented piece of software, to restrict non-“experts” from inspecting that, which should ensure that.







  • It’s almost as if they’re seeking to replace these with technology. They’ve purposefully neglected social services and will continue to do so, to lower the bar for AI and grant themselves an excuse for the poor “substitute”. And this isn’t at all restricted to the UK, in The Netherlands we’re in the midst of it too: the same exact playbook. Modern surveillance cameras (like Axis’ for example) have NPU’s built in, or camera footage (even from legacy analog cameras, by use of encoders) is linked to either an onsite server, a cloud-service, or a combination of the two, facilitating the functionality. I hardly believe AI to be the limiting factor here, storage of footage is another story however. But I think they instead strategically place facial-recognition cameras, while the other cameras simply store abstractions from the footage. Of course if one of those cameras senses an event, which it recognizes might be of elevated relevance, it might store the raw footage. An example being: railways doing face-scanning for “depression detection”, instead of implementing ‘platform screen doors’ of course…


  • I don’t turn my face towards houses while I’m walking if I notice a doorbell camera

    I do that haha… In all seriousness, I’ve recently quit my job as mailman, in part because of this. Year after year I saw the number of doorbell cameras increase, and so grew my discomfort of my job requiring me to expose myself, to these privacy-hostile situations. The worst case scenarios were semi-detached houses: since the doors to the paired addresses are right beside another. Between the entries there’s often (decorative) separation, requiring some acrobatics to shortcut to the next address. If the second address would have a doorbell camera, while requiring me to sidestep between the obstructions, I could either: A) face the door and have my face right up to the camera, or B) turn my back to it and spin back into position. I did the latter, and I HATED having to adapt my seemingly simple job to this extend, just to protect my dignity.

    The Netherlands technically requires a sign which indicates camera surveillance, besides having to direct cameras in such a way that they cannot capture the public sphere. But have a guess at how much enforcement there is in this regard…