• 0 Posts
  • 19 Comments
Joined 7 months ago
cake
Cake day: March 12th, 2025

help-circle
  • wuffah@lemmy.worldtoFediverse@lemmy.worldI've recently turned into a blocker.
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    5
    ·
    edit-2
    4 days ago

    The electronic machine you’re operating, and the electrons flowing through it that illuminate the screen constitute a highly ordered abstraction that your human brain interprets to have meaning. The software implementing that abstraction has been structured with paradigms developed over decades with functionality specifically created for you to manage the information displayed to you. Such is the power of these technologies that they are widely regarded to have culminated in a digital information age of revolution. One of the defining moments of that age is the point at which the software, which previously was designed to implement the will and preferences of the user, began changing to instead serve the developer. It could be said that the fundamental philosophy of social media software has become to optimize it such that the user continues to use it while still freely feeding it information and being subject to manipulation.

    The abstraction has become hostile, and the tools to manage the information displayed are quickly disappearing as the implementation is abstracted away. The ability to block mimetically harmful information is being designed out of software - exposure to advertising, propaganda, violent or disturbing content, and even the addictive abstractions themselves, have become requirements for use. The filtering and management of information through the hardware and software that you OWN is not just a feature, it is a RIGHT that must be intrinsic to its design.

    In my view, the use of blocking technology should not be considered a human social action with emotional weight, but rather a mechanical one like switching off a light or moving an object out of the way. They are information management tools built to serve YOU, the user. If the technology you are using does not serve you, then who are you serving?



  • It’s pretty clear to me that Microsoft wants your PC to be just like your phone: closed-source vendor-only hardware and software with all user data cryptographically linked to your identity. Coupled with social media and internet mass surveillance, device level surveillance will fully enable the fascist takeover of the United States, and other countries. There are untold riches in selling your device-level actions to an authoritarian government so they can eliminate electoral opposition, and to advertisers who will advertise and capture insights at the OS level.

    We have been building the surveillance state for decades, and now there is a federal power that is willing to use it not just extra-judicially, but against its own citizens to suppress their constitutional rights. ICE is already using this power to arrest and disappear lawful citizens without trial. Protesting in a city where the national guard is illegally deployed? Better not bring your phone or speak about it online or do anything on your phone relating to it, really. Hell, eventually you won’t be able to safely speak out loud anywhere even near a mobile phone. The Great Eye is ever watchful.

    Imagine no more covert device interception, no more packet-level analysis from expensive secret rooms at your ISP, and no more digging through phone records and social media posts - just organized, searchable, chatbot queryable information updated hourly and purchased from Apple, Microsoft, and Google with your tax dollars about what you think, where you go, what you buy, what you do, and who you talk to every minute of every day, with an integrated secret police ready to arrest you at a moments notice of thought-crime or an attempt to exercise your rights. In the end, an AI agent will just tell them where to go and who to arrest. An authoritarian’s dream.

    All the attention you’ve paid and all the work you’ve done preserving your privacy is about to come to fruition. And it still won’t be enough to save us.









  • This reminds me of ELIZA, a natural language processing program from the 60’s that induced what they coined as the ELIZA effect, a tendency to anthropomorphize the computer. Joseph Wizenbaum, the computer scientist who created ELIZA, wrote Computer Power and Human Reason: From Judgment to Calculation in which he contends that while artificial intelligence may be possible, we should never allow computers to make important decisions, as they will always lack human qualities such as compassion and wisdom.

    The danger is not that AI will become self aware and turn against humanity, it is that people will not realize that is has already been turned against us by its masters.



  • If you know Flock, most of their cheap cameras go down all by themselves. Even when they’re operating as intended, their capture rates are under 70%, which is why you usually find them in pairs monitoring the same direction of traffic. That dinky solar panel can barely power them through the night so most of their cameras are dead in the early morning hours.

    The only way Flock stays in business is by literally giving their cameras away by illegally installing them in municipalities and waiting for them to be ordered removed. You’d probably be doing most cities a favor by taking them down.





  • wuffah@lemmy.worldtoPrivacy@lemmy.ml*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    63
    ·
    edit-2
    2 months ago

    Google, Apple, and Microsoft are all doing the same thing with their operating systems. Soon, everything you do on your device will be recorded, analyzed, compiled, and cross-referenced. All of it ready to be used against you in advertisements, commerce, and a court of law.

    If you think age checks are bad now, wait until it’s enforceably illegal to even look at pornography on your device. Or, maybe you receive a knock at your door for a missed period, or some questionable searches while pregnant? Higher ride app pricing for a low phone battery? Now your digital credit score determines your eligibility and cost for a ride. Think your VPN will hide that pirated movie or your location? Who needs to bypass encryption when your entire screen is analyzed by a hardware driven AI classification system in real-time.

    Advertisers, authoritarians, media execs, and tech bros are vibrating so hard that they’re starting to glow. This is the real AI revolution into which they have sunk so much money. Do you think they’re just going to accept that they won’t get a return on their investment because you think you have a right to privacy?

    How long will we even have access to hardware and software that doesn’t contain baked-in content analysis?


  • You’re right, it appears that NPU hardware was introduced in the iPhone 8, in 2017, and I say this typing away from an Apple device. Sounds like I’ll be busy this weekend with that rusty spork.

    However, it’s the recent generation of NPUs that provide the processing power needed to run the expanded scanning services implemented by these OS’s. That’s why Apple Intelligence requires an iPhone 15, and Microsoft is hawking Recall laptops.

    While I admit the pins-and-string-on-a-corkboard tone, I don’t think we actually disagree on anything here. Eventually, an open source platform will become the only way to avoid this kind of hardware enabled surveillance.


  • Microsoft, Google, and Apple are all quietly integrating NPUs into their devices, and implementing the software infrastructure in their operating systems to do on-device classification of content: Windows Recall, Google SafetyCore, and Apple Intelligence. These services are obsequiously marketed as being for your benefit, while all are privacy and surveillance nightmares. When the security breaking features of these systems are mentioned, each company touts convoluted workarounds to justify the tech.

    Why would these companies risk rabidly forcing these unwanted, unpopular, insecure, expensive, and unnecessary features on their collective user bases? The real reason is to capture everything you do and store on your device, use the tensor hardware you may or may not even know that you purchased to analyze the data locally, then export and sell that “anonymized” information to advertisers and the government. All while cryptographically tying the data to your device, and the device to you, for “security”. This enables mass surveillance, digital rights management, and targeted advertising on a scale and depth previously unseen. Who needs a backdoor or a quantum computer to break consumer-grade encryption when you can just locally record everything everyone does and analyze it automatically at the hardware level?

    Each of these providers is already desperate to scan, analyze, and classify your content:

    Microsoft has been caught using your stored passwords to decrypt archives uploaded to OneDrive.

    Apple developed forced client side scanning for CSAM before backlash shut it down. They already locally scan your photos with a machine learning classification algorithm whether you like it or not. You can’t turn it off.

    Google recently implemented local content scanning with SafetyCore to “protect you from unwanted content like spam”. Then why is it scanning your photo library?

    I would rather saw off my own nuts with a rusty spork before willfully purchasing a device with an integrated NPU. I fear that in the next 5-10 years, you won’t be able to avoid them. We are paying for the edge hardware being used for our own unwilling surveillance. Then, our tax dollars are paid to these tech companies to purchase the data!

    Do you trust the rising fascist regimes and their tech lackeys in America and the UK to use this power morally and responsibly?

    Do you really believe that these features that you didn’t ask for, that you cannot disable, and are baked directly into the hardware, are for your benefit?