• MyMindIsLikeAnOcean@piefed.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    17 hours ago

    They should be compelled to…sell less ads? Silly. What do you mean by “tools”? There a gajillion tools that nobody understands or uses…we need more responsibility in the purveyor…not the user. Saying you want tools is the status quo.

    Moderation is the only solution. Social media companies should be required, with no exceptions, to follow the laws of the region they operate in. They don’t do that…they put out whatever whenever and take almost no responsibility for what they expose people to.

    • deathbird@mander.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 hours ago

      By “tools” I generally mean software and options/functionalities offered by that software through the regular user interface that enables one to modify the outputs of that software, and thus one’s user experience. So in this sense Windows 11 is a “tool” that as an operating system enables one to use a computer, but also therefore supplies tools to modify the experience, such as one lets a privileged user prevent non-privileged users from uninstalling software or sharing a printer to the LAN, right? Facebook (a software deployment owned and remotely hosted by Meta) has a tool that allows a person with a Javascript-enabled web browser (also a tool) or Meta’s proprietary application to send a message to a stranger on the internet, or a known person, along with a lot of other things, right?

      Now what Windows 11 doesn’t have is a tool that lets me locate my mouse pointer on screen easily, but that’s okay because I can install PowerToys to gain that functionality. I can also install software that modifies the Facebook experience to some degree, but there’s not a lot of that for various reasons, and certainly I can’t find any that sells itself as a child-safety or parental control solution. But that makes sense, right? Because in order to serve that functionality it has to be deployable across all computers the child is using to access that remote service, and it has to be updated to match changes in that service’s software, like your shadow is attached to your feet. No practical at all.

      Obviously this is of limited use, and this is why people use tools to modify their experience of social media sites like FB are usually doing so merely for their own comfort and enjoyment, which is valid but not the same purpose as parental control. And the relationship between the remote service and the local software developer is adversarial. This is why there’s plenty of parental control tools to block a website, but none to modify one.

      I actually agree that moderation is the solution, but not in the way you mean. FB doesn’t create content, it just facilitates people to share their own (bots too, but set that aside). I don’t think any sane person believes that Meta or lemmy [dot] world or any other platform could continue to exist if it was held responsible for what its users said. Platforms make what efforts they do at regulation to avoid getting DMCAed, to keep themselves advertiser-friendly, and to make their services sufficiently enjoyable to users who those advertisers what their ads to be seen by. That last bit’s important, but even look at the first two, a legal regulation and “regulation” by market forces in the wild, and you can see how these already cause problems. But what platforms like FB don’t give you because they don’t want you to have it is control over your user experience.

      FB doesn’t want you to have tools (account options) to moderate your own or your child’s experience on their platform because it would cost them money, both in development costs and opportunity costs. But that’s what’s actually needed to make FB an enjoyable and even child-safe experience. Not broad legal “moderation” demands that no platform could survive without obscenely invasive company-side tools and exploitative labor outsourcing, but functional tools (that yes, would have to be mandated by law because they won’t do it voluntarily) that enable the user to control their own experience. It’s a question of, do you want some underpaid and thrice subcontracted Indian/Nigerian tech workers reading your teen’s sexts with his boyfriend and making judgment calls as to their appropriateness, or do you want the capacity to simply allow communication between those to accounts without monitoring them, but retain the ability to block DMs from unknown accounts so your kid doesn’t get groomed by a stranger? We’re constantly told we have to choose between total system control or the Wild West, but we are only encouraged to consider these possibilities because they’re what’s cheapest for the companies.