This is an open question on how to get the masses to care…

Unfortunately, if other people don’t protect their privacy it affects those who do, because we’re all connected (e.g. other family members, friends). So it presents a problem of how do you get people who don’t care, to care?

I started the Rebel Tech Alliance nonprofit to try to help with this, but we’re still really struggling to convert people who have never thought about this.

(BTW you might need to refresh our website a few times to get it to load - no idea why… It does have an SSL cert!)

So I hope we can have a useful discussion here - privacy is a team sport, how do we get more people to play?

  • merde alors
    link
    fedilink
    6818 days ago

    you should stop calling people “normies”, if you want them to care about what you have to say

    • whoareu
      link
      fedilink
      1118 days ago

      I call them normies not because I look down upon them or I hate them I do that because whenever I educate them to use privacy oriented services they mock me saying “you are crazy” “you aren’t president” “nobody cares about your data” yada yada yada…

      It makes me frustrated :(

      • मुक्त
        link
        fedilink
        518 days ago

        Framing “them” as fundamentally different reinforces the mental barrier that your requirements and their requirements are different. Avoid it.

        • @cardfire@sh.itjust.works
          link
          fedilink
          English
          518 days ago

          You’d better believe marketing execs and specialists in branding will divide and conquer market segments of apathetic typical people.

          Addicts in recover programs can call the general population of non-addicts ‘normies’; people that have been marginalized for neurodivergent thinking often call the mainstream population of neurotypicals ‘normies’ etc.

          Gatekeeping by commonly accepted language across diverse circles only serves for your own purity testing instead of focusing on the core issue of how to sell people on exercising their own basic self-interest.

      • @Auli@lemmy.ca
        link
        fedilink
        English
        3
        edit-2
        18 days ago

        The problem is their arguments are not wrong. Nobody does care about your data. Which makes it so hard to convince people about the dangerous.

    • @Paddy66@lemmy.mlOP
      link
      fedilink
      English
      418 days ago

      noted, and you’re right.

      I actually mis-applied that term in my post. I’ve been trying to learn about tech, and self hosting in particular, along this journey. I found that ‘normies’ is the term that tech-savvy people apply to people who don’t know about tech - i.e. me! - and I started using it. In the sense of “these install instructions will never work with normies”.

      In this context I shouldn’t have used it to refer to people who do not care about data privacy. I’ll edit my post.

      Thank you for pointing that out!

      • merde alors
        link
        fedilink
        117 days ago

        Privacy is a team sport - how do we get more more people to play?

        now you’re calling them "more"s 🙂

  • @MoonlightFox@lemmy.world
    link
    fedilink
    1518 days ago

    I think certain arguments work, and certain don’t.

    I live in a very high trust society, Norway. This has a lot of advantages, but also some downsides.

    We trust eachother, our neighbours, our government and our media. Which is fantastic, and well deserved. The government deserves the trust.

    This makes it hard for me to make people realize how important privacy is, because they trust organizations with their data.

    During COVID, Norway made their own app for tracking who met to prevent the spread. Of all the apps in the world, Norway wanted to push about the least privacy friendly app in the world. This from a country with the highest press freedom and rankings for democracy. Most people though it was fine, because why not? We trust our government.

    https://www.amnesty.org/en/latest/news/2020/06/norway-covid19-contact-tracing-app-privacy-win/

    Luckily someone protested enough, and it got scrapped for something better.

    When I try to convince someone I have a couple of angles:

    1. You trust the government and organizations with your data today. But do you trust the government in 30 years? Because data is forever. The US has changed a lot in a very short time, this can happen here as well

    2. You have a responsibility for other peoples privacy as well. When you use an app that gets access to all your SMSes and contacts you spy on behalf of companies on people that might need protection. Asylum seekers from other countries for instance.

      • @MoonlightFox@lemmy.world
        link
        fedilink
        118 days ago

        While I agree in theory, in practice open source has a similar amount of expected trust as closed source can have in many cases. I use all sorts of open source software without reading the code. I ain’t got time for that.

        I can trust that software from a lot of organizations are trustworthy even if it is closed source, but I can’t trust any open source repo without reading the code. I habe to use other ways to evaluate it, is it probable that someone has audited it? Is it popular? Is it recognized as safe and trustworthy? Is the published and finished build the same as the one I would get if I built it myself?

        But yes, you can never be 100% certain without open source and auditing it yourself.

        I do trust that my travel pass app from a government organization doesn’t install malware / spyware on my phone. I can’t trust a random github repo even if it is open source.

    • Mike
      link
      fedilink
      5
      edit-2
      16 days ago

      Something similar happened in Denmark with the new Sundhedsloven, which had provisions allowing the government to forcefully isolate people in concentration camps, along with forcefully vaccinating them. This was during the COVID-19 pandemia.

      This was of course alarming for those who were in the know, but very few people protested (and the law was subsequently amended), but the general attitude from the public was “it’s not a problem because something like THAT would ever happen in Denmark.” 🤡

      • @MoonlightFox@lemmy.world
        link
        fedilink
        318 days ago

        We had some emergency law that was almost passed recently. As in it passed the first of two rounds. The second voting round is just a formality, all laws are just passed after the first in practice. Luckily some law professor raised the alarms and it did not pass the second time. So within a couple of hours margin it was stopped.

        The law gave the government the ability to force people to do a lot of stuff, work any job at any place in Norway. If you do not comply you could get up to three years in prison. It would not be a problem with the current or any government in the near future, but it is a law. And we can’t have laws that rely on trusting politicians. Because we might have politicians with anti democratic tendencies in the future

        • @Paddy66@lemmy.mlOP
          link
          fedilink
          217 days ago

          This is the same argument against trusting opaque algorithms from proprietary systems (usually billionaire owned). You just don’t know when they’re going to tweak it for their purposes.

    • @Paddy66@lemmy.mlOP
      link
      fedilink
      English
      218 days ago

      This is a VERY interesting perspective - thank you for sharing!

      You are lucky in Norway to have that level of trust, but I’d never considered the flip side: that it would create a dangerous apathy about privacy.

      Your two angles are great:

      1. This is so true but for some it is so nebulous, and it countries like the UK (and especially if you are white and not struggling financially) then there is an exceptionalism that creeps into the thinking. Probably because we’ve never been invaded and occupied. I was in Norway last year, and Denmark this year, and no one wants that to happen again. It seems to have shaped thinking a lot - correct me if i’m wrong 😊

      2. This is a big one - privacy is a collective problem. It’s a team sport. I have had some success with this argument.

      What’s very hard is to convey to people just how amazingly powerful and efficient big tech’s profiling models really are. Trillions of computations a minute to keep your creepy digital twin up to date. Most people cannot get their head round the scale of it, and I’m struggling to visualise it for them!

      • @swordfish@lemmy.world
        link
        fedilink
        117 days ago

        I think it’s a good idea. People are more likely to cooperate and take advice from people who don’t call them names. Although i understand that “normie” was not meant as an insult. But it might be perceived that way.

  • @FriendBesto@lemmy.ml
    link
    fedilink
    918 days ago

    I have learned that the best game is simply not to play. You risk annoying the hell out of people. Let them get curious, maybe mention it but they have to come to you. Pushing it onto people who do not care is simply not worth it. You are wasting your time, this is real life. Some people will simply not want to care. It is their choice and sometimes that choice will not match yours.

    The people I have so-called converted where people who actually were interest to know more. If you push it on people who are not interested then you risk being that annoying person who comes off as an activist or ideologue.

  • @hansolo@lemm.ee
    link
    fedilink
    English
    918 days ago

    There’s several overlapping problems:

    First, that the problem is complex. It’s not just “Microsoft bad.” There’s a turducken lasagna of layered problems that make it hard for the average person to wrap their heads around the issue.

    Next, there’s no direct monetary incentive. You can’t say “you lose $500 a year because data brokers know your address.” Most people also have relied their whole lives on free email, so the average person in already in “debt” in terms of trade offs already.

    You’re also starting from a point of blaming the victim in a way. It’s the same problem companies have with cybersecurity, blaming everyone except the executive that didn’t know the risks of skimping on cyber budgets. Hiding the problem to avoid public shame is the natural human response.

    Finally, that resolving the problem is fucking hard. I know, we all know, it’s a constantly moving target that requires at the very least moderate technical skill. My partner wants to have more privacy online, but would rather have conveniences in many cases. And has zero patience for keeping up with changes, so I have to be a CISO for a household. So the average person, and the average household, does not have the skillset to care “effectively” if they wanted to.

    • @Paddy66@lemmy.mlOP
      link
      fedilink
      English
      118 days ago

      First of all, it’s May 4th so happy Star Wars day Han Solo!

      Your points land… hard. Yes it is so messed up that privacy has been pushed on the end user as ‘their problem to fix with consent choice’. As you all know here it’s not a real choice.

      Yes this should all be solved at the regulatory / gov level, but whilst the EU has been doing some great things recently, and the US has just kicked Apple and Google and Meta in the balls for antitrust, it’s never enough - there’s just too much lobbying and money washing around.

      So, sadly, it does come down to the individual. My position is “if huge numbers of people starve the system of their behavioural data, then the surveillance economy is less effective, and perhaps other business models will have a chance”. Do you think that holds water?

      • @hansolo@lemm.ee
        link
        fedilink
        English
        117 days ago

        And may The Force also be with you.

        And don’t take it personally, it’s a fair question with an answer that it’s exactly why people get degrees in things like public policy.

        The way to “solve” this for the average person is two steps: services like DeleteMe making them feel like they can “get back” their privacy. Second is dumbed down education with easy means. 1 year ago, uBlock did amazing stuff, and only 33% of internet users were using it. Exclude 25% of the remainder as enterprise setups not allowing extensions, and you still have 40+% of people online just rawdogging MSN and Yahoo and Drudge Report. Like, have you seen that internet lately? It’s fucking intolerable. But the same peoe that install searchbars won’t install uBlock. You have to be aggressive explaining value for 10 seconds of time.

        It’s a genuine campaign that takes time and alluring promos.

    • @Auli@lemmy.ca
      link
      fedilink
      English
      118 days ago

      The data broker one is kind of week though addresses have never been private. I mean we used to give everyone a book with everyone’s address and phone number. Also anyone could look up who owns what land you would have to do some serious stuff to hide owning some land and most people are not going to do that.

  • @themurphy@lemmy.ml
    link
    fedilink
    918 days ago

    People want convinience. You’ll never get people to do it, unless it personally affects them. Realisticly, you can convert a few.

    But most importantly. It shouldnt be that hard to have privacy. THATS the problem. People shouldnt need to do alot of things to get it.

    Do something about the problem (political, legally change privacy laws) instead of every single person.

    But I know that can be near impossible depending of where you live.

    • @Paddy66@lemmy.mlOP
      link
      fedilink
      English
      118 days ago

      oh yes, convenience… a big problem when moving from the alternatives.

      And I have to acknowlege that I’m an unusual case - I would rather use a less-good service than give my data to a better one. I know most people don’t think like that.

      That’s why the alternatives we recommend are usually the zero knowledge encrypted ones, and they need to have a good experience. But privacy by design is sadly not that widely adopted in products. It has been increasing though, but just very slowly.

      And about your point to hit the problem when mass change can happen e.g. political, legal - that is more the domain of our friends at other orgs like EFF, noyb, The Citizens etc. But you’re right, that is where change needs to happen. Not easy when the big tech firms lobby so hard and throw money at the problem.

      • @themurphy@lemmy.ml
        link
        fedilink
        217 days ago

        Yeah, they really do throw money around to keep control…

        And I know it doesnt help to always say “we need political change” because it’s also an easy escape to just say that.

        Im also trying my best moving me and my friends to other platforms, and we shouldnt stop. Be the change.

  • @tomatolung@sopuli.xyz
    link
    fedilink
    617 days ago

    Great cause and one that reaches to the heart of what I see as impacting much of the governmental and societal disruption that’s happening. It’s a complex and nuanced issue that is likely to take multiple prongs and a long time to resolve.

    Let me start by again generally agreeing with the point. Privacy is necessary for reasons beyond the obvious needs. Speaking to the choir here on a privacy community. I think it’s worth listing the reasons that I understand why Americans are generally dismissive of the need for privacy protections. I cheated here, and used an LLM to help, but I think these points are indicative of things to overcome.

    • Convenience > confidentiality. Nearly half of U.S. adults (47 %) say it’s acceptable for retailers to track every purchase in exchange for loyalty-card discounts, illustrating a widespread “deal first, data later” mindset. Pew Research Center

    • “Nothing to hide.” A popular refrain equates privacy with secrecy; if you’re law-abiding, the thinking goes, surveillance is harmless. The slogan is so common that rights groups still publish rebuttals to it. Amnesty International

    • Resignation and powerlessness. About 73 % feel they have little or no control over what companies do with their data, and 79 % say the same about government use—attitudes that breed fatalism rather than action. Pew Research Center

    • Policy-fatigue & click-through consent. Because privacy policies are dense and technical, 56 % of Americans routinely click “agree” without reading, while 69 % treat the notice as a hurdle to get past, not a safeguard. Pew Research Center

    • The privacy paradox. Behavioral studies keep finding a gap between high stated concern and lax real-world practice, driven by cognitive biases and social desirability effects. SAGE Journals

    • Market ideology & the “free-service” bargain. The U.S. tech economy normalizes “free” platforms funded by targeted ads; many users see data sharing as the implicit cost of innovation and participation. LinkedIn

    • Security framing. Post-9/11 narratives cast surveillance as a safety tool; even today 42 % still approve of bulk data collection for anti-terrorism, muting opposition to broader privacy safeguards. Pew Research Center

    • Harms feel abstract. People worry about privacy in the abstract, yet most haven’t suffered visible damage, so the risk seems remote compared with daily conveniences. IAPP

    • Patchwork laws. With no single federal statute, Americans face a confusing mix of state and sector rules, making privacy protections feel inconsistent and easy to ignore. Practice Guides

    • Generational normalization. Digital natives are more comfortable with surveillance; a 2023 survey found that 29 % of Gen Z would even accept in-home government cameras to curb crime. cato.org

    Having listed elements to overcome, it’s easy to see why this feels sisyphean task in an American society. (It is similar, but different other Global North societies. The US desperately needs change as is evident with the current administration.) Getting to your question though, I feel like the real rational points to convey are not those above, but the reasons how a lack of privacy impacts individuals.

    • Political micro-targeting & democratic drift
      Platforms mine psychographic data to serve bespoke campaign messages that exploit confirmation bias, social-proof heuristics, and loss-aversion—leaving voters receptive to turnout-suppression or “vote-against-self-interest” nudges. A 2025 study found personality-tailored ads stayed significantly more persuasive than generic ones even when users were warned they were being targeted. Nature

    • Surveillance pricing & impulsive consumption
      Retailers and service-providers now run “surveillance pricing” engines that fine-tune what you see—and what it costs—based on location, device, credit profile, and browsing history. By pairing granular data with scarcity cues and anchoring, these systems push consumers toward higher-priced or unnecessary purchases while dulling price-comparison instincts. Federal Trade Commission

    • Dark-pattern commerce & hidden fees
      Interface tricks (pre-ticked boxes, countdown timers, labyrinthine unsubscribe flows) leverage present-bias and choice overload, trapping users in subscriptions or coaxing them to reveal more data than intended. Federal Trade Commission

    • Youth mental-health spiral
      Algorithmic feeds intensify social-comparison and negativity biases; among U.S. teen girls, 57 % felt “persistently sad or hopeless” and nearly 1 in 3 considered suicide in 2021—a decade-high that public-health experts link in part to round-the-clock, data-driven social media exposure. CDC

    • Chilling effects on knowledge, speech, and creativity
      After the Snowden leaks, measurable drops in searches and Wikipedia visits for sensitive topics illustrated how surveillance primes availability and fear biases, nudging citizens away from inquiry or dissent. Common Dreams

    • Algorithmic discrimination & structural inequity
      Predictive-policing models recycle historically biased crime data (representativeness bias), steering patrols back to the same neighborhoods; credit-scoring and lending algorithms charge Black and Latinx borrowers higher interest (statistical discrimination), entrenching wealth gaps. American Bar AssociationRobert F. Kennedy Human Rights

    • Personal-safety threats from data brokerage
      Brokers sell address histories, phone numbers, and real-time location snapshots; abusers can buy dossiers on domestic-violence survivors within minutes, exploiting the “search costs” gap between seeker and subject. EPIC

    • Identity theft & downstream financial harm
      With 1.35 billion breach notices issued in 2024 alone, stolen data fuels phishing, tax-refund fraud, bogus credit-card openings, and years of credit-score damage—costs that disproportionately hit low-information or low-income households. ITRC

    • Public-health manipulation & misinformation loops
      Health conspiracies spread via engagement-optimized feeds that exploit negativity and emotional-salience biases; a 2023 analysis of Facebook found antivaccine content became more politically polarized and visible after the platform’s cleanup efforts, undercutting risk-perception and vaccination decisions. PMC

    • Erosion of autonomy through behavioral “nudging”
      Recommendation engines continuously A/B-test content against your micro-profile, capitalizing on novelty-seeking and variable-reward loops (think endless scroll or autoplay). Over time, the platform—rather than the user—decides how hours and attention are spent, narrowing genuine choice. Nature

    • National-security & geopolitical leverage
      Bulk personal and geolocation data flowing to data-hungry foreign adversaries opens doors to espionage, blackmail, and influence operations—risks so acute that the DOJ’s 2025 Data Security Program now restricts many cross-border “covered data transactions.” Department of Justice

    • Social trust & civic cohesion
      When 77 % of Americans say they lack faith in social-media CEOs to handle data responsibly, the result is widespread mistrust—not just of tech firms but of institutions and one another—fueling polarization and disengagement. Pew Research Center

    • @tomatolung@sopuli.xyz
      link
      fedilink
      317 days ago

      And one last point here, is that these all stem from the way we as humans are built. Although we are capable of rational though, we often do not make rational decisions. Indeed those decisions are based on cognitive biases which we all have and are effected by context, environment, input, etc. It’s possible to overcome this lack of rational judgement, through processes and synthesis such as the scientific method. So we as citizens and humans can build institutions that help us account for the individual biases we have and overcome these biological challenges, while also enjoying the benefits and remaining human.

  • @Termight@lemmy.ml
    link
    fedilink
    English
    4
    edit-2
    18 days ago

    One method is to put a $ on privacy. Consider this: if you were offered $5 for every piece of information you shared about yourself, would you still share it? Probably not. But the true cost is far less obvious, spread out over time, and often masked by the convenience of “free” services.

    • @tomatolung@sopuli.xyz
      link
      fedilink
      217 days ago

      I like this concept and I feel like that a step along the way as it is essentially what’s happening. The EULA’s, TOS’s, SLA’s, etc are all contracts, which should be negotiable by both parties and allow the individuals or groups to define value, be that monetary value (the $5) or something in trade. Some how we the masses skipped over the negotiation, and are left with an almost binary choice either accept and use it or not. (You could sue, or protest, or etc, but without standing or a large following this is not effective for an individual.)

      So whilst’ I agree, I also think it might be more useful to focus on the reason the information is valuable.

  • @Jason2357@lemmy.ca
    link
    fedilink
    417 days ago

    As a thought experiment: what would have happened if instead of a public health regulation approach, we dealt with restaurant safety by providing a few safe places and advocating everyone go there if they don’t want salmonella or e-coli poisoning. We’d have people ignorant going to the dangerous places, others misinformed or in denial, and a flood of misinformation that food poisoning is either “fine” or there’s no avoiding it anyway so best not to worry.

  • @corvus@lemmy.ml
    link
    fedilink
    2
    edit-2
    18 days ago

    Tell them how governments, employees and scammers buy from data brokers the data collected from apps in their phones to surveil, blackmail or scam them. Do a research and send them a good summary with the links. When a told my brother in law about this, he was stunned. He’s still using his phone as always lol, so don’t have too much expectations.

    • @Paddy66@lemmy.mlOP
      link
      fedilink
      English
      218 days ago

      I’ve had a bit of success with this - a cousin for example was shocked by a report I sent him about the RTB system - but I worry that if I send too many of those kinds of info then people will think I’m some kind of conspiracy theorist. 😱

  • Drunk & Root
    link
    fedilink
    117 days ago

    for the site see if you can reissue the cert or try certbot if u already used certbot try manyally downloading the cert an pointibng to it

    • @Paddy66@lemmy.mlOP
      link
      fedilink
      English
      117 days ago

      The site is hosting by a hosting company - and they assure me that the cert is fine.

      If I was self hosting I’d expect these problems, but not with a hosting company.

      The only difference with this company is that they do not use any big tech infrastructure - they have their own servers. I wonder if big tech has something they don’t…?

      • Drunk & Root
        link
        fedilink
        1
        edit-2
        15 days ago

        idk for me it doesnt say a error just cannot complete request and https even though connections not secure its quite odd and i can use http for it an it works

        • @Paddy66@lemmy.mlOP
          link
          fedilink
          115 days ago

          really? It works with just http? that is weird.

          It suggests to me that the web hosting company we are using don’t know what they’re doing. We’re going to change.

  • @GrumpyDuckling@sh.itjust.works
    link
    fedilink
    English
    1
    edit-2
    18 days ago

    I can use an sdr to read your water meter and determine how often you go to the bathroom, shower, wash your clothes, and when you’re home and it’s not illegal. I’m allowed to follow you around and take your picture as much as I want to. I can print off as many pictures of you as I want in public and wallpaper my whole house with your face and body, there’s nothing you can do about it. I can do an 8 hour video essay about you and share this with everyone. As long as the info is publicly available (or not in most U.S. states), it’s legal.

    • @Paddy66@lemmy.mlOP
      link
      fedilink
      118 days ago

      Damn. that is creepy. Similar to the comment someone else left about stalking…

      Maybe I’ll so a series of case studies via the blog - thank you for sharing this!

      • @GrumpyDuckling@sh.itjust.works
        link
        fedilink
        English
        218 days ago

        In my state it’s not stalking if you don’t make any threats. You don’t have an expectation of privacy in public. That’s the argument they use with license plate cameras and other warrantless survelance, tracking, facial recognition, etc.

  • @Churbleyimyam@lemm.ee
    link
    fedilink
    118 days ago

    I sometimes wonder if NordVPN has done more for the privacy cause than anything else, purely for the sheer amount of advertising.

    • @Auli@lemmy.ca
      link
      fedilink
      English
      1
      edit-2
      18 days ago

      But most of their claims are false. And how does it do anything for privacy. And if you say obscures your ip address.

      • @Churbleyimyam@lemm.ee
        link
        fedilink
        117 days ago

        Just the fact that NordVPN claims to protect your privacy means that the average person hears about privacy a lot

      • @Paddy66@lemmy.mlOP
        link
        fedilink
        English
        117 days ago

        It certainly make me feel safer against big tech snooping. Is obscuring your IP address not useful? I genuinely want to hear the arguments for and against VPNs. And if they’re not effective what are better ways we can protect ourselves?

        • @Churbleyimyam@lemm.ee
          link
          fedilink
          2
          edit-2
          17 days ago

          VPNs hide your IP from your ISP and anyone they share that information with. Here in the UK ISPs keep a record of every internet connection you make and pass it on to the government and perhaps others. Using a VPN here means that instead of them knowing every single website you visit they just know you are using a VPN (or Tor, or a proxy etc if that’s what you’re using). All they can tell from that data is what time you’re active online and how much data you upload/download, not which websites you’re visiting.

          The websites that you connect to at the other end can still determine who you are by means other than your IP address, like information that your machine presents to them which is unique. VPNs don’t protect against this.

          A VPN is like a private courier. What the recipient does with the delivered message (and what you’ve put in it) is out of the courier’s hands.