She’s almost 70, spend all day watching q-anon style of videos (but in Spanish) and every day she’s anguished about something new, last week was asking us to start digging a nuclear shelter because Russia was dropped a nuclear bomb over Ukraine. Before that she was begging us to install reinforced doors because the indigenous population were about to invade the cities and kill everyone with poisonous arrows. I have access to her YouTube account and I’m trying to unsubscribe and report the videos, but the reccomended videos keep feeding her more crazy shit.

  • Milk
    link
    fedilink
    22 years ago

    You can’t, it’s like trying to fix your mom’s phone. The only way is resetting it, try resetting your mom.

    • @abbadon420@lemm.ee
      link
      fedilink
      English
      22 years ago

      I don’t understand how these people can endure enough ads to be lured in by qanon. The people of that generation generally don’t know about decent adblockers.

    • froggers
      link
      fedilink
      English
      72 years ago

      At this point, any channel that I know is either bullshit or annoying af I just block. Out of sight out of mind.

      • youthinkyouknowme
        link
        fedilink
        English
        42 years ago

        Same. I have ads blocked and open YouTube directly to my subbed channels only. Rarely open the home tab or check related videos because of the amount of click bait and bs.

        • froggers
          link
          fedilink
          English
          42 years ago

          Ohh I just use BlockTube to block channels/ videos I don’t want to see.

    • @Thorny_Thicket@sopuli.xyz
      link
      fedilink
      English
      52 years ago

      I find it interesting how some people have so vastly different experience with YouTube than me. I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I’m interested in. I even watch occasional political videos, gun videos and police bodycam videos but it’s still not trying to force any radical stuff down my throat. Not even when I click that button which asks if I want to see content outside my typical feed.

      • @bstix@feddit.dk
        link
        fedilink
        English
        2
        edit-2
        2 years ago

        The experience is different because it’s not one algorithm for everyone.

        Demographics are targeted differently. If you actually get a real feed, it’s only because no one has yet paid YouTube for guiding you towards their product.

        It would be an interesting experiment to set up two identical devices and then create different Google profiles for each just to watch the algorithm take them in different directions.

      • Andreas
        link
        fedilink
        English
        52 years ago

        I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I’m interested in.

        The algorithm’s goal is to get you addicted to Youtube. It has already succeeded. For the rest of us who watch one video a day, if at all, it employs more heavy-handed strategies.

      • @scottyjoe9@sh.itjust.works
        link
        fedilink
        English
        52 years ago

        At one point I watched a few videos about marvel films and the negatives about them. One was about how captian marvel wasn’t a good hero because she was basically invincible and all powerful etc etc. I started getting more and more suggestions about how bad the new strong female leads in modern films are. Then I started getting content about politically right leaning shit. It started really innocuously and it’s hard to figure out if it’s leading you a certain way until it gets further along. It really made me think when I’m watching content from new channels. Obviously I’ve blocked/purged all channels like that and my experience is fine now.

      • livus
        link
        fedilink
        62 years ago

        My youtube is usually ok but the other day I googled an art exhibition on loan from the Tate Gallery, and now youtube is trying to show me Andrew Tate.

    • niktemadur
      link
      fedilink
      23
      edit-2
      2 years ago

      You watch this one thing out of curiosity, morbid curiosity, or by accident, and at the slightest poke the goddamned mindless algorithm starts throwing this shit at you.

      The algorithm is “weaponized” for who screams the loudest, and I truly believe it started due to myopic incompetence/greed, not political malice. Which doesn’t make it any better, as people don’t know how to take care of themselves from this bombardment, but the corporations like to pretend that ~~they~~ people can, so they wash their hands for as long as they are able.

      Then on top of this, the algorithm has been further weaponized by even more malicious actors who have figured out how to game the system.
      That’s how toxic meatheads like infowars and joe rogan get a huge bullhorn that reaches millions. “Huh… DMT experiences… sounds interesting”, the format is entertaining… and before you know it, you are listening to anti-vax and qanon excrement, your mind starts to normalize the most outlandish things.

      EDIT: a word, for clarity

      • Jaywarbs
        link
        fedilink
        32 years ago

        Whenever I end up watching something from a bad channel I always delete it from my watch history, in case that affects my front page too.

        • @Sludgehammer@lemmy.world
          link
          fedilink
          2
          edit-2
          2 years ago

          I do that, too.

          However I’m convinced that Youtube still has a “suggest list” bound to IP addresses. Quite often I’ll have videos that other people in my household have watched suggested to me. While some of it can be explained by similar interests, but it happens a suspiciously often.

          • Drunemeton
            link
            fedilink
            42 years ago

            I can confirm the IP-based suggestions!

            My hubs and I watch very different things. Him: photography equipment reviews, photography how to’s, and old, OLD movies. Me: Pathfinder 2e, quantum field theory/mechanics and Dip Your Car.

            Yet we both see stuff in the other’s Suggestions of videos the other recently watched. There’s ZERO chance based on my watch history that without IP-based suggestions YT is going to think I’m interested in watching a Hasselblad DX2 unboxing. Same with him getting PBS Space Time’s suggestions.

        • @emptyother@lemmy.world
          link
          fedilink
          English
          22 years ago

          Huh, I tried that. Still got recommended incel-videos for months after watching a moron “discuss” the Captain Marvel movie. Eventually went through and clicked “dont recommend this” on anything that showed on my frontpage, that helped.

    • DaGuys470
      link
      fedilink
      72 years ago

      Just this week I stumbled across a new YT channel that seemed to talk about some really interesting science. Almost subscribed, but something seemed fishy. Went on the channel and saw the other videos, immediately got the hell out. Conspiracies and propaganda lurk everywhere and no one is save. Mind you, I’m about to get my bachelor’s degree next year, meaning I have received proper scientific education. Yet I almost fell for it.

      • @Mikina@programming.dev
        link
        fedilink
        33
        edit-2
        2 years ago

        It’s even worse than “a lot easier”. Ever since the advances in ML went public, with things like Midjourney and ChatGPT, I’ve realized that the ML models are way way better at doing their thing that I’ve though.

        Midjourney model’s purpose is so receive text, and give out an picture. And it’s really good at that, even though the dataset wasn’t really that large. Same with ChatGPT.

        Now, Meta has (EDIT: just a speculation, but I’m 95% sure they do) a model which receives all data they have about the user (which is A LOT), and returns what post to show to him and in what order, to maximize his time on Facebook. And it was trained for years on a live dataset of 3 billion people interacting daily with the site. That’s a wet dream for any ML model. Imagine what it would be capable of even if it was only as good as ChatGPT at doing it’s task - and it had uncomparably better dataset and learning opportunities.

        I’m really worried for the future in this regard, because it’s only a matter of time when someone with power decides that the model should not only keep people on the platform, but also to make them vote for X. And there is nothing you can do to defend against it, other than never interacting with anything with curated content, such as Google search, YT or anything Meta - because even if you know that there’s a model trying to manipulate with you, the model knows - there’s a lot of people like that. And he’s already learning and trying how to manipulate even with people like that. After all, it has 3 billion people as test subjects.

        That’s why I’m extremely focused on privacy and about my data - not that I have something to hide, but I take a really really great issue with someone using such data to train models like that.

        • @Cheers@sh.itjust.works
          link
          fedilink
          62 years ago

          Just to let you know, meta has an open source model, llama, and it’s basically state of the art for open source community, but it falls short of chatgpt4.

          The nice thing about the llama branches (vicuna and wizardlm) is that you can run them locally with about 80% of chatgpt3.5 efficiency, so no one is tracking your searches/conversations.

          • @Mikina@programming.dev
            link
            fedilink
            22 years ago

            I was using ChatGPT only as an example - I don’t think that making a chatbot AI is their focus, so it’s understandable that they are not as good at it - plus, I’d guess that making a coherent text is a lot harder than deciding what kind of video or posts to put in someones feed.

            And that AI, the one that takes users data as input and outputs what to show him in his feed to keep him glued to Facebook for as much as possible, I’m almost sure is one of the best ML we have on the world right now - simply because of the user base and time it has to learn on, and the sheer amount of data Meta has about users. But that’s also something that will never get public, naturally.

    • nLuLukna
      link
      fedilink
      English
      152 years ago

      Reason and critical thinking is all the more important in this day and age. It’s just no longer taught in schools. Some simple key skills like noticing fallacies or analogous reasoning, and you will find that your view on life is far more grounded and harder to shift

      • @cynar@lemmy.world
        link
        fedilink
        English
        92 years ago

        Just be aware that we can ALL be manipulated, the only difference is the method. Right now, most manipulation is on a large scale. This means they focus on what works best for the masses. Unfortunately, modern advances in AI mean that automating custom manipulation is getting a lot easier. That brings us back into the firing line.

        I’m personally an Aspie with a scientific background. This makes me fairly immune to a lot of manipulation tactics in widespread use. My mind doesn’t react how they expect, and so it doesn’t achieve the intended result. I do know however, that my own pressure points are likely particularly vulnerable. I’ve not had the practice resisting having them pressed.

        A solid grounding gives you a good reference, but no more. As individuals, it is down to us to use that reference to resist undue manipulation.

          • @cynar@lemmy.world
            link
            fedilink
            English
            12 years ago

            The only way you can’t be manipulated is if you are dead. All human interaction is manipulation of some sort of another. If you think your immune, your likely very vulnerable. If it’s delivered in the correct way, since your not bothering to guard against it.

            An interesting factoid I’ve ran across a few times. Smart people are far easier to rope into cults than stupid people. The stupid, have experienced that sort of manipulation before, and so have some defenses against it. The smart people assume they wouldn’t be caught up in something like that, and so drop their guard.

            In the words of Mad-eye Moody “Constant vigilance!”

      • Dark Arc
        link
        fedilink
        English
        152 years ago

        I think it’s worth pointing out “no longer” is not a fair assessment since this is regularly an issue with older Americans.

        I’m inclined to believe it was never taught in schools, and is probably more likely to be a subject teachers are increasingly likely to want to teach (i.e. if politics didn’t enter the classroom it would already be being taugh, and might be in some districts).

        The older generations were given catered news their entire lives, only in the last few decades have they had to face a ton of potentially insidious information. The younger generations have had to grow up with it.

        A good example is that old people regularly click malicious advertising, fall for scams, etc, they’re generally not good at applying critical thinking to a computer, where as younger people (typically though I hear this is regressing some with smartphones) know about this stuff and are used to validating their information (or at least have a better “feel” for what’s fishy).

      • @MonkCanatella@sh.itjust.works
        link
        fedilink
        English
        32 years ago

        imagine if they taught critical media literacy in schools. of course that would only be critical media literacy with an american propaganda backdoor but still

    • @masquenox@lemmy.world
      link
      fedilink
      English
      5
      edit-2
      2 years ago

      I have to clear out my youtube recommendations about once a week… no matter how many times I take out or report all the right-wing garbage, you can bet everything that by the end of the week there will be a Jordan Peterson or PragerU video in there. How are people who aren’t savvy to the right-wing’s little “culture war” supposed to navigate this?

    • static
      link
      fedilink
      20
      edit-2
      2 years ago

      My normal YT algorithm was ok, but shorts tries to pull me to the alt-right.
      I had to block many channels to get a sane shorts algorythm.

      “Do not recommend channel” really helps

      • @AstralPath@lemmy.ca
        link
        fedilink
        62 years ago

        It really does help. I’ve been heavily policing my Youtube feed for years and I can easily see when they make big changes to the algorithm because it tries to force feed me polarizing or lowest common denominator content. Shorts are incredibly quick to smother mebin rage bait and if you so much as linger on one of those videos too long, you’re getting a cascade of alt-right bullshit shortly after.

      • Andreas
        link
        fedilink
        52 years ago

        Using Piped/Invidious/NewPipe/insert your preferred alternative frontend or patched client here (Youtube legal threats are empty, these are still operational) helps even more to show you only the content you have opted in to.

    • @Mikina@programming.dev
      link
      fedilink
      English
      53
      edit-2
      2 years ago

      My personal opinion is that it’s one of the first large cases of misalignment in ML models. I’m 90% certain that Google and other platforms have been for years already using ML models design for user history and data they have about him as an input, and what videos should they offer to him as an ouput, with the goal to maximize the time he spends watching videos (or on Facebook, etc).

      And the models eventually found out that if you radicalize someone, isolate them into a conspiracy that will make him an outsider or a nutjob, and then provide a safe space and an echo-chamber on the platform, be it both facebook or youtube, the will eventually start spending most of the time there.

      I think this subject was touched-upon in the Social Dillema movie, but given what is happening in the world and how it seems that the conspiracies and desinformations are getting more and more common and people more radicalized, I’m almost certain that the algorithms are to blame.

      • archomrade [he/him]
        link
        fedilink
        English
        32 years ago

        100% they’re using ML, and 100% it found a strategy they didn’t anticipate

        The scariest part of it, though, is their willingness to continue using it despite the obvious consequences.

        I think misalignment is not only likely to happen (for an eventual AGI), but likely to be embraced by the entities deploying them because the consequences may not impact them. Misalignment is relative

      • @Ludrol@szmer.info
        link
        fedilink
        English
        162 years ago

        If youtube “Algorithm” is optimizing for watchtime then the most optimal solution is to make people addicted to youtube.

        The most scary thing I think is to optimize the reward is not to recommend a good video but to reprogram a human to watch as much as possible

        • @Mikina@programming.dev
          link
          fedilink
          English
          72 years ago

          I think that making someone addicted to youtube would be harder, than simply slowly radicalizing them into a shunned echo chamber about a conspiracy theory. Because if you try to make someone addicted to youtube, they will still have an alternative in the real world, friends and families to return to.

          But if you radicalize them into something that will make them seem like a nutjob, you don’t have to compete with their surroundings - the only place where they understand them is on the youtube.

      • @MonkCanatella@sh.itjust.works
        link
        fedilink
        English
        22 years ago

        fuck, this is dark and almost awesome but not in a good way. I was thinking the fascist funnel was something of a deliberate thing, but may be these engagement algorithms have more to do with it than large shadow actors putting the funnels into place. Then there’s the folks who will create any sort of content to game the algorithm and you’ve got a perfect trifecta of radicalization

        • @floofloof@lemmy.ca
          link
          fedilink
          English
          6
          edit-2
          2 years ago

          Fascist movements and cult leaders long ago figured out the secret to engagement: keep people feeling threatened, play on their insecurities, blame others for all the problems in people’s lives, use fear and hatred to cut them off from people outside the movement, make them feel like they have found a bunch of new friends, etc. Machine learning systems for optimizing engagement are dealing with the same human psychology, so they discover the same tricks to maximize engagement. Naturally, this leads to YouTube recommendations directing users towards fascist and cult content.

          • @MonkCanatella@sh.itjust.works
            link
            fedilink
            English
            22 years ago

            That’s interesting. That it’s almost a coincidence that fascists and engagement algorithms have similar methods to suck people in.

    • Entropywins
      link
      fedilink
      122 years ago

      I watch a lot of history, science, philosophy, stand up, jam bands and happy uplifting content… I am very much so feeding my mind lots of goodness and love it…

    • Atemu
      link
      fedilink
      English
      132 years ago

      YouTube’s entire business is propaganda: Ads.

      • @martyc3@lemm.ee
        link
        fedilink
        English
        22 years ago

        Lately the number of ads on YouTube has increased by an order of magnitude. What they managed to accomplish was driving me away.

  • ???
    link
    fedilink
    312 years ago

    This is a suggestion with questionable morality BUT a new account with reasonable subscriptions might be a good solution. That being said, if my child was trying to patronize me about my conspiracy theories, I wouldn’t like it, and would probably flip if they try to change my accounts.

    • Dark Arc
      link
      fedilink
      6
      edit-2
      2 years ago

      Yeah the morality issue is the hard part for me… I’ve been entrusted by various people in the family to help them with their technology (and by virtue of that not mess with their technology in ways they wouldn’t approve of), violating that trust to stop them from being exposed to manipulative content seems like doing the wrong thing for the right reasons.

      • @Historical_General@lemmy.world
        link
        fedilink
        -1
        edit-2
        2 years ago

        Really? That seems far-fetched. Various people in the family specifically want you not to mess with their technology? ??

        If the algorithm is causing somebody to become a danger to others and potentially themselves, I’d expect in their right state of mind, one would appreciate proactive intervention.

        eg. archive.is/https://www.theguardian.com/media/2019/apr/12/fox-news-brain-meet-the-families-torn-apart-by-toxic-cable-news

        • ???
          link
          fedilink
          102 years ago

          I think this is pretty much what it boils down to. Where do you draw the line between having the right to expose yourself to media that validates your world view and this media becoming a threat to you to a point where you require intervention?

          I’ve seen lots of people discussion their family’s Qanon casualties to recognize it’s a legitimate problem, not to mention tragic in many cases, but I would still think twice before ‘tricking’ someone. What if she realizes what’s happening and becomes more radicalized by this? I find that direct conversation, discussion, and confrontation work better; or at least that worked with family that believes in bullshit.

          That being said, the harmful effects of being trapped in a bubble by an algorithm are not largely disputed.

          • Wondering if a qanon person would be offended at you deradicalise them seems like overthinking - it’s certainly possible, but most likely fine to do anyway. The only cases where you’d think twice is if something similar has happened before and if this person has a pattern of falling into bad habits/cultish behaviour in the past. In which case you have a bigger problem on your hands or just a temporary one.

            • Consider it from a different angle - if a techy Q-anon “fixed” the algorithm of someone whose device they had access to due to tech help. That person would rightfully be pissed off, even if the Q-anon tech nerd explained that it was for their own good and that they needed to be aware of this stuff etc.

              Obviously that’s materially different to the situation at hand, but my point is that telling someone that what you’ve done is necessary and good will probably only work after it’s helped. Initially, they may still be resistant to the violation of trust.

              If I think of how I would in that situation, I feel a strong flare of indignant anger that I could see possibly manifesting in a “I don’t care about your reasons, you still shouldn’t have messed with my stuff” way, and then fermenting into further ignorance. If I don’t let the anger rise and instead sit with the discomfort, I find a bunch of shame - still feeling violated by the intervention, but sadly realising it was not just necessary, but actively good that it happened, so I could see sense. There’s also some fear from not trusting my own perceptions and beliefs following the slide in reactionary thinking. That’s a shitty bunch of feelings and I only think that’s the path I’d be on because I’m unfortunately well experienced in being an awful person and doing the work to improve. I can definitely see how some people might instead double down on the anger route.

              On a different scale of things, imagine if one of my friend who asked for tech help was hardcore addicted to a game like WoW, to the extent that it was affecting her life and wellbeing. Would it be acceptable for me to uninstall this and somehow block any attempts to reinstall? For me, the answer is no. This feels different to the Q-anon case, but I can’t articulate why exactly

              • @Historical_General@lemmy.world
                link
                fedilink
                -1
                edit-2
                2 years ago

                Better to be embarrassed temporarily than lose a decade of precious time with your family on stuff that makes you angry on the internet.

                You’re seeing a person who freely made choices here, perhaps like the gamer, but I see a victim of opportunists on youtube, who may have clicked on something thinking it was benign and unknowingly let autoplay and the recommendations algorithm fry their brain.

                You probably think the gamer situation is different because they, unlike the boomer, are aware of what’s happening and are stuck in a trap of their own making. And yes, in such a situation, I’d talk it out with them before I did anything since they’re clearly (in some ways) more responsible for their addiction, even though iirc some people do have a psychological disposition that is more vulnerable that way.

                edit: I want to clarify that I do care about privacy, it’s just that in these cases of older angry relatives (many such cases), I prioritise mental health.

            • ???
              link
              fedilink
              32 years ago

              I guess despite them being Qanon, I still see and believe in the human in them, and their ultimate right to believe stupid shit. I don’t think it’s ever ‘overthinking’ when it comes to another human being’s privacy and freedom. I actually think it’s bizarre to quickly jump into this and decide to alter the subscriptions behind their back like they’re a 2 year old child with not even perception to understand basic shit.

              • @Historical_General@lemmy.world
                link
                fedilink
                -4
                edit-2
                2 years ago

                Nobody said this had to be an instant/quick reaction to anything. If you can see that somebody has ‘chosen’ to fall into a rabbithole and self destruct, becoming an angrier, more hateful (bigoted) and miserable person because of algorithms, dark patterns and unnatural infinite content spirals, I’d recognise that it isn’t organic or human-made but in fact done for the profit motive of a large corporation, and do the obvious thing.

                If you’re on Lemmy because you hate billionaire interference in your life why allow it to psychologically infect your family far more insidiously on youtube?

        • Dark Arc
          link
          fedilink
          22 years ago

          I reworded my comment to clarify (my original wording was a bit clumsy).

          I don’t really think they’re a danger to others anymore than their policy positions in my option are harmful to some percentage of the population. i.e. they’re not worried about indigenous populations invading and killing people with poison arrows, but they do buy into some of the anti-establishment doctors when it comes to issues like COVID vaccination.

          It’s kind of like “I don’t think you’re a great driver, but I don’t think you’re such a bad driver I should be trying to subvert your driving.” Though it’s a bit of a hard line to draw…

          • @Duranie@lemmy.film
            link
            fedilink
            52 years ago

            After watching a hospice patient cry because (according to her) the Dr interviewed on Fox News talked about how he doesn’t do abortions anymore after performing a late term abortion where the mother went into labor and delivered the baby before he could kill it, so he cleaned up the baby and consoled it as he discussed with the parents their options on how to dispatch it after the fact. She was inconsolable. But in drinking Fox’s Kool aid, it was the only channel she would watch.

            For moral reasons I will take any opportunity to nudge the vulnerable away from the harm certain entities create.

            • Dark Arc
              link
              fedilink
              English
              12 years ago

              After watching a hospice patient cry because (according to her) the Dr interviewed on Fox News talked about how he doesn’t do abortions anymore after performing a late term abortion where the mother went into labor and delivered the baby before he could kill it, so he cleaned up the baby and consoled it as he discussed with the parents their options on how to dispatch it after the fact. She was inconsolable. But in drinking Fox’s Kool aid, it was the only channel she would watch.

              I don’t understand what happened in this story.

              I think it’s hard to have a universal morality. I wouldn’t want my family forcing their moral judgements on me if the roles were reversed. e.g. I’m not a car guy, but my family members wouldn’t (even if they could) make it so it only drives to “approved” locations.

              Like the other commentor said, I think it’s better to talk about these issues, though that too can be hard, I can’t say I’ve made much visible traction.

          • Well, I assume neither you or I are psychologists that can determine what one person may or may not do. However these algorithms are confirmed to be dangerous left unchecked on a mass level - e.g. the genocide in Burma that occurred with the help of FB.

            Ultimately if I had a relative in those shoes, I’d want to give them the opportunity to be the best person they can be, not a hateful, screen-addicted crazy person. These things literally change their personality in the worst cases. Not being proactive is cowardly/negligient on the part of the person with the power to do it imo.

        • @Duranie@lemmy.film
          link
          fedilink
          52 years ago

          Maybe not so far fetched. I work in hospice, with the vast majority of the patients I see in their 75-95+yo range. While most have no interest in technology, it’s not uncommon for the elderly to have “that grandchild” that helps everyone set up their cell phone, “get the Netflix to work,” set up Ring doorbells, etc. I’ve even known some to ask their grandchild to help their equally elderly neighbor (who doesn’t have any local family) with their new TV. It’s a thing.

  • @Laticauda@lemmy.world
    link
    fedilink
    652 years ago

    Do the block and uninterested stuff but also clear her history and then search for videos you want her to watch. Cover all your bases. You can also try looking up debunking and deprogramming videos that are specifically aimed at fighting the misinformation and brainwashing.

    • @Today@lemmy.world
      link
      fedilink
      152 years ago

      This is a really good idea - so she begins to see the videos of people who were once where she is now.

  • @some_guy@lemmy.sdf.org
    link
    fedilink
    362 years ago

    I listen to podcasts that talk about conspiratorial thinking and they tend to lean on compassion for the person who is lost. I don’t think that you can brow-beat someone into not believing this crap, but maybe you can reach across and persuade them over a lot of time. Here are two that might be useful (the first is on this topic specifically, the second is broader):

    https://youarenotsosmart.com/transcript-qanon-and-conspiratorial-narratives/ https://www.qanonanonymous.com

    I wish you luck and all the best. This stuff is awful.

    • @driving_crooner@lemmy.eco.brOP
      link
      fedilink
      32 years ago

      It has to be content in Spanish, that makes it a little difficult because I watch exclusively content on English so I don’t know what to recommend her.

      • You should consider asking for recommendations somewhere there’s more Spanish speaking people, because giving the algorithm something new to recommend is necessary to keep the shit away

  • @choquel@lemm.ee
    link
    fedilink
    12 years ago

    i dont know but im just glad youre alive bc the fish bumped you out of the water with its nose

  • Ech
    link
    fedilink
    472 years ago

    Go into the viewing history and wipe it. Then maybe view some more normal stuff to set it on a good track. That should give a good reset for her, though it wouldn’t stop her from just looking it up again, of course.

  • alphacyberranger
    link
    fedilink
    62 years ago

    I too faced this dilemma. So I uninstalled every ad blocker and made it very tedious videos. It kinda helped.

  • Malcriada Lala
    link
    fedilink
    692 years ago

    In addition to everything everyone here said I want to add this; don’t underestimate the value in adding new benin topics to her feel. Does she like cooking, gardening, diy, art content? Find a playlist from a creator and let it auto play. The algorithm will pick it up and start to recommend that creator and others like it. You just need to “confuse” the algorithm so it starts to cater to different interests. I wish there was a way to block or mute entire subjects on their. We need to protect our parents from this mess.

  • @ilco@feddit.nl
    link
    fedilink
    32 years ago

    block youtube/facebook/and any social media . though dns .with pihole for the time being . and in the meantime try to reset her youtube(or delete and create new one -if she doesnt use gmail)

  • @elbowdrop@lemmy.world
    link
    fedilink
    72 years ago

    Idk on here, but if you need help on reddit there is a sub called Qanoncasualties and it’s basically a support group for family members with Q whoevers.

  • @masquenox@lemmy.world
    link
    fedilink
    202 years ago

    You can’t “de-radicalize” your mom because your mom has never been “radicalized” in the first place - if she was, she’d be spray-painting ACAB on the walls and quoting Bakunin or Marx at everyone.

    Your mom has been turned into a reactionary - pretty much the opposite of that which is radical. And since you have access to your mom’s youtube account, it’s radical content that is required to start levering open the cognitive dissonance that right-wing content attempts to paper over.

    However, it can’t just be any radical content - it has to be radical content that specifically addresses things that are relevant to your mom. I’ll use my own mom as an example… she has always been angry about the fact that my (late) father essentially tossed away the family’s nest-egg in his obsessive attempts to become a successful business owner - I showed her some videos explaining how this neolib cult of business-ownership was popularized by the likes of Reagan and Thatcher (which she had no difficulty believing because my dad was a rabid right-winger who constantly brought these two fuckheads up in conversation during the 80s) which led to a lot of good conversations.

    Obviously, your mom will not be as receptive to new information as mine is - so you may have to be a bit sneakier about it. But I don’t see too many other options for you at this point.

    • @aidan@lemmy.world
      link
      fedilink
      32 years ago

      Radicalization !== Radical Marxist. Don’t forget radicalism the ideology!. But yes, definitions can change: etymolically and in modern language radicalism politically is to be pushing for a change at the root of something. I think it is pretty fair to say OPs mom could fit this. Radicalism does not have to be and often is not leftist.

      Furthermore, in my opinion, reactionary is not a descriptive word- it is wholely used as an insult. It doesn’t describe an ideology, it is just a prejorative used to insult ideological opponents. You can again tell by the fact that nobody uninronically considers themselves a reactionary.

      • @masquenox@lemmy.world
        link
        fedilink
        02 years ago

        radicalism politically is to be pushing for a change at the root of something

        You mean that thing right-wing ideology exists to prevent? There is no such thing as a “right-wing radical” - right-wing ideology exists to protect the status quo and destroy or co-opt that which advocates “change at the root…” it has no reason to exist otherwise.

        I think it is pretty fair to say OPs mom could fit this.

        The paranoid racism displayed by OP’s mom isn’t “radical” - it’s bog-standard right-wing white supremacist colonialist paranoia.

        You can again tell by the fact that nobody uninronically considers themselves a reactionary.

        They also generally do not self-identify as fascists or white supremacists, and you can easily anger most right-wingers even by just calling them right-wingers - so your point is… what?

        • @aidan@lemmy.world
          link
          fedilink
          12 years ago

          You mean that thing right-wing ideology exists to prevent?

          Right wing !== The status quo

          There is no such thing as a “right-wing radical”

          You did not click the link. You did forget Radicalism.

          right-wing ideology exists to protect the status quo and destroy or co-opt that which advocates “change at the root…”

          So in a socialist country is advocating for liberalism or monarchism or whatever left-wing and advocating for socialism right wing? Were the Nazi’s left-wing in the Weimar Republic?

          The paranoid racism displayed by OP’s mom isn’t “radical”

          Iirc there was no mention of racism, and I assume based on the mention of “videos in Spanish” that OPs mom is more than likely Latina. But yeah, QAnon is pretty radical seeing as they definitely aren’t the status quo.

          They also generally do not self-identify as fascists or white supremacists

          There are plenty who do, but I’ve never heard of someone identifying as reactionary.

          you can easily anger most right-wingers even by just calling them right-wingers

          I’ve never seen a right-winger being upset by that. Maybe because your definition of right-wing differs a lot from what most people would consider right-wing, they feel they are being mislabeled or mischaracterized.

  • @MrFagtron9000@lemmy.world
    link
    fedilink
    52 years ago

    I had to log into my 84-year-old grandmother’s YouTube account and unsubscribe from a bunch of stuff, “Not interested” on a bunch of stuff, subscribed to more mainstream news sources… But it only works for a couple months.

    The problem is the algorithm that values viewing time over anything else.

    Watch a news clip from a real news source and then it recommends Fox News. Watch Fox News and then it recommends PragerU. Watch PragerU and then it recommends The Daily Wire. Watch that and then it recommends Steven Crowder. A couple years ago it would go even stupider than Crowder, she’d start getting those videos where it’s computer voice talking over stock footage about Hillary Clinton being arrested for being a demonic pedophile. Luckily most of those channels are banned at this point or at least the algorithm doesn’t recommend them.

    I’ve thought about putting her into restricted mode, but I think that would be too obvious that I’m manipulating the strings in the background.

    Then I thought she’s 84, she’s going to be dead in a few years, she doesn’t vote, does it really matter that she’s concerned about trans people trying to cut off little boy’s penises or thinks that Obama is wearing ankle monitor because he was arrested by the Trump administration or that aliens are visiting the Earth because she heard it on Joe Rogan?

  • @Chunk@lemmy.world
    link
    fedilink
    442 years ago

    I curate my feed pretty often so I might be able to help.

    The first, and easiest, thing to do is to tell Youtube you aren’t interested in their recommendations. If you hover over the name of a video then three little dots will appear on the right side. Clicking them opens a menu that contains, among many, two options: Not Interested and Don’t Recommend Channel. Don’t Recommend Channel doesn’t actually remove the channel from recommendations but it will discourage the algorithm from recommending it as often. Not Interested will also inform the algorithm that you’re not interested, I think it discourages the entire topic but it’s not clear to me.

    You can also unsubscribe from channels that you don’t want to see as often. Youtube will recommend you things that were watched by other people who are also subscribed to the same channels you’re subscribed to. So if you subscribe to a channel that attracts viewers with unsavory video tastes then videos that are often watched by those viewers will get recommended to you. Unsubscribing will also reduce how often you get recommended videos by that content creator.

    Finally, you should watch videos you want to watch. If you see something that you like then watch it! Give it a like and a comment and otherwise interact with the content. Youtube knows when you see a video and then go to the Channel’s page and browse all their videos. They track that stuff. If you do things that Youtube likes then they will give you more videos like that because that’s how Youtube monetizes you, the user.

    To de-radicalize your mom’s feed I would try to

    1. Watch videos that you like on her feed. This introduces them to the algorithm.
    2. Use Not Interested and Don’t Recommend Channel to slowly phase out the old content.
    3. Unsubscribe to some channels she doesn’t watch a lot of so she won’t notice.
    • @Beanerrr@lemmy.world
      link
      fedilink
      3
      edit-2
      2 years ago

      I can confirm that this works quite well. I use these tactics all the time on my parents’ feed to keep them from watching too many “crap” news, pardon my french. There’s a very notorious news channel in our country that insists on feeding bad (and only the bad) news - I often remove their channel from the suggested feed play videos of funny fails/wins, cute cats, daily dose of internet and other happy nonesense.

      Give us an update sometime @driving_crooner@lemmy.eco.br , hope all goes well with your mom.

    • @Ducks@ducks.dev
      link
      fedilink
      72 years ago

      OP listen to this comment. YouTubes goal is to feed you as much related content to keep you on the site as long as possible. Radical content or otherwise, any engagement to them is positive. You can spend some time curating the feed so that the algorithm works in your favor and the algorithm will adjust very quickly to new interests.