
Published Dec 31, 2025•S1 E2
SUMMARY
This episode unpacks how social media algorithms work, their impact on our behavior, and how to take control of your digital experience. Learn how to recognize manipulation and train your algorithm for a healthier online life.
Key Topics
- How social media algorithms work
- The impact of algorithms on mental health
- Strategies to control and train your algorithm
Sound Bites
- “The algorithm learns what makes you tick.”
- “It knows what content keeps you engaged.”
- “The algorithm tests and refines on you.”
Episode Timeline:
00:00 Understanding Algorithms: The Basics
04:28 The Algorithm’s Influence on Behavior
09:19 Training Your Algorithm for Positive Engagement
13:26 Empowering Kids: Conversations About Algorithms
Schnelle Acevedo (00:02)
Welcome back to Smart with Screens. I’m Chanelle Acevedo. Last episode, I told you why a digital marketer is teaching digital literacy. Today, I’m pulling back the curtain on the thing that makes social media so addictive, algorithms. I spent 14 years using these systems to sell products for major brands. I know how they work. I know what they’re designed to do. And I’m going to explain to you in plain language, no jargon, no tech speak needed.
Because once you understand how the algorithm works, you can start to work with it instead of against it. And more importantly, you can teach your kids to do the same. Let’s get into it. So basically, what is an algorithm? In the simplest of terms, an algorithm is just a set of instructions. A recipe is an algorithm. Turn by turn directions are an algorithm. Shout out to MapQuest. If this, then that.
On social media, the algorithm is the system that decides what you see when you open up the app. it’s why your Instagram feed looks different from your friends feed, even though you follow some of the same people. It’s why YouTube keeps suggesting certain videos. It’s why TikTok feels like it can read your mind because honestly, my For You page is absolutely 100 % for me.
The algorithm is watching what you do, what you click, what you watch, what you scroll past, how long you look at something, and then it’s making predictions on what you will want to see next. And here’s the key thing to understand. The algorithm’s job is not to show you what’s good for you. Its job is to keep you on the platform for as long as possible. Because the longer you’re on the platform, the more ads you see. The more ads you see, the more money the company makes. That’s it.
That’s the whole business model. So when we’re talking about the algorithm, we’re talking about a system designed to figure out what keeps you specifically watching, clicking and scrolling. And then they give you more of that. It’s not magic. It’s not evil. It’s just optimized for engagement, not your wellbeing.
Now let me tell you something embarrassing. I know how algorithms work. I’ve built campaigns that use them. I teach people about them and I still fall victim to them. Doom scrolling is a real thing and I’ve done it more times than I want to admit, often to just fall asleep. I’ll pick up my phone to check one thing, maybe a message, maybe the weather, and somehow 45 minutes later, I’m deep in my Instagram feed looking at content that’s making me feel not great.
A little comparison, a big drop of anxiety, a big scoop of FOMO, that weird mix of videos that I can’t look away from and make me wonder, why am I still looking at this?
The algorithm learned what made me tick. It knows I’m a parent, so it shows me parenting content. It also knows that I’m interested in digital marketing, food blogging, the New York Knicks, the New York Giants, and pretty much anything sports. It knows me. It knows what to keep me engaged. And it knows because it’s tracked my behavior and that I’m more likely to keep scrolling when I’m seeing content that makes me slightly anxious or inadequate.
because that emotional response keeps me engaged. So it feeds me more of that, not because it haze me, because it’s doing exactly what it was designed to do, keep me on the platform. But here’s what’s changed for me recently. I’ve started to notice in the moment what’s happening.
I’ll be scrolling and I’ll catch myself thinking, wait, how does this video make me feel? And if the answer is anxious, inadequate, angry, or just drained, I’ll scroll away. I’ll close the app. And most likely I’ll probably say, nope, not today algorithm before I get really sucked in. It doesn’t always work. And I’m not perfect at this, but I’m getting better at recognizing when I’m being manipulated, not by the content creator, but by the system that’s decided to show me the content right now.
And that awareness, that is the difference between being controlled by the algorithm and having some agency over it.
So let me break down how this actually works using Instagram as an example, though this applies to TikTok, YouTube, Facebook, every single one of them. The algorithm is watching every single thing you do. And I mean everything. It is watching what posts you like or comment on, what posts you save, what accounts. And I mean everything. It’s watching what posts you like or comment on, what posts you save.
What accounts you visit directly. How long you watch a video before scrolling past. What to search for. Who you DM. What time of day you’re most active and even though
and even what you don’t interact with. If you’re consistently scrolling past certain types of content, the algorithm will learn from that too.
Every single action point is a data point.
If you’ve watched five cooking videos in a row, it’s going to show you more cooking content. If you tend to engage with posts about politics, you’ll see more political content. If you watch videos all the way to the end, when they have a certain style or topic, you’ll get more of those. It’s not showing you what you asked for. You didn’t say, show me more cooking videos. It’s showing you what your behavior suggests that you’ll engage with.
Next, the algorithm tests and refines a lot. So this is a part that people don’t always realize. The algorithm is constantly experimenting on you. It’ll show you something slightly different from your usual content to see if you engage with it. And I’ll give you my own example. As I mentioned before, I’m a big fan of the New York Knicks and I engage with everything that I see online. Whether it’s a Jalen Brunson buzzer beater or it is the coaches talking, I will engage with everything.
But recently they started to show me lot of gambling content. I said, no, that’s not for me. So I immediately hit not interested.
The thing is, the algorithm will show you something slightly different from your usual content to see if you engage with it. If you do, it learns, ⁓ they like this type of content. If you don’t, it learns, okay, stick with what works.
It’s like those choose your own adventure books except you’re choosing with your clicks and your scroll speed and your algorithm is writing the next page based on your choices.
And finally, the algorithm optimizes for engagement metrics. Here are the things that the algorithm cares about the most. Time spent, how long did you watch this video? Completion rate, did you watch it all the way through? Interactions, did you like, comment, share, save? Repeat behavior, did you go back and watch it again and again? Did you visit that account for more of that content?
The algorithm does not care if the content made you happy or sad, informed or misinformed, better or worse. It cares if you engaged with it. And here’s the tricky part. Negative emotions often drive more engagement than positive ones. Outrage, fear, anxiety, these make people click, comment, and share. So the algorithm learns to show you content that triggers those emotions because that’s what keeps you on the platform.
So let me give you a concrete example from my digital marketing days. Let me give you a con… So I shared with you guys that I am a full-time blogger and let me give you a concrete example of what I’m talking about here. We ran a campaign for a baby product and I tested two different versions of the video ad. So a real or a TikTok. The first version had a happy baby, smiling parents, me and my husband.
and basically saying that the product makes life so much easier. The second version had me looking crazy, fussy baby, and kind of implied that, you struggling with this problem too? And then the product as a solution. Version B performed better. Not because the product was different, but because it was just like that slight anxiety that makes people think, no, am I struggling with this too? And it makes people watch longer and click more.
The algorithm learned that showing people content that made them slightly uncomfortable got better engagement. So it showed version B to more people. And that’s how this scale, and that’s how this works at scale across every platform. The algorithm finds what gets a reaction, any reaction, and gives you more of it.
The best part is you can train your algorithm. Now here’s the good news, and this is what I want parents and kids to understand. You have more control over your algorithm than you think. The algorithm is learning from your behavior, which means that you can teach it lessons. Here’s how. The first way is to be intentional about what you engage with. So if you want to see more positive content, actively engage with positive content. Like it, save it, watch it all the way through.
Retweet it, repost it. The algorithm will take note. If you’re seeing content that makes you feel bad, scroll past it quickly. Do not hate watch. Do not rage comment. Do not like the comments that reflects how you feel. Because even negative engagement tells the algorithm this person cares about this. Show them some more. And most importantly, use the not interested button.
On Instagram and TikTok and YouTube, they all have ways to tell the algorithm, I don’t want to see this. Use it. Don’t just scroll past content you don’t like. Actively tell this algorithm to stop showing it to you.
On TikTok, press and hold a video, select not interested, do the same on Instagram and the same on YouTube. This is powerful because you are directly training the algorithm. Search for and engage with the content that you actually want. Don’t wait for the algorithm to show you the good stuff. Actively search for accounts, topics, hashtags that align with what you want to see more of. Follow them. Engage with their content.
The algorithm will start to understand, ⁓ this person is interested in educational content about science or this person likes uplifting stories. Whatever it is you’re actively seeking out, you will find it. Take breaks and reset if your algorithm has gotten really negative or anxiety inducing. Sometimes you just need to reset it. It might mean shutting it down altogether, taking a few days off the platform entirely.
Unfollowing or muting accounts that don’t serve you, actively seeking out and engaging with different types of content or a week.
the algorithm will recalibrate based on your new behavior. And here’s what I tell kids in my workshops in New York City. You can choose what kind of algorithm you have. Do you want an algorithm that shows you inspiring creative content, funny videos that make you laugh, educational stuff that teaches you things, content about your hobbies?
You get to train it every like, every save, every video you watch all the way through. You’re teaching the algorithm what to show you next. It’s not perfect control. The algorithm still has its own agenda, but it’s more agency than more people realize that they have. And what does this mean for parents and kids?
What do we do with all of this information? Understand that when your kid says, I can’t stop scrolling, they’re not being dramatic. They’re up against a system that’s literally designed by some of the smartest engineers in the world to be hard to resist. But also understand that they’re not helpless. They can learn to recognize what’s happening and make different choices. Have conversations with your kids about what their algorithm is showing them. What kind of videos are you seeing a lot of lately?
How does that content make you feel? Do you want to see more of that or should we figure out how to train your algorithm differently? Pay attention to how content makes you feel, not just in the moment, but after you put your phone down. If you’re consistently feeling worse after being on an app, more anxious, more insecure, more angry, that’s information. You have the power to train your algorithm.
You can teach it to show content that makes you feel good, that teaches you things, that inspires you. It takes intentionality, but it works. And for all of us, please remember, the algorithm is not your friend. It’s not trying to help you. It’s trying to keep you on the platform. And once you understand that, you can start making more conscious choices about when to engage and when to walk away. I wish I could tell you that there’s a simple fix.
A setting that you could turn on that makes social media healthy and positive, but there isn’t. What there is, is awareness. Understanding how these systems work, recognizing that you’re being manipulated, and making intentional choices about what you engage with. I’m still working on this myself, I still do scroll sometimes, I still fall into the trap, but I’m getting better at noticing it and that is the first step. In the next episode, I’m going to talk to you about a wonderful conversation that I had with a local teen council where a young man asked me a question that stumped me. And I thought I knew everything about digital literacy. You’re going to want to hear about this one. If you’re finding this helpful, please share it with someone who needs to hear it. And if you want to bring these workshops to your school, library or community program, please reach out to contactus@bamdigitalmedia.info and read more about what we do at bamdigitalmedia.info.
I’m Schnelle Acevedo, let’s get smart with screens.
