Parker Woodroof, Ph.D., associate professor of marketingIn the current digital landscape, children feel pressured to be connected.
But the real question is: connected to what?
Parker Woodroof, Ph.D., a social media expert and associate professor of marketing at the Collat School of Business at the University of Alabama at Birmingham, is exploring those answers through his research.
Studies suggest that up to 95 percent of youth ages 13–17 use a social media platform, with many reporting constant use. Although 13 is the commonly required minimum age for social media users in the United States, nearly 40 percent of children ages 8-12 use social media.
As summer begins and children spend more unsupervised time online, Woodroof helps parents understand how social media algorithms shape their children’s perceptions of reality and offers advice on how to help them stay grounded in the midst of it.
How social media algorithms work
Woodroof, whose research explores the intersection of social media, behavioral science and marketing ethics, said, “Social media algorithms are like an invisible DJ. They choose the next post based on what they think you will like and feed you more of that.”
These algorithms are not designed to make users wiser or happier. Instead, they are built to keep users scrolling. That is why digital literacy is now essential for parents, Woodroof notes.
Most platforms were originally designed for adults but became wildly popular with youth. Children’s developing brains are especially vulnerable to content loops, comparison traps and addictive features.
The data behind the feed
Social media algorithms do more than personalize; they predict. Platforms track every click, scroll, share, message, video view and even how long users hover over a post. They gather contextual data such as time of day, location, language and connected apps.
“Algorithms learn what a person likes by observing what they pause on, click, comment on or share,” Woodroof said. “The more one engages with a type of content, the more they get it.”
What social media rewards
Algorithms reward what spreads, not necessarily what is accurate or kind. Emotional content such as outrage, beauty, humor, controversy and fear tends to perform better than calm nuance.
“The feed is not the best of the internet; it is the most clickable,” Woodroof said. “These systems do not know one’s values. They only know patterns. And the more predictable a user’s behavior is, the more profitable they become.”
This creates a risk that children may begin to believe social media reflects reality, when in fact it reflects what gets attention.
Common myths
A major myth is that following “good” accounts ensures only age-appropriate content. In reality, algorithms often serve related posts that may not be appropriate, even if the child never searches for them.
Another myth is that only the amount of time spent online matters. “Comparison, misinformation and self-esteem spirals do not require hours,” Woodroof said. “They just need moments.”
What parents can do
Woodroof emphasizes that the goal is better screen time over less screen time. He advises parents to model critical thinking and have regular conversations with their children to protect them from the harms of social media.
He stresses making social media use a conversation, not a control issue, by co-creating rules like device-free dinners or sunset scroll limits.
“Involving children in decision-making helps discipline stick better,” Woodroof said. “During the conversations, emphasize the ‘why,’ which is to protect their minds, not to punish their fun. Ask children, ‘What do we want our evenings to feel like?’ instead of ‘Turn it off now.’”
Another way to help children is by constructively narrating digital skepticism when appropriate. For example, when a child shows a photo of an influencer they admire, parents can say, “That looks perfect, and I wonder how many times it took to get that shot.”
Normalize the idea that social media is a highlight reel, not the whole story. Parents can do that by asking questions like: What do you think that person is not showing? How do you feel after scrolling? How do you think they felt after posting that?
“These questions will help a child build curiosity instead of comparison,” Woodroof said. “Checking in with children emotionally after a scroll will build emotional intelligence and digital resilience. Over time, these moments will train children to see through the gloss and question the highlight reel.”
Most importantly, parents should model healthy tech use themselves. If children see parents doomscrolling at bedtime, rules will feel hypocritical.
“It seems fitting that the term ‘user’ describes consumers of illegal drugs and consumers of social media, which are both engineered for dependency,” Woodroof said. “But the antidote is raising kids who know they are more than what they consume and stronger than what they scroll through.”
The best defense against algorithms
Woodroof says a strong parent-child connection is the most powerful safeguard.
“Your relationship with your child is stronger than any algorithm,” Woodroof said. “As opposed to parental controls, a grounded sense of self protects kids online. When children feel loved and valued at home, they are less likely to search for identity in the algorithm.”
Safer alternatives for younger users
Two emerging platforms, CoverStar and Zigazoo, designed to offer safer, developmentally appropriate content, are promising alternatives to TikTok for younger audiences.
- Zigazoo (ages 5-11): It functions like a collaborative learning space and uses a prompt-response model focused on education and creativity. There are no open comments or private messages. Interactions occur through pre-approved emoji reactions and video replies. Its algorithm avoids addictive loops and promotes positive identity formation. It is compliant with the Children’s Online Privacy Protection Act, commonly referred to as COPPA, and it is heavily moderated.
- CoverStar (ages 10-14): Appeals to kids interested in music, dance and performance. It emphasizes creativity over virality through themed challenges judged by influencers or professionals. The platform allows more interaction than Zigazoo but includes strong moderation and safety features.
“While CoverStar uses AI-driven content discovery, its algorithms prioritize creativity, effort and talent over maximizing watch time or engagement,” Woodroof said.
UAB’s Collat School of Business ongoing digital media research helps students and communities navigate social interactions and perceptions with wisdom, humanity and hope.