We’ve made so much “progress” that we can now order food, learn, and work without ever seeing another person. But is that really true progress or just convenience that erodes our humanity and quality of life? We’ve traded a handful of close friends for thousands of followers—and feel lonelier than ever. We’ve gone from exploring new frontiers to decluttering our minds from misinformation and AI-generated noise. In the chase for speed and efficiency, we’ve innovated away the messy but vital parts of life: failing, practicing, discovering, and connecting.
People often ask me whether curaJOY is a tech nonprofit or a behavioral health organization. That confusion is understandable—for decades, technology, learning, and wellness were treated as separate worlds. Tech was about devices and data; health and education were about services. But today, technology and wellness are inseparable. Technology is no longer just a tool—it’s the environment we live in. It shapes how we connect, how our children see themselves, and how societies resolve conflict. Ignoring technology—or treating it in black-and-white terms—is like ignoring air or water quality in a city: it shapes everyone’s health.
Platforms pursued our attention, powered by algorithms tuned for profit and time-on-site, normalizing comparison culture and “always-on” habits while youth safeguards lagged. We drifted along without a fight, just as we slid into a screen-bound inactive lifestyle fueled by ultra-processed food. It’s convenient at first, but it rapidly atrophies the muscles we need for real-world thriving—attention, sleep, conflict resolution, empathy. The U.S. Surgeon General now flags sleep disruption and attention concerns as public-health risks from technology use and calls for stronger guardrails and research.
It’s not that technology is bad. Just as we shouldn’t blame our sedentary lives solely on the makers of desks and chairs, the real mistake was adopting tech without clear foresight, guardrails, and community oversight—especially for vulnerable populations like youth. When businesses are rewarded only for attention and communities don’t set norms, outcomes naturally follow the incentives.
AI is already following the same trajectory as social media—racing ahead without community oversight, driven by engagement metrics and unchecked incentives. That’s why this moment demands urgent action: we must not wait for harm to accumulate before we intervene. We cannot repeat that pattern. We still have a second chance to decide whether AI will feed our worst impulses or cultivate our best ones. But this time, communities—families, educators, clinicians, and youth—must ask the right questions and set the terms from the beginning.
People Are Waking Up—But Slowly
Parents and attorneys general are testing the courts: families have filed suits alleging AI chatbots contributed to teen suicides; New Jersey’s Attorney General has sued Discord over alleged child-safety failures. Education systems worldwide are restricting phones in class—79 systems by the end of 2024.
In 2025:
-Over 300 pieces of legislation are pending across 45 states to regulate youth social media use, including age verification, parental consent, and impact assessments.
-Google DeepMind updated its Frontier Safety Framework to address risks like harmful manipulation and AI systems resisting shutdown.
-All 50 U.S. states introduced AI-related legislation, with 38 enacting measures on ownership of AI-generated content, worker protections, and misuse prevention.
-Over half of U.S. states have moved from district-led efforts to statewide mandates banning phones during the school day.
-States like Delaware and Colorado are piloting phone-free programs with research-driven evaluations before broader adoption.
Technology Isn’t the Villain—Incentives, Defaults, and Inaction Are
It’s much easier to blame tools or praise them as saviors. Humans prefer simple stories under uncertainty. But it’s our collective inaction that perpetuates harmful systems. Consider how we’ve gradually outsourced key skills to screens: blocking instead of repairing conflict; group chats replacing awkward, growth-building in-person problem-solving; short-form feeds training rapid novelty over sustained attention; late-night scrolling displacing journaling or sleep.
Blanket bans are tempting, but the evidence is mixed. (and good luck getting phones out of kids’ hands.) These are important measures, but they’re reactive. Intentional design and use matter more than simple prohibitions–the deeper work is proactive–and frankly, as a parent of teenagers, I cannot wait for legislation to force social media companies to verify age. Parents who lost their kids to online child predators don’t want to be let down by superficial policies that cannot be enforced. Recent computer science graduates don’t want to gamble their careers and student debt on an industry that’s innovating their values away. And communities don’t want to keep cleaning up preventable harms after the fact.
What Intentional Tech Design Looks Like
Over a decade, I tried every parental-control solution (hardware and software) available. I burned cash and a lot of time setting up VPN, time limits, filters, etc. My daughter learned the workarounds; Restriction alone didn’t work—redesigning environments and routines did. I learned this the hard way. (I also learned that I had to change. I couldn’t tell my kids not to be in front of a screen when I was there 14 hours a day.)
That’s why curaJOY exists: to show what it looks like when technology is built for wellness with communities from the ground up. MyCuraJOY helps schools and families understand behavior without provoking crises, and coaches parents to prevent problems before they start. It’s one example of how tech can support—not replace—human connection and care.
And curaJOY isn’t alone. Technology is helping Operation Zero to prevent veterans suicide. mRelief is simplifying access to food assistance through tech to reduce stigma and increase equity. These organizations reflect a growing movement: designing technology not just for engagement, but for empathy, wellness, and community values.
What We Still Don’t Know—Hidden Dangers
-What happens when a child spends 10–15 years practicing most social skills through screens?
-Which kinds of digital engagement truly protect wellness over decades—and which corrode it?
I’m not arguing that every product should maximize wellness. I’m arguing for risk-weighted governance: when a product touches youth widely or intermediates huge swaths of our attention, it should meet higher standards—clearer metrics than “time on site,” safer defaults by design, and community oversight. That’s how other industries like car, pharmaceutical, manage risk; digital systems that raise our kids should do no less. We’re only beginning to address near-term harms; the long-term effects are even less understood. That uncertainty is part of the problem: we shipped youth-defining platforms without longitudinal research or community governance. Communities should be involved from the start, asking these questions up front instead of inviting our families to become guinea pigs.
Here’s How We Move Forward
-Measure more than minutes and eyballs. Especially for products used by youth, we need to evolve beyond an advertising-based business model towards outcome-based ones. (I’ll explore this further in my next article.)
-Educate for and practice intentional use. Digital literacy should be as essential as reading and math. Families and schools need simple tools and shared norms that model healthy habits.
-Invest in longitudinal research that tracks delayed and cumulative effects—not just snapshots—to learn which designs promote thriving over time.
-Shape culture, not just products. Prioritize offline connection, diverse role models, and community-driven design so tools reflect local values rather than only corporate incentives.
Social media is our cautionary tale. We let corporations decide the part they played in our lives, and we’re still reckoning with the fallout. AI is our second chance. If we act intentionally—designing, educating, and studying for wellness [this may not be the right term]—technology can become a partner in human flourishing. If not, the harms will compound quietly until they’re too large to ignore.
The choice is ours: to be passive consumers or active shapers of technology and culture for true progress.
Leave a Reply