By Dineke · April 2026
Use AI as your PA, not your therapist
It's 10pm. The kids are finally asleep. You have — if you squint — fifteen quiet minutes before you should also be asleep. Downstairs, there's a 14-paragraph email from school about next week's trip, a WhatsApp thread about the class gift you haven't replied to, an empty fridge, and a vague feeling that something is due tomorrow and you can't remember what.
A lot of parents, quietly, have started handing that fifteen minutes to ChatGPT. And it's working. Roughly 79% of US parents with children under 18 have now used AI; 29% use it daily — almost twice the rate of adults without kids. On Mumsnet, the busiest tech threads aren't about crypto or coding. They're mums swapping prompts for meal plans, packed lunches and school admin.
This post is a small argument for keeping that habit — and for being careful about where it ends. We think AI makes a brilliant personal assistant for a family. We think it makes a worrying therapist. The tricky bit is that the same app does both, and the line between them is softer than it looks.
The PA lane
Start with the good news, because there is a lot of it. Used as an assistant, AI is one of the most effective pieces of mental-load relief households have had in a generation. If you scroll parenting forums here and in the US, the same jobs keep coming up:
- • Turn a photo of the term calendar into a proper list of dates
- • Summarise the 14-paragraph school email into three bullets
- • Draft a reply to the class WhatsApp that doesn't sound passive-aggressive
- • Plan a week of packed lunches around what's actually in the cupboard
- • Translate the nursery's consent form
- • Turn half an hour of scribbled notes into a birthday-party plan
- • Draft the email to the GP receptionist, the builder, the landlord, school
This is real work. It's the anticipating and monitoring at the bookends of household life — the bits researchers have shown fall most heavily on mums — being handed off to something that doesn't forget, doesn't sulk, and doesn't mind being asked at 11pm.
It's also reversible. If ChatGPT misreads a date on the school photo, you'll spot it when you glance at the calendar. If the packed-lunch plan is a bit weird on Wednesday, Wednesday is not ruined. The worst case is you re-do it, and you still got your evening back. That's the profile of task AI is genuinely good for: boring, high-volume, recoverable.
We're firmly in favour of this. If anything, most parents we talk to are under-using it for logistics, not over-using it.
The therapist lane
Here's where we'd like to plant a small, friendly flag. The same tool that's excellent at "draft a polite reply to this email" is increasingly being used for something quite different: working out parenting decisions, processing difficult feelings, and — especially — giving medical and developmental advice about children.
One study from the University of Kansas asked 116 parents to read health-related advice about their children — infant sleep, nutrition, that kind of thing — half written by ChatGPT, half written by qualified healthcare professionals. Parents couldn't reliably tell which was which. Worse: when they could tell a difference, they rated the ChatGPT advice as more trustworthy, more accurate and more reliable than the expert advice.
That's not a finding about AI being clever. It's a finding about how desperately easy it is, at 10pm with a sick toddler and a two-week wait for a GP appointment, to accept a confident-sounding paragraph that happens to be wrong. ChatGPT sounds like a GP. It isn't one. It has no file on your child, no chain of accountability if it gets it wrong, and no way to say "actually this one needs a human to look at them."
The same pattern shows up with emotional weight. AI is excellent at sounding supportive, which is a different thing from being supportive. It has no memory of your last argument with your partner, no knowledge of your family's history, and every incentive to keep the conversation going. It will gently agree with you for as long as you keep typing. That is not what a friend, a therapist, a relative or a GP does, and there is a reason.
(A brief note on teenagers: there is now a great deal of evidence — from Stanford, Drexel and others — that AI companion apps sit particularly badly with adolescent brains, and the UK government announced in February that chatbots will be pulled into the Online Safety Act this year. If you have teens at home, that conversation is worth having. It's also a conversation that deserves its own post, and we'll give it one.)
A rule of thumb
When we're deciding whether to hand something to AI ourselves, we use a small test:
Would I delegate this to a very capable assistant who doesn't know my family?
Drafting the PTA email, turning the school calendar photo into dates, planning Tuesday's dinner, summarising the 40-page holiday-club brochure, writing the "so sorry we can't make it" reply: yes. An assistant can do that, a mistake doesn't hurt anyone, and you get your evening back.
Deciding whether a rash needs a GP, whether your six-year-old's sleep is "normal", whether to separate, whether to be worried about your mum, how to feel about any of it: no. Those aren't tasks. They don't come apart neatly, and a confident answer is worse than no answer. Those need a human — a GP, a health visitor, a friend, a therapist, a relative who knows you.
The honest middle case is "a human would help, but I can't get one tonight." We've all been there. In that corner, our small suggestion is: use AI to help you describe the problem, not to solve it. Ask it to help you write a clear note for the GP, summarise the timeline of symptoms, or draft the message to a friend you haven't spoken to in a while. That's still the PA lane. Just pointed at a harder week.
Where we sit
We build parte, a family calendar. We use AI where it's boring and useful: reading a school email and turning it into events, searching a cluttered schedule in plain English, turning a photo of a timetable into a week. We don't build a chatbot for your family to talk to. We don't offer advice. We don't want to know your family's secrets.
Calm infrastructure, not a relationship. A PA, not a therapist. If AI pulls its weight in the corner of the house that deals with packed lunches, PE kit and parents' evening — and stays out of the corner that deals with everything else — that's a trade we'd take every time.
Sources
- University of Kansas / AAU (2024). Study finds parents relying on ChatGPT for health guidance about children.
- Boston Globe (January 2026). AI to the rescue: how parents use ChatGPT in daily life.
- Motherly. I'm a ChatGPT convert — here's how it helps with the mental load of motherhood.
- Mumsnet. ChatGPT discussion threads.
- Partnership on AI. Can AI apps help carry the mental load for moms?
- Stanford Medicine (2025). Why AI companions and young people can make for a dangerous mix.
- CNBC (February 2026). AI chatbot firms face stricter UK regulation to protect children.
- Ofcom (2026). AI chatbots and online regulation: what you need to know.
- Daminger, A. P. (2019). The Cognitive Dimension of Household Labor. American Sociological Review, 84(4), 609–633.