TMIT 39: Family AI – The Tools We're Using to Clarify, Coach, & Create at Home
We did something that sounds crazy: We gave our 8-year-old an iPhone 15 Pro. But there is a strategy behind the screen.
In this episode, we are exploring a new frontier: Family AI. We believe this is a pivotal moment where parents can either fear the technology or learn to lead with it. Our goal? To shift from being a "consumer family" (passive scrolling) to a "creator family" (active building).
We break down our personal framework for using AI at home—The 3 C’s: Clarify, Coach, and Create.
In this episode, we cover:
- The iPhone Decision: Why we gave Hunter a "device" (not a phone) and how we locked it down using Apple's native settings.
- Clarify: Using tools like the Limitless Pendant to capture the "ground truth" during disagreements and using voice-to-text to save brainpower during brainstorming.
- Coach: How we use AI as a neutral third party to mediate sibling arguments (like Maverick vs. Hunter) and navigate health scares in real-time.
- Create: Moving from consumption to creation—from designing our Thanksgiving gratitude tables to making explainer videos for school using NotebookLM.
Mentioned in this episode:
- 🔗 Limitless AI: limitless.ai (Recently acquired by Meta!)
- 🔗 NotebookLM: notebooklm.google.com
- 🔗 OpenAI Whisper: openai.com/index/whisper
Watch the full video version of this episode on Spotify.
Join us as we figure this out in real-time. It’s messy, it’s new, but it’s the most important addition to our family workflows ever.
1
00:00:00,040 --> 00:00:02,000
Welcome to the most important
thing.
2
00:00:02,240 --> 00:00:05,280
I'm Danielle.
And I'm Greg Bot, and together
3
00:00:05,280 --> 00:00:08,119
we're exploring family culture
in the age of AI.
4
00:00:08,160 --> 00:00:11,000
That's right.
I've got my meta Ray bans on
5
00:00:11,200 --> 00:00:15,240
ironically of which I never use
any AI features so let me take
6
00:00:15,240 --> 00:00:19,360
those off.
But they are the best family
7
00:00:19,440 --> 00:00:22,520
photography assistant so that is
why I started with these.
8
00:00:23,040 --> 00:00:25,000
I think that every family should
have one.
9
00:00:25,400 --> 00:00:29,000
Allows me to stay in the moment
and shoot my kids without
10
00:00:29,000 --> 00:00:30,640
needing to go into my pocket
first.
11
00:00:30,920 --> 00:00:33,040
That's right.
I'll leave it to Greg to always
12
00:00:33,040 --> 00:00:36,280
have the latest technological
innovation for both work and
13
00:00:36,280 --> 00:00:38,320
home.
Yeah, well, we were just joking
14
00:00:38,320 --> 00:00:41,520
that the kids got new iPads and
new headphones because it's a
15
00:00:41,520 --> 00:00:43,720
gift to me.
It's not really a gift to them.
16
00:00:43,720 --> 00:00:45,040
It's a gift to us, being able
to.
17
00:00:45,040 --> 00:00:47,760
Daddy, I need you to connect my
headphones exactly.
18
00:00:47,840 --> 00:00:50,440
Yeah, well, you really took this
to the next level because it was
19
00:00:50,440 --> 00:00:54,400
Hunter's 8th birthday this past
week and she wanted a camera.
20
00:00:55,560 --> 00:00:58,400
And.
You did a lot of research on the
21
00:00:58,560 --> 00:01:01,880
best camera for kids and what
did you come out?
22
00:01:01,880 --> 00:01:05,440
With I said Hunter, the best
camera for kids, the best camera
23
00:01:05,519 --> 00:01:10,600
is an iPhone, and I ain't even
going to figure out how to lock
24
00:01:10,600 --> 00:01:13,080
down this sucker so that it is
just a camera for you.
25
00:01:14,160 --> 00:01:15,720
So that's right, you heard
correctly.
26
00:01:16,160 --> 00:01:19,800
Hunter Neufeld currently owns an
iPhone, and not even a very old
27
00:01:19,800 --> 00:01:22,520
one.
No, she's got a 15 Pro, so
28
00:01:22,520 --> 00:01:25,680
that's a very recent device.
So it's it was your old phone
29
00:01:26,160 --> 00:01:28,960
and she's very happy.
I'm very happy.
30
00:01:28,960 --> 00:01:32,040
Obviously she's very happy I'm.
Very happy that I can lock down
31
00:01:32,040 --> 00:01:35,120
all of the controls and apps.
Apple's done a pretty good job
32
00:01:35,120 --> 00:01:37,440
of figuring this out and I was
like this is a solved problem.
33
00:01:37,720 --> 00:01:40,360
We have an 8 year old that
doesn't really care about the
34
00:01:40,360 --> 00:01:43,320
apps but does care about video
and photography.
35
00:01:43,320 --> 00:01:45,640
So let's just give her this and
get rid of the apps.
36
00:01:46,000 --> 00:01:48,600
Yeah, so stay tuned.
We gave our 8 year old an iPhone
37
00:01:48,960 --> 00:01:53,000
and she definitely had a lot of
excitement about the form factor
38
00:01:53,000 --> 00:01:55,920
for the 1st 24 hours, but it
wore off pretty quickly.
39
00:01:55,960 --> 00:01:58,640
It did.
And it's funny, Jane keeps
40
00:01:58,640 --> 00:02:01,400
saying to correcting Hunter when
she calls it a phone, she says
41
00:02:01,400 --> 00:02:03,040
it's a device.
That's right.
42
00:02:03,040 --> 00:02:04,840
Which she's right.
She's right.
43
00:02:04,840 --> 00:02:10,680
So I guess the plan for us is to
give her the phone now though,
44
00:02:10,680 --> 00:02:15,680
as a form factor that can only,
we can only do cameras, so
45
00:02:15,680 --> 00:02:19,480
videos and photos, but that
hopefully she will, well, one,
46
00:02:19,480 --> 00:02:21,480
because it was the best
technology available, right?
47
00:02:21,960 --> 00:02:24,920
We, we didn't really feel like
spending a couple $100 on a joke
48
00:02:24,920 --> 00:02:28,200
of a kid camera.
But the idea being that she can
49
00:02:28,200 --> 00:02:31,280
kind of grow with the phone that
will keep it developmentally
50
00:02:31,280 --> 00:02:33,720
appropriate.
So right now she has no way of
51
00:02:33,720 --> 00:02:36,120
communicating with the outside
world, right?
52
00:02:36,560 --> 00:02:42,640
But that over time we will build
on to this device and make it
53
00:02:42,640 --> 00:02:45,560
and it will grow with her, which
I haven't really heard people
54
00:02:45,560 --> 00:02:47,520
talking about.
So I'm excited.
55
00:02:47,520 --> 00:02:51,640
I think that we're kind of on a
new frontier here and a little
56
00:02:51,640 --> 00:02:54,160
nervous because I thought you
were nuts at first, but we'll
57
00:02:54,160 --> 00:02:55,080
see.
We'll see.
58
00:02:55,240 --> 00:02:56,600
I mean, we can always walk it
back.
59
00:02:57,000 --> 00:03:02,080
We've we've created a system for
her to score herself on how she
60
00:03:02,200 --> 00:03:04,000
how well she does on her morning
checklist.
61
00:03:04,280 --> 00:03:06,680
And if she doesn't score very
well, she's not going to be able
62
00:03:06,680 --> 00:03:08,360
to use that phone that day and
she knows that.
63
00:03:08,360 --> 00:03:11,560
So that's another incentive that
we've created here around this
64
00:03:11,560 --> 00:03:14,840
device.
And if it comes up that we need
65
00:03:14,840 --> 00:03:17,440
to pivot, we will.
Hardware aside, what we wanted
66
00:03:17,440 --> 00:03:20,720
to talk about today is family
life in the age of AI.
67
00:03:20,920 --> 00:03:24,480
So we came up with this episode
because we've been asked a lot.
68
00:03:24,600 --> 00:03:29,520
Yeah, recently, especially like
how do you use AI in the home?
69
00:03:30,000 --> 00:03:33,280
And you know, there's a lot of
AI that we use for work.
70
00:03:33,280 --> 00:03:36,440
And I would say that that is
pretty well covered, but I
71
00:03:36,440 --> 00:03:41,160
haven't really seen a good
explainer of how AI can be used
72
00:03:41,160 --> 00:03:44,440
effectively to create family
culture and to lead at home.
73
00:03:44,800 --> 00:03:47,640
Not like, you know, there's,
there's lots of stuff like, oh,
74
00:03:47,680 --> 00:03:50,600
you know, how to make images and
logos and coloring books and
75
00:03:50,600 --> 00:03:52,280
stuff for.
Your party invitation party like
76
00:03:52,320 --> 00:03:53,120
that?
Right.
77
00:03:53,560 --> 00:03:58,200
But this is brand new for all of
us, right?
78
00:03:58,200 --> 00:04:02,640
It was 20 November 2022.
So it's been three years since
79
00:04:02,640 --> 00:04:06,160
ChatGPT came out.
And in those three years, a lot
80
00:04:06,160 --> 00:04:10,120
has happened, but what really
has happened is some people have
81
00:04:10,120 --> 00:04:14,280
adopted and some people are kind
of afraid to lean in, especially
82
00:04:14,280 --> 00:04:16,880
when it comes to home life.
It shocks me that I still have
83
00:04:16,880 --> 00:04:21,040
friends or here on my housewives
podcast that have never
84
00:04:21,040 --> 00:04:24,120
downloaded ChatGPT or any AI
system.
85
00:04:24,360 --> 00:04:27,160
Some because they're adamantly
opposed to it, but others
86
00:04:27,160 --> 00:04:29,320
because they just hasn't
occurred to them to do so.
87
00:04:29,800 --> 00:04:31,680
Yeah, and, you know, I can't
blame them.
88
00:04:31,680 --> 00:04:36,560
Like, it is wild to me being on
the frontier of technology that
89
00:04:36,560 --> 00:04:38,720
people wouldn't adopt.
But I think it back to my
90
00:04:38,720 --> 00:04:41,400
parents and the Internet.
And they were pretty forward
91
00:04:41,400 --> 00:04:44,320
thinking and they got on AOL,
you know, fairly early.
92
00:04:44,320 --> 00:04:46,080
But it was a scary time and
place.
93
00:04:46,080 --> 00:04:48,480
They didn't know how to manage
it for themselves.
94
00:04:48,480 --> 00:04:50,520
They certainly didn't know how
to manage it for me and my
95
00:04:50,520 --> 00:04:52,560
brother.
And we got into all sorts of
96
00:04:52,560 --> 00:04:55,800
trouble online.
Like so you know this is a hairy
97
00:04:55,800 --> 00:04:58,280
topic and this is our
generations version of the
98
00:04:58,280 --> 00:05:00,440
Internet.
Yes, absolutely.
99
00:05:00,560 --> 00:05:04,280
So this technological shift that
we're standing in the middle of
100
00:05:05,000 --> 00:05:07,680
is a pivotal time for us as
families.
101
00:05:07,800 --> 00:05:11,160
New York Times will come out
with the story about the child
102
00:05:11,160 --> 00:05:15,880
falling in love with a chatbot,
or I mean, much darker things to
103
00:05:15,920 --> 00:05:20,680
like.
But what's certain, just if you
104
00:05:20,680 --> 00:05:24,120
follow the progress of time, is
that parents are going to be
105
00:05:24,120 --> 00:05:28,200
behind and, and parents right
now feel like it's really hard
106
00:05:28,200 --> 00:05:31,600
to keep up.
So I mean, I certainly do like,
107
00:05:31,680 --> 00:05:34,200
you know, and I work in this
stuff like I, I use this for,
108
00:05:34,440 --> 00:05:37,120
for home life.
But had I not stumbled into some
109
00:05:37,120 --> 00:05:41,480
things from work and brought
them into the home life I I
110
00:05:41,480 --> 00:05:43,720
probably wouldn't have known
where to get started.
111
00:05:44,240 --> 00:05:45,800
Right.
I mean, I feel like this is the
112
00:05:45,800 --> 00:05:49,600
challenge that our generation of
parents is having generally with
113
00:05:49,600 --> 00:05:52,680
things like cell phones, right?
If you had asked me two weeks
114
00:05:52,680 --> 00:05:55,040
ago, I would have said, oh,
Hunter is not getting a cell
115
00:05:55,040 --> 00:05:57,800
phone until she's 16.
And all of a sudden there's an
116
00:05:57,800 --> 00:06:00,080
iPhone in our home, right?
We're still figuring it out.
117
00:06:00,080 --> 00:06:04,080
It's still new.
And we're not yet able to
118
00:06:04,080 --> 00:06:08,040
navigate this with confidence.
I think even, you know, the
119
00:06:08,040 --> 00:06:10,680
books are being being written
now and they'll and that'll be
120
00:06:10,680 --> 00:06:13,680
the first edition, right?
They will be updated and
121
00:06:13,680 --> 00:06:17,680
revised.
And so this is just for us, I
122
00:06:17,680 --> 00:06:22,280
think when we sit back and think
about our family culture, do we
123
00:06:22,280 --> 00:06:26,400
want to be the people that are
fearful of the next great
124
00:06:26,400 --> 00:06:29,520
technology wave?
Do we want to be the people that
125
00:06:29,520 --> 00:06:34,080
are avoidant of the great, the
next great technology wave or do
126
00:06:34,080 --> 00:06:36,920
we want to experiment and engage
with it?
127
00:06:36,920 --> 00:06:40,840
And certainly calibrate right,
like get over our skis and how
128
00:06:40,840 --> 00:06:45,480
much we use it and then quit for
a little while and really try to
129
00:06:45,680 --> 00:06:49,680
figure out how to optimize what
works for our family for sure.
130
00:06:49,880 --> 00:06:53,560
And door number three, 100% for
us right number.
131
00:06:53,560 --> 00:06:58,520
Three, the iPhone with Hunter is
a great analogy, though, and
132
00:06:58,800 --> 00:07:02,760
I'll tell you why.
It's because if we don't reason
133
00:07:02,760 --> 00:07:05,120
from first principles about how
to use this stuff, we're going
134
00:07:05,120 --> 00:07:08,000
to be pushed the kid version of
all this crap.
135
00:07:08,440 --> 00:07:11,120
And I've already seen it.
I've seen get your kids away
136
00:07:11,120 --> 00:07:15,120
from screens and use this little
like furry thing with the screen
137
00:07:15,400 --> 00:07:18,200
that will engage with them on
math problems and ask them how
138
00:07:18,200 --> 00:07:20,040
their day was and stuff like.
That and let them create
139
00:07:20,040 --> 00:07:22,960
imaginary friends, which is
essentially the same thing that
140
00:07:22,960 --> 00:07:25,320
we're talking about with
teenagers falling in love with
141
00:07:25,320 --> 00:07:27,880
chatbots just literally dressed
up in fur.
142
00:07:28,000 --> 00:07:30,320
Absolutely.
And if you, if you think that
143
00:07:30,560 --> 00:07:34,320
the big brands like Disney and
Marvel, you know, aren't getting
144
00:07:34,320 --> 00:07:36,760
into AI and trying to put it in
our households, you're
145
00:07:36,760 --> 00:07:40,280
absolutely wrong.
They're going to infiltrate our
146
00:07:40,280 --> 00:07:43,920
lives if we, if we let them by
saying, hey, wouldn't it be fun
147
00:07:43,920 --> 00:07:47,000
to have bedtime stories read to
you by your favorite Disney
148
00:07:47,000 --> 00:07:49,280
character?
Or like you know, come up with a
149
00:07:49,280 --> 00:07:53,240
story or a game you know here
and there and and use our IP to
150
00:07:53,280 --> 00:07:55,280
create.
I think the image IP is going to
151
00:07:55,280 --> 00:07:57,200
be huge.
It's going to be huge.
152
00:07:57,200 --> 00:08:01,160
It already is starting.
So suffice it to say that this
153
00:08:01,240 --> 00:08:05,000
technology wave, the tsunami is
here and it's not going
154
00:08:05,000 --> 00:08:08,920
anywhere.
So we would love to talk about
155
00:08:09,120 --> 00:08:12,160
how we use it in our home today.
And this is one of those things,
156
00:08:12,440 --> 00:08:15,120
it's very new.
And with most things in life, we
157
00:08:15,120 --> 00:08:17,640
have a saying that we've
borrowed from a very famous
158
00:08:17,640 --> 00:08:21,320
venture capitalist firm A16Z,
which is strong convictions
159
00:08:21,520 --> 00:08:25,000
weekly held.
So we 100% reserve the right to
160
00:08:25,000 --> 00:08:29,320
change our mind.
But as of December 2025, this is
161
00:08:29,320 --> 00:08:32,320
how we are using AI within our
family culture.
162
00:08:32,360 --> 00:08:35,120
That's right.
And before we dig in, I heard
163
00:08:35,120 --> 00:08:37,400
this line from Jerry Seinfeld
that I loved.
164
00:08:37,919 --> 00:08:41,559
He said we're smart enough to
invent AI, dumb enough to need
165
00:08:41,559 --> 00:08:44,760
it, and so stupid we can't
figure out if we did the right
166
00:08:44,760 --> 00:08:46,720
thing.
Yeah, it's true.
167
00:08:47,120 --> 00:08:48,880
That encapsulates.
It only time will tell.
168
00:08:49,080 --> 00:08:52,480
Only time will tell.
I mean, we were, If you had
169
00:08:52,720 --> 00:08:56,040
asked us to record this two
weeks ago, we would have been
170
00:08:56,080 --> 00:09:02,080
pushing our favorite work flows
on on shot GPT and now.
171
00:09:02,520 --> 00:09:04,240
Real time pivots, people.
Real time pivots.
172
00:09:04,240 --> 00:09:08,920
For team Gemini at home, like I
got to say, you know, GPT has
173
00:09:08,920 --> 00:09:13,320
been hallucinating and Gemini
has just caught up and it uses
174
00:09:13,320 --> 00:09:17,440
so many more results from the
live web than GPT and getting
175
00:09:17,440 --> 00:09:20,080
answers.
So it's crazy to say, but like,
176
00:09:20,080 --> 00:09:23,200
you know, there's very little
switching costs in these things
177
00:09:23,200 --> 00:09:25,920
and we're moving basically all
our conversations over to
178
00:09:25,920 --> 00:09:29,320
Gemini, yes.
It actually ties into the things
179
00:09:29,320 --> 00:09:31,960
that we've been talking about
these past couple of weeks with
180
00:09:31,960 --> 00:09:34,840
the shadow side of things.
I think that for some people,
181
00:09:35,520 --> 00:09:40,640
some of you, some of my friends
even that I know like this LLMAI
182
00:09:40,640 --> 00:09:44,800
wave is really, it's looming,
it's lurking in the shadows.
183
00:09:44,840 --> 00:09:48,920
And so a lot of what we want to
do today is really make it
184
00:09:48,920 --> 00:09:51,520
explicit, right?
Bring it into the light, show
185
00:09:51,520 --> 00:09:56,000
you that it's not that scary and
that there are ways to
186
00:09:56,000 --> 00:09:59,280
incorporate it into our
workflows while still
187
00:09:59,280 --> 00:10:02,960
maintaining agency and ultimate
decision making authority.
188
00:10:03,040 --> 00:10:05,640
That's right, right.
We're we're not suggesting that
189
00:10:05,720 --> 00:10:08,520
AI should take over our
thinking, right?
190
00:10:09,040 --> 00:10:12,880
Absolutely not.
This is it's it's shortcuts to
191
00:10:13,040 --> 00:10:17,600
getting the information.
It's not shortcuts to creating
192
00:10:17,640 --> 00:10:20,360
the outputs.
Yes, and it's pretty bad at
193
00:10:20,360 --> 00:10:23,160
writing still.
Yeah, don't use it.
194
00:10:23,720 --> 00:10:27,760
I mean I've been seeing like
people congratulating others or
195
00:10:27,760 --> 00:10:30,360
wishing others happy birthday
and it's just like full of M
196
00:10:30,360 --> 00:10:32,560
dashes and it's obviously chat
TBT.
197
00:10:32,560 --> 00:10:34,720
Which we're guilty of as well
that I think we've really kind
198
00:10:34,720 --> 00:10:38,080
of pushed ourselves away from
we've we've distanced ourselves
199
00:10:38,080 --> 00:10:40,080
from those do.
You remember on more recently on
200
00:10:40,120 --> 00:10:43,480
AIM or AOL when people used to
write an alternating capitals in
201
00:10:43,480 --> 00:10:46,320
lowercase letters?
Yes, it's basically this is this
202
00:10:46,320 --> 00:10:48,360
generation's version of that the
M-IS.
203
00:10:48,360 --> 00:10:50,600
I think we're going to look.
Back.
204
00:10:50,600 --> 00:10:51,520
Oh my gosh.
OK.
205
00:10:51,800 --> 00:10:56,080
So before we get there, let me
just check on our AI editor D
206
00:10:56,080 --> 00:10:59,120
script little plug for them and
make sure that it's recording.
207
00:11:00,520 --> 00:11:03,560
Good idea.
I would be lying if I said that
208
00:11:03,560 --> 00:11:08,280
that hasn't happened before.
All right, so we're going to
209
00:11:08,280 --> 00:11:11,880
talk about how we use AI in the
home and this episode is not
210
00:11:12,000 --> 00:11:14,720
really about the technology, but
about how we.
211
00:11:16,360 --> 00:11:18,240
Yeah.
So today we're going to talk
212
00:11:18,240 --> 00:11:22,240
about how we use AI to make
values aligned decisions and
213
00:11:22,240 --> 00:11:26,440
have fun together as a family.
The important thing for me when
214
00:11:26,440 --> 00:11:29,760
we talk about this is that none
of this is about consumption.
215
00:11:30,240 --> 00:11:33,640
It's not passive.
It's all about active leadership
216
00:11:34,080 --> 00:11:40,320
and how we use AI to support our
workflows while still making
217
00:11:40,320 --> 00:11:44,360
sure that we spend a lot of time
brainstorming, thinking
218
00:11:44,360 --> 00:11:49,200
critically, and it involves a
lot of iteration in these ideas
219
00:11:49,200 --> 00:11:52,480
to make them work.
It does, and it's not something
220
00:11:52,480 --> 00:11:56,680
that comes naturally to us
because again, three years is
221
00:11:56,680 --> 00:12:00,240
really all that we've had with
this latest LLM technology.
222
00:12:00,240 --> 00:12:04,440
And so in our house, we have a
playful ritual.
223
00:12:04,520 --> 00:12:08,040
Danielle and I have this like I
would call it a friendly
224
00:12:08,040 --> 00:12:12,080
competition almost about who can
ring the AI bell when we're
225
00:12:12,080 --> 00:12:15,160
solving problems.
Sure, who thinks to 1st right?
226
00:12:15,360 --> 00:12:18,560
How can we integrate AI into our
workflow to solve this problem?
227
00:12:18,560 --> 00:12:21,840
Yeah, there's no real bell.
This is a proverbial AI bell.
228
00:12:22,000 --> 00:12:26,880
But it's a lot of fun and I've
noticed myself finding areas
229
00:12:26,880 --> 00:12:31,560
that I dread, like I'll have to
sell something on Facebook
230
00:12:31,560 --> 00:12:33,960
Marketplace.
And I'm like, Oh my God, I have
231
00:12:33,960 --> 00:12:36,640
to take photos of this.
I've got to look up the info.
232
00:12:36,640 --> 00:12:38,080
I've got to write the
description.
233
00:12:38,360 --> 00:12:40,920
This is going to take.
And so I delay, delay,
234
00:12:41,160 --> 00:12:44,920
procrastinate on it.
But ringing the AI bell for
235
00:12:44,920 --> 00:12:47,400
Facebook Marketplace, yeah, you
still have to take the photos.
236
00:12:47,400 --> 00:12:49,840
I take those photos, but then
the AI does the rest.
237
00:12:49,840 --> 00:12:54,000
I go look this up, like create
the description, create like.
238
00:12:54,000 --> 00:12:55,240
And it's.
Done.
239
00:12:55,280 --> 00:12:57,520
Yeah, Especially for things that
are good enough, right?
240
00:12:57,520 --> 00:12:59,280
Exactly.
That's that's the perfect
241
00:12:59,280 --> 00:13:00,720
example.
Now it's good enough.
242
00:13:01,160 --> 00:13:04,480
One place where I've noticed
that the AI bell does come
243
00:13:04,480 --> 00:13:08,240
naturally is to our children
because there's an Alexa in our
244
00:13:08,240 --> 00:13:12,160
kitchen and the number of
questions that they ask when
245
00:13:12,160 --> 00:13:15,040
they are in dialogue with one
another or doing their math
246
00:13:15,040 --> 00:13:17,120
homework.
I've had to stop that, right?
247
00:13:17,120 --> 00:13:20,440
Alexa is currently on mute
because I'm seeing just how
248
00:13:20,440 --> 00:13:24,760
quickly they have integrated
Alexa in and Alexa's like the
249
00:13:24,760 --> 00:13:27,120
worst day at.
So that's part of it.
250
00:13:27,120 --> 00:13:29,920
But that they can just kind of
yell out into the kitchen.
251
00:13:29,920 --> 00:13:32,400
So honestly, having a speaker
that can provide them with
252
00:13:32,400 --> 00:13:36,120
answers, I'm learning is not
working for us right now, right?
253
00:13:36,120 --> 00:13:41,400
But they very, very quickly are
tapping into, which I think is
254
00:13:41,400 --> 00:13:44,440
true to every technology wave,
right, that young people tend to
255
00:13:44,600 --> 00:13:47,400
adopt it more readily and more
easily.
256
00:13:47,600 --> 00:13:49,160
So it's just something to watch
out for.
257
00:13:49,160 --> 00:13:51,600
What stinks is that they don't
have like a notepad where they
258
00:13:51,600 --> 00:13:54,480
can actually look back on the
questions that they had because
259
00:13:54,480 --> 00:13:57,240
a lot of those questions are
just rapid fire, like stream of
260
00:13:57,240 --> 00:13:59,320
consciousness and they don't
need to know the answers.
261
00:13:59,320 --> 00:14:02,080
But if they're actually curious
about something, it stinks that
262
00:14:02,080 --> 00:14:04,040
they're not able to like reflect
on that later.
263
00:14:04,280 --> 00:14:06,160
And maybe that's.
Well, I think that's part of the
264
00:14:06,160 --> 00:14:07,120
skill.
Like that's part of the
265
00:14:07,120 --> 00:14:09,520
experiments here, right?
It's really about about teaching
266
00:14:09,520 --> 00:14:14,040
them to leverage AI.
But iterate on those ideas and
267
00:14:14,040 --> 00:14:15,840
think critically about the
response.
268
00:14:16,080 --> 00:14:18,680
Right?
Because a ton of the time it
269
00:14:18,680 --> 00:14:22,520
hallucinates.
Well, it it it sure does, but it
270
00:14:22,520 --> 00:14:25,200
is great for ground truth and
we'll get into that in a moment.
271
00:14:25,400 --> 00:14:29,800
So we rang the AI bell to make
this episode and we.
272
00:14:29,800 --> 00:14:32,520
Asked.
How meta but not meta?
273
00:14:33,440 --> 00:14:39,280
We used a bunch of our different
models to add that we that we
274
00:14:39,280 --> 00:14:42,320
talked to to ask the question,
how do we use AI?
275
00:14:42,400 --> 00:14:43,800
How?
Are we using AI actually?
276
00:14:44,080 --> 00:14:45,440
Like, tell us what we've been
doing.
277
00:14:45,880 --> 00:14:49,160
And when we pulled everything
together, there was an obvious
278
00:14:49,160 --> 00:14:54,320
pattern that emerged that we
kind of came up with three
279
00:14:54,320 --> 00:14:59,760
categories to put everything in,
and those are clarify, Coach and
280
00:14:59,760 --> 00:15:01,440
create.
And Greg actually came up with
281
00:15:01,440 --> 00:15:03,440
these, even though it does sound
like something that GPT would
282
00:15:03,440 --> 00:15:06,080
create, correct?
OK, so the ways we use AI for
283
00:15:06,080 --> 00:15:09,120
our family, we are going to
discuss it in the three CS of
284
00:15:09,120 --> 00:15:11,480
clarify, coach and Create.
Yes.
285
00:15:11,720 --> 00:15:13,160
All right.
So you want to start with
286
00:15:13,160 --> 00:15:14,720
clarify.
Sure, absolutely.
287
00:15:14,720 --> 00:15:18,840
So clarify and what we really
mean by this is a dedication to
288
00:15:18,840 --> 00:15:20,400
the truth.
So there are a number of
289
00:15:20,400 --> 00:15:24,400
different ways that we get at
this, but probably the biggest
290
00:15:24,400 --> 00:15:26,640
one is your limitless PIN, which
I'm.
291
00:15:26,720 --> 00:15:30,760
Wearing around my neck right now
I'm wearing this and showing it
292
00:15:30,760 --> 00:15:37,480
to the the cameras because it is
probably the best in person
293
00:15:38,600 --> 00:15:42,600
scribe that I've that that I
could have ever wished for.
294
00:15:43,400 --> 00:15:47,040
And I wear it on my hip most
days just walking about the
295
00:15:47,040 --> 00:15:49,480
house and in conversation with
you and the kids.
296
00:15:50,480 --> 00:15:54,040
But yeah, do you want to talk
about like what, what, what
297
00:15:54,040 --> 00:15:56,480
going back to the tape really
means here and what we're able
298
00:15:56,480 --> 00:15:58,600
to do with Limitless?
Yeah, absolutely.
299
00:15:58,600 --> 00:16:01,880
So when there is a disagreement,
right, rather than agreeing to
300
00:16:01,880 --> 00:16:06,080
disagree about whether someone
wore green shoes or red shoes
301
00:16:06,080 --> 00:16:09,560
the day prior, right.
If there's a dialogue about it
302
00:16:09,600 --> 00:16:12,880
or say Hunter has and this has
happened, like Greg's having a
303
00:16:12,880 --> 00:16:15,600
conversation with another adult
and Hunter wants to know
304
00:16:15,600 --> 00:16:20,760
specifically how it went, right?
And Greg's able to to open it up
305
00:16:20,760 --> 00:16:22,760
and be like, no, this is exactly
what we talked about.
306
00:16:22,760 --> 00:16:27,520
Here's the recording so that
there is no, no discrepancies
307
00:16:27,520 --> 00:16:31,280
between what actually happened
and what we're sharing, no human
308
00:16:31,280 --> 00:16:33,760
bias.
I will say it does freak people
309
00:16:33,760 --> 00:16:36,520
out a little bit.
Even the most pioneering of
310
00:16:36,520 --> 00:16:41,240
technologists that we have
encountered that the fact that
311
00:16:41,240 --> 00:16:45,960
you are constantly wearing a
recording device is, it's one of
312
00:16:45,960 --> 00:16:48,440
those ethical questions of our
generation, right?
313
00:16:49,360 --> 00:16:51,720
But we can't put the genie back
in the bottle.
314
00:16:52,080 --> 00:16:56,320
And when I think about all of
the meetings that I have it at
315
00:16:56,320 --> 00:17:00,320
work and on Zoom and how many of
those are recorded, it's about
316
00:17:00,320 --> 00:17:02,640
95%.
And how many of those say that
317
00:17:02,640 --> 00:17:06,359
they're being recorded like 5%?
It reminds me of the Freedom of
318
00:17:06,359 --> 00:17:07,920
Information Act when it first
came out.
319
00:17:07,920 --> 00:17:10,160
People like some people are
like, I have nothing to hide,
320
00:17:10,160 --> 00:17:11,880
who cares?
And other people are like, no,
321
00:17:11,880 --> 00:17:14,560
this is a total violation of my
privacy, right?
322
00:17:15,079 --> 00:17:17,760
And it's here.
It's here to stay if.
323
00:17:17,800 --> 00:17:20,359
We think about where this is
going play the tape forward
324
00:17:20,359 --> 00:17:23,640
another 25 years, right, so the.
Actual tape, not your.
325
00:17:23,760 --> 00:17:25,640
The proverbial tape, not your.
Limitless, correct?
326
00:17:26,000 --> 00:17:28,240
So if we just think about
playing that tape forward, OK,
327
00:17:28,520 --> 00:17:30,720
this is a recording of my
experience, right?
328
00:17:30,720 --> 00:17:33,600
So I, I think that I should own
my experience.
329
00:17:33,600 --> 00:17:36,320
I have it through my ears.
I have it into my brain and now
330
00:17:36,320 --> 00:17:39,640
I have it in a limitless pen.
Yes, if you Fast forward 25
331
00:17:39,640 --> 00:17:41,720
years, you don't need the pen.
It's in your brain.
332
00:17:42,120 --> 00:17:44,240
The pen is in your brain.
Like that is where we're going.
333
00:17:44,640 --> 00:17:48,640
And so I think that we all have
to get used to this idea that we
334
00:17:48,720 --> 00:17:52,280
we are able to record our own
experiences and we can pass on
335
00:17:52,280 --> 00:17:55,480
those experiences, but they are
ours because we live them in
336
00:17:55,480 --> 00:17:57,920
that moment.
They are not anyone else's, even
337
00:17:57,920 --> 00:18:00,440
if someone else is there, you
know, communicating with us.
338
00:18:00,440 --> 00:18:01,800
Yes.
And I would say the bulk of the
339
00:18:01,800 --> 00:18:04,760
time you're not necessarily
playing back the recording of
340
00:18:04,760 --> 00:18:07,760
your conversation with someone.
Typically what's happening is
341
00:18:07,920 --> 00:18:10,160
you and I are having a
conversation while we're on our
342
00:18:10,160 --> 00:18:12,840
run that moves into the
bathroom, that moves into the
343
00:18:12,840 --> 00:18:15,160
kitchen.
And we're like, wait, we had so
344
00:18:15,160 --> 00:18:18,280
many good ideas about work,
about family life, about this
345
00:18:18,280 --> 00:18:23,480
podcast during that time.
And Greg can get a summary of
346
00:18:23,480 --> 00:18:27,120
that from the limitless pin.
So no longer are we concerned
347
00:18:27,120 --> 00:18:30,640
that we will lose or have to
rehash conversations that we've
348
00:18:30,640 --> 00:18:32,640
already had.
It's like, what did we decide?
349
00:18:32,640 --> 00:18:34,440
What was that great phrase?
Oh, it's here.
350
00:18:34,640 --> 00:18:39,000
It's right here and I never have
to have that fear of like
351
00:18:39,320 --> 00:18:42,840
missing that piece of
information or forgetting it
352
00:18:42,960 --> 00:18:44,920
where I'm not able to stay
present the rest of the time.
353
00:18:45,880 --> 00:18:47,520
Do you want to talk a little bit
about your workflow?
354
00:18:47,520 --> 00:18:50,640
Like how you actually use it.
Honestly, I let this be a
355
00:18:50,640 --> 00:18:54,720
passive device that is able to
push information back to me.
356
00:18:54,720 --> 00:18:58,680
So it will push information in
the app saying, hey, here was a
357
00:18:58,680 --> 00:19:02,000
summary of your day yesterday.
You had a nice Family Day.
358
00:19:02,000 --> 00:19:05,840
You guys, you know, went to the
park and you picked up trash and
359
00:19:05,840 --> 00:19:09,160
then you went and gave out
cookies and it was a, you know,
360
00:19:09,160 --> 00:19:11,480
a lovely different kind of
Thanksgiving, for example.
361
00:19:13,120 --> 00:19:16,640
Then it'll say here are some
opportunities to that were
362
00:19:16,640 --> 00:19:19,600
missed where you could have
connected or here are, you know,
363
00:19:19,600 --> 00:19:22,960
some other things.
So Limitless just gives me like
364
00:19:22,960 --> 00:19:26,600
a summary of how my day went and
it helps me remember and query
365
00:19:26,600 --> 00:19:29,080
it back when I want to go back
into the past.
366
00:19:30,760 --> 00:19:32,680
OK, great.
So that's an understanding of
367
00:19:32,680 --> 00:19:35,280
Limitless.
We also use something called
368
00:19:35,280 --> 00:19:39,440
Whisper Flow, which is really
talk about shifting our
369
00:19:39,440 --> 00:19:41,840
workflows.
I hate typing at this point
370
00:19:41,840 --> 00:19:46,360
thanks to Whisper Flow because
now I am able to double tap on
371
00:19:46,360 --> 00:19:51,040
my keyboard and dictate what I
want to say in an e-mail or a
372
00:19:51,040 --> 00:19:57,320
text message or even talking to
AI in a way that is so much more
373
00:19:57,320 --> 00:20:01,520
effective than whatever Apple
has on their talk to text.
374
00:20:02,480 --> 00:20:06,480
This is not your like iPhones
text, text to speech.
375
00:20:06,480 --> 00:20:11,360
This is an open AI model called
Whisper that's built into an
376
00:20:11,360 --> 00:20:13,160
app.
Basically that is a wrapper
377
00:20:13,320 --> 00:20:17,200
called Whisper Flow that lives
on all of your devices and it
378
00:20:17,400 --> 00:20:21,880
takes what you say and makes it
actually look like what you
379
00:20:21,880 --> 00:20:25,800
meant to say.
With punctuation, with meaning,
380
00:20:25,800 --> 00:20:29,120
with.
If you if you stumble your words
381
00:20:29,120 --> 00:20:31,720
it, it's not going to include.
It lightly changes syntax I've
382
00:20:31,760 --> 00:20:36,600
noticed more recently too to
make things more clear, but I
383
00:20:36,600 --> 00:20:39,800
cannot.
I know some people love to send
384
00:20:39,800 --> 00:20:42,600
audio messages.
I don't know anybody that loves
385
00:20:42,600 --> 00:20:47,520
to receive audio messages and so
and some people really truly
386
00:20:47,520 --> 00:20:49,880
hate them like Greg.
I would prefer to listen to them
387
00:20:49,880 --> 00:20:51,880
on like 2X speed.
Please send.
388
00:20:52,000 --> 00:20:54,840
Anyone an audio message?
If you're listening to this, yes
389
00:20:54,840 --> 00:20:55,680
it is.
It is.
390
00:20:55,680 --> 00:20:57,720
It is an act of torture.
Right, But I know, I know some
391
00:20:57,720 --> 00:21:00,440
people that do and they do it
because it's easier and if
392
00:21:00,440 --> 00:21:02,600
they're like say a busy mom,
right.
393
00:21:02,600 --> 00:21:06,280
But anyway, I think that whisper
flow is really the best.
394
00:21:06,280 --> 00:21:09,840
I would, I would say that it's
the best solution because it's
395
00:21:09,840 --> 00:21:15,200
so much more understandable than
the voice to text that we have
396
00:21:15,200 --> 00:21:19,680
on our on our phones, like
naturally on our phones and the
397
00:21:20,400 --> 00:21:22,880
I don't, I do use it for
communicating with others, but
398
00:21:22,880 --> 00:21:29,320
just actually being able to
brainstorm my ideas in spoken
399
00:21:29,320 --> 00:21:35,800
word as opposed to typing them
out has exponentially supported
400
00:21:35,800 --> 00:21:39,000
my creativity and and work in
general.
401
00:21:39,160 --> 00:21:42,400
Again, we play the tape through,
you don't need to double tap on
402
00:21:42,400 --> 00:21:44,640
your keyboard.
It's just connected, right?
403
00:21:44,720 --> 00:21:45,840
So that's what's going to
happen.
404
00:21:45,840 --> 00:21:49,080
And we're we're really seeing a
glimpse of the future with this
405
00:21:49,080 --> 00:21:51,240
advanced AI model called
Whisper.
406
00:21:51,640 --> 00:21:53,800
Right, OK.
And then the, so we've talked
407
00:21:53,800 --> 00:21:55,680
about the limitless pin whisper
flow.
408
00:21:55,960 --> 00:21:58,280
The third thing that we think
about when we think about
409
00:21:58,280 --> 00:22:03,200
clarifying is how we can use AI
to surface information that
410
00:22:03,200 --> 00:22:05,680
would usually be hard to find.
Yes.
411
00:22:06,120 --> 00:22:09,280
Absolutely, and not even talking
about deep research, but in
412
00:22:09,280 --> 00:22:12,600
looking for things that are
extremely important and
413
00:22:12,600 --> 00:22:14,440
imperative to Get the facts
right.
414
00:22:14,640 --> 00:22:18,960
So I started kindergarten when I
was 4 and I turned 5 in October
415
00:22:20,360 --> 00:22:23,600
in Florida where we live.
The state, according to
416
00:22:23,600 --> 00:22:26,680
everything that I had seen and
that Danielle had seen, said
417
00:22:26,680 --> 00:22:30,480
that kids cannot start
kindergarten if they are not 5
418
00:22:30,480 --> 00:22:34,720
before September 1st.
And I said, you know, this has
419
00:22:34,720 --> 00:22:37,560
to be a solved problem because I
think about all the people from
420
00:22:37,560 --> 00:22:41,720
the North, where it was the 1231
cut off that are down here in
421
00:22:41,720 --> 00:22:45,440
Florida and where the Maverick
is ready for kindergarten next
422
00:22:45,440 --> 00:22:48,600
year.
And so I worked with GPT to
423
00:22:48,600 --> 00:22:52,360
figure out how this was actually
structured in the state.
424
00:22:52,920 --> 00:22:58,000
And it turns out that private
schools can accept children at
425
00:22:58,000 --> 00:23:00,800
any point, irrespective of when
they were born.
426
00:23:02,280 --> 00:23:06,480
Which was news not just to us,
but also to the director of the
427
00:23:06,480 --> 00:23:09,080
private school that we want
Maverick to go to next year.
428
00:23:09,120 --> 00:23:13,080
So we were able to find the
exact, you know, the exact
429
00:23:13,080 --> 00:23:17,280
statutes in the Florida
Constitution that proved that
430
00:23:17,280 --> 00:23:18,440
Maverick could enter
kindergarten.
431
00:23:18,440 --> 00:23:21,600
We brought it to the
administrator, and she was like,
432
00:23:21,600 --> 00:23:24,320
wow, this is so helpful.
I've been turning kids away.
433
00:23:24,680 --> 00:23:27,240
I'm calling the Department of
Education to confirm that this
434
00:23:27,240 --> 00:23:29,880
is correct, even though I've
spoken with them before.
435
00:23:30,200 --> 00:23:30,680
Right?
Right.
436
00:23:31,200 --> 00:23:33,480
And so it we we made change
here.
437
00:23:33,880 --> 00:23:37,960
Yes, huge, huge change.
Not just for our family, but for
438
00:23:37,960 --> 00:23:39,880
the school itself.
Right, there was a
439
00:23:39,880 --> 00:23:44,280
misunderstanding because who if
you don't have a full time clerk
440
00:23:44,520 --> 00:23:48,760
on your payroll, who has time to
go through the Florida Statutes,
441
00:23:48,760 --> 00:23:51,760
right?
That said, the most important
442
00:23:51,760 --> 00:23:55,760
thing about when we surface
information using AI is to
443
00:23:56,120 --> 00:24:01,560
verify, verify, verify.
Like I would not even say trust,
444
00:24:01,560 --> 00:24:04,360
but verify.
I would say hallucinations
445
00:24:04,360 --> 00:24:06,440
aren't just real, they are
common.
446
00:24:06,560 --> 00:24:10,520
So make sure that you always go
back to the ground truth.
447
00:24:10,640 --> 00:24:15,680
Yes, these models are three
years old and don't trust a
448
00:24:15,680 --> 00:24:17,760
three-year old anything with
anything.
449
00:24:17,920 --> 00:24:20,800
Yeah, but it can.
So it can be really helpful in
450
00:24:20,800 --> 00:24:24,360
surfacing information as long as
we also verify through
451
00:24:24,360 --> 00:24:26,160
legitimate sources, right,
Correct.
452
00:24:26,680 --> 00:24:27,920
OK.
So that's clarified.
453
00:24:28,400 --> 00:24:31,840
And then the second aspect or
the second way that we use AI
454
00:24:31,840 --> 00:24:34,320
for our family is really around
coaching.
455
00:24:34,800 --> 00:24:38,320
And this is a lot of the time
turning the ground truth into
456
00:24:38,360 --> 00:24:40,440
insights for our family.
Exactly.
457
00:24:40,440 --> 00:24:43,920
So back to the limitless pin,
which is playing back the tape,
458
00:24:43,920 --> 00:24:47,760
but also there's coaching
involved where I can say, you
459
00:24:47,760 --> 00:24:51,040
know, hey, was there an
opportunity for me to connect
460
00:24:51,040 --> 00:24:52,440
better with Maverick in some
way?
461
00:24:52,480 --> 00:24:55,240
Or sometimes I just get pushed.
Hey, here's a summary of your
462
00:24:55,240 --> 00:24:57,200
day.
And by the way, it looks like
463
00:24:57,200 --> 00:25:00,280
you really struggled with Hunter
and maybe this was a missed
464
00:25:00,280 --> 00:25:03,800
opportunity that you you 2 might
want to look to connect on next
465
00:25:03,800 --> 00:25:06,480
time.
Or I know Limitless, you've been
466
00:25:06,480 --> 00:25:09,640
around this past five day
weekend when Hunter and Maverick
467
00:25:09,640 --> 00:25:13,720
have gotten into it a lot.
Can you give us an understanding
468
00:25:14,120 --> 00:25:17,600
from your perspective on what is
happening and some possible
469
00:25:17,600 --> 00:25:20,800
ideas for supporting them?
Yes, and that was really cool.
470
00:25:20,800 --> 00:25:22,840
That was a real one that we used
just yesterday.
471
00:25:23,160 --> 00:25:26,560
And it was important for us to
do it as a family, right, so
472
00:25:26,560 --> 00:25:30,880
that everyone could hear a an
unbiased account without
473
00:25:30,880 --> 00:25:34,480
judgement or criticism of what
has been happening.
474
00:25:34,480 --> 00:25:38,800
And some some ideas that then we
we were able to brainstorm
475
00:25:38,800 --> 00:25:41,160
additional ideas of how to
support them specifically.
476
00:25:41,160 --> 00:25:44,920
But having Hunter and Maverick
be able to hear an unbiased
477
00:25:44,920 --> 00:25:48,880
account of their behavior, like
a neutral third party, if you
478
00:25:48,880 --> 00:25:52,840
will, really laid the foundation
for them to support each other.
479
00:25:52,880 --> 00:25:56,080
And it's only been, you know,
3648 hours since we had that
480
00:25:56,080 --> 00:25:59,280
conversation.
But I do think that it really
481
00:25:59,280 --> 00:26:01,360
started to change things for
them.
482
00:26:01,400 --> 00:26:05,520
And I know now what to look for
what they called it over
483
00:26:06,200 --> 00:26:09,640
limitless, called it overlapping
dominant personalities and that
484
00:26:09,640 --> 00:26:12,920
the vast majority of the fights
that they're getting into start
485
00:26:12,920 --> 00:26:16,640
as power struggles.
And so now I've already started
486
00:26:16,640 --> 00:26:20,800
to see even if I'm going, which
is not the best parenting in the
487
00:26:20,800 --> 00:26:23,800
world, right?
But like when I start to see the
488
00:26:23,800 --> 00:26:27,840
struggles it it really has
helped me to know what to look
489
00:26:27,840 --> 00:26:31,360
out for.
Yeah, totally 22 points on that
490
00:26:31,360 --> 00:26:33,800
one.
I should have asked it to
491
00:26:33,800 --> 00:26:36,480
explain it like I'm four or
something like.
492
00:26:36,480 --> 00:26:39,320
You could have, yeah.
That that was a miss I We do
493
00:26:39,320 --> 00:26:42,920
that often and that is another
good use on the on the these
494
00:26:42,920 --> 00:26:44,520
things.
It's just like there's a
495
00:26:44,520 --> 00:26:46,040
question.
Explain it like.
496
00:26:46,360 --> 00:26:49,400
Well, on that note, though, I
think giving giving any AI
497
00:26:49,400 --> 00:26:52,640
directive like you are a world
class X, you are a world class
498
00:26:52,640 --> 00:26:56,240
podcast coach and producer.
How would you say this?
499
00:26:56,240 --> 00:26:59,160
You are a world class
journalist, right?
500
00:26:59,160 --> 00:27:03,560
Like uncover X for me, making
sure that it knows not just who
501
00:27:03,560 --> 00:27:07,600
the audience is, but like to who
to really embody.
502
00:27:08,200 --> 00:27:09,760
Is it talk to me like you're
Chris Sacca.
503
00:27:09,840 --> 00:27:12,040
I do that, you know, like that
kind of thing that there's
504
00:27:12,320 --> 00:27:14,360
enough publicly available
information about these
505
00:27:14,360 --> 00:27:16,520
individuals out there and
they've been on a plenty, plenty
506
00:27:16,520 --> 00:27:19,440
of podcasts.
It can do it for you pretty
507
00:27:19,440 --> 00:27:19,880
well.
Yep.
508
00:27:20,880 --> 00:27:26,600
So coach and give specifics
around how the communication and
509
00:27:26,600 --> 00:27:28,800
context needs to be inputted and
outputted.
510
00:27:29,760 --> 00:27:32,440
The second thing I was going to
say is for a week we did an
511
00:27:32,440 --> 00:27:36,760
experiment where we hung the my
limitless in the kitchen so that
512
00:27:36,760 --> 00:27:40,360
we could just hear how the
dynamics were between the kids.
513
00:27:40,920 --> 00:27:43,720
And I think that that is going
to be another thing that I'm
514
00:27:43,720 --> 00:27:46,280
going to look forward to is a
device that is sitting in our
515
00:27:46,280 --> 00:27:49,560
kitchen that's able to feedback
to us what is happening when
516
00:27:49,560 --> 00:27:51,840
we're not around.
And some of the words that get
517
00:27:51,840 --> 00:27:54,080
used and both the good and the
bad, right?
518
00:27:54,080 --> 00:27:56,840
Because I want to, I want to
know, you know, where are kids
519
00:27:56,840 --> 00:27:59,400
showing up for one another and
where are they kind of missing
520
00:27:59,560 --> 00:28:01,160
one another or where they
butting heads?
521
00:28:01,560 --> 00:28:03,800
Yeah, jury's still out on that
one for me.
522
00:28:03,800 --> 00:28:08,200
Because, as you know, I felt
like it was watching me well and
523
00:28:08,200 --> 00:28:11,760
I was uncomfortable.
It's one thing if I can forget
524
00:28:11,760 --> 00:28:15,600
that it exists and go back to my
real self, but I felt like I was
525
00:28:15,640 --> 00:28:19,040
a bit performative or really
stunted in the way I was
526
00:28:19,040 --> 00:28:21,960
responding because I was nervous
about it judging me.
527
00:28:22,320 --> 00:28:26,840
I totally understand and hence
why I want it for like not as a
528
00:28:26,840 --> 00:28:30,800
babysitter, but when we go out
for a run and you know, on the
529
00:28:30,800 --> 00:28:34,400
weekend and the kids are home,
you know, putting it in the
530
00:28:34,400 --> 00:28:37,000
kitchen and just hearing how
things are going.
531
00:28:38,200 --> 00:28:44,320
Yeah, maybe, maybe, maybe.
And so another way that we have
532
00:28:44,320 --> 00:28:48,920
used AI as a coach is as a
guidance counselor for Hunter
533
00:28:48,920 --> 00:28:51,840
and for Jade, right?
Always with us alongside of
534
00:28:51,840 --> 00:28:54,280
them.
This is not teenagers falling in
535
00:28:54,280 --> 00:28:57,600
love with chat bots.
This is sometimes going back to
536
00:28:57,600 --> 00:29:01,320
that neutral third party with,
by the way, beautiful verbiage
537
00:29:01,320 --> 00:29:04,920
and perfect intonation to
provide empathy and validation
538
00:29:04,920 --> 00:29:09,000
and support just to ask
questions and to work through
539
00:29:09,000 --> 00:29:11,640
problems with us sitting
alongside.
540
00:29:11,640 --> 00:29:16,920
And it's almost like a family
therapist, if you will, for
541
00:29:16,920 --> 00:29:21,080
really low stakes situations
that are that it may something
542
00:29:21,080 --> 00:29:27,120
may have happened at school that
were they one where they will
543
00:29:27,120 --> 00:29:30,600
just listen to a third party
more than they will listen to my
544
00:29:30,600 --> 00:29:34,640
advice or just express their
feelings in a way that maybe
545
00:29:34,640 --> 00:29:37,480
they weren't when they were
staring directly at me.
546
00:29:37,640 --> 00:29:42,400
Yeah, no, I I completely agree.
And I think that this supervised
547
00:29:42,400 --> 00:29:47,840
use of a, of a large language
model, parents and child is so
548
00:29:47,840 --> 00:29:51,800
much better than what they're
going to push out there around
549
00:29:52,360 --> 00:29:55,920
child therapy with AI or family
therapy with AI.
550
00:29:55,920 --> 00:30:00,880
Because honestly, like once you
start giving your power over to
551
00:30:00,880 --> 00:30:05,160
something because you're calling
it a therapist, that's when you
552
00:30:05,200 --> 00:30:06,440
really run.
Into well, we struggle with that
553
00:30:06,440 --> 00:30:09,240
with humans as well, right?
Even when, when you, I think
554
00:30:09,240 --> 00:30:12,920
giving your power over to a
human therapist is also a
555
00:30:12,920 --> 00:30:15,960
challenge.
And I think what we want to do,
556
00:30:16,560 --> 00:30:18,960
same thing as we are with the
iPhone from a hardware
557
00:30:18,960 --> 00:30:23,680
perspective, is that we want to
support and guide our children
558
00:30:23,680 --> 00:30:29,080
in a developmentally appropriate
way to use these tools while
559
00:30:29,240 --> 00:30:33,840
maintaining agency, autonomy,
boundaries and the ability to
560
00:30:33,840 --> 00:30:36,800
think critically about, OK, was
that a good idea?
561
00:30:36,800 --> 00:30:38,920
Let me let me repeat that back
to myself.
562
00:30:38,920 --> 00:30:41,680
Like, was that a hallucination?
Was that just off?
563
00:30:41,720 --> 00:30:45,280
Does that resonate with me?
But to tell you the truth, it's
564
00:30:45,280 --> 00:30:50,160
not that different than a human
therapist who has no idea what
565
00:30:50,160 --> 00:30:53,480
you're thinking either, right?
I mean, you and I have both
566
00:30:54,760 --> 00:30:59,240
bullshitted and manipulated
therapists very easily, right?
567
00:30:59,600 --> 00:31:03,440
So I don't see how it's going to
be any different with a chat bot
568
00:31:03,440 --> 00:31:06,120
because they don't really know.
Nobody really knows what's going
569
00:31:06,120 --> 00:31:09,400
on in your head.
But learning to leverage this
570
00:31:09,400 --> 00:31:14,400
tool in a developmentally
appropriate way, as a family, as
571
00:31:14,400 --> 00:31:17,080
a family early on, makes sense
for us.
572
00:31:17,240 --> 00:31:18,640
It sure does.
It really does.
573
00:31:18,960 --> 00:31:23,040
And similarly, health coaching.
So as far as health
574
00:31:23,040 --> 00:31:25,640
troubleshooting, I do think it's
a slippery slope.
575
00:31:25,920 --> 00:31:29,880
And I think both of us have come
to, well, we've gone down the
576
00:31:29,880 --> 00:31:33,920
rabbit hole of is this the thing
that's giving me a stomach ache?
577
00:31:33,920 --> 00:31:35,880
Is this the thing that's not
helping me sleep?
578
00:31:35,880 --> 00:31:38,440
And it's like, yes, absolutely.
That is the problem.
579
00:31:38,440 --> 00:31:42,880
And it'll, it's really just a
very big health spiral.
580
00:31:42,880 --> 00:31:46,720
So as we talk about like kind of
being in this dance with AII
581
00:31:46,720 --> 00:31:49,120
think that and we calibrate our
use of it.
582
00:31:49,480 --> 00:31:54,160
I personally am staying away,
away from health coaching and so
583
00:31:54,160 --> 00:31:55,920
we are like learning in real
time.
584
00:31:56,600 --> 00:32:01,200
However, when Maverick slammed
his head when we were on a boat
585
00:32:01,400 --> 00:32:04,560
in the middle of the
intracoastal, nowhere close to
586
00:32:04,640 --> 00:32:09,240
where we came from, and he
started to get a giant Welt on
587
00:32:09,240 --> 00:32:11,600
his head and we were scared out
of our minds.
588
00:32:12,520 --> 00:32:16,240
Talking to GPT about what
actually just happened and
589
00:32:16,240 --> 00:32:19,240
whether or not we needed to like
get an emergency airlift
590
00:32:19,240 --> 00:32:24,000
situation or we were OK was so
powerful.
591
00:32:24,000 --> 00:32:26,520
So powerful and the more
context, the better.
592
00:32:26,520 --> 00:32:30,520
So multimodal, you know, photos,
videos, whatever we have on
593
00:32:30,520 --> 00:32:33,560
hand, like the, the more context
is going to be, is going to be
594
00:32:33,560 --> 00:32:35,160
better.
And taking a photo of the back
595
00:32:35,160 --> 00:32:37,440
of his head and being like, he
just slammed his head.
596
00:32:37,600 --> 00:32:40,280
Look at this.
It's incredibly powerful.
597
00:32:40,280 --> 00:32:41,160
Yeah, it was.
So.
598
00:32:41,160 --> 00:32:43,680
It was so comforting.
And, you know, I've had some
599
00:32:43,680 --> 00:32:49,200
time to reflect more recently on
Jade's birth and the, you know,
600
00:32:49,840 --> 00:32:52,120
potential malpractice that we
were a part of.
601
00:32:52,760 --> 00:32:59,400
And I realized that it is so
incredibly unlikely that we
602
00:32:59,400 --> 00:33:02,960
would have had that same
situation today because if we
603
00:33:02,960 --> 00:33:06,160
had said I seem to have white
coat, high blood pressure and
604
00:33:06,160 --> 00:33:09,480
panic in the office, they want
to induce me, what should I do?
605
00:33:10,040 --> 00:33:13,360
It would have told us the truth,
which is that medical best
606
00:33:13,360 --> 00:33:17,560
practice says you should have a
24 hour urine test, right?
607
00:33:18,920 --> 00:33:22,280
I mean, I mean, and this is what
I mean by when you have 0
608
00:33:22,560 --> 00:33:26,200
information available to you
because you're in panic mode and
609
00:33:26,200 --> 00:33:29,600
you, you know, this is only your
second birth ever or you're in
610
00:33:29,600 --> 00:33:31,920
the middle of the intracoastal
and your kid has had a head
611
00:33:31,960 --> 00:33:36,880
your, their head.
It can be extremely clarifying
612
00:33:37,400 --> 00:33:40,320
and supportive just to know.
Like just to gather some
613
00:33:40,320 --> 00:33:43,880
baseline level of information
around what is supposed to be
614
00:33:43,880 --> 00:33:47,640
done if you are a patient or
what you should do as as the
615
00:33:47,640 --> 00:33:49,080
parent.
Totally.
616
00:33:49,080 --> 00:33:52,960
I can't believe that 2019 is
basically the before times.
617
00:33:53,400 --> 00:33:56,840
Yeah, I'm forgiving myself for
this, but I also, I'm just like,
618
00:33:56,840 --> 00:34:00,040
gosh darn it, three years later
and we never, ever would have
619
00:34:00,040 --> 00:34:03,960
had that traumatic experience.
They were like sanitizing the
620
00:34:03,960 --> 00:34:07,920
scalpel with like some alcohol
and giving you a piece of wood
621
00:34:07,920 --> 00:34:09,040
to bite down on.
Yeah, right.
622
00:34:09,120 --> 00:34:10,560
You know that's.
Laughing.
623
00:34:11,440 --> 00:34:13,880
Laughing gas.
That's basically what.
624
00:34:14,440 --> 00:34:17,080
What it feels like, yeah.
Have this.
625
00:34:17,280 --> 00:34:20,600
Power it must also be awful for
doctors right now because we are
626
00:34:20,600 --> 00:34:23,840
in this like storming forming
phase of AI where I'm sure tons
627
00:34:23,840 --> 00:34:27,199
of patients are coming in and
like GBT tells me to do X and
628
00:34:27,199 --> 00:34:29,679
they're like, I don't know,
maybe my 30 years of experience
629
00:34:29,679 --> 00:34:32,600
tell me to do why.
So I I feel for you in the
630
00:34:32,600 --> 00:34:36,320
medical system out there and I'm
really glad that we as patients
631
00:34:36,320 --> 00:34:41,080
have more opportunity to get up
to speed on our own health more
632
00:34:41,080 --> 00:34:43,800
quickly.
OK, so #3 is create.
633
00:34:43,800 --> 00:34:48,400
This is obviously my favorite 1
and I feel like something that a
634
00:34:48,400 --> 00:34:51,120
lot of people know about.
I've heard some friends say, you
635
00:34:51,120 --> 00:34:53,960
know, I tell it what's in my in
my refrigerator and it comes up
636
00:34:53,960 --> 00:34:57,360
with a recipe or for us, a
couple years ago, we made this
637
00:34:57,360 --> 00:34:59,640
awesome scavenger hunt for
Christmas morning.
638
00:34:59,920 --> 00:35:03,640
These are some like really fun
ways to extend our own
639
00:35:03,640 --> 00:35:08,240
creativity.
I also think that this gets to a
640
00:35:08,240 --> 00:35:12,920
bigger concept, which is that
we've talked about as a family,
641
00:35:13,160 --> 00:35:16,880
which is that we want to become
our role as a creator family,
642
00:35:17,400 --> 00:35:22,480
not a consumer family because
with social.
643
00:35:22,480 --> 00:35:26,640
Media.
With AI there is more content
644
00:35:26,640 --> 00:35:30,560
out there than ever that is ripe
for consumption.
645
00:35:32,360 --> 00:35:34,200
But.
We in the same way.
646
00:35:34,200 --> 00:35:37,280
That we didn't give our kids
toys with batteries where you
647
00:35:37,280 --> 00:35:42,200
can push lots of buttons because
it's very passive and we chose
648
00:35:42,200 --> 00:35:45,760
to give them open-ended toys
that allow for active play.
649
00:35:46,200 --> 00:35:51,440
We want to get them creating
simple things right now, right?
650
00:35:51,440 --> 00:35:56,480
But creating videos, creating
tour guides, creating Lego
651
00:35:56,480 --> 00:36:04,760
designs that are really an
effort to work with the AI so
652
00:36:04,760 --> 00:36:07,960
that they one, develop an
understanding of what's involved
653
00:36:07,960 --> 00:36:11,960
when they do consume, but two,
think more about how they don't
654
00:36:11,960 --> 00:36:13,960
really have to consume because
they can create.
655
00:36:15,160 --> 00:36:17,040
OK, so.
Some ways that we.
656
00:36:17,040 --> 00:36:19,960
Have been creators as opposed to
consumers.
657
00:36:20,240 --> 00:36:23,280
Have been like when we went to
the Kennedy Space Center, Hunter
658
00:36:23,280 --> 00:36:27,200
very quickly created a tour, a
tour guide of it, I guess you
659
00:36:27,200 --> 00:36:29,280
would say, through a series of
different videos.
660
00:36:29,520 --> 00:36:32,520
And then we helped her put that
all together with edits.
661
00:36:32,520 --> 00:36:35,600
We did the same thing for
Thanksgiving this year when we
662
00:36:35,600 --> 00:36:39,040
had a gratitude table and gave
away free cookies.
663
00:36:39,040 --> 00:36:42,240
She asked everyone that came up
what they were grateful for, and
664
00:36:42,240 --> 00:36:44,520
then we were able to put
together a little video for her
665
00:36:44,520 --> 00:36:47,320
very easily.
Yes, I I just love to see.
666
00:36:47,320 --> 00:36:50,800
Them create and think.
Think about creating as their as
667
00:36:50,800 --> 00:36:53,560
their instinct.
Now yes, I agree and.
668
00:36:53,560 --> 00:36:55,760
Just a double click on the table
for a moment.
669
00:36:55,760 --> 00:36:59,520
So as you mentioned, Greg, this
past weekend for Thanksgiving,
670
00:36:59,520 --> 00:37:03,400
we did a part cleanup and little
gratitude trail and sidewalk
671
00:37:03,400 --> 00:37:05,160
talk.
And then in the afternoon, we
672
00:37:05,160 --> 00:37:07,440
gave away free cookies at a
gratitude table.
673
00:37:08,280 --> 00:37:10,200
All of those ideas were AI
generated.
674
00:37:10,680 --> 00:37:13,720
There were also about 100 other
bad ideas or ones that we did
675
00:37:13,720 --> 00:37:16,160
not choose.
And I did feed it.
676
00:37:16,280 --> 00:37:18,880
You know, this is how we want to
spend our day, giving back in
677
00:37:18,880 --> 00:37:22,280
service in the community, being
outside, focusing on gratitude.
678
00:37:22,480 --> 00:37:25,600
Here are some of the things that
we care about, right?
679
00:37:25,840 --> 00:37:27,960
And so guiding it and
brainstorming with it.
680
00:37:27,960 --> 00:37:31,120
But I have to tell you, I would
not have thought of a park
681
00:37:31,120 --> 00:37:35,960
cleanup, a gratitude walk, or a
gratitude table on my own.
682
00:37:37,040 --> 00:37:39,840
It's so cool.
So thanks be to AI, yeah.
683
00:37:39,960 --> 00:37:42,720
And also like, you know.
There are, there's some
684
00:37:42,720 --> 00:37:45,200
switching costs here.
I said there are no switching
685
00:37:45,200 --> 00:37:48,120
costs, but there are some
because the models do get to
686
00:37:48,120 --> 00:37:51,600
know us as, as people and kind
of the, the types of things that
687
00:37:51,600 --> 00:37:55,440
we gravitate towards.
So I think that all of the
688
00:37:55,440 --> 00:37:59,600
inputs over the years LED them
to say, hey, Danielle's the type
689
00:37:59,600 --> 00:38:01,840
of person that would be inspired
by this.
690
00:38:02,680 --> 00:38:05,800
Yes, possibly.
And and this goes back to being
691
00:38:05,800 --> 00:38:08,720
creators, not consumers.
Execution matters.
692
00:38:09,320 --> 00:38:11,920
How many people do you think
dialogued with AI about like fun
693
00:38:11,920 --> 00:38:14,080
things to do on Thanksgiving
versus actually did it?
694
00:38:14,760 --> 00:38:20,440
You know, and so these tools are
at our disposal, but it's still
695
00:38:20,440 --> 00:38:23,800
going to be the people that
actually are the action takers
696
00:38:24,320 --> 00:38:29,080
that that win in the end, right?
The other thing I'll say around
697
00:38:29,080 --> 00:38:32,720
creation is we've been using
notebook LMS explainer videos
698
00:38:32,720 --> 00:38:36,880
more recently to take, you know,
perhaps they'll be a Wikipedia
699
00:38:36,880 --> 00:38:41,680
page or a 15 minute YouTube
video on the domestication of
700
00:38:41,800 --> 00:38:44,160
cats, which is like a real life
example.
701
00:38:44,160 --> 00:38:47,920
Hunter wanted to understand how
cats became house cats, and so I
702
00:38:47,920 --> 00:38:50,960
pulled in a couple of different
sources and then asked Notebook
703
00:38:50,960 --> 00:38:54,000
LM to create an explainer video.
It created a 5 minute video,
704
00:38:54,240 --> 00:38:58,840
made it extremely easy to
understand with pictures with
705
00:38:58,880 --> 00:39:00,960
it.
It really breaks it out nicely,
706
00:39:01,560 --> 00:39:04,360
always does.
And then it really helped us
707
00:39:04,360 --> 00:39:07,040
understand how cats were
domesticated, which is really
708
00:39:07,040 --> 00:39:10,040
around when we became an
agrarian society.
709
00:39:10,040 --> 00:39:13,600
And with grain comes rats, and
with rats come cats.
710
00:39:15,120 --> 00:39:17,200
They're very effective pest.
Control, right?
711
00:39:17,240 --> 00:39:24,040
And so it also created, not only
did Hunter understand this very
712
00:39:24,040 --> 00:39:26,800
quickly get up to speed, but
then she said, I want to share
713
00:39:26,800 --> 00:39:29,840
it with my class.
And so we were able to share it
714
00:39:29,840 --> 00:39:32,040
with her teacher who shared it
with her cluster.
715
00:39:32,280 --> 00:39:36,840
And now it really has legs far
beyond what we do with it if we
716
00:39:36,840 --> 00:39:39,480
choose to share it totally.
And I think that.
717
00:39:39,480 --> 00:39:41,840
Notebook LM is just the tip of
the iceberg.
718
00:39:41,840 --> 00:39:47,680
I I really see all of these AI
video apps like Sora being able
719
00:39:47,680 --> 00:39:50,320
to generate explainer videos and
I think that's going to be the
720
00:39:50,320 --> 00:39:52,840
next big thing.
Absolutely.
721
00:39:52,840 --> 00:39:56,120
So we want to be a creator, not
a consumer family.
722
00:39:56,680 --> 00:39:58,880
We like that.
It generates a lot of ideas,
723
00:39:58,880 --> 00:40:01,760
recognizing that most of them
will be bad, but.
724
00:40:02,440 --> 00:40:05,240
The great thing about AI is that
it really has what they call
725
00:40:05,240 --> 00:40:08,320
elastic thinking, right?
It can take criticism.
726
00:40:08,560 --> 00:40:12,160
It always has a beginner's mind.
You can tell it to relax when it
727
00:40:12,160 --> 00:40:15,400
becomes overly analytical.
These are all really important
728
00:40:15,400 --> 00:40:18,800
skills to have in a teammate
when you are working.
729
00:40:19,640 --> 00:40:23,880
Absolutely, yeah.
So hopefully this was a helpful
730
00:40:23,880 --> 00:40:29,560
understanding of how we use AI
at home to to clarify, coach and
731
00:40:29,560 --> 00:40:33,320
create.
We would love to hear from
732
00:40:33,320 --> 00:40:39,240
others, any novel innovations or
novel uses of AI within your
733
00:40:39,240 --> 00:40:43,200
homes, especially as it relates
to things that build family
734
00:40:43,200 --> 00:40:45,440
culture.
I can see lots of parenting
735
00:40:45,440 --> 00:40:48,640
advice in AI, and that's a
slippery slope, but I think
736
00:40:48,640 --> 00:40:51,320
building family culture is, it's
obviously what we're all about,
737
00:40:51,320 --> 00:40:53,360
and it's what we're looking for
use cases around.
738
00:40:53,520 --> 00:40:55,240
Yeah, I'd love to hear about
things that are.
739
00:40:55,240 --> 00:40:59,560
Repeatable and scalable, not
just kind of creative ideas of
740
00:40:59,720 --> 00:41:02,800
how do I turn these flowers into
a beautiful flower arrangement,
741
00:41:02,800 --> 00:41:06,120
but things that are actually
incorporated into your daily
742
00:41:06,120 --> 00:41:07,800
workflow.
Like maybe there's some shared
743
00:41:07,800 --> 00:41:12,360
systems that you use AI for.
Would love to hear more about
744
00:41:12,360 --> 00:41:16,040
that and just use it to.
I just want to say to the folks
745
00:41:16,040 --> 00:41:19,400
that are scared of it or haven't
taken a look at it yet.
746
00:41:19,840 --> 00:41:23,280
Currently we like Gemini Grok is
pretty fun as well.
747
00:41:23,480 --> 00:41:26,600
But you know, I truly believe
that if you're listening to this
748
00:41:26,600 --> 00:41:30,760
podcast, you are on the
forefront of developing family
749
00:41:30,760 --> 00:41:35,520
culture and that the use of
artificial intelligence could be
750
00:41:35,520 --> 00:41:39,680
really supportive to to your
family for sure.
751
00:41:40,040 --> 00:41:41,480
It's been, really.
Supportive here.
752
00:41:41,480 --> 00:41:45,120
And I think that the
developmentally appropriate
753
00:41:45,120 --> 00:41:49,120
stuff that we're exploring with
the kids is the key to all of
754
00:41:49,120 --> 00:41:52,840
this is bringing them in only
selectively when there's
755
00:41:52,840 --> 00:41:54,320
problems that they need to
solve.
756
00:41:54,320 --> 00:41:58,560
Otherwise bringing AI to them in
its finished product after we've
757
00:41:58,560 --> 00:42:02,960
had the input reviewed and
clarified and and corrected
758
00:42:03,320 --> 00:42:05,320
right.
So those explainer videos out of
759
00:42:05,320 --> 00:42:07,520
notebook LM are the perfect
thing for kids.
760
00:42:07,520 --> 00:42:10,640
They don't realize that it's all
AI generated, but they are going
761
00:42:10,640 --> 00:42:12,920
to sit there and watch and say,
wow, that was really helpful.
762
00:42:13,080 --> 00:42:15,640
And it sure beats the hell out
of a, you know, a 30 minute
763
00:42:15,640 --> 00:42:19,160
YouTube to watch 5 minutes of or
me trying to read something and.
764
00:42:19,160 --> 00:42:21,720
Then explain it exactly.
Exactly so.
765
00:42:21,880 --> 00:42:23,640
More to come.
I feel like this is going to be
766
00:42:23,640 --> 00:42:27,480
a recurring theme here.
And you know, fortunately, it's
767
00:42:27,480 --> 00:42:31,000
within scope of everything that
we do, absolutely all.
768
00:42:31,760 --> 00:42:32,920
Right.
Thanks so much, everybody.
769
00:42:33,000 --> 00:42:34,120
Thanks everyone.
Love you, Goosey.
770
00:42:34,120 --> 00:42:39,160
Love you, Goosey.
Hey guys, if you're still here,
771
00:42:39,160 --> 00:42:40,880
you're.
Definitely our kind of person.
772
00:42:41,360 --> 00:42:43,880
Thanks for spending this time
with us on The most Important
773
00:42:43,880 --> 00:42:46,400
thing.
If this episode resonated with
774
00:42:46,400 --> 00:42:48,840
you, we'd love for you to follow
us wherever you get your
775
00:42:48,840 --> 00:42:51,040
podcasts and share it with
someone else.
776
00:42:51,040 --> 00:42:52,760
Building family culture on
purpose.