00:00:00,000 --> 00:00:03,875
If you don't engage with the product,
you become the product.
00:00:04,500 --> 00:00:09,416
And it's not about whether or not you're
ready for AI is already happening to you.
00:00:09,916 --> 00:00:12,791
It's a matter of you
taking the opportunity
00:00:12,791 --> 00:00:15,791
to educate yourself on it.
00:00:16,666 --> 00:00:18,583
If you believe
we can change the narrative,
00:00:18,583 --> 00:00:20,791
if you believe
we can change our communities,
00:00:20,791 --> 00:00:24,166
if you believe we can change the outcomes,
then we can change the world.
00:00:24,875 --> 00:00:26,458
I'm Rob Richardson.
00:00:26,458 --> 00:00:29,000
Welcome to disruption now.
00:00:29,000 --> 00:00:29,875
All right.
00:00:29,875 --> 00:00:31,166
Welcome to disruption now.
00:00:31,166 --> 00:00:33,000
I'm your host and moderator
Rob Richardson.
00:00:33,000 --> 00:00:36,625
As always, we bring you stories of people
that are, changing the world
00:00:36,625 --> 00:00:39,625
through, social action
or social innovation.
00:00:39,708 --> 00:00:43,541
And Aaron Reddick is one of those people
with chat black GPT.
00:00:43,875 --> 00:00:47,166
He's,
really caught the eye movement by storm
00:00:47,458 --> 00:00:51,458
and making sure that it was inclusive,
intentional, avoided bias.
00:00:51,708 --> 00:00:53,791
And, you know, he's still, a warrior
00:00:53,791 --> 00:00:55,375
and one of the one of the first
to innovate
00:00:55,375 --> 00:00:58,375
in that space in terms of making sure
that we are intentional about it
00:00:58,541 --> 00:01:00,125
and that we're inclusive about it.
00:01:00,125 --> 00:01:01,291
And one of the most important
00:01:01,291 --> 00:01:05,750
technological revolutions
that we've ever seen, in AI.
00:01:05,750 --> 00:01:08,750
And I'm honored to have her as a friend.
00:01:09,000 --> 00:01:10,583
She also be at the next Midwest con.
00:01:10,583 --> 00:01:13,166
So we
we have a lot to discuss and to go over.
00:01:13,166 --> 00:01:15,375
And I'll see her in DC soon.
00:01:15,375 --> 00:01:17,458
Anyway, Aaron, how are you doing today?
00:01:17,458 --> 00:01:18,333
I'm doing well.
00:01:18,333 --> 00:01:21,500
Thank you for having me. Awesome
to be here.
00:01:21,500 --> 00:01:24,250
Yes. Very excited for Midwest Con.
00:01:24,250 --> 00:01:27,208
Yeah, very excited to have you for sure.
00:01:27,208 --> 00:01:28,708
And you know, I'm curious.
00:01:28,708 --> 00:01:31,208
So you, you,
you know, just jumping right into it.
00:01:31,208 --> 00:01:32,708
You you what what
00:01:33,708 --> 00:01:34,708
what was the like
00:01:34,708 --> 00:01:38,625
spark that got that
got you into AI in the first place
00:01:38,625 --> 00:01:41,916
and specifically sparked the idea of chat
black GPT?
00:01:43,250 --> 00:01:43,541
Yeah.
00:01:43,541 --> 00:01:47,458
I mean, my story goes that I was laid off
00:01:47,458 --> 00:01:50,958
from Metta and I had worked there
00:01:50,958 --> 00:01:55,083
for a couple of years and really kind of
made it my whole identity.
00:01:55,083 --> 00:01:59,458
So when I lost that job,
I really felt like I didn't have ownership
00:01:59,458 --> 00:02:03,000
over my relationship with technology
as a black woman in tech.
00:02:03,458 --> 00:02:05,125
And it didn't really sit right with me.
00:02:05,125 --> 00:02:08,000
So I decided to take that into my own
hands.
00:02:08,000 --> 00:02:11,083
And I was like,
what's the most tech forward
00:02:11,083 --> 00:02:14,291
thing that I can just dive into
and learn everything about?
00:02:14,291 --> 00:02:16,166
And I chose AI.
00:02:16,166 --> 00:02:18,291
This was late 2023.
00:02:18,291 --> 00:02:22,041
And so once I started studying
and doing research, I noticed that
00:02:22,041 --> 00:02:26,708
there were two narratives like, it's
so great for business and it's amazing.
00:02:26,708 --> 00:02:29,458
And then otherwise it's ruthless,
it's bias.
00:02:29,458 --> 00:02:32,000
It's, you know, erasing black history.
00:02:32,000 --> 00:02:34,500
And I'm like,
why are there two different stories?
00:02:34,500 --> 00:02:37,708
So once I was told, you know,
there's nothing you could do about it.
00:02:37,708 --> 00:02:39,541
It's a black box of data.
00:02:39,541 --> 00:02:42,541
That's my favorite time to ask.
00:02:42,583 --> 00:02:44,333
But are you sure not.
00:02:45,458 --> 00:02:46,750
That's why I pretty much shine.
00:02:46,750 --> 00:02:50,333
And just continuing to ask questions
and push back and come up with solutions.
00:02:50,333 --> 00:02:54,500
And chatbot GPT was born
not soon after that.
00:02:54,750 --> 00:02:55,916
Yeah, you're a natural disruptor.
00:02:55,916 --> 00:02:58,000
Just, you know,
you don't accept what I appreciate.
00:02:58,000 --> 00:03:00,750
You don't accept the first answer, right?
00:03:00,750 --> 00:03:01,250
Interesting.
00:03:01,250 --> 00:03:03,750
You say your identity, and that's
00:03:03,750 --> 00:03:07,041
I really resonated when you talked
about your identity and kind of.
00:03:07,125 --> 00:03:09,416
You didn't say this, but, projecting.
00:03:09,416 --> 00:03:11,083
And if I'm wrong, you can push back,
00:03:11,083 --> 00:03:14,625
you know, but after your layoff,
kind of had an initial identity,
00:03:14,625 --> 00:03:17,750
kind of crisis on what you wanted to be
and where you wanted to go.
00:03:18,083 --> 00:03:21,875
And my guess is it wasn't just tech,
it was with you as a person
00:03:21,875 --> 00:03:24,333
because you saw yourself.
And we often see ourselves,
00:03:25,375 --> 00:03:27,958
as attached to what we are doing.
00:03:27,958 --> 00:03:28,333
Right.
00:03:28,333 --> 00:03:32,250
And so I've shared this story
that it also relates to me
00:03:32,250 --> 00:03:35,875
like my identity was I before this,
I was in, you know, politics.
00:03:35,875 --> 00:03:37,250
I said I wanted to be,
00:03:37,250 --> 00:03:40,083
you know, the president, United States,
the first black president, United States,
00:03:40,083 --> 00:03:42,083
obviously, I won't be the first black
president of the United States.
00:03:42,083 --> 00:03:44,875
And now I have no aim
to be president. Right.
00:03:44,875 --> 00:03:48,916
But I used to write, and that was
and my identity was tied up in politics.
00:03:48,916 --> 00:03:51,208
And, make a very long story variable.
00:03:51,208 --> 00:03:53,416
Got a lot of votes and still lost.
86
00:03:53,416 --> 00:03:56,291
And, you know,
it was hard for me to reconcile that.
00:03:56,291 --> 00:04:00,208
But often I tell people,
and I think you've taken this advice
00:04:00,666 --> 00:04:04,791
unknowingly, or maybe knowingly,
you did, that your identity isn't tied to
00:04:05,041 --> 00:04:09,208
what you do, but to what your values are
and how you have an impact in the world.
00:04:09,416 --> 00:04:11,166
Not not, not a position.
00:04:11,166 --> 00:04:16,208
And you're getting laid off for meta
didn't make you less worthy
00:04:16,208 --> 00:04:20,250
or less smart,
but it often feels that way, right?
00:04:20,541 --> 00:04:22,750
It it felt that way. To me.
It feels like a rejection.
00:04:22,750 --> 00:04:24,666
And how did you deal with that?
00:04:24,666 --> 00:04:28,500
Take your obviously you've recovered well
and it's and it worked out for the best.
00:04:29,250 --> 00:04:32,625
Take us through that moment when you went
through that initial identity
00:04:32,625 --> 00:04:35,875
crisis or rejection,
or how have you felt about it?
00:04:36,625 --> 00:04:39,916
What did you do to take yourself
to move beyond that, to
00:04:39,916 --> 00:04:44,166
to be able to, take the next leap into
what's now chat, black TV?
00:04:45,291 --> 00:04:46,875
I mean, to be honest, like,
00:04:46,875 --> 00:04:50,041
I, I'm very good at recognizing,
00:04:50,833 --> 00:04:53,833
cause and effect, like,
00:04:53,833 --> 00:04:54,875
analysis.
00:04:54,875 --> 00:05:00,291
Like I understood months before I was laid
off that I was going to be laid off.
00:05:00,833 --> 00:05:04,333
I got put on a very senior team
because I could hang,
00:05:04,791 --> 00:05:09,208
but like, I'm working alongside my manager
and other managers,
00:05:09,291 --> 00:05:12,416
just a bunch of managers and managing like
00:05:13,208 --> 00:05:17,500
tens of millions of dollars and vendors
of tools that I know are going to be cut.
00:05:18,125 --> 00:05:22,791
And it's like if the budgets for
these tools are being cut
00:05:22,791 --> 00:05:26,791
and I'm managing the vendor relationship,
where do I fit in?
00:05:27,125 --> 00:05:27,583
Right.
00:05:27,583 --> 00:05:32,333
So I was you know,
having one on one with my, director.
00:05:32,333 --> 00:05:35,750
And I was like, I'm not going to work
any harder than I already am,
00:05:35,750 --> 00:05:39,208
because whatever decisions that have been
made are already made.
00:05:39,666 --> 00:05:41,541
And I just was very resolute to that.
00:05:41,541 --> 00:05:45,083
And she was like, it's
very sobering for you to say that,
00:05:45,083 --> 00:05:48,291
and I appreciate your attitude towards
all of this.
00:05:48,291 --> 00:05:50,916
It's extremely difficult,
blah, blah, blah.
00:05:50,916 --> 00:05:54,375
And so I started my grieving process
while I was still working there.
00:05:54,750 --> 00:05:56,541
I went to all the cafeterias.
00:05:56,541 --> 00:06:00,041
I touched the felt on the wall
when in the music room,
00:06:00,375 --> 00:06:03,833
I stayed late,
I came early, you know, I clicked my badge
00:06:03,833 --> 00:06:06,916
a few extra times,
but I knew it was coming.
00:06:07,375 --> 00:06:12,416
And, when it all happened, I also knew
I was going to get a severance.
00:06:12,416 --> 00:06:18,250
And that was the first time that I had
a lump sum of cash money in my hands.
00:06:18,250 --> 00:06:21,458
And I wanted to not waste
00:06:21,458 --> 00:06:24,583
the opportunity to just do something.
00:06:24,916 --> 00:06:28,833
And I kept hearing this voice in my head
that was like, it's now or never.
00:06:29,000 --> 00:06:30,750
Are you actually an entrepreneur?
00:06:30,750 --> 00:06:33,708
Because I've had my business license
for years and years and years,
00:06:33,708 --> 00:06:36,958
like ten years,
and I never reported any revenue.
00:06:37,250 --> 00:06:41,458
2024 I will be the first year
that I report taxable income
00:06:41,458 --> 00:06:43,541
and I'm so proud of that. Let's go.
00:06:43,541 --> 00:06:44,708
Yeah. So proud of that.
00:06:44,708 --> 00:06:47,125
I mean, sure,
it might be half what I was making,
00:06:47,125 --> 00:06:50,458
but I still made half of what
I was making from 12 in that there's
00:06:50,458 --> 00:06:53,458
nothing to substitute
when you have to go out
00:06:53,625 --> 00:06:56,416
and you have to hunt and keep it
like, do not.
00:06:56,416 --> 00:06:59,666
I want you to
I do not want you to downplay
00:07:00,416 --> 00:07:03,625
the magnitude of what you've done
because it is incredible.
00:07:04,041 --> 00:07:06,708
It is not easy to do that.
00:07:06,708 --> 00:07:10,083
And most businesses
don't make any type of taxable profit.
00:07:10,083 --> 00:07:10,791
Okay.
00:07:10,791 --> 00:07:13,375
Like some people like people
get lost in this.
00:07:13,375 --> 00:07:16,916
What this actually is because social media
has you worked for meta.
00:07:16,916 --> 00:07:17,958
They're good at selling.
00:07:17,958 --> 00:07:19,916
The algorithms are good at selling
fantasies.
00:07:19,916 --> 00:07:20,875
Yeah. This is hard.
00:07:20,875 --> 00:07:24,541
Like and it's like
I mean like there is me.
00:07:24,750 --> 00:07:25,250
This is
00:07:26,750 --> 00:07:27,250
and this is hard.
00:07:27,250 --> 00:07:30,000
So I want you to know that I am
proud of you.
00:07:30,000 --> 00:07:31,708
Many are very proud of you.
00:07:31,708 --> 00:07:36,083
And I've, seen your work,
and this is just the beginning.
00:07:36,083 --> 00:07:37,083
The fact that you're.
00:07:37,083 --> 00:07:39,000
I can tell you this without question.
00:07:39,000 --> 00:07:42,000
Whatever you did this year, right? In.
00:07:42,208 --> 00:07:45,916
I know that if you did that, you can do
00:07:45,916 --> 00:07:49,416
triple that with the right approach
and learnings that you get from that.
00:07:49,416 --> 00:07:50,666
I promise you, you can.
00:07:50,666 --> 00:07:53,125
And then once you get a triple,
you can sense that,
00:07:53,125 --> 00:07:56,666
but you are already on your way
and, you know, don't,
00:07:56,916 --> 00:07:58,500
don't, don't downplay
00:07:58,500 --> 00:08:01,125
what you've already accomplished
because it has been quite amazing.
00:08:01,125 --> 00:08:02,125
I really mean that.
00:08:02,125 --> 00:08:03,583
I appreciate that.
00:08:03,583 --> 00:08:03,916
Yeah.
00:08:03,916 --> 00:08:07,250
It's it's, it's definitely in, like,
00:08:08,541 --> 00:08:11,250
eye opening,
00:08:11,250 --> 00:08:13,250
getting out of the hourly mindset.
00:08:13,250 --> 00:08:16,208
Yes, is a huge task.
00:08:16,208 --> 00:08:20,041
So, like, getting out of it
and like how I kind of reclaim
00:08:20,041 --> 00:08:22,000
my personality and all of that.
00:08:22,000 --> 00:08:23,416
I did go to therapy.
00:08:23,416 --> 00:08:26,541
I'm ashamed to say that I shouldn't
be right.
00:08:26,833 --> 00:08:30,833
I was in I was in there
twice a week for like months.
00:08:30,833 --> 00:08:31,833
Okay.
00:08:31,833 --> 00:08:34,208
I had to really, like, reset,
00:08:34,208 --> 00:08:38,375
like who I am and like, what my what,
what skills, I mean.
00:08:38,416 --> 00:08:41,458
And I didn't start to feel better
until I started working
00:08:41,458 --> 00:08:46,916
and getting my hands dirty
into and learning how to actually build
00:08:46,916 --> 00:08:51,208
and the code and like, the certifications
and like that's when I started to gain
00:08:51,208 --> 00:08:54,250
confidence and like, I'm
naturally drawn towards solutions.
00:08:54,250 --> 00:08:58,416
So I kind of feel like that's
what brought me back to myself.
00:08:58,416 --> 00:09:01,375
Like,
I started the grieving process early.
00:09:01,375 --> 00:09:04,583
I went to therapy,
and I was very hands on and immersed
00:09:04,583 --> 00:09:07,708
in my new, you know, interest.
00:09:07,708 --> 00:09:10,333
So I've kind of coined that.
00:09:10,333 --> 00:09:13,500
So, Aaron, I read a little bit
about your background and,
00:09:13,500 --> 00:09:14,541
you know, correct me if I'm wrong.
00:09:14,541 --> 00:09:17,291
So you
you grew up in a very diverse setting.
00:09:17,291 --> 00:09:21,375
So if I'm correct, your father was your father grew up with white was white, right.
00:09:22,375 --> 00:09:22,750
Okay.
00:09:22,750 --> 00:09:26,125
And I think you've talked about
how that's shaped out of your perspectives
00:09:26,125 --> 00:09:27,958
and looking at the world
in an inclusive manner.
00:09:27,958 --> 00:09:31,500
Speak about how your experience
growing up in a very diverse
00:09:32,000 --> 00:09:36,458
kind of perspective speaks to who you are
and how you approach AI and technology.
00:09:37,166 --> 00:09:37,583
Yeah.
00:09:37,583 --> 00:09:40,583
So I have a,
200
00:09:43,041 --> 00:09:44,416
blended family.
00:09:44,416 --> 00:09:47,958
So I have my stepdad who is a white man,
00:09:49,083 --> 00:09:51,875
blond hair, blue eyes, the brown hair.
00:09:51,875 --> 00:09:57,000
But he was a blond on,
and then I had my mom, my biological mom.
00:09:57,500 --> 00:10:02,208
And so, like, for Christmas
time, we're going to his parents house.
00:10:03,000 --> 00:10:05,000
There's boats. We're on the lake.
00:10:05,000 --> 00:10:07,833
It's a condo with tall ceilings.
00:10:07,833 --> 00:10:10,791
Granite countertops
are making taupe or not.
00:10:10,791 --> 00:10:13,583
And then you've got my family
in Saint Louis, Missouri,
00:10:13,583 --> 00:10:16,750
where we're, like, pan
fried chicken patties.
00:10:16,750 --> 00:10:22,166
And I'm out in Baldwin, Michigan
with my grandmother, who I love dearly.
00:10:22,250 --> 00:10:25,166
She's no longer here,
but she's still with us in spirit.
00:10:25,166 --> 00:10:28,791
Yes, I'm in hot dog and jello cuts
and it's like,
00:10:29,125 --> 00:10:32,125
no matter what environment I was in.
00:10:32,791 --> 00:10:35,708
And not to put a juxtaposition
of rich and poor,
00:10:35,708 --> 00:10:40,791
because I also had a four floor story,
you know, grandparents
00:10:40,791 --> 00:10:44,666
house in Saint Louis, and they had trucks
and a construction company.
00:10:44,958 --> 00:10:47,916
But just with the family members
that I spent most time
00:10:47,916 --> 00:10:50,916
with, the juxtaposition was very stark.
00:10:51,000 --> 00:10:55,250
But I never felt that I loved
one more than the other,
00:10:55,250 --> 00:10:58,250
or I never felt more loved by one
or the other.
00:10:58,250 --> 00:11:00,958
It was just, this is my life.
00:11:00,958 --> 00:11:05,458
And sometimes my dad dropped me off
at school and people are like, who's that?
00:11:05,458 --> 00:11:06,500
And I'm like, it's my dad.
00:11:06,500 --> 00:11:11,250
And then they're like, oh, you know, oh,
you're this, you're that, blah blah blah.
00:11:11,458 --> 00:11:14,916
And like, then my mom would come
and I'd be like, oh, you know,
00:11:15,416 --> 00:11:16,708
she talks like you.
00:11:16,708 --> 00:11:19,708
And it's just like so many things.
00:11:20,416 --> 00:11:24,166
Dismantled and built up who I am today.
00:11:24,166 --> 00:11:28,083
Like definitions of,
like, being black in a black household.
00:11:28,375 --> 00:11:30,125
I still got black, mom.
00:11:30,125 --> 00:11:34,166
But, you know,
I still also had a different perspective.
00:11:34,250 --> 00:11:34,791
Absolutely.
00:11:34,791 --> 00:11:36,333
That got me into hunting.
00:11:36,333 --> 00:11:41,375
And he's, eating, venison chili and deer
jerky, like,
00:11:42,208 --> 00:11:46,166
you know, it's just two different cultures
and fish and it's like, right.
00:11:46,666 --> 00:11:47,000
Yeah.
00:11:47,000 --> 00:11:49,291
They just collided for me
my whole life. So.
00:11:49,291 --> 00:11:52,291
And oh, excuse me, I'm curious on because,
00:11:52,666 --> 00:11:55,583
because I kind of, in your description,
00:11:56,833 --> 00:11:58,750
think about,
like, you probably have some challenges
00:11:58,750 --> 00:12:02,916
with some of your peers in terms of them
trying to put you in a box.
00:12:02,916 --> 00:12:06,625
That's what I felt like you were getting
to, because I've been through this, too.
00:12:06,625 --> 00:12:07,500
Right.
00:12:07,500 --> 00:12:10,583
And I wonder how that's both informed you
and how you navigated that.
00:12:10,583 --> 00:12:10,791
Right.
00:12:10,791 --> 00:12:13,833
Because I had to navigate
the ridiculous definition of what
00:12:13,833 --> 00:12:16,500
it's supposed to mean to be whatever
the definition of being black
00:12:16,500 --> 00:12:19,583
supposed to mean, whatever
it means to talk black, to be black,
00:12:20,250 --> 00:12:24,250
like given your background
and your experience
00:12:24,250 --> 00:12:27,250
and how you dealt with, how do you
how did that both inform you?
00:12:27,541 --> 00:12:30,500
How did you deal with the,
00:12:30,500 --> 00:12:30,708
sure.
00:12:30,708 --> 00:12:34,583
Direct criticisms of people
thinking that you're a, you're either
00:12:34,583 --> 00:12:38,166
not black enough and then if you're
talking to your white counterparts
00:12:38,166 --> 00:12:41,250
that you're also not one of them,
because I'm sure that's a, that's a thing.
00:12:41,250 --> 00:12:41,875
That's still a thing.
00:12:41,875 --> 00:12:44,208
It was the thing with me, maybe
wasn't thing with you because you're like,
00:12:45,750 --> 00:12:47,291
so that
00:12:47,291 --> 00:12:50,416
yeah, I definitely, have had some
00:12:50,833 --> 00:12:53,875
really traumatic experiences,
you know, like your,
00:12:54,875 --> 00:12:56,541
What? Why are you talk so white?
00:12:56,541 --> 00:12:58,166
Why do you act like a white girl?
00:12:58,166 --> 00:13:00,208
Like it was like a white girl, Aaron.
00:13:00,208 --> 00:13:04,166
You know, and then my name is already
not very black.
00:13:04,166 --> 00:13:05,875
It's Aaron Reddick.
00:13:05,875 --> 00:13:08,875
It is literally not a black origin
that Reddick
00:13:09,041 --> 00:13:11,666
and then,
00:13:11,666 --> 00:13:14,041
and then you have, white friends.
00:13:14,041 --> 00:13:15,416
I was like, into, like,
00:13:15,416 --> 00:13:19,166
I don't know, you know, I got I got to,
you know, is they're all white
00:13:19,583 --> 00:13:23,000
and they're making fun of your hair
and like,
00:13:24,333 --> 00:13:27,750
run their fingers and say,
on the black market, I think it's cool.
00:13:27,750 --> 00:13:29,083
Yeah, yeah.
00:13:29,083 --> 00:13:31,125
And, you know, certain things
I shouldn't say.
00:13:31,125 --> 00:13:35,375
And testing boundaries
and it was just it was very difficult
00:13:35,375 --> 00:13:40,125
because I wasn't going to be,
you know, talking to T-Pain.
00:13:40,125 --> 00:13:42,250
But I also wasn't being much him.
00:13:42,250 --> 00:13:46,000
So it's like I
never danced, you know, like,
00:13:47,333 --> 00:13:50,416
but it helps with how it informs
my design for the product,
00:13:50,416 --> 00:13:54,583
because when people say, oh, well,
you can't represent the how can you say
00:13:54,583 --> 00:13:58,125
represent all, you know, black people,
black people aren't a monolith.
00:13:58,166 --> 00:14:01,375
If anybody understand
that, it's definitely also me,
00:14:01,833 --> 00:14:03,125
that we're not a monolith.
00:14:03,125 --> 00:14:09,791
And so I'm very much I'm open
minded to all walks of black life when,
00:14:10,625 --> 00:14:14,250
considering data
and information and stories
00:14:14,250 --> 00:14:17,875
and things that are donated for the tool
and how it actually
00:14:18,416 --> 00:14:21,416
acts and provides output.
00:14:21,666 --> 00:14:25,250
So with I speaking more about that,
because we're talking about like AI is,
00:14:26,541 --> 00:14:28,666
it's, it's, it's
what you put in is what you get.
00:14:28,666 --> 00:14:31,208
Right
in terms of making sure it's trained,
00:14:31,208 --> 00:14:33,000
that you give it the correct information.
00:14:33,000 --> 00:14:36,750
And that's why Chat Black GPT
is so important because our perspective
00:14:36,750 --> 00:14:38,375
isn't there.
00:14:38,375 --> 00:14:42,583
A lot of times it's not the those who are
building, put their perspective first.
00:14:42,583 --> 00:14:44,875
Like, it hit like,
you know, as they say, with history, the,
00:14:44,875 --> 00:14:47,875
the history is written
by the winners, right?
00:14:48,208 --> 00:14:49,541
So on and so forth.
00:14:49,541 --> 00:14:50,875
It's kind of true with data too.
00:14:50,875 --> 00:14:54,500
But I'm curious, given
some of the challenges in your obviously
00:14:54,958 --> 00:14:57,833
with what you're doing, you're taking a
step at addressing some of the challenges.
00:14:57,833 --> 00:15:03,000
But what do you see as the greatest
challenge in the future with AI?
00:15:03,208 --> 00:15:05,791
And actually,
let me state the question another way.
00:15:07,625 --> 00:15:08,916
What do you see as
00:15:08,916 --> 00:15:12,416
the worst case scenario
when it comes to AI?
00:15:12,875 --> 00:15:14,916
All right, that's the first question.
00:15:14,916 --> 00:15:18,500
What do you see as the best case scenario
and how do we make that happen.
00:15:19,375 --> 00:15:20,750
Sure.
00:15:20,750 --> 00:15:23,666
I think like worst case scenario
00:15:23,666 --> 00:15:26,666
with AI is that,
00:15:27,166 --> 00:15:28,791
the general public
00:15:28,791 --> 00:15:33,500
doesn't have access to it anymore
and we're only subjected to it.
00:15:34,083 --> 00:15:37,375
I think that's worst case scenario,
because then we don't have
00:15:37,375 --> 00:15:38,875
an opportunity to build.
00:15:38,875 --> 00:15:43,041
We don't have an opportunity
to create the technology
00:15:43,041 --> 00:15:46,166
in a way
that it can serve our communities.
00:15:47,333 --> 00:15:50,083
And we kind of lose our autonomy.
00:15:50,083 --> 00:15:56,458
We lose a lot if if we don't
have access to it in order to build it.
00:15:57,916 --> 00:16:01,250
And I know that some people might expect
the worst case scenario
00:16:01,250 --> 00:16:04,416
is that it takes over the world
and starts killing everyone.
00:16:06,000 --> 00:16:06,708
But I think
00:16:06,708 --> 00:16:09,958
that scenario exist with or without AI.
00:16:10,500 --> 00:16:15,583
Yeah, yeah, I actually think that's
the least likely scenario.
00:16:15,583 --> 00:16:18,625
I tell people, if you want to say sci fi
00:16:18,625 --> 00:16:21,958
movies, it's less Terminator,
more matrix, right?
00:16:22,041 --> 00:16:25,041
It's terms of like like, you know,
00:16:25,041 --> 00:16:26,250
with with with Terminator.
00:16:26,250 --> 00:16:28,583
It was they took over destroy.
00:16:28,583 --> 00:16:29,541
We destroyed the Earth.
00:16:29,541 --> 00:16:32,166
Whatever. With matrix.
00:16:32,166 --> 00:16:35,625
It was more like the algorithms
were giving people what they wanted.
00:16:35,625 --> 00:16:35,958
Right.
00:16:35,958 --> 00:16:39,791
And they were for and they and people
were mindlessly following
00:16:40,208 --> 00:16:42,625
without without thinking about it. Right.
00:16:42,625 --> 00:16:47,666
I think that could be a scenario where
it's where only a few have access to it.
00:16:47,666 --> 00:16:50,875
We have digital dictatorships,
and algorithms
00:16:50,875 --> 00:16:54,333
are influencing because they already
are to some extent, right.
00:16:54,333 --> 00:16:57,333
In terms of how things
are with social media, things like that.
00:16:57,333 --> 00:17:00,375
Now with algorithms,
pick it up to a new level.
00:17:01,500 --> 00:17:02,666
And knowing
00:17:02,666 --> 00:17:05,666
us at a better level
and being able to learn,
00:17:06,291 --> 00:17:09,166
it could exacerbate that,
that that issue in that problem,
00:17:09,166 --> 00:17:11,833
if we don't have some intentionality
and thought process about it.
00:17:11,833 --> 00:17:13,041
So, yeah.
00:17:13,041 --> 00:17:14,625
So I guess that that's the worst case
scenario.
00:17:14,625 --> 00:17:16,125
I agree with you.
00:17:16,125 --> 00:17:17,708
Let's
how do we make the best case scenario?
00:17:17,708 --> 00:17:19,291
What is the best case
scenario in your mind.
00:17:19,291 --> 00:17:21,250
And how do we make that happen?
00:17:21,250 --> 00:17:25,125
I think the best case scenario is that
AI is used to solve,
00:17:25,958 --> 00:17:29,541
like basic human rights concerns,
00:17:29,958 --> 00:17:32,958
like hunger, housing,
00:17:33,666 --> 00:17:36,666
clean water, medicine.
00:17:36,833 --> 00:17:40,458
I like using AI to, advance
00:17:40,541 --> 00:17:45,958
research faster and help
everybody just live a normal, decent life.
00:17:46,250 --> 00:17:50,250
I think that's best case
scenario and best case scenario
00:17:50,250 --> 00:17:53,708
as far as let's say that that was already
a standard,
00:17:54,041 --> 00:17:57,041
is that people understand how to use AI
00:17:57,041 --> 00:17:59,791
to bring out the best in themselves
00:17:59,791 --> 00:18:02,916
and really explore and expand
00:18:02,916 --> 00:18:08,000
and explode like just who they are
and scale themselves
00:18:08,000 --> 00:18:12,500
and what, gifted them with
and how their beauty and talents
00:18:12,500 --> 00:18:15,500
can grace
the earth and other people around.
00:18:15,666 --> 00:18:19,000
I really think that it should be about
00:18:19,375 --> 00:18:22,375
helping us spread our true selves,
00:18:22,375 --> 00:18:25,833
rather
than recreating versions of ourselves.
00:18:26,250 --> 00:18:27,291
I love that so.
367
00:18:27,291 --> 00:18:30,291
So, I think that was a that was a
it was a mic drop moment.
00:18:30,291 --> 00:18:33,541
So it's about spreading our true selves
versus
00:18:33,541 --> 00:18:36,666
just recreating or copying things.
00:18:36,666 --> 00:18:39,541
I feel like ourselves. That's
that's really that's a gem.
00:18:39,541 --> 00:18:39,958
It's a gem.
00:18:39,958 --> 00:18:42,583
We have to we're gonna have to have
a mic drop moment for that.
00:18:42,583 --> 00:18:43,875
It really is.
00:18:43,875 --> 00:18:47,250
I, I talk about that in terms of, like,
I think of it like,
00:18:47,541 --> 00:18:52,708
if we had Einsteins, Benjamin Banneker,
you know, other geniuses and others
00:18:52,708 --> 00:18:57,041
that were around him,
I wouldn't replace them.
00:18:57,041 --> 00:19:00,041
And they're great inventions
and thought processes.
00:19:00,250 --> 00:19:05,458
But what it could do is expand or 100 x
what they were able to output.
00:19:05,458 --> 00:19:06,958
Right? Right.
00:19:06,958 --> 00:19:07,250
Yeah.
00:19:07,250 --> 00:19:08,541
That's what I that's how I see it,
00:19:08,541 --> 00:19:10,916
because I don't think there's
a lot of concern about like,
00:19:12,625 --> 00:19:13,791
it like replacing
00:19:13,791 --> 00:19:18,208
all of human creativity,
which I don't think is at least right now,
00:19:18,333 --> 00:19:22,000
even even in the realm of possibility,
like, I think artistic intelligence
00:19:22,000 --> 00:19:23,041
is what we need to look for.
00:19:23,041 --> 00:19:26,041
Not, not artificial intelligence.
00:19:26,083 --> 00:19:27,458
It doesn't replace us.
00:19:27,458 --> 00:19:29,666
It it done right enhances us.
00:19:29,666 --> 00:19:33,291
So let's let's, let's, let's work
through the average person because,
00:19:33,750 --> 00:19:37,833
you know, although it seems like it,
most people still probably don't use,
00:19:39,458 --> 00:19:40,500
generative AI.
00:19:40,500 --> 00:19:42,833
And AI,
00:19:42,833 --> 00:19:45,833
at a level that they should
or mostly probably still not at all.
00:19:46,000 --> 00:19:48,125
How do you introduce people
through the process?
00:19:48,125 --> 00:19:51,958
Like how like what do you do to make sure
you kind of get over that initial
00:19:51,958 --> 00:19:55,250
kind of psychological block that I'm sure,
especially if you're dealing
00:19:55,250 --> 00:19:59,000
with the black community in particular
that is there, the mistrust, the,
00:20:00,000 --> 00:20:02,791
the feeling of like,
oh, I'm not technical, so I can't do this.
00:20:02,791 --> 00:20:04,916
So how do you approach those opportunities
00:20:04,916 --> 00:20:07,916
and conversations when you, when you,
when you talk about AI?
00:20:08,250 --> 00:20:12,708
Yeah, I think that's probably why
my, keynotes and talks are popular
00:20:13,041 --> 00:20:16,708
because I, I may get somebody
what's the last person who said there?
00:20:16,708 --> 00:20:19,708
But thank you
for making AI less scary for me.
00:20:19,875 --> 00:20:22,041
I'm like, I like that.
00:20:22,041 --> 00:20:25,916
It's it's it's like when
when people think about coding.
00:20:25,916 --> 00:20:27,625
I remember the first time my dad was,
00:20:27,625 --> 00:20:29,416
like, moved to Seattle to be a software
engineer.
00:20:29,416 --> 00:20:31,166
I'm like, what is software?
00:20:31,166 --> 00:20:33,750
I'm like, what do you mean code?
00:20:33,750 --> 00:20:39,000
Like, I didn't understand how numbers
and screens like, just make things happen.
00:20:39,250 --> 00:20:42,416
Obviously I'm interacting with technology,
but how to create it?
00:20:43,041 --> 00:20:46,291
I could not connect those dots
for the life of me
00:20:46,666 --> 00:20:49,708
until I started working at it
and actually working at these companies,
00:20:50,000 --> 00:20:51,000
and then I can understand.
00:20:51,000 --> 00:20:53,500
And I think it's kind of the same for AI.
00:20:53,500 --> 00:20:57,750
It's happening, I see it,
but I don't understand how it's doing,
00:20:57,750 --> 00:20:58,791
what it's doing.
00:20:58,791 --> 00:21:01,916
And so I just break it down
like people say,
00:21:02,541 --> 00:21:05,791
that the number one
coding language in the next,
00:21:05,791 --> 00:21:08,791
I don't know, five,
ten years is going to be English.
00:21:09,083 --> 00:21:09,791
Right.
00:21:09,791 --> 00:21:13,541
And and it's because algorithms like it's
00:21:13,666 --> 00:21:16,708
words, you're telling it with words.
00:21:16,708 --> 00:21:19,250
I break it down like it's like a parent.
00:21:19,250 --> 00:21:22,250
Like I is like a child.
00:21:22,500 --> 00:21:27,416
The algorithm is like a parent telling
the child, these are your core values.
00:21:27,416 --> 00:21:30,000
This is wrong. This is right.
00:21:30,000 --> 00:21:33,500
And then the kid is just looking at things
and like taking into the information,
00:21:33,500 --> 00:21:37,250
but it's really not organized because it's
a child, but it's just taking it all in.
00:21:37,583 --> 00:21:38,666
And it's up to you as a parent
00:21:38,666 --> 00:21:42,416
to guide the child
to grow up and do good in the world.
00:21:42,708 --> 00:21:45,958
And so that's kind of how I help
break it down for people.
00:21:45,958 --> 00:21:47,750
So it's like less scary.
00:21:47,750 --> 00:21:53,416
And with the whole thing of so AI is
the child and the algorithm is the parent.
00:21:54,291 --> 00:21:55,083
That's how I look at it.
00:21:55,083 --> 00:21:58,791
And how do you see the human interacting
with the algorithm with the AI?
00:21:59,791 --> 00:22:01,291
So do you mean
00:22:01,291 --> 00:22:05,375
like the child,
like interacting with the world around it?
00:22:05,375 --> 00:22:07,916
Like, or if it's just so like,
because if I'm hearing you right,
00:22:07,916 --> 00:22:09,875
you say the algorithm is guiding the AI.
00:22:09,875 --> 00:22:12,750
So I guess to rephrase the question,
because it feels like
00:22:12,750 --> 00:22:14,583
it feels like we're getting to that.
00:22:14,583 --> 00:22:17,708
We are guiding
the principles around the algorithm.
00:22:17,708 --> 00:22:18,583
Is, is that how you see it?
00:22:18,583 --> 00:22:22,833
And then that then guides the
AI as a child, right? Yes.
00:22:22,833 --> 00:22:25,291
And we have to remember that
we're in charge.
00:22:25,291 --> 00:22:29,333
So I always tell them when you start,
don't don't use AI blindly.
00:22:29,541 --> 00:22:32,541
Like don't go into it
and expect it to lead you somewhere.
00:22:32,833 --> 00:22:37,833
Allow it to help you get where you're
going or help you challenge yourself.
00:22:38,875 --> 00:22:39,666
And so what I,
00:22:39,666 --> 00:22:43,583
what I say is let AI inform you,
not perform.
00:22:43,583 --> 00:22:45,625
You, say it again.
00:22:45,625 --> 00:22:46,458
Say that again.
Listen, letting it inform you.
00:22:49,375 --> 00:22:52,375
Let AI inform you. Yes.
00:22:52,583 --> 00:22:54,708
Not perform. Yeah.
00:22:54,708 --> 00:22:57,041
There you go, I like that. Yeah.
00:22:57,041 --> 00:23:00,500
So that's how I try to, like,
make it more approachable for people.
00:23:00,500 --> 00:23:05,833
And then I show them the back end of,
like, these custom AI engines.
00:23:05,833 --> 00:23:09,083
And it's like, it's really not all that.
00:23:09,083 --> 00:23:11,541
Now when you build one from scratch,
you know, that's different.
00:23:11,541 --> 00:23:12,750
Yeah. Very different.
00:23:12,750 --> 00:23:16,916
But the average person can create their
own custom GPT and they don't realize it.
00:23:17,208 --> 00:23:17,458
Yeah.
00:23:17,458 --> 00:23:20,250
And people have to get like it's just me.
00:23:20,250 --> 00:23:24,458
Like a lot of really great points
about the necessity of bringing the
00:23:24,625 --> 00:23:27,583
the authenticity of who you are
00:23:27,583 --> 00:23:30,416
and understanding who you are first.
00:23:30,416 --> 00:23:33,583
AI in a world where everything is
artificial, authenticity reigns supreme.
00:23:33,583 --> 00:23:38,333
So think about like you still
this is why, like critical
00:23:38,333 --> 00:23:42,416
thinking, understanding principles
and developing your own set of creativity
00:23:43,458 --> 00:23:44,458
will not go away.
00:23:44,458 --> 00:23:45,583
They'll only become enhanced.
00:23:45,583 --> 00:23:49,458
But I think people will think that
they can turn off their brain and use AI.
00:23:49,500 --> 00:23:51,375
That's that's not going to work.
00:23:51,375 --> 00:23:54,666
And what he could do is that, you know,
I can give you something
00:23:54,666 --> 00:23:56,333
where you don't guided appropriately.
00:23:56,333 --> 00:23:58,291
You're not clear on what you're saying.
00:23:58,291 --> 00:24:00,458
And then you get, you know,
00:24:00,458 --> 00:24:03,958
the on the, on the, on the most mild
example, you get a bad answer.
00:24:04,291 --> 00:24:04,833
Yeah.
00:24:04,833 --> 00:24:08,666
On a horrible example, you know, you get
you get something where it instructs
00:24:08,666 --> 00:24:12,583
a customer to to, to
to give to get all your money back or.
00:24:13,041 --> 00:24:16,416
But the car crashes because you weren't
clear with how you got to the child.
00:24:17,750 --> 00:24:18,000
Yeah.
00:24:18,000 --> 00:24:21,000
So this is why your work is so important
I love to talk about.
00:24:21,083 --> 00:24:23,541
So we talked about the worst case
scenario. The best case scenario.
00:24:25,416 --> 00:24:25,791
When you're
00:24:25,791 --> 00:24:28,791
starting off you give people the back end
and you give them the support.
00:24:29,041 --> 00:24:33,125
How can we from a systemic level
make sure more people are getting
00:24:33,750 --> 00:24:36,208
the knowledge and are getting the access.
00:24:36,208 --> 00:24:37,041
That's the first question.
00:24:37,041 --> 00:24:38,791
The second question is
what's the one thing
00:24:38,791 --> 00:24:41,791
if you had to advise someone right now
that doesn't know anything about AI,
00:24:42,208 --> 00:24:45,208
what's the what's the one thing
they can do to really get started?
00:24:46,916 --> 00:24:49,791
So for your first question,
00:24:49,791 --> 00:24:53,291
I think that recognizing it
00:24:53,833 --> 00:24:57,416
from like a standard,
00:24:57,416 --> 00:25:01,583
like a, almost like a basic, bare
minimum standard
00:25:01,750 --> 00:25:05,458
to have an understanding of AI
00:25:05,791 --> 00:25:09,500
in some way should be like a requirement
to graduate high school.
00:25:10,458 --> 00:25:11,125
Yeah.
00:25:11,125 --> 00:25:16,041
Just because of how prevalent it is
in our day to day lives
00:25:16,041 --> 00:25:20,541
and in the future of work, in the future
of their work and their jobs potentially.
00:25:21,916 --> 00:25:24,416
I think there should be some
00:25:24,416 --> 00:25:30,208
requirement that they know what it is
and what it's doing, and how to use it,
00:25:30,625 --> 00:25:35,708
not promoting a specific product,
but just the literacy or understanding.
00:25:35,958 --> 00:25:38,708
And I know some school districts
in California already
00:25:38,708 --> 00:25:42,375
have those requirements to graduate
and some,
00:25:43,791 --> 00:25:46,583
private schools
already have prompt engineering classes.
00:25:46,583 --> 00:25:50,750
And it's like you're saying
systematically, I think if we're going
00:25:51,125 --> 00:25:55,791
that route, it needs to be a part of
like a typical curriculum,
00:25:57,875 --> 00:26:03,083
or at least a very robust lesson
in absolutely math class.
00:26:03,083 --> 00:26:04,166
You know, it's math.
00:26:04,166 --> 00:26:06,458
It's pretty much what it is. Math.
00:26:06,458 --> 00:26:09,458
So, like,
I would even put it somewhere in there,
00:26:10,000 --> 00:26:12,375
or like a social science class, like,
00:26:12,375 --> 00:26:15,833
if you're looking at it more towards,
like, a public school, those things.
00:26:15,833 --> 00:26:18,333
I mean, if you
I mean, go into your point here and it's
00:26:18,333 --> 00:26:22,166
I think it should be infused
in, and, and arts
00:26:22,583 --> 00:26:26,750
infused and lessons and think about
we have to figure out how we are
00:26:26,750 --> 00:26:28,125
now going to approach the world.
00:26:28,125 --> 00:26:29,875
Just like when the internet came about.
00:26:29,875 --> 00:26:33,333
We had to reorient
how everything was done.
00:26:33,750 --> 00:26:37,333
Communication, business payments.
00:26:38,250 --> 00:26:40,958
I mean, there's not anything
that this is not going to affect
00:26:40,958 --> 00:26:42,000
at a really large level.
00:26:42,000 --> 00:26:43,708
I mean, it's and I believe
00:26:43,708 --> 00:26:46,875
we're probably undervaluing it in terms
of how we're talking about it.
00:26:47,291 --> 00:26:49,375
So when I think about it
in terms of sense, you,
00:26:49,375 --> 00:26:52,375
I know you're focused a lot on the black
community too, just as I am to
00:26:52,750 --> 00:26:56,083
like our institutions need to move faster.
00:26:56,458 --> 00:26:59,375
If I was every HBCU right now,
00:26:59,375 --> 00:27:01,791
like you don't have
all the advantages in the world.
00:27:01,791 --> 00:27:02,875
Some people have more resources,
00:27:02,875 --> 00:27:06,291
obviously because of many reasons
structural, systematic.
00:27:06,291 --> 00:27:10,708
But like if I was right now,
I would be doing everything to adopt and
00:27:10,708 --> 00:27:14,750
run with how I could differentiate myself
within the I movement.
00:27:14,750 --> 00:27:20,291
I mean, every HBCU should be hiring you,
self, Noel and others to help them scale
00:27:20,291 --> 00:27:23,833
because this is something they could do
right now to differentiate themselves.
00:27:24,083 --> 00:27:25,791
But if you wait, five years is too late.
00:27:25,791 --> 00:27:30,208
Right now would be the time to,
to really adopt and disrupt.
00:27:30,625 --> 00:27:32,333
Like their approach to education.
00:27:32,333 --> 00:27:35,416
So I just want to just the point
you're making is a valid one like this
00:27:35,416 --> 00:27:38,750
needs to be implemented
and thought about right now.
00:27:39,083 --> 00:27:41,208
But I think people are overwhelmed.
00:27:41,208 --> 00:27:44,125
So and then I want to get to a couple
of rapid fire questions and then we'll,
00:27:44,125 --> 00:27:44,750
we'll close out.
00:27:46,041 --> 00:27:48,041
I'm sure people are overwhelmed
when you breathe that brain,
00:27:48,041 --> 00:27:50,500
when you show
all the opportunities with AI,
00:27:50,500 --> 00:27:52,750
like when people get to you
and they just feel like they can't start.
00:27:52,750 --> 00:27:53,166
Like what?
00:27:53,166 --> 00:27:55,833
What do you what do you tell them?
00:27:55,833 --> 00:27:56,291
Yeah.
00:27:56,291 --> 00:28:00,541
So I mean, I just look at it like a
I try to tell people
00:28:00,541 --> 00:28:04,500
if you don't engage with the product,
you become the product.
00:28:05,125 --> 00:28:08,041
And it's not about
whether or not you're ready for AI.
00:28:08,041 --> 00:28:10,541
AI is already happening to you.
00:28:10,541 --> 00:28:16,041
It's a matter of you taking
the opportunity to educate yourself on it
00:28:16,333 --> 00:28:21,458
and make yourself a part of the narrative
and AI in your life.
00:28:21,958 --> 00:28:23,541
It's like, yeah, you got it.
00:28:23,541 --> 00:28:28,500
You got to actually, like, use it and take
advantage of it in the way that you can,
00:28:28,708 --> 00:28:33,958
because other people are
and they're getting a lot further ahead.
00:28:33,958 --> 00:28:35,625
And that tech divide
00:28:35,625 --> 00:28:39,750
is it's not just a matter of,
oh, I don't have a laptop or a computer.
00:28:39,750 --> 00:28:45,000
It's I'm not using AI technology at all
in my business or in my school
00:28:45,000 --> 00:28:47,458
or teaching my kids. And, that is like
00:28:48,458 --> 00:28:51,458
the tech is so much
00:28:51,625 --> 00:28:54,333
bigger and dividing so much faster,
00:28:54,333 --> 00:28:59,541
except when at least I've created
what I call a culture intervention,
00:29:00,000 --> 00:29:04,500
where I'm just creating a safe space
for black and brown communities
00:29:04,500 --> 00:29:08,666
to engage with the technology
in a way that is relatable.
00:29:09,333 --> 00:29:11,875
Significantly less harmful.
00:29:11,875 --> 00:29:15,291
And I really
I only the, the main things that I do is
00:29:15,750 --> 00:29:18,750
it's sources from black owned,
00:29:19,166 --> 00:29:24,000
content black authored content first,
which makes it a lot less biased
00:29:24,000 --> 00:29:27,000
because you're learning from the community
you're asking about.
00:29:27,291 --> 00:29:32,500
And, obviously
I teach it principles of like equity.
00:29:32,500 --> 00:29:37,375
And, it's also rooted in historical fact.
00:29:37,375 --> 00:29:39,291
So all the things that they're banning,
00:29:39,291 --> 00:29:42,583
all the classes that they're getting rid
of, critical race theory,
00:29:42,583 --> 00:29:45,750
like all of those things
that they want to pretend didn't happen,
00:29:46,791 --> 00:29:47,541
is like the heart
00:29:47,541 --> 00:29:50,916
and soul of the guarantee that I've got.
00:29:51,416 --> 00:29:54,541
So that's why it performs so well.
00:29:55,791 --> 00:29:56,416
That makes sense.
00:29:56,416 --> 00:29:58,916
And I, I tell people often,
00:29:58,916 --> 00:30:02,541
you may not like the system,
but you can't completely opt out, right?
00:30:02,541 --> 00:30:05,250
Unless you're just not going
to have any power, any knowledge.
00:30:05,250 --> 00:30:06,708
You can't opt out of technology.
00:30:06,708 --> 00:30:09,958
You can't opt out of AI.
It's it's too late.
00:30:10,250 --> 00:30:11,541
It is too late.
00:30:11,541 --> 00:30:15,625
And the future is going to be determined
by those who are building.
00:30:15,625 --> 00:30:18,625
So you are like the new civil
rights leader movement.
00:30:18,625 --> 00:30:21,625
That's how it's people
that are building in this space.
00:30:21,625 --> 00:30:25,208
Because, like you said,
the future is being written.
00:30:26,125 --> 00:30:30,458
Narratives are being written and, and,
and a lot of them are not accurate.
00:30:30,791 --> 00:30:32,750
And some of them are erasing history.
00:30:32,750 --> 00:30:35,500
So we have to be a part of building this.
00:30:35,500 --> 00:30:37,875
And it's also influencing
people like you said.
00:30:37,875 --> 00:30:41,208
So the only solution is for us to
to get up,
00:30:41,750 --> 00:30:45,541
get involved, become builders
and do the work that you're doing.
00:30:45,583 --> 00:30:48,000
Okay, a couple of rapid fire questions
I got for you.
00:30:48,000 --> 00:30:49,500
So these always trip people up.
00:30:49,500 --> 00:30:51,958
So sorry.
Not sorry. Let me give you a couple.
00:30:53,791 --> 00:30:55,625
What's an important truth you have that
00:30:55,625 --> 00:30:58,625
very few people agree with you on?
00:31:01,125 --> 00:31:05,041
Black people are uniquely equipped
to thrive
00:31:05,625 --> 00:31:10,583
in this new administration
because things are about to be unfair.
00:31:10,583 --> 00:31:13,208
But life's been unfair for us.
00:31:13,208 --> 00:31:15,833
This is transparent as a people.
00:31:15,833 --> 00:31:20,708
And it may not be fun,
it may not feel good, but you can thrive
00:31:20,708 --> 00:31:23,708
and you will thrive
if you allow yourself to.
00:31:24,750 --> 00:31:27,041
Especially as an entrepreneur.
00:31:27,041 --> 00:31:30,166
And I love it,
I love it, something I you know,
00:31:30,208 --> 00:31:33,208
I won't go too down into a rabbit
hole in this, but I'll say this,
00:31:33,416 --> 00:31:36,833
I think, you know, when we think
about a lot of these, diversity, inclusion
00:31:36,833 --> 00:31:39,833
efforts, I'm obviously
pro diversity, diversity, inclusion.
00:31:39,875 --> 00:31:40,166
Right.
00:31:40,166 --> 00:31:44,875
What they have been a lot, though,
is a lot of people here at this level
00:31:44,875 --> 00:31:48,416
have been a few people have been helped
and I call it diversity inclusion.
00:31:48,416 --> 00:31:49,875
But you look at the systemic,
00:31:49,875 --> 00:31:52,500
effort, it's not a lot of people
that end up getting contracts.
00:31:52,500 --> 00:31:54,291
It was a check the box approach.
00:31:54,291 --> 00:31:57,166
And so from that extent,
it really didn't work.
00:31:57,166 --> 00:32:00,041
Like we could just be honest.
Like diversity, inclusion works.
00:32:00,041 --> 00:32:03,625
The programing of how it was done
was mostly for show.
00:32:03,625 --> 00:32:04,750
We can be real.
00:32:04,750 --> 00:32:07,750
And most of it it didn't
it didn't help a lot of black communities.
00:32:07,750 --> 00:32:10,750
But and that's the reason
why it didn't have that push back.
00:32:10,750 --> 00:32:13,500
Now the reasons they're doing it,
I don't agree with. Right.
00:32:13,500 --> 00:32:16,500
Because I think they're doing it
for reactionary and
00:32:16,625 --> 00:32:18,083
and some of them are bigots.
00:32:18,083 --> 00:32:21,458
But, that's one of my truths
that people don't agree with.
00:32:21,500 --> 00:32:25,083
Like the diversity, inclusion
and how it was done was not effective,
00:32:25,375 --> 00:32:28,166
only helped a few people,
and mostly was a gatekeeping.
00:32:28,166 --> 00:32:28,500
Right.
00:32:28,500 --> 00:32:32,583
Okay, yeah, I didn't
I don't think it was that helpful.
00:32:32,583 --> 00:32:35,583
And that's not from me
being outside of most of my work.
00:32:35,666 --> 00:32:40,416
And I don't put it on my profile,
but it was working in Dei department.
00:32:40,583 --> 00:32:43,583
Yeah, it's a tolerance machine.
00:32:43,583 --> 00:32:44,666
That's right.
00:32:44,666 --> 00:32:48,375
And it's like it's like, hey,
just a reminder,
645
00:32:48,791 --> 00:32:51,250
please be nice and a decent human being.
00:32:51,250 --> 00:32:55,291
And by the way, all you black people,
I really want you guys to support
00:32:55,291 --> 00:32:57,000
and lean on each other.
00:32:57,000 --> 00:33:00,000
So how much onus on us to take care of us
00:33:00,250 --> 00:33:03,416
without actually giving any, like,
00:33:03,833 --> 00:33:08,583
actionable resources
to just be equal sized?
00:33:08,958 --> 00:33:12,083
Yes. About virtual deejays
and pizza parties.
00:33:12,291 --> 00:33:12,750
Okay.
00:33:12,750 --> 00:33:14,041
It's about
00:33:14,041 --> 00:33:18,375
how do I get a raise when she got a raise,
how do you not tone police me?
655
00:33:18,416 --> 00:33:23,250
How do you not talk about my hair
or ask me about,
00:33:23,666 --> 00:33:26,875
you know, problems
that have nothing to do with me?
00:33:26,875 --> 00:33:30,750
An overburdened me with subject matter
that's not related to what
00:33:30,750 --> 00:33:32,541
I'm getting paid to do. Yeah,
00:33:33,750 --> 00:33:36,375
that's what the where the real equity.
00:33:36,375 --> 00:33:39,083
Like we're still working on equal
pay for women. Yeah.
00:33:39,083 --> 00:33:40,791
So it's like the Dei.
00:33:40,791 --> 00:33:43,916
It was very much like a tolerance machine
00:33:44,208 --> 00:33:46,916
that people looked at as an inconvenience.
00:33:46,916 --> 00:33:49,791
And I think it perpetuated,
00:33:49,791 --> 00:33:52,291
rage because you got engineers
00:33:52,291 --> 00:33:55,625
over here working
and then you've got the black whatever.
00:33:55,708 --> 00:33:56,583
That's the right.
00:33:56,583 --> 00:34:00,333
They're not doing this work, or now
things are behind here
00:34:00,625 --> 00:34:03,583
and it's like, oh, well,
they get their day, blah, blah, blah.
00:34:03,583 --> 00:34:06,583
And it's kind of like, that's not.
00:34:07,333 --> 00:34:09,791
Why do we always get
00:34:09,791 --> 00:34:15,166
put in a position where it looks like
we have to be entertained
00:34:15,666 --> 00:34:18,250
to stay employed? Oh, oh, please, please.
00:34:18,250 --> 00:34:21,166
Preaching to the choir and preaching
oh my God. So like this is.
00:34:21,166 --> 00:34:23,791
And then as we get ready to wrap
up, I'll just say this is really
00:34:23,791 --> 00:34:27,208
one of my impetus to start a midwest con,
right?
00:34:27,208 --> 00:34:29,458
It's not it's and I have nothing against,
00:34:30,625 --> 00:34:33,500
any of the other,
and any other, conferences programing.
00:34:33,500 --> 00:34:35,291
I'm all for it. I got one coming.
00:34:35,291 --> 00:34:36,333
I got two coming on. They're good.
00:34:36,333 --> 00:34:40,291
But what I found is that
I wanted us to be in a space where, like,
00:34:40,291 --> 00:34:41,125
you know what, I.
00:34:41,125 --> 00:34:42,125
There was no diversity.
00:34:42,125 --> 00:34:44,750
Inclusion within what? I talk about stuff.
00:34:44,750 --> 00:34:45,250
You know why?
00:34:45,250 --> 00:34:47,250
Because it's part of who we are
or we don't like.
00:34:47,250 --> 00:34:51,416
So then I make sure that when people come
to the stage that they are,
00:34:51,416 --> 00:34:52,666
they bring diverse perspective.
00:34:52,666 --> 00:34:56,000
Because what has to happen is it has to be
part of the leadership and culture.
00:34:56,000 --> 00:34:58,041
It can't be seen as it has to be.
00:34:58,041 --> 00:35:02,625
It has to be integral to the organization,
not seen as a side piece,
00:35:02,625 --> 00:35:06,750
because then, it, it never really happens
and it becomes a distraction.
00:35:06,750 --> 00:35:11,458
And then, you know, we have what we have
now like, and I just think there was a,
00:35:11,500 --> 00:35:13,583
there's a better way
for us to move forward. And so,
00:35:15,083 --> 00:35:16,791
we definitely need diversity, inclusion.
00:35:16,791 --> 00:35:19,166
So I'll be clear on what I'm saying. Yeah.
00:35:19,166 --> 00:35:21,833
But how it's been done in
00:35:21,833 --> 00:35:25,500
the past has been more token
gating, helping a few people.
00:35:25,750 --> 00:35:27,458
It has not been effective.
00:35:27,458 --> 00:35:30,583
And that's why we need to
now become builders in the space.
00:35:30,583 --> 00:35:33,833
It cannot be enough for us
just to say we have a inclusion program.
00:35:34,166 --> 00:35:37,416
We have to be builders
and a part of actually
00:35:37,416 --> 00:35:40,458
changing the narrative
and changing the outcome.
00:35:40,458 --> 00:35:43,458
And that's why, Aaron,
I am so proud of you. And,
00:35:43,666 --> 00:35:46,333
to see the work that you're doing
and looking forward to,
00:35:46,333 --> 00:35:48,041
you know, figuring out ways
that we collaborate.
00:35:48,041 --> 00:35:50,750
What's the final piece
you'd like to leave folks with in terms of
00:35:50,750 --> 00:35:53,916
how do we build a better future together,
and what would you like people to take
00:35:53,916 --> 00:35:55,208
away with black ChatGPT?
00:35:56,750 --> 00:35:57,416
I just want
00:35:57,416 --> 00:36:00,416
it to be looked at as an opportunity
00:36:00,416 --> 00:36:05,958
to learn about black people
in black culture, with no consequences,
00:36:06,375 --> 00:36:11,875
and use generative AI with same ease
as the people who created the one.
00:36:11,875 --> 00:36:14,875
That doesn't really work
for us all the time.
00:36:15,166 --> 00:36:20,458
So it's like, if you have a question about
why can't I touch a black person's hair?
00:36:20,458 --> 00:36:22,666
Why do people make everything by race?
00:36:22,666 --> 00:36:25,583
But you don't have to disrupt
00:36:25,583 --> 00:36:29,416
and burden a black person
with these emotionally charged
00:36:29,416 --> 00:36:33,291
like questions you can ask chat like GPT
and give us a break.
00:36:33,541 --> 00:36:35,916
I think there's nothing
00:36:35,916 --> 00:36:39,625
like the let us mind our own business
and do our thing.
00:36:40,416 --> 00:36:42,750
And that is exactly what I'm focused on.
00:36:42,750 --> 00:36:47,000
And so like you said,
how about you and then black people,
00:36:47,333 --> 00:36:51,041
you know, it's actually going to help you
create business plans
00:36:51,041 --> 00:36:55,458
or strategies
with systemic oppression in mind.
00:36:55,458 --> 00:36:58,583
It's going to actually acknowledge like,
oh, you're
00:36:58,583 --> 00:37:02,333
you might run into these hardships
like remember to ask these questions
00:37:02,333 --> 00:37:04,416
when you're in this setting
because these are
00:37:04,416 --> 00:37:06,208
the things
that are going to be working against you.
00:37:06,208 --> 00:37:08,333
It's like a it's like a strategy.
00:37:08,333 --> 00:37:12,125
It's like somebody in your corner
who's like, hey, like, this is the world.
00:37:12,125 --> 00:37:14,458
Here's how you can navigate it with,
00:37:15,416 --> 00:37:17,916
street smarts, I don't know, street
smarts, essentially.
00:37:17,916 --> 00:37:21,541
So but but yeah, I want people to use it.
00:37:22,625 --> 00:37:24,166
I use it. I'm a fan.
00:37:24,166 --> 00:37:27,166
Oh, and I encourage everybody to use it.
00:37:27,375 --> 00:37:28,375
Of course I do that. Yeah.
00:37:28,375 --> 00:37:29,625
You sound like you surprised.
00:37:29,625 --> 00:37:32,625
Like about it. Yeah.
00:37:32,625 --> 00:37:34,666
But it's, you know,
00:37:34,666 --> 00:37:39,083
like I said, we're going to continue
to build work together for, Aaron Reddick.
00:37:39,500 --> 00:37:42,500
Chat black GPT
make sure you check her out.
00:37:42,958 --> 00:37:47,000
If you need to book her as a, as a,
as a, as a, as a, as a, as a speaker.
00:37:47,000 --> 00:37:48,000
Highly recommended.
00:37:48,000 --> 00:37:51,000
She is very knowledgeable,
very entertaining,
00:37:51,750 --> 00:37:54,291
and a wonderful person overall
that is working to really make
00:37:54,291 --> 00:37:55,041
a difference in the world.
00:37:55,041 --> 00:37:57,750
I'm proud to call her a friend
and a fellow disruptor
00:37:57,750 --> 00:37:59,208
and looking forward
to working more with you.
00:37:59,208 --> 00:38:02,000
Aaron, it's been a pleasure.
Thank you so much for coming on.
00:38:02,000 --> 00:38:03,625
Thanks so much. All right.
00:38:03,625 --> 00:38:05,208
Hold on one second. There you go. Okay.
HOSTED BY
ROB RICHARDSON
Share This!
LEARN about Erin Reddick's inspiring AI journey. She is the creator of ChatBlackGPTBeta, a platform advancing equity in AI. Erin shares how her unique upbringing as a Black woman adopted by a white father shaped her views on race, identity, and belonging. She discusses her transition from being laid off at Meta to becoming a trailblazer in ethical AI. The conversation dives deep into the challenges of promoting inclusivity in technology, the development of ChatBlackGPTBeta, and actionable advice for overcoming biases in AI and the workplace.
Top Three Things You Will Learn:
1. How Identity Shapes Innovation: Erin's personal experiences, from her upbringing to her career in tech, reveal how deeply identity and perspective influence the drive to create equitable solutions in AI.
2. Practical Steps for Ethical AI: This article provides insights into the challenges of addressing biases in AI systems and how tools like ChatBlackGPTBeta aim to promote anti-racism and inclusivity in technology.
3. Overcoming Bias in Professional Spaces: Strategies Erin used to navigate a male-dominated industry, ignore biased feedback, and foster positive, inclusive communication in the workplace.
ChatBlackGPT Official Page: https://chatblackgpt.com/
LinkedIn: https://www.linkedin.com/company/chatblackgpt/
Erin Reddick LinkedIn: https://www.linkedin.com/in/erinreddick/
DISRUPTION NOW LINKS:
Watch the episode: https://www.youtube.com/@Disruptionnow
Listen to the podcast episode: https://share.transistor.fm/s/6303505f/
CONNECT WITH THE HOST
ROB RICHARDSON
Entrepreneur & Keynote Speaker
Rob Richardson is the host of disruption Now Podcast and the owner of DN Media Agency, a full-service digital marketing and research company. He has appeared on MSNBC, America this Week, and is a weekly contributor to Roland Martin Unfiltered.
MORE WAYS TO WATCH
DISRUPTION NOW
Serious about change? Subscribe to our podcasts.