fbpx
00:00:00,133 --> 00:...

00:00:00,133 --> 00:00:03,500
If you
can produce something that people want

00:00:03,900 --> 00:00:07,733
and sell it for less than it, for more
than it cost for you to produce it.

00:00:07,966 --> 00:00:11,533
That is a magical thing
called capitalism and called.

00:00:11,566 --> 00:00:14,566
Yeah, no I agree. It is very simple. It.

00:00:14,566 --> 00:00:16,466
If you believe
we can change the narrative.

00:00:16,466 --> 00:00:18,666
If you believe
we can change our communities.

00:00:18,666 --> 00:00:22,033
If you believe, we can change the outcome
and we can change the world.

00:00:22,766 --> 00:00:24,300
I'm Rob Richardson.

00:00:24,300 --> 00:00:27,000
Welcome to disruption. Now.

00:00:27,000 --> 00:00:28,200
Welcome to disruption now.

00:00:28,200 --> 00:00:30,766
I'm your host and moderator,
Rob Richardson.

00:00:30,766 --> 00:00:35,066
As always, we like to bring stories
of those who are who are creating change

00:00:35,066 --> 00:00:38,633
and who are either building a business
or focused on social innovation.

00:00:38,733 --> 00:00:41,100
And my guest today is no different.

00:00:41,100 --> 00:00:42,066
Tremain Davis.

00:00:42,066 --> 00:00:45,066
We got a chance to meet during DC Startup
and Tech week.

00:00:45,600 --> 00:00:48,200
We quickly became, really just,

00:00:48,200 --> 00:00:51,733
kindred spirits that we're both here
focused on making sure that,

00:00:52,866 --> 00:00:55,866
everybody has access to technology
and all are empowered.

00:00:56,233 --> 00:00:58,466
Because it is the new economic
wave of freedom.

00:00:58,466 --> 00:00:59,800
We talk about it often.

00:00:59,800 --> 00:01:02,300
Digital literacy.

00:01:02,300 --> 00:01:03,366
You know, digital ownership.

00:01:03,366 --> 00:01:06,433
Those are the keys to the future
when it comes to creating the new civil

00:01:06,433 --> 00:01:07,333
rights movement.

00:01:07,333 --> 00:01:10,900
And Tremain has been at the forefront
of that really all of his life.

00:01:10,933 --> 00:01:13,800
So Tremain, good to have you on the show.
How are you doing, brother?

00:01:13,800 --> 00:01:15,733
Man, thank you so much for having me. Rob.

00:01:15,733 --> 00:01:18,266
This is incredible. I love your platform.

00:01:18,266 --> 00:01:20,500
Just excited to participate
in the conversation.

00:01:20,500 --> 00:01:23,500
And hopefully you can add value
in some kind of small way.

00:01:23,733 --> 00:01:25,600
All right, y'all. Well, yeah. For sure.

00:01:25,600 --> 00:01:28,600
So I know you're a multi-time
entrepreneur.

00:01:28,600 --> 00:01:30,500
You've been you've been an entrepreneur
since the beginning.

00:01:32,266 --> 00:01:33,033
How did you know

00:01:33,033 --> 00:01:36,033
you wanted to be an entrepreneur
or what sparked this,

00:01:36,100 --> 00:01:39,333
this journey into entrepreneurship,
which is glorified

00:01:39,333 --> 00:01:42,633
but often, often not really understood
for what it takes to get here.

00:01:43,333 --> 00:01:45,300
Yeah. I mean, I love that part of it.

00:01:45,300 --> 00:01:46,700
I mean, glorified definitely.

00:01:46,700 --> 00:01:49,966
And I think the main thing
that did that was, social media, you know,

00:01:49,966 --> 00:01:52,966
because a lot of people
can post the Lamborghinis and,

00:01:53,066 --> 00:01:58,100
you know, you see in a snapshot
what took somebody 15, 20 years to build.

00:01:58,800 --> 00:02:02,000
And you can't break down
20 years of development into,

00:02:02,366 --> 00:02:06,333
you know, a 32nd,
you know, meme or a 32nd post or real.

00:02:06,666 --> 00:02:08,366
And that's what happens on social media.

00:02:08,366 --> 00:02:11,633
So it gives people the illusion
that this is an easy journey, man.

00:02:11,666 --> 00:02:14,433
Entrepreneurship is hard. It is. Yeah.

00:02:14,433 --> 00:02:16,500
It is not an easy undertaking.

00:02:16,500 --> 00:02:19,066
I mean, you really have to sacrifice it
all. You have to.

00:02:19,066 --> 00:02:20,333
You have to go all in.

00:02:20,333 --> 00:02:23,400
There is no you know, there's
no such thing as halfway crooks.

00:02:23,400 --> 00:02:24,333
You know who said that?

00:02:26,066 --> 00:02:26,933
Good luck.

00:02:26,933 --> 00:02:29,433
That's what I'm saying, man. 1990s. Man.

00:02:29,433 --> 00:02:30,366
And that's a real thing.

00:02:30,366 --> 00:02:31,800
You can't go have young

00:02:31,800 --> 00:02:33,733
y'all young kids and understand
that wrong for some of y'all.

00:02:33,733 --> 00:02:35,333
But you real true

00:02:35,333 --> 00:02:38,600
legends, the real true hip hop, real true
hip hop legends for sure.

00:02:38,633 --> 00:02:41,633
Yeah. The story of hip
hop. Yep yep yep yep.

00:02:41,633 --> 00:02:44,433
I mean, so for me, you know,
I grew up in Southeast Washington

00:02:44,433 --> 00:02:47,900
DC man, in the mid 90s, early late
80s, mid 90s.

00:02:47,900 --> 00:02:49,633
And, it was not a great place.

00:02:49,633 --> 00:02:51,366
I mean, murder capital, I mean, it was

00:02:51,366 --> 00:02:55,566
it was one of those type of situations
where I was a dreamer, always have been.

00:02:55,566 --> 00:02:57,966
I'm not really sure what that came from,
but I always would

00:02:57,966 --> 00:03:00,200
look at my surroundings and say,
you know what is?

00:03:00,200 --> 00:03:00,900
How can I get out?

00:03:00,900 --> 00:03:04,133
Like, he's got to be more to this than
the streets that I'm walking around on?

00:03:04,200 --> 00:03:04,566
Absolutely.

00:03:04,566 --> 00:03:07,666
Like, that was that was always
I was always curious.

00:03:07,666 --> 00:03:10,500
I was always wondering about
is there more?

00:03:10,500 --> 00:03:13,233
So that kind of was innate.

00:03:13,233 --> 00:03:16,233
You know, and,
and and and always wondering,

00:03:16,800 --> 00:03:19,100
what are you today, if you can tell me.

00:03:19,100 --> 00:03:21,800
Yeah. Is there any point
that you remember the spark?

00:03:21,800 --> 00:03:25,033
Was there any moment
or something that you knew that, okay,

00:03:25,400 --> 00:03:28,500
I'm going to be I'm
going to take control of my own life.

00:03:28,500 --> 00:03:30,566
I'm going to be in entrepreneurship,
if you can.

00:03:30,566 --> 00:03:32,400
Is there any moment
that, like, sticks out in your mind

00:03:32,400 --> 00:03:36,666
or you just kind of 100%, I can
I can that moment, moment, exact moment.

00:03:36,666 --> 00:03:38,100
I'm in eighth grade.

00:03:38,100 --> 00:03:40,666
I was in eighth grade and I started,

00:03:41,833 --> 00:03:43,066
you know, in my school

00:03:43,066 --> 00:03:47,366
it was an illegal underground
candy business in my in my school.

00:03:47,933 --> 00:03:51,633
And during that time
period in my, in my, in my school,

00:03:51,633 --> 00:03:55,400
I also got my first job
scooping ice cream at Baskin-Robbins.

00:03:55,400 --> 00:03:57,533
I don't know if anybody remembers
Baskin-Robbins 31.

00:03:57,533 --> 00:03:58,466
Yeah, that's not a story.

00:03:58,466 --> 00:04:03,200
I don't think I, I think so, yeah, yeah,
yeah, yeah, that was my first actual job.

00:04:03,200 --> 00:04:06,000
And so I, I had this this business.

00:04:06,000 --> 00:04:06,600
It was on demand.

00:04:06,600 --> 00:04:07,733
The reason I say it was undergrad

00:04:07,733 --> 00:04:09,933
because I went to like this
really strict private school

00:04:09,933 --> 00:04:13,500
and they didn't allow like outside candies
and these types of things.

00:04:13,500 --> 00:04:17,066
I had this business
where I'm like hand in hand taking orders

00:04:17,066 --> 00:04:20,933
in first period, delivering them by lunch
and taking more orders.

00:04:20,933 --> 00:04:23,766
I had runners, I'm doing
hand in hand at the water fountain.

00:04:23,766 --> 00:04:25,600
It was like it was insanity.

00:04:25,600 --> 00:04:27,366
Man over freaking candy business

00:04:27,366 --> 00:04:31,400
because I'm the only provider on campus,
you know, like, I'm like getting popular.

00:04:31,433 --> 00:04:32,366
Get famous.

00:04:32,366 --> 00:04:35,866
The headmaster had a monopoly
and the whole logistics was really

00:04:35,966 --> 00:04:37,966
there was serious bad. It was serious.

00:04:37,966 --> 00:04:40,966
The headmaster lady,
she didn't know who it was, so

00:04:40,966 --> 00:04:44,966
she threatened to expel whoever it was
when she found out who it was.

00:04:45,200 --> 00:04:48,000
And so that made me even more popular.

00:04:48,000 --> 00:04:50,366
I had a couple of,
a couple of, like, the younger teachers.

00:04:50,366 --> 00:04:51,900
They found out that it was me,

00:04:51,900 --> 00:04:54,900
and they pulled me into their classrooms
after class and told me

00:04:54,900 --> 00:04:57,933
they found out it was me,
and I ended up putting them on payroll.

00:04:57,933 --> 00:04:59,500
I gave them $30 a week.

00:05:02,333 --> 00:05:05,366
I, I had two teachers on payroll.

00:05:05,366 --> 00:05:07,200
Man did not say anything.

00:05:07,200 --> 00:05:08,933
I had, I had friends.

00:05:08,933 --> 00:05:10,900
It was such a it was growing so much.

00:05:10,900 --> 00:05:14,066
There's a lot of whole campus that I
couldn't keep all the candy in my locker.

00:05:14,233 --> 00:05:18,466
So I had to get friends to store candy
in their locker, and I paid them in candy.

00:05:18,600 --> 00:05:20,300
So what?

00:05:20,300 --> 00:05:22,033
What ever do you are? Entrepreneur.

00:05:22,033 --> 00:05:23,433
Oh my gosh. Oh.

00:05:23,433 --> 00:05:26,800
So I was doing this at school
and then I would leave school

00:05:26,800 --> 00:05:29,900
and go to this part
time job at Baskin-Robbins.

00:05:29,900 --> 00:05:31,933
And you know,
I don't know if you guys remember,

00:05:31,933 --> 00:05:34,966
if you old enough, you remember use that
to put up two weeks in the hole.

00:05:35,266 --> 00:05:36,600
And what that meant was

00:05:36,600 --> 00:05:38,933
that you would work a job,
and then you had to work two weeks,

00:05:38,933 --> 00:05:40,700
and then you had to work a second,
two weeks,

00:05:40,700 --> 00:05:42,666
and they would give you
your first paycheck.

00:05:42,666 --> 00:05:46,333
And so you work for like over a month
before you got your first paycheck.

00:05:46,333 --> 00:05:48,566
Now, I know this is old school,
isn't it? How it was.

00:05:48,566 --> 00:05:49,566
So long story short,

00:05:49,566 --> 00:05:52,800
I'm doing my candy business
and I'm working this job out there.

00:05:52,800 --> 00:05:56,666
And I got my first check
after working like a slave

00:05:56,666 --> 00:05:59,700
after school and Baskin-Robbins
scoop of ice cream.

00:05:59,866 --> 00:06:03,766
Now, I mean, I'm only 13 years old
or something like that.

00:06:03,766 --> 00:06:06,766
So, Rob, after all the hard work
I've been doing at Baskin-Robbins,

00:06:06,766 --> 00:06:09,633
I just knew my first check
was going to be like $1 million. Right?

00:06:09,633 --> 00:06:13,100
So, you know, like, oh, like,
this has gotta be

00:06:13,100 --> 00:06:15,333
like the come up of the century, man.

00:06:15,333 --> 00:06:19,200
When they handed me that first check, the
I would never forget this number.

00:06:19,200 --> 00:06:22,833
It was $273.86.

00:06:23,500 --> 00:06:28,966
I have to have been working
for over my $276 and 83 cent.

00:06:29,466 --> 00:06:33,900
And Rob, I'm making that in like
three days in school with two teachers

00:06:33,900 --> 00:06:35,200
on payroll.

00:06:35,200 --> 00:06:38,533
I got two employees, I got runners,
you know what I'm saying?

00:06:38,833 --> 00:06:41,066
You know, my name is this man.

00:06:41,066 --> 00:06:45,933
And and so like, like I got the check
from the dude that ran the Baskin-Robbins.

00:06:46,166 --> 00:06:47,033
He handed it to me,

00:06:47,033 --> 00:06:50,966
I opened it, I looked at it, I asked him,
I said, is this, like, serious?

00:06:50,966 --> 00:06:52,800
Like, this is this is zero.

00:06:52,800 --> 00:06:55,433
He was like, no, you know what?
I take taxes out. I know you, young.

00:06:55,433 --> 00:06:57,666
You won't understand. He's
saying stuff like that.

00:06:57,666 --> 00:06:59,866
I was like, man,
I know one thing I do understand.

00:06:59,866 --> 00:07:02,400
I'm telling this, this what I'm about
to tell you changed my life.

00:07:02,400 --> 00:07:06,600
I decided right there
that jobs are for suckers like I did.

00:07:06,600 --> 00:07:09,200
Like I'm talking about eighth grade.
I said that to my left, out the.

00:07:09,200 --> 00:07:10,500
I left out the store.

00:07:10,500 --> 00:07:13,466
I quit, I told him
that's it for me. I'm not coming back.

00:07:13,466 --> 00:07:14,766
I took that check. I didn't come back.

00:07:14,766 --> 00:07:17,766
And when I walked out the store,
I said to myself, jobs are for suckers.

00:07:17,866 --> 00:07:18,666
And and,

00:07:18,666 --> 00:07:22,366
because I'm like, I'm making this in 2
or 3 days at school doing my own thing.

00:07:22,366 --> 00:07:27,033
And and I got dudes
that got degrees as teachers, you know,

00:07:27,033 --> 00:07:31,700
taking $30 a week from this snot nosed kid
because I'm in value in a way.

00:07:31,700 --> 00:07:34,233
And I couldn't articulate it like that
back. Right, right.

00:07:34,233 --> 00:07:36,066
But just, you know, looking backwards

00:07:36,066 --> 00:07:39,066
because I'm in value in a way
that nobody else could on campus.

00:07:39,466 --> 00:07:42,700
And I decided then,
I didn't know what business was.

00:07:42,700 --> 00:07:44,600
I didn't know, but I just knew right there

00:07:44,600 --> 00:07:48,400
at that moment that I was not going to be
a traditional job dude.

00:07:48,400 --> 00:07:50,166
Like I was not going to do it. Yeah.

00:07:50,166 --> 00:07:52,900
No, that's that
that's that's incredible. So,

00:07:53,933 --> 00:07:55,266
so like going to that.

00:07:55,266 --> 00:07:58,100
So from eighth grade
your your your you're born hustler.

00:07:58,100 --> 00:08:00,166
It just it was there.

00:08:00,166 --> 00:08:02,900
Now give me to the point of your,
of your full kind of more mature

00:08:02,900 --> 00:08:05,900
entrepreneurial journey
and your first kind of step into that.

00:08:06,000 --> 00:08:06,466
And, you know,

00:08:06,466 --> 00:08:09,800
specifically think about whenever that
first kind of mature business is.

00:08:09,800 --> 00:08:11,300
I mean, the candy business is much

00:08:11,300 --> 00:08:13,966
you're more mature
than a lot of people ever get to that.

00:08:13,966 --> 00:08:14,900
You already have a

00:08:14,900 --> 00:08:16,533
you already have employees,
you're already figured out

00:08:16,533 --> 00:08:18,566
how to reduce your cost of goods.

00:08:18,566 --> 00:08:19,633
I mean, you did a lot.

00:08:19,633 --> 00:08:21,033
I didn't even see. Right.

00:08:21,033 --> 00:08:22,500
But like

00:08:22,500 --> 00:08:25,500
but now let's take your kind of
first business venture, because I,

00:08:25,800 --> 00:08:27,700
I really want to talk about that.

00:08:27,700 --> 00:08:30,600
And then take yourself
back to that moment in time so you can.

00:08:30,600 --> 00:08:32,266
Yeah. So what advice?

00:08:32,266 --> 00:08:34,133
I'll give you a younger self really quick.

00:08:34,133 --> 00:08:36,766
So like question
just like in a concise way.

00:08:37,866 --> 00:08:39,300
And then you can dive into the story.

00:08:39,300 --> 00:08:42,300
What advice
would you give your younger self

00:08:42,400 --> 00:08:44,700
and what advice would you ignore.

00:08:44,700 --> 00:08:45,200
Yeah.

00:08:45,200 --> 00:08:49,233
So the the ignoring advice is easy,
which is you're stupid, don't do it.

00:08:49,233 --> 00:08:51,300
You know, to me, like a lot of that.

00:08:51,300 --> 00:08:53,233
And I was so young
when I started my first business.

00:08:53,233 --> 00:08:54,633
We'll get into that in the second.

00:08:54,633 --> 00:08:57,400
So I had a lot of naysayers,
and I think that just comes

00:08:57,400 --> 00:09:00,500
with the territory of doing things
that is, counterintuitive

00:09:00,500 --> 00:09:02,800
or just not really something
that everybody else is doing.

00:09:02,800 --> 00:09:04,233
So you're going to get a lot of that.

00:09:04,233 --> 00:09:06,866
So that was the advice
that I would have ignored.

00:09:06,866 --> 00:09:09,900
But, you know, at the time it
it was meaningful because there's people

00:09:09,900 --> 00:09:12,466
that you think care about you,
you know, these types of things.

00:09:12,466 --> 00:09:14,733
And they may too.
I want to say absolutely.

00:09:14,733 --> 00:09:16,400
And they probably do.

00:09:16,400 --> 00:09:17,533
But but oops.

00:09:17,533 --> 00:09:21,233
Yep. Giving you advice that haven't been
in business aren't the right mentors.

00:09:21,233 --> 00:09:23,133
But that doesn't mean that they're bad
people.

00:09:23,133 --> 00:09:24,466
It's just that like, you know,

00:09:24,466 --> 00:09:27,233
if I want to learn to play basketball,
I'm not going to talk to a hockey player.

00:09:27,233 --> 00:09:30,766
And if I want to, I guess
if I want to learn to be an entrepreneur,

00:09:31,100 --> 00:09:34,200
I can't sell to somebody
that doesn't understand entrepreneurship.

00:09:34,200 --> 00:09:37,866
And unfortunately, it's been seen as this,
this thing we shouldn't

00:09:37,866 --> 00:09:41,133
do as a community, we should get a safe
job, get our education.

00:09:41,133 --> 00:09:42,333
And there's nothing wrong
with those things.

00:09:42,333 --> 00:09:45,000
Some of those things are passed
entrepreneurial thinking, by the way.

00:09:45,000 --> 00:09:45,900
Right? Right, right, right.

00:09:45,900 --> 00:09:48,666
Like we've gotten to this place where

00:09:48,666 --> 00:09:52,133
we've often been, you know, so that that
that's not something you should do.

00:09:52,333 --> 00:09:55,366
You should do the same thing in order to
really protect yourself and your family.

00:09:55,600 --> 00:09:57,733
Well, you know,
and then you got to go back.

00:09:57,733 --> 00:10:00,100
I mean, I started my first company
25 years ago.

00:10:00,100 --> 00:10:03,600
You talking about 99 2000
and that's where the greats coming in.

00:10:03,666 --> 00:10:06,266
Yes, exactly. Exactly.

00:10:06,266 --> 00:10:07,700
Got that great coming in now.

00:10:07,700 --> 00:10:10,700
But it wasn't even called entrepreneurs
like that wasn't even a word.

00:10:11,233 --> 00:10:14,733
And you know, this was just this crazy
business stuff like, what are you doing?

00:10:15,333 --> 00:10:18,533
You know, the guys that wrote the book
on entrepreneurship, you know, the Steve

00:10:18,533 --> 00:10:22,700
Blank's this the egg Reese's of the world,
you know, Clay Christensen,

00:10:22,800 --> 00:10:26,800
like they were writing the book
on entrepreneurship as I was doing it,

00:10:26,833 --> 00:10:29,800
you know, so there was no
YouTube didn't exist.

00:10:29,800 --> 00:10:31,766
Like, you know,
you couldn't find this information.

00:10:31,766 --> 00:10:34,266
You was really just kind of
taking a chance.

00:10:34,266 --> 00:10:36,800
And people thought
that you were literally crazy.

00:10:36,800 --> 00:10:38,700
And my situation, it was even crazier.

00:10:38,700 --> 00:10:42,900
So you fast forward like two years
after the crazy Kennedy story,

00:10:43,233 --> 00:10:44,400
you know, I, I was,

00:10:44,400 --> 00:10:47,266
you know, just so having to been been
blessed to be able to play basketball.

00:10:47,266 --> 00:10:51,100
So I was all state in Maryland,
all county, all of this stuff

00:10:51,100 --> 00:10:52,633
in my high school career.

00:10:52,633 --> 00:10:55,500
And if you can, you know any to O'Hare.

00:10:55,500 --> 00:10:58,500
I can remember before smartphones,

00:10:58,633 --> 00:11:03,066
in order to get your game footage
or to get recruited to play basketball,

00:11:03,066 --> 00:11:06,733
back in those days, you had to have an old
fashioned video recorder

00:11:06,933 --> 00:11:09,766
with a with a camcorder and a VHS tape,

00:11:09,766 --> 00:11:13,700
and you had to have somebody to a game
to tape you on these big old clunky tapes.

00:11:13,700 --> 00:11:17,866
And then you had to go get, you know,
copies of it made highlight reels and

00:11:17,866 --> 00:11:21,900
snail mail them all across the country
and hope that a recruiter got back to you.

00:11:22,066 --> 00:11:24,233
So I went through that whole process
in high school,

00:11:24,233 --> 00:11:28,700
you know, again, before technology
was really a thing, like, hey, and,

00:11:29,000 --> 00:11:32,633
I said in 12th grade, I'm like,
not getting the recruiting that

00:11:32,633 --> 00:11:33,866
I think I should get.

00:11:33,866 --> 00:11:35,533
I mean, I'm all state, all county,

00:11:35,533 --> 00:11:38,466
but I didn't have as big of a name
as some of the guys that I played against.

00:11:38,466 --> 00:11:41,233
They went to the NBA like you Bogans and,

00:11:41,233 --> 00:11:44,900
Joe Forte and some of these guys
that had really good NBA careers.

00:11:44,900 --> 00:11:47,700
They played against me,
I SBU, I didn't have the name.

00:11:47,700 --> 00:11:50,600
And so,
I was just like, well, what do we do?

00:11:50,600 --> 00:11:54,066
And so I started thinking about this
in 12th grade, how to solve this problem.

00:11:54,366 --> 00:11:57,433
And so I actually got a small scholarship
to go

00:11:57,433 --> 00:12:00,533
for full ride
at Southern Maryland and play basketball.

00:12:00,866 --> 00:12:04,133
And I went through there
my first year thinking about this idea.

00:12:04,400 --> 00:12:09,033
And then I went back my second year
and I came back a preseason All-American.

00:12:09,033 --> 00:12:11,033
I had all these offers
from around the country,

00:12:11,033 --> 00:12:15,366
and I broke my ankle like 3 or 4 games
into the season and my second year,

00:12:15,600 --> 00:12:17,100
and I was out for the season.

00:12:17,100 --> 00:12:20,133
And so now you still expected
to participate in practice

00:12:20,133 --> 00:12:21,400
and do all the things.

00:12:21,400 --> 00:12:23,033
I'm there to play basketball.

00:12:23,033 --> 00:12:24,666
And I got this idea, this brain.

00:12:24,666 --> 00:12:27,333
I couldn't take it anymore,
especially if I couldn't play ball.

00:12:27,333 --> 00:12:30,566
And so I dropped out of college
my second year on crutches

00:12:30,566 --> 00:12:32,333
and started my first tech company.

00:12:32,333 --> 00:12:36,233
And with the company
was to give you this a quick background

00:12:36,233 --> 00:12:38,100
is, I don't know if you remember
Black Planet.

00:12:38,100 --> 00:12:39,933
That was like the great
I remember my company.

00:12:39,933 --> 00:12:42,300
Yeah, I guess I'm go first focus on media.

00:12:42,300 --> 00:12:43,900
Yeah, yeah, yeah. For the first day.

00:12:43,900 --> 00:12:44,100
Yeah.

00:12:44,100 --> 00:12:45,266
That's before Facebook

00:12:45,266 --> 00:12:49,433
and before that's the great great great
grandfather, favorite.

00:12:49,466 --> 00:12:51,600
Like,
you know, the Black planet. Moving on

00:12:52,900 --> 00:12:54,300
more than ever I look ahead. Yep.

00:12:54,300 --> 00:12:54,900
Yeah.

00:12:54,900 --> 00:12:57,566
When I first got to campus
because my whole thing was like,

00:12:57,566 --> 00:12:58,733
how do you solve this?

00:12:58,733 --> 00:13:02,066
I did this problem of recruitment
around the country for players,

00:13:02,233 --> 00:13:03,366
because I'm sure that

00:13:03,366 --> 00:13:07,000
if you're not a blue chip or like,
you're not the top player in the country,

00:13:07,633 --> 00:13:10,766
there's tons of second tier players,
I guess, such as myself,

00:13:10,966 --> 00:13:13,700
who are just as good
but just don't have the notoriety.

00:13:13,700 --> 00:13:14,966
How do you get them?

00:13:14,966 --> 00:13:18,033
And so I got on campus,
and the very first thing I did on

00:13:18,033 --> 00:13:20,033
my first year was first day on campus.

00:13:20,033 --> 00:13:22,400
One of my teammates
said, you got to get on black Planet.

00:13:22,400 --> 00:13:23,433
Dude, I'm from the hood.

00:13:23,433 --> 00:13:26,100
We barely had a TV in the house,
let alone a computer.

00:13:26,100 --> 00:13:27,166
So back on Black Plan.

00:13:27,166 --> 00:13:29,733
I don't know anything about computers,
email, nothing.

00:13:29,733 --> 00:13:31,766
And so he takes me into the library.

00:13:31,766 --> 00:13:34,800
He sets me up an email account,
and he gives me on the Black Planet.

00:13:35,000 --> 00:13:36,933
Now, of course,
you know, it's about girls at the time.

00:13:36,933 --> 00:13:40,000
So he's like, you can talk to girls
all across the country.

00:13:40,000 --> 00:13:40,366
Yeah.

00:13:40,366 --> 00:13:43,200
Oh, black plan to use it,
instant messenger and all this stuff.

00:13:43,200 --> 00:13:45,700
And so like, I was like, oh, really? Yes.

00:13:45,700 --> 00:13:46,833
I get on Black Planet

00:13:46,833 --> 00:13:51,266
and sure enough, you can talk to girls
of all the way around the, country.

00:13:51,266 --> 00:13:54,266
Now I go straight
to being an entrepreneur.

00:13:54,400 --> 00:13:57,600
And I say, wait a minute, I should create

00:13:57,600 --> 00:14:00,600
Black Planet for sports recruitment.

00:14:00,800 --> 00:14:01,966
And that was my for.

00:14:01,966 --> 00:14:05,866
That was the epiphany I got the first day
I was on campus to play basketball.

00:14:05,866 --> 00:14:08,866
What if there was a black planet
for sports recruitment, where

00:14:08,966 --> 00:14:14,100
college coaches can see see players
in a game film anywhere in the country?

00:14:14,300 --> 00:14:16,833
And then that.
Then I saw the recruitment problem.

00:14:16,833 --> 00:14:18,833
And so I had that idea.

00:14:18,833 --> 00:14:21,833
My I had my daddy in 12th grade,
but I didn't it didn't crystallize

00:14:21,866 --> 00:14:22,833
high school. Right.

00:14:22,833 --> 00:14:25,600
Once I saw Black Planet,
I immediately said

00:14:25,600 --> 00:14:28,500
this is the way to do this idea
I've had for a year.

00:14:28,500 --> 00:14:30,833
Then I play ball a whole year,
and then I got,

00:14:30,833 --> 00:14:34,800
like I said, I got hurt my second year
and, and I dropped out

00:14:34,800 --> 00:14:38,000
and I started what I called, c
athlete.com.

00:14:38,000 --> 00:14:39,966
And right where we were doing,

00:14:39,966 --> 00:14:42,966
we were creating a black planet
for sports recruitment.

00:14:43,133 --> 00:14:46,366
That was my how do you think I know going
down that going down that road there.

00:14:46,366 --> 00:14:48,933
So you started started this black tech.

00:14:48,933 --> 00:14:50,400
Sorry, not black tech, but this black,

00:14:51,466 --> 00:14:54,466
black planet for recruitment.

00:14:54,700 --> 00:14:56,033
Now, did you know how to code?

00:14:56,033 --> 00:14:58,500
Did you know I've never been a tech
like welder.

00:14:58,500 --> 00:14:59,333
Never been a tech.

00:14:59,333 --> 00:15:02,066
You know, she had her walk people
through that process, right?

00:15:02,066 --> 00:15:02,600
So yeah.

00:15:02,600 --> 00:15:04,466
Yeah, yeah, it's feeling like right now

00:15:04,466 --> 00:15:08,066
it's it's never easy,
but it's much easier now than it was.

00:15:08,400 --> 00:15:10,500
It was there like, you can
you can pop up a website.

00:15:10,500 --> 00:15:13,500
If you really got to put in 3 or 4 hours,
you can figure it out.

00:15:13,500 --> 00:15:17,766
But, I even had
I never even had a computer growing up.

00:15:18,266 --> 00:15:20,366
I mean, how do you go about this, like,
you out here?

00:15:20,366 --> 00:15:23,366
Like, where does your company.

00:15:23,400 --> 00:15:26,333
Yeah. So, so so here's where it started.

00:15:26,333 --> 00:15:30,000
So, you know, the listen,
the internet is just coming out.

352
00:15:30,000 --> 00:15:34,066
I mean, AOL America
Online is just getting big, you know, so

00:15:34,066 --> 00:15:35,233
all this old tech,

00:15:35,233 --> 00:15:37,600
you know, where it was back
then 25 years ago

00:15:37,600 --> 00:15:39,600
is not what it is today
first and foremost.

00:15:39,600 --> 00:15:42,133
So the first thing I had to think about
was building a team.

00:15:42,133 --> 00:15:44,333
So if there's any entrepreneurs out there,

00:15:44,333 --> 00:15:46,333
what I would say
is lean into your skill sets.

00:15:46,333 --> 00:15:49,666
And what I'm really good at is
the stuff we're doing right now, right?

00:15:49,700 --> 00:15:51,500
Which is relating to people,

00:15:51,500 --> 00:15:54,900
creating relationships,
you know, finding ways and strategies on

00:15:54,900 --> 00:15:57,933
how to add value,
how do people teams and relate to people

00:15:57,966 --> 00:16:00,233
interpersonal relationships?
I'm good at that stuff.

00:16:00,233 --> 00:16:02,700
And so the first thing I say
when I have this epiphany

00:16:02,700 --> 00:16:05,533
that I want to do this
idea, is that I need to know somebody.

00:16:05,533 --> 00:16:07,433
Now, back then,
it wasn't like how to code.

00:16:07,433 --> 00:16:09,366
It was like, I need to know
somebody who knows computers.

00:16:10,433 --> 00:16:13,200
Because there's no there's no smartphones,
there's no apps.

00:16:13,200 --> 00:16:14,800
Everything was web based back then.

00:16:14,800 --> 00:16:17,466
So we had to create a website,
you know, type of thing.

00:16:17,466 --> 00:16:21,033
And so, I started asking around, and,

00:16:21,033 --> 00:16:24,300
and this guy I know, he's like, oh,
my nephew is good with computers.

00:16:24,466 --> 00:16:26,800
And he was like a couple years
older than me. I'm 19.

00:16:26,800 --> 00:16:28,366
He's like 22, 23. Right.

00:16:28,366 --> 00:16:30,000
And so I told him about my idea.

00:16:30,000 --> 00:16:31,333
He's like, oh yeah, I help you with that.

00:16:31,333 --> 00:16:34,866
And he was just graduating from,
I don't know if Virginia Tech one of them.

00:16:35,200 --> 00:16:35,400
Right.

00:16:35,400 --> 00:16:39,733
And and we together
went out to try to, build this company.

00:16:39,733 --> 00:16:42,833
We raised a couple of dollars
on friends and family, a couple of angels,

00:16:43,366 --> 00:16:47,300
put some money in, and,
I mean, the solutions were dirty, Rob.

00:16:47,300 --> 00:16:48,933
I mean, these were dirty solutions.

00:16:48,933 --> 00:16:50,833
Trying to get to this. Yeah.

00:16:50,833 --> 00:16:52,800
Thing
that we were going to take to market.

00:16:52,800 --> 00:16:55,633
And, you know, just to give people

00:16:55,633 --> 00:16:59,566
a little bit of, context about what
we discovered throughout this process,

00:16:59,566 --> 00:17:04,266
is that what we really were, were doing
and we didn't know this.

00:17:04,266 --> 00:17:06,700
And this is another thing
I always advise entrepreneurs

00:17:06,700 --> 00:17:10,233
at this stage of the game, you know,
looking back on it is to understand

00:17:10,233 --> 00:17:13,433
the macro environment, the bigger market
that you're trying to go into.

00:17:13,433 --> 00:17:16,900
Because our ideas that we want to solve,
sports recruitment,

00:17:17,066 --> 00:17:20,366
what we were really doing was solving
streaming video.

00:17:20,400 --> 00:17:24,566
Now we were out six years
before YouTube came out,

00:17:25,000 --> 00:17:30,633
and the whole idea was that we wanted to
to stream 30s of game film,

00:17:31,433 --> 00:17:34,000
you know, onto our platform
so that any college

00:17:34,000 --> 00:17:37,000
coach or recruiter can see,
you know, from around the country.

00:17:37,166 --> 00:17:41,666
And that was what so we kept bumping up,
bumping our heads up against two things.

00:17:41,666 --> 00:17:43,000
One is, how do you,

00:17:44,133 --> 00:17:47,333
yeah, because, I mean, we were still just
coming off a dial up internet.

00:17:47,333 --> 00:17:49,633
So how do you stream in that environment?

00:17:49,633 --> 00:17:53,666
And the second thing, this is
the little known fact, was storage.

00:17:53,666 --> 00:17:56,666
How do you store,
you know. Oh, yeah. Yeah.

00:17:57,066 --> 00:17:58,333
There was no.

00:17:58,333 --> 00:17:59,233
Oh, my gosh.

00:17:59,233 --> 00:18:01,633
I mean, you could buy a computer,
a whole computer back then.

00:18:01,633 --> 00:18:03,900
It would still come in from a store
standpoint with

00:18:03,900 --> 00:18:05,300
he basically had to buy servers. Right.

00:18:05,300 --> 00:18:07,200
This was the only way
I can see you doing it, man.

00:18:07,200 --> 00:18:09,500
I mean,
we were buying these little computer boxes

00:18:09,500 --> 00:18:11,100
that were still coming in embeds

00:18:11,100 --> 00:18:14,700
and having a network of them together
like a real data cloud storage.

00:18:14,700 --> 00:18:16,966
But, you know, no dirty solution, man.

00:18:16,966 --> 00:18:19,000
And like, gigs didn't even exist.

00:18:19,000 --> 00:18:21,233
You know,
you talk about two gigs, three gig,

00:18:21,233 --> 00:18:24,233
you know, your iPhone,
you buy this, got 16 gigs on it

00:18:24,333 --> 00:18:28,266
that did not exist back then in one place,
especially for like a,

00:18:28,633 --> 00:18:30,000
you know, a startup, you know, like,

00:18:30,000 --> 00:18:32,133
but then you didn't have access
to that type of stuff.

00:18:32,133 --> 00:18:35,766
And so like the way
we were trying to solve it was very,

00:18:35,766 --> 00:18:38,733
very, very, very entrepreneurial.
Let me put it that.

00:18:38,733 --> 00:18:39,466
Yeah. Yeah.

00:18:39,466 --> 00:18:42,033
But those were the two things
that we were really solving.

00:18:42,033 --> 00:18:44,100
But we didn't know that.
You know what I mean?

00:18:44,100 --> 00:18:46,066
We thought we would try to solve
streaming.

00:18:46,066 --> 00:18:50,500
I mean, I mean, how would you like,
going back, I sometimes think back

00:18:50,933 --> 00:18:54,533
and maybe you would, but, like, sometimes
I think even Mark Zuckerberg's at this,

00:18:55,200 --> 00:18:57,633
an idea is not fully formed
until you start doing it.

00:18:57,633 --> 00:18:59,266
So, like, it's very.

00:18:59,266 --> 00:19:00,500
I think you can try to see.

00:19:00,500 --> 00:19:05,900
And you probably have, more of the ability
to see, the forest

00:19:05,900 --> 00:19:10,100
for the trees now enable toys, like,
because, but it's building that muscle.

00:19:10,100 --> 00:19:10,833
Yeah.

00:19:10,833 --> 00:19:13,500
Even when you have some of that muscle
built, it's

00:19:13,500 --> 00:19:16,866
just it's it's you can start one concept
and not see it till.

00:19:17,066 --> 00:19:20,066
Yeah, you start to really kind of dive
into it, but you kind of always like.

00:19:20,266 --> 00:19:23,300
So now you've learned that lesson in terms
of seeing seeing things.

00:19:23,300 --> 00:19:25,133
And I want to talk a little bit
about AI in a second

00:19:25,133 --> 00:19:28,133
and related conversations,
because I think there's

00:19:28,400 --> 00:19:31,400
there's a good, point to lean on here,
in fact, that you were

00:19:31,933 --> 00:19:35,566
you had this transitional moment
at the beginning of the internet,

00:19:36,200 --> 00:19:38,333
and now I feel like
we're in another kind of stage.

00:19:38,333 --> 00:19:39,633
Oh, right. Right.

00:19:41,000 --> 00:19:43,000
And, and,
and, and I'm sure there are lessons

00:19:43,000 --> 00:19:45,000
that you can draw
from being at that earlier stage.

00:19:45,000 --> 00:19:47,233
I like to tell folks that there's no

00:19:47,233 --> 00:19:50,933
there's no wasted activity
when you're doing something productive.

00:19:50,933 --> 00:19:52,033
Oh, yes. Yeah.

00:19:52,033 --> 00:19:55,033
Even if you quote unquote fail,

00:19:55,300 --> 00:19:58,966
if you actually learn
and don't give up like these things

00:19:58,966 --> 00:20:02,066
become applicable in the future
so that all that, all those,

00:20:03,066 --> 00:20:06,733
all that, muscle tissue
that you built up and, and your brain,

00:20:07,000 --> 00:20:10,766
those synapses that were created
from learning how to do the candy store

00:20:10,766 --> 00:20:17,066
to learning how to, then build a internet
company from, from the ground up,

00:20:17,700 --> 00:20:20,900
and the old fashioned way, learning
how to think

00:20:20,900 --> 00:20:24,633
is something that will never,
that that will never go away.

00:20:24,633 --> 00:20:28,133
It is something that is definitely
is needed, learning how to create value,

00:20:28,133 --> 00:20:31,133
understanding vision and articulating
that are very important.

00:20:31,266 --> 00:20:33,500
Now we're moving to the AI age.

00:20:33,500 --> 00:20:34,800
Not moving. We are in it.

00:20:34,800 --> 00:20:35,366
Yeah.

00:20:35,366 --> 00:20:38,633
I'm just curious to see, do
you see any comparisons to the beginning

00:20:38,633 --> 00:20:41,633
of the internet in the AI movement,
and what lessons do you think?

00:20:41,966 --> 00:20:43,633
You can take from,

00:20:43,633 --> 00:20:47,100
us being seasoned enough to be at all
the stages of the internet

00:20:47,100 --> 00:20:48,633
to be at this current stage? Like what?

00:20:48,633 --> 00:20:51,833
What's the most important lesson
you think you can grab from the past

00:20:51,833 --> 00:20:53,466
that's applicable to the present?

00:20:53,466 --> 00:20:57,666
Man. Listen, you know,
one thing that I like to tell people is

00:20:57,666 --> 00:21:03,166
that we're not going to see another, like,
period of time, like this for another

00:21:03,166 --> 00:21:05,433
150 years,
like I'm going to be dead and gone

00:21:05,433 --> 00:21:07,266
and my grandkids
are going to be experiencing

00:21:07,266 --> 00:21:10,233
the next boom that's happening,
like it's happening right now.

00:21:10,233 --> 00:21:13,333
And the fact that we saw two of them
so close

00:21:13,333 --> 00:21:16,766
together, we'll never see that again
in our lifetime.

00:21:16,766 --> 00:21:18,433
So the internet happened.

00:21:18,433 --> 00:21:22,433
And like I participated in that,
like we were building a company right as

00:21:22,633 --> 00:21:25,266
the stuff that we use today. Like
like we were out

00:21:25,266 --> 00:21:27,766
trying to sell streaming video,
although we didn't know it.

00:21:27,766 --> 00:21:31,800
That's what we were working on six years
before YouTube, five years

00:21:31,800 --> 00:21:34,800
before YouTube,
which is ubiquitous now on the planet.

00:21:34,800 --> 00:21:35,933
You know what I mean?

00:21:35,933 --> 00:21:41,566
Mark, Mark Cuban got his first billions
from Audio Net, which was take,

00:21:41,866 --> 00:21:45,333
you know, like like some of the richest
people in the world

00:21:45,700 --> 00:21:48,566
were working on the same stuff
that we were working on at the time,

00:21:48,566 --> 00:21:51,200
you know what I mean? And so, like,
I just have more capital.

00:21:51,200 --> 00:21:52,766
Let's capital.

00:21:52,766 --> 00:21:55,666
Well, I mean, Mark, Mark Cuban
was probably way smarter than me too.

00:21:55,666 --> 00:21:56,400
I will say that.

00:21:57,400 --> 00:21:57,533
Yeah.

00:21:57,533 --> 00:21:58,000
Maybe not.

00:21:58,000 --> 00:22:02,300
I mean, it's not I don't know,
I don't know, I don't like, I think smart

00:22:02,400 --> 00:22:04,200
there's a lot of smart people. Right.
Yeah.

00:22:04,200 --> 00:22:06,766
Right. Right. It's like access. Access.

00:22:06,766 --> 00:22:08,033
Networks and opportunities.

00:22:08,033 --> 00:22:09,466
I didn't matter now.

00:22:09,466 --> 00:22:12,566
That's one thing I will say
is that I didn't have access to network.

00:22:12,566 --> 00:22:13,233
I didn't I mean,

00:22:13,233 --> 00:22:17,633
I started literally from the gutter
when it comes to where I am today,

00:22:18,066 --> 00:22:21,366
because I didn't have access to network
resources, capital like, nothing,

00:22:21,366 --> 00:22:23,600
you know,
and I had to figure it all out on my own.

00:22:23,600 --> 00:22:27,433
But to answer your question, like,
there was a time when the internet was

00:22:27,433 --> 00:22:32,100
being born, like 95 to 2005 with YouTube
and all of that stuff.

00:22:32,400 --> 00:22:35,133
You know, 2007 is where Facebook came out.

00:22:35,133 --> 00:22:39,833
Right after that, the iPhone came out,
you know, so there's like a 15 year gap

00:22:39,833 --> 00:22:43,766
from like 95, 96 to like 2010 where

00:22:44,100 --> 00:22:47,666
like everything that we use today was
it was invented, you know what I'm saying.

00:22:47,900 --> 00:22:49,000
And like that

00:22:49,000 --> 00:22:53,233
or even if it was already invented, I'm
talking about maybe even commercialized.

00:22:53,233 --> 00:22:56,366
That happened in like this 15 year gap.

00:22:56,366 --> 00:23:00,700
Now we're seeing that same opportunity
happen right now.

00:23:00,700 --> 00:23:03,133
Where's the next ten years?

00:23:03,133 --> 00:23:05,866
AI development is going
to shape the next hundred.

00:23:06,900 --> 00:23:07,866
That during that

00:23:07,866 --> 00:23:11,466
15 years, there was so many billionaires,

00:23:11,733 --> 00:23:16,400
hundred millionaires, more time
millionaires born because they either

00:23:16,433 --> 00:23:19,500
invested in the technology that supported

00:23:19,500 --> 00:23:22,500
or came out of the internet.

00:23:22,700 --> 00:23:26,400
They found it
something in or around that space. Yes.

00:23:26,400 --> 00:23:27,466
A lot of dead bodies

00:23:27,466 --> 00:23:31,133
and a lot of, you know, skeletons
from that time period as well.

00:23:31,533 --> 00:23:36,966
But it was such an era of innovation
because once the internet,

00:23:37,300 --> 00:23:38,833
you know, problem was solved

00:23:38,833 --> 00:23:42,733
and became ubiquitous, now
you can start doing things like streaming.

00:23:42,900 --> 00:23:46,800
Now you can start, I mean, that's
what made the iPhone hit the way.

00:23:46,900 --> 00:23:50,600
Did it convert all of these technologies
at the perfect time?

00:23:50,933 --> 00:23:53,166
And like,
that's what's happening right now.

00:23:53,166 --> 00:23:56,766
And I'm just I'm telling people,
just on a bullhorn

00:23:56,766 --> 00:24:01,233
from the top of the mountain screaming in
as loud as I can get involved,

00:24:01,233 --> 00:24:04,266
like the next wave of billionaires
is going to happen

00:24:04,500 --> 00:24:08,433
because of where we are with a
it looks like we're we are with a,

00:24:08,700 --> 00:24:12,600
looks just like it did back in 97
when the internet was first.

00:24:12,666 --> 00:24:15,666
Yeah. You know, it's just like it.

00:24:15,900 --> 00:24:17,000
No, I, I completely agree.

00:24:17,000 --> 00:24:18,500
It looks, it looks it looks a lot like it.

00:24:18,500 --> 00:24:22,333
I, I tell people there are
and now I think about there might actually

00:24:22,333 --> 00:24:26,466
be five stages to the digital,
you know, digital revolution.

00:24:26,466 --> 00:24:27,466
There was the,

00:24:27,466 --> 00:24:28,400
there was the, there was a,

00:24:28,400 --> 00:24:31,166
there was just a read version,
which is the, which was the first starting

00:24:31,166 --> 00:24:33,433
on the internet
when people just put up sites. Right.

00:24:33,433 --> 00:24:36,933
Then there was the centralization versus
when it read and right, right when that's

00:24:36,933 --> 00:24:41,633
when the social media, Airbnb
that got a centralization. Yep.

00:24:42,300 --> 00:24:45,433
We're still in this phase of,
we still might say

00:24:45,433 --> 00:24:48,633
centralization, but there's a third stage
in terms of ownership of data.

00:24:48,633 --> 00:24:49,133
There's this.

00:24:49,133 --> 00:24:50,400
So that's the Web3 component.

00:24:50,400 --> 00:24:53,400
People think about blockchain
owning your data,

00:24:53,700 --> 00:24:56,000
authenticating your data
like no one thought.

00:24:56,000 --> 00:24:57,166
A lot of people, I knew it was important.

00:24:57,166 --> 00:25:00,500
And that's where I stuck on or that's
where I was before I got getting into

00:25:01,033 --> 00:25:02,800
AI and data. Right, right.

00:25:02,800 --> 00:25:06,000
But the two are related
because now the fourth stage

00:25:06,000 --> 00:25:09,000
is we're on AI, which is the manipulation,

00:25:09,866 --> 00:25:14,300
the, the, the mass distribution,
the creation of data. Yep.

00:25:14,300 --> 00:25:16,166
Just from having some data. Right.

00:25:16,166 --> 00:25:19,000
And so like and then there's a fifth point
that, you know, we'll,

00:25:19,000 --> 00:25:21,700
we'll get into on a lot of the podcast,
which is quantum computing. Right.

00:25:21,700 --> 00:25:25,900
So like that's, all those adding together
is going to really recreate

00:25:25,900 --> 00:25:29,866
how we interact in the world,
how we tell stories, how we,

00:25:30,566 --> 00:25:32,000
how we experience the world.

00:25:33,000 --> 00:25:34,900
But there
are some fundamental fundamentals

00:25:34,900 --> 00:25:37,500
one still has to have, like at least
at least I believe that. Right.

00:25:37,500 --> 00:25:39,566
So I think your experience does you.

00:25:39,566 --> 00:25:43,366
Well, because I had a conversation earlier
with the, with, with,

00:25:43,666 --> 00:25:47,466
with a leadership coach and she said
there's a lot of applications,

00:25:47,600 --> 00:25:51,600
a lot of people putting applications out
for these jobs using AI.

00:25:51,866 --> 00:25:54,466
The problem is they're not qualified for
these jobs are putting out.

00:25:54,466 --> 00:25:58,866
So they're like over flooding things
and understanding, like we need to use AI.

00:25:59,233 --> 00:26:02,966
But I told this to my,
my son who told me I was telling him

00:26:02,966 --> 00:26:04,033
about financial literacy.

00:26:04,033 --> 00:26:06,333
I think we had this conversation
before. Right?

00:26:06,333 --> 00:26:08,933
And he's like, well, you know,
I don't need to know all this.

00:26:08,933 --> 00:26:11,600
I can just look it up. I said, well,
you can't look up how to think.

00:26:13,100 --> 00:26:15,300
Know I solve a problem.

00:26:15,300 --> 00:26:16,133
Yeah, exactly.

00:26:16,133 --> 00:26:19,100
Those are things
you got to develop the muscle for. Yes.

00:26:19,100 --> 00:26:20,166
The grind floor.

00:26:20,166 --> 00:26:20,666
Yeah.

00:26:20,666 --> 00:26:22,333
So let's talk about AI everybody.

00:26:22,333 --> 00:26:24,633
Because everybody says get into AI.

00:26:24,633 --> 00:26:26,000
How would you advise someone to do that.

00:26:26,000 --> 00:26:27,466
What does that look like.

00:26:27,466 --> 00:26:29,100
Yeah.
So it's a couple of different things.

00:26:29,100 --> 00:26:31,800
Like everybody's not going to build
you know what I mean.

00:26:31,800 --> 00:26:34,100
Their own you know large language model.

00:26:34,100 --> 00:26:36,066
Like everybody's not going to do that
you know.

00:26:36,066 --> 00:26:38,200
But I think the first thing
that I would say

00:26:38,200 --> 00:26:42,000
is to just calibrate yourself
to what's happening right now, you know,

00:26:42,000 --> 00:26:46,300
I think I think just understand
where we are in the market,

00:26:46,300 --> 00:26:50,700
like where we are as a, as a human race,
people that have no clue.

00:26:50,700 --> 00:26:53,700
And so I just tell people
to just first take a second

00:26:53,766 --> 00:26:58,133
and just think through like, hey,
wait a minute, this thing is happening.

00:26:58,133 --> 00:27:01,133
You know,
AI is not something to be scared about.

00:27:01,266 --> 00:27:04,566
AI is not something
that you need to be fearful about.

00:27:04,600 --> 00:27:06,733
You know, it's going to take my job.
Oh, no.

00:27:06,733 --> 00:27:10,500
You know, you can't approach it that way
because because, you know,

00:27:10,500 --> 00:27:12,100
you have no control over stuff
you don't control.

00:27:12,100 --> 00:27:14,166
Absolutely, absolutely.

00:27:14,166 --> 00:27:16,200
You know, you got to get in the game.

00:27:16,200 --> 00:27:20,100
And I think so many people are just
sitting on the sideline just for fear

00:27:20,333 --> 00:27:24,166
or just not understanding
exactly what's going on.

00:27:24,166 --> 00:27:26,800
So that's the first thing
just you can get on a podcast.

00:27:26,800 --> 00:27:28,733
I know you got tons of concept

00:27:28,733 --> 00:27:31,733
content around
what this thing is absolutely about you.

00:27:31,766 --> 00:27:35,400
I mean, it's it's free on YouTube,
free on podcasts like this,

00:27:35,600 --> 00:27:40,366
like just listen to a 5 or 10 minute,
you know, piece of content.

00:27:40,400 --> 00:27:44,566
Short form doesn't have to be a long,
you know, class at Harvard,

00:27:44,833 --> 00:27:48,500
although that's available
for free as well, just to understand

00:27:48,500 --> 00:27:50,200
what's happening and where we are.

00:27:50,200 --> 00:27:54,433
The second thing I would say
is, is that AI is not like new.

00:27:54,466 --> 00:27:57,233
Like this is this is you out of it.

00:27:57,233 --> 00:28:00,100
Like people don't generate. They always do

00:28:01,633 --> 00:28:04,766
Alexa
and all of these things that we use every

00:28:04,766 --> 00:28:10,133
oh Siri, turn my my water on, Alexa, turn
on my favorite music or my favorite song.

00:28:10,266 --> 00:28:15,000
Are you so you just simply have solutely.

00:28:15,000 --> 00:28:20,600
So we've been engaging with it
already in the main space, you know?

00:28:20,833 --> 00:28:25,033
So just think about it like this
AI is used to, to

00:28:25,366 --> 00:28:29,066
to bring your favorite movies
to the forefront on Netflix.

00:28:29,200 --> 00:28:32,533
That's why Netflix is always suggesting
your favorite shows

00:28:32,700 --> 00:28:36,033
and shows that you think you have,
because it's using this technology,

00:28:36,233 --> 00:28:40,400
then Netflix is not even
I mean, it's barely a 20 year old company.

00:28:40,400 --> 00:28:41,300
You see what I'm saying?

00:28:41,300 --> 00:28:45,166
So that's
how fast this technology is moving to.

00:28:45,300 --> 00:28:48,766
And that would be my third point
is that it's moving way

00:28:48,766 --> 00:28:52,233
faster than it was 25 years ago
when I first got started.

00:28:52,500 --> 00:28:56,333
And I think right now this
not I'm not going to say the scary part,

00:28:56,333 --> 00:29:00,400
but the, the
the part that's way more interesting to me

00:29:00,400 --> 00:29:02,300
about the whole development piece of it

00:29:02,300 --> 00:29:06,233
is that you can develop
gut stuff and go to scale so quickly.

00:29:06,233 --> 00:29:09,000
Now, you couldn't do that 25 years ago.

00:29:09,000 --> 00:29:10,300
You had to build a small,

00:29:10,300 --> 00:29:13,933
then you had to get a few clients,
few customers build you, work your way up.

00:29:14,233 --> 00:29:16,833
But because the technology
is developing so fast

00:29:16,833 --> 00:29:20,900
and you have access to so many people,
you can build a technology.

00:29:20,900 --> 00:29:25,666
I mean, ChatGPT had 300 million users
or something like in the first year.

00:29:25,800 --> 00:29:29,333
It was something insane, you know,
but they were given $1 billion of what

00:29:29,366 --> 00:29:30,433
they were. They were

00:29:31,933 --> 00:29:32,266
I mean,

00:29:32,266 --> 00:29:35,966
even with $1,000,000.25 years ago,
you wouldn't be able to.

00:29:36,266 --> 00:29:38,500
No, no,
you wouldn't be able to scale that size.

00:29:38,500 --> 00:29:40,433
You wouldn't be even with $1 billion.

00:29:40,433 --> 00:29:43,066
You wouldn't
you just the infrastructure wasn't there.

00:29:43,066 --> 00:29:46,200
But now that the infrastructure is there
and you talked about that,

00:29:46,466 --> 00:29:49,200
you can scale so quickly
with this technology,

00:29:49,200 --> 00:29:53,700
which means that it's going to leave
certain populations.

00:29:53,700 --> 00:29:58,800
And this is where I have a real soap box,
populations of people behind bars,

00:29:58,800 --> 00:30:03,133
especially black and brown communities,
you know, other more diverse communities

00:30:03,133 --> 00:30:08,066
that aren't thinking about the future
of AI, the future, not even just a yeah,

00:30:08,100 --> 00:30:12,566
the future of just technology in general
are not thinking meaningfully about it.

00:30:12,566 --> 00:30:15,566
You know, it'll really foster
and communities

00:30:15,600 --> 00:30:19,400
left in the dust and and like that's
something that I'm really passionate about

00:30:19,600 --> 00:30:22,533
because I, you know,
I think equity is a big, big,

00:30:22,533 --> 00:30:25,966
big risk when it comes to
how fast this technology is moving.

00:30:26,033 --> 00:30:26,233
Yeah.

00:30:26,233 --> 00:30:30,100
Because it's not a consideration for,
for the technology.

00:30:30,100 --> 00:30:31,433
Like as we talk about building

00:30:31,433 --> 00:30:34,833
and responsible AI,
we often talk about that disruption now

00:30:34,833 --> 00:30:39,166
and that our signature conference, Midwest
Con, it's when we talk about responsible

00:30:39,466 --> 00:30:42,966
people, talk about responsible AI,
and all the time.

00:30:42,966 --> 00:30:45,200
And, I kind of equate it like this.

00:30:45,200 --> 00:30:48,066
I say, responsible AI is the new DTI.

00:30:48,066 --> 00:30:49,633
What do I mean by that?

00:30:49,633 --> 00:30:51,433
This image, this is what I mean. Okay,

00:30:52,600 --> 00:30:56,033
a lot of people say it and don't have any
real meaning when they actually say this.

00:30:56,033 --> 00:30:59,033
It is just something that they say to be,

00:30:59,466 --> 00:31:02,466
to be, to be, to be thought of, to say,
oh, we're thinking about it.

00:31:03,000 --> 00:31:05,566
Lots of people are not thinking about it,
and they're not, because

00:31:05,566 --> 00:31:08,733
it's the whole infrastructure
has been built to be inequitable.

00:31:09,500 --> 00:31:13,000
But like what I've,
what I, what I tell fortune 500 companies,

670
00:31:13,000 --> 00:31:15,066
the big players in this space

00:31:15,066 --> 00:31:18,366
is that you want to do this because you're
going to need to build trust.

00:31:18,833 --> 00:31:21,866
And if if you have a whole population
that becomes

00:31:21,866 --> 00:31:25,700
so mistrustful,
we can go backwards on, on on innovation.

00:31:25,700 --> 00:31:29,166
So it's in your interest
to actually do this in a way that,

00:31:29,666 --> 00:31:32,833
that, that actually is more accessible,
that is transparent.

00:31:33,033 --> 00:31:35,100
Yeah. Now,
I don't that they're going to do it.

00:31:35,100 --> 00:31:36,100
I think they're going to do it.

00:31:36,100 --> 00:31:38,800
But like I think we gotta push forward
and do everything we can.

00:31:38,800 --> 00:31:39,666
But I do think that's

00:31:39,666 --> 00:31:41,800
that's the conversation
I have with them when we talk about this.

00:31:41,800 --> 00:31:42,666
Well, you know what?

00:31:42,666 --> 00:31:46,933
And I think we had a, a brief,
part of our discussion, to,

00:31:47,333 --> 00:31:49,966
you know, leading up to this podcast
about this.

00:31:49,966 --> 00:31:53,700
I don't know if that is the right

00:31:55,266 --> 00:31:56,033
you know, I don't know if

00:31:56,033 --> 00:32:00,033
that's the right approach,
because the way you change behaviors,

00:32:00,033 --> 00:32:03,866
especially in such a capitalist society
like America and just, you know, basically

00:32:04,133 --> 00:32:07,133
all of our allies,
which is most of the world,

00:32:08,000 --> 00:32:12,200
is there has to be a substantive
business model that drives the change.

00:32:12,200 --> 00:32:15,233
And so, I don't know,
and this is just, you know, my

00:32:15,500 --> 00:32:18,633
know, this is especially like saying
you got agree I don't know what.

00:32:18,633 --> 00:32:19,233
No, no, no, no.

00:32:19,233 --> 00:32:20,733
But I'm just saying
I don't know if there's

00:32:20,733 --> 00:32:24,766
a business model to support equitable
I you see what I'm saying?

00:32:24,766 --> 00:32:25,766
Like I don't know. Yes.

00:32:25,766 --> 00:32:27,000
Because I mean, even if you just

00:32:27,000 --> 00:32:30,066
think about the technology
as it, as it is today, you know,

00:32:30,066 --> 00:32:33,366
especially the tech that's kind of
becoming ubiquitous and everybody's using

00:32:33,633 --> 00:32:37,500
it was built by a very small few people,
you know, set of people,

00:32:37,800 --> 00:32:41,833
you know, and, and those that are now
trying to get into a,

00:32:42,066 --> 00:32:47,000
meaning building out their own form of it,
their own limbs and things like that.

00:32:47,866 --> 00:32:51,466
One thing that they're having a problem
with is how do you commercialize?

00:32:51,633 --> 00:32:55,466
How do you what's the business
model behind you can't just get into AI,

00:32:55,500 --> 00:32:56,300
you know what I'm saying?

00:32:56,300 --> 00:32:59,733
Like, you can it's great that we're doing
it is great to be.

00:32:59,866 --> 00:33:03,466
But what these larger
firms and cover and Google and all these,

00:33:03,700 --> 00:33:08,133
what they have mastered
is how to the business model behind it.

00:33:08,133 --> 00:33:11,133
And that's why
they're able to drive the narrative.

00:33:11,233 --> 00:33:13,866
There are a lot of people
that are getting into it. Yeah.

00:33:13,866 --> 00:33:17,066
But I think one conversation
that we need to have as

00:33:17,066 --> 00:33:20,300
builders is
how do you commercialize the AI?

00:33:20,500 --> 00:33:24,566
Because there isn't a substantive business
model that's driving innovation.

00:33:24,700 --> 00:33:29,533
Listen, everybody talks about innovation,
but there is no money in innovation.

00:33:29,533 --> 00:33:31,400
The innovation is in replication.

00:33:31,400 --> 00:33:34,100
I mean, the money is in replication.
There's no money in innovation.

00:33:34,100 --> 00:33:35,733
The money is in replication.

00:33:35,733 --> 00:33:39,500
How do you take something
that you innovated on and then make it

00:33:39,500 --> 00:33:41,200
so that a billion people can use it?

00:33:41,200 --> 00:33:44,100
That's where the money is,
and that's what Google does.

00:33:44,100 --> 00:33:46,133
Well, that's what Tesla does.

00:33:46,133 --> 00:33:49,033
Well, you know, Netflix,
that's what they do.

00:33:49,033 --> 00:33:50,766
It's not just the innovation.

00:33:50,766 --> 00:33:54,733
You have to wrap that around
a business model that can now add value

00:33:54,733 --> 00:33:56,100
to billions of people.

00:33:56,100 --> 00:33:57,700
And I guess because I'm an entrepreneur.

00:33:57,700 --> 00:34:01,633
And to your point, have, you know, on
this scale said over 25 years,

00:34:01,800 --> 00:34:06,933
you know, I can see the real issue,
the real issue is not necessarily equity.

00:34:06,933 --> 00:34:09,566
The real issue is not necessarily
innovation.

00:34:09,566 --> 00:34:12,266
All those things are
there is commercialization.

00:34:12,266 --> 00:34:14,066
Like that's what moves the needle.

00:34:14,066 --> 00:34:17,066
And these larger companies are the ones
who do that really, really well.

00:34:17,666 --> 00:34:18,000
Yeah.

00:34:18,000 --> 00:34:21,600
And I think, you know, I would have
a different perspective on that.

00:34:21,600 --> 00:34:23,166
I think yes, they obviously do it.

00:34:23,166 --> 00:34:25,833
Well I mean that's not that's
you can't argue with their they

00:34:25,833 --> 00:34:27,666
you can't argue with the trillion
trillion dollar company,

00:34:28,733 --> 00:34:31,233
kind of speaks for itself in terms of,

00:34:31,233 --> 00:34:35,666
I will I will say, though, what makes this
different from every iteration of,

00:34:36,666 --> 00:34:40,000
entrepreneurship, of innovation, whatever.

00:34:40,266 --> 00:34:43,166
Adjective we want to use to describe
what we're talking about here

00:34:43,166 --> 00:34:49,000
is that I data data
is, data is derivative.

00:34:49,000 --> 00:34:49,966
It's building on itself.

00:34:49,966 --> 00:34:53,700
So it's much easier
to create digital dictatorships,

00:34:54,566 --> 00:34:58,400
and then for you not to, and, and,
and for folks to be able to control data.

00:34:58,400 --> 00:35:02,900
So the question is,
I believe it's going to become, extremely,

00:35:04,233 --> 00:35:07,500
it's going to be extremely important
to figure out, what this looks like.

00:35:07,500 --> 00:35:12,166
And I think it will be, a model,
because what's also a model is,

00:35:12,500 --> 00:35:15,500
if there's, there's a business
that is at its core,

00:35:16,033 --> 00:35:18,666
also has a mission that is about solving

00:35:18,666 --> 00:35:22,233
some type of pain point or problem
or rallying the community.

00:35:22,233 --> 00:35:23,266
Right.

00:35:23,266 --> 00:35:26,400
So this is my
this is my thesis and this is my theory.

00:35:27,133 --> 00:35:30,133
And again,
we'll we'll see how it how it plays out.

00:35:30,600 --> 00:35:33,533
No one can predict the future,
but I hope to try to envision it.

00:35:33,533 --> 00:35:37,200
So I believe that I believe that,
you know, a lot of the inequalities

00:35:37,200 --> 00:35:41,633
that have happened over time, particularly
I'm talking in the in the digital realm,

00:35:42,000 --> 00:35:45,000
those things have never been
adequately addressed.

00:35:45,600 --> 00:35:49,166
From the input of the data,
to the infrastructure, where, where it,

00:35:49,200 --> 00:35:53,433
where people actually have access to,
the internet, all those things.

00:35:53,433 --> 00:35:54,600
Right.

00:35:54,600 --> 00:35:58,166
The information is more democratized
than ever has been so today.

00:35:58,166 --> 00:36:01,166
But but at the same time,
inequalities are starting to stretch out

00:36:01,300 --> 00:36:03,433
larger
than they have been at the same time.

00:36:03,433 --> 00:36:06,133
And I think the risk to that, comes
to a few things.

00:36:06,133 --> 00:36:10,200
So when I talk about equity, it's
also your point I completely agree with.

00:36:10,200 --> 00:36:12,066
It's about making sure
people have the knowledge.

00:36:12,066 --> 00:36:13,266
They go out there and learn.

00:36:13,266 --> 00:36:15,766
We have complete agreement
there. Go learn.

00:36:15,766 --> 00:36:18,666
I tell people learn
by doing the same thing.

00:36:18,666 --> 00:36:21,666
When I discuss blockchain
is the same thing that applies to,

00:36:22,400 --> 00:36:27,600
I go to ChatGPT, learn some, just
go on there and just see how it applies.

00:36:27,600 --> 00:36:29,166
Play around with three.

00:36:29,166 --> 00:36:29,666
It's free.

00:36:29,666 --> 00:36:30,700
Go to perplexity.

00:36:30,700 --> 00:36:33,600
Go to go to notebook LLM.

00:36:33,600 --> 00:36:37,633
That's another thing that I use
for this podcast preparation today

00:36:37,866 --> 00:36:42,566
that can help you, actually prepare,
get notes and then learn how to,

00:36:43,133 --> 00:36:47,166
I like to say I don't think, I should be
a total replacement for people.

00:36:47,166 --> 00:36:51,300
And I think there will be just like people
didn't the business model in creating

00:36:51,300 --> 00:36:55,700
food used to be
that, just create mass produce the food.

00:36:55,800 --> 00:36:57,766
Don't worry about the quality of it.

00:36:57,766 --> 00:37:00,500
Shoot out as fast as you can
and mass produce it.

00:37:00,500 --> 00:37:05,066
And that was the that was the model
for the first 30, 40, 50 years of, food.

00:37:05,066 --> 00:37:07,333
But then people started saying, well,
what's in this?

00:37:07,333 --> 00:37:09,600
And then people wanted to have organic.

00:37:09,600 --> 00:37:11,600
They wanted transparency.

00:37:11,600 --> 00:37:14,300
So I believe that's going to happen
with, with the type of art

00:37:14,300 --> 00:37:15,500
and things that we create.

00:37:15,500 --> 00:37:17,200
I believe
people are going to want to have a dance.

00:37:17,200 --> 00:37:18,666
With humans evolved,

00:37:18,666 --> 00:37:23,733
I still think we need to work with
I am pro I, I'm just pro authenticity.

00:37:23,733 --> 00:37:29,200
I'm pro transparency and I'm pro ethical
and I, I don't I do think there will be a

00:37:29,566 --> 00:37:32,800
I do think they will be
I don't think in the short term I agree.

00:37:32,800 --> 00:37:33,766
Is that what you say.

00:37:33,766 --> 00:37:36,300
So probably it's going to be
we're going to rush to that.

00:37:36,300 --> 00:37:40,033
And I believe that there's
going to be errors because of that.

00:37:40,333 --> 00:37:43,166
And that's going to have openings
in the market.

00:37:43,166 --> 00:37:47,000
And there will always be people that want
to, do this in a responsible way.

00:37:47,000 --> 00:37:50,266
So I do think there will be opportunities
for that space,

00:37:50,266 --> 00:37:51,933
because so many others are just saying,

00:37:51,933 --> 00:37:55,700
how can we rush without any type
of guidance to what we're doing with AI?

00:37:56,100 --> 00:37:59,100
No. Listen, I, I don't I,
you might have misunderstood.

00:37:59,100 --> 00:38:01,633
I don't think you, you you capturing.

00:38:01,633 --> 00:38:03,066
I'm not disagreeing with you.

00:38:03,066 --> 00:38:04,766
I agree with you in this.

00:38:04,766 --> 00:38:09,033
Oh, no, no, no no, I do, no, I hear you,
but I'm saying no, I do agree with you.

00:38:09,233 --> 00:38:13,733
My thing is just all of the elements
that you just said, which are important.

00:38:13,733 --> 00:38:16,133
I just I'm just adding something to it.

00:38:16,133 --> 00:38:20,466
I think in order for for us to really move
the needle is to figure out

00:38:20,466 --> 00:38:21,166
how we can take

00:38:21,166 --> 00:38:25,200
all the stuff that you just said
and create a business model with it.

00:38:25,200 --> 00:38:27,000
You see, saying,
that's the only thing I'm just

00:38:28,033 --> 00:38:29,900
and I completely agree you can do that.

00:38:29,900 --> 00:38:33,833
Then it becomes something that can grow,
you know, you can't.

00:38:33,833 --> 00:38:39,800
Like business is the is the most magical,
you know, mechanism on the planet.

00:38:39,800 --> 00:38:41,700
It's the only way to grow anything.

00:38:41,700 --> 00:38:47,300
You I mean, if you can produce
something that people want and sell it

00:38:47,300 --> 00:38:50,333
for less than it, if for more than it
cost for you to produce it,

00:38:50,566 --> 00:38:54,133
that is a magical thing
called capitalism and called.

00:38:54,166 --> 00:38:56,000
Yeah, no, I agree, it is very simple.

00:38:56,000 --> 00:39:00,133
And so like if you can put equity,
if you can put like these things

00:39:00,400 --> 00:39:05,233
in that wheelhouse, now it becomes a fuel,
an engine for growth,

00:39:05,233 --> 00:39:09,700
because now you have to look at it
like Google has to say, wait a minute.

00:39:09,900 --> 00:39:12,000
These guys are focused on ethics.

00:39:12,000 --> 00:39:14,866
These guys are focused on things
like compliance.

00:39:14,866 --> 00:39:17,866
These guys are focused
on empowering the global South.

00:39:17,933 --> 00:39:20,800
These guys are focused
on these types of things.

00:39:20,800 --> 00:39:23,500
And wait a minute, they got $1 billion.

00:39:23,500 --> 00:39:26,800
They they're they're valuation
is what you know how did they do that.

00:39:26,800 --> 00:39:30,100
That's when it becomes something that that

00:39:30,100 --> 00:39:33,466
that that that initiates change
and I agree.

00:39:33,500 --> 00:39:33,900
Yeah.

00:39:33,900 --> 00:39:35,366
You know what I'm saying

00:39:35,366 --> 00:39:38,300
I would say one point that
I want to get to infrastructure.

00:39:38,300 --> 00:39:41,266
Yeah I would say yes.

00:39:41,266 --> 00:39:42,500
But okay.

00:39:44,200 --> 00:39:46,866
There is a point where this is
this is where I believe in policy.

00:39:46,866 --> 00:39:47,133
Right.

00:39:47,133 --> 00:39:50,133
So I go back to, yeah,
you talk about a lot.

00:39:50,366 --> 00:39:52,700
Yeah. So let's talk let's talk about
the industrial revolution. Right.

00:39:52,700 --> 00:39:54,000
And which there's no

00:39:54,000 --> 00:39:57,300
there's no doubt over the course of time
that's created more value.

00:39:57,300 --> 00:39:59,100
It's how we got here.

00:39:59,100 --> 00:40:02,700
But a lot of,
a lot of shit happened between

00:40:03,133 --> 00:40:06,433
then and where we got to
in the 70s and 80s and 60s.

00:40:06,433 --> 00:40:09,433
Right. We had two world wars. Okay.

00:40:09,766 --> 00:40:10,733
What? People remember this, right?

00:40:10,733 --> 00:40:13,733
We had, we had a massive raid.

00:40:14,633 --> 00:40:17,200
Redefinition of what it was like to work.

00:40:17,200 --> 00:40:19,200
People
actually got rights in the workplace.

00:40:19,200 --> 00:40:23,100
That's a foreign concept
to most of the world 99% of the time.

00:40:23,100 --> 00:40:23,400
Right.

00:40:23,400 --> 00:40:26,000
That allowed us the freedom to have more
entrepreneurs and things like that.

00:40:26,000 --> 00:40:28,833
Having a middle class. Yeah,
those things we take for granted.

00:40:28,833 --> 00:40:32,400
But those are new freaking concepts
last couple hundred years for our

00:40:32,766 --> 00:40:36,000
that is not the traditional way
people think about things, right?

00:40:36,733 --> 00:40:39,666
We could still and so like,
what I think we're going to have to do

00:40:39,666 --> 00:40:42,966
is that we're going to have to have we're
going to have to redefine how we interact

00:40:42,966 --> 00:40:45,233
and work with each other.
What is policy look like?

00:40:45,233 --> 00:40:47,866
And I'm not saying I have the answer
completely.

00:40:47,866 --> 00:40:50,766
I just know
and I'm very confident that understand

00:40:50,766 --> 00:40:54,533
enough about algorithms in
AI that they're so complex.

00:40:54,533 --> 00:40:58,466
No one can no one has any full
understanding of how it works.

00:40:59,033 --> 00:41:00,566
Hard. Hard. Stop. Nobody.

00:41:00,566 --> 00:41:03,100
Yeah, right. Right right, right. Okay.

00:41:03,100 --> 00:41:06,233
So yeah,
but although we can't fully understand

00:41:06,233 --> 00:41:09,233
or control systems,
we can dance with them.

00:41:09,600 --> 00:41:11,800
As I said,
although we can't predict the future,

00:41:11,800 --> 00:41:14,300
we can't envision
what type of future we want.

00:41:14,300 --> 00:41:18,133
So what I say is we should
we can use this technology in ways

00:41:18,133 --> 00:41:23,000
that solves almost all of our issues
or amplifies a lot of our issues.

00:41:23,166 --> 00:41:25,933
We can use it to inform people more,
or we can use it

00:41:25,933 --> 00:41:28,100
to how social media is doing it
now, which is

00:41:28,100 --> 00:41:30,500
we can figure out how to fight over
nonsense

00:41:30,500 --> 00:41:33,366
and make sure our brain is being hacked
because we're creating alternate

00:41:33,366 --> 00:41:36,366
realities that are actually aren't factual
with what's happening.

00:41:36,733 --> 00:41:40,966
So like, we have to think about what type
of future we want for humanity, right?

00:41:40,966 --> 00:41:42,900
And that has to be centered in what we do.

00:41:42,900 --> 00:41:46,866
So I think how you create that business
model is through awareness and eventually

00:41:47,200 --> 00:41:48,133
through policy.

00:41:49,200 --> 00:41:52,066
And then you and
then you creating some level of standard.

00:41:52,066 --> 00:41:53,700
There will be no perfection. Right.

00:41:53,700 --> 00:41:57,333
But what we will do is set, like,
what do we want to have happen?

00:41:58,400 --> 00:42:00,433
What's it look like?

00:42:00,433 --> 00:42:02,000
What's the purpose in building this?

00:42:02,000 --> 00:42:05,133
And also we might want
to ask the question, should we build this?

00:42:05,133 --> 00:42:06,000
Because that's a real question.

00:42:06,000 --> 00:42:08,766
We should ask for some of these things
sometimes. Right.

00:42:08,766 --> 00:42:12,666
And unless there is the government
intervention in some ways, in

00:42:12,666 --> 00:42:15,666
a reasonable way that promotes innovation,
but it needs to be involved.

00:42:15,700 --> 00:42:19,566
We know that, you know, capitalism
without any type of restraint.

00:42:19,566 --> 00:42:21,600
You know, that that, that that's
how we got slavery, right.

00:42:21,600 --> 00:42:23,100
And that's extreme example.

00:42:23,100 --> 00:42:25,866
But like, we all of this comes with
balance is all I'm saying.

00:42:25,866 --> 00:42:26,400
Right. And

00:42:27,466 --> 00:42:29,366
we got to go full speed ahead.

00:42:29,366 --> 00:42:32,966
But we also need to just understand
that this is a new

00:42:33,966 --> 00:42:35,000
this is new.

00:42:35,000 --> 00:42:37,833
This is much different
than any technology we've ever had.

00:42:37,833 --> 00:42:42,266
And it's and, I'm hoping it's
going to create these new type of jobs.

00:42:42,266 --> 00:42:43,000
I believe it will.

00:42:43,000 --> 00:42:44,333
But then there's also challenges there,

00:42:44,333 --> 00:42:47,266
because then we got to figure out
how to train people for these new jobs.

00:42:47,266 --> 00:42:47,533
Yeah.

00:42:47,533 --> 00:42:50,533
And we got to reset people's brains
to think differently,

00:42:51,133 --> 00:42:54,300
and to kind of let go
of the things that they are used to doing,

00:42:54,300 --> 00:42:57,366
which is also very difficult
for the human psyche to do.

00:42:57,600 --> 00:43:00,600
So, like,
we have to really wrestle and think about

00:43:00,666 --> 00:43:03,466
what does our future look like
and not just take it for granted.

00:43:03,466 --> 00:43:06,000
Right? We have to build and be intentional
about future we build.

00:43:06,000 --> 00:43:08,533
And then I want to,
I want you to reply to that.

00:43:08,533 --> 00:43:11,700
And then I want to talk about,
your infrastructure pay,

00:43:12,000 --> 00:43:15,966
play in terms of Africa, how you see and,
and building that infrastructure.

00:43:16,233 --> 00:43:18,333
And then we'll kind of close out
with a rapid fire question.

00:43:18,333 --> 00:43:19,466
Yeah, sure, sure, sure.

00:43:19,466 --> 00:43:20,700
So you know, what I would say,

00:43:20,700 --> 00:43:24,466
especially from a policy standpoint,
is that policy is going to be necessary.

00:43:24,666 --> 00:43:26,500
I mean, absolutely necessary.

00:43:26,500 --> 00:43:31,700
You know, I think that as always,
policy is is way behind of innovation,

00:43:32,466 --> 00:43:34,800
especially substantive policy
that makes sense.

00:43:34,800 --> 00:43:35,566
You know,

00:43:35,566 --> 00:43:40,066
and you trying to do that line between not
stifling innovation versus testing.

00:43:40,133 --> 00:43:42,333
Yeah. But
you still gotta create those guardrails.

00:43:42,333 --> 00:43:44,300
I mean, I think that you're right.

00:43:44,300 --> 00:43:47,500
You know, I think that so we,
you know, just a quick example.

00:43:47,500 --> 00:43:49,000
So we experienced a little bit of this.

00:43:49,000 --> 00:43:51,900
So I run an ID tech company, I run
a couple of venture backed tech companies.

00:43:51,900 --> 00:43:53,733
One of them is an is in edtech.

00:43:53,733 --> 00:43:56,333
And we built
AI enabled educational programing.

00:43:56,333 --> 00:44:01,300
And our are probably 85% of our clients
are in the K to 12 space.

00:44:01,300 --> 00:44:04,100
So we work with cities and school
districts, that type of thing.

00:44:04,100 --> 00:44:08,266
Now, if you know anything about that
space, AI is not welcomed

00:44:09,366 --> 00:44:11,000
in that room right now.

00:44:11,000 --> 00:44:11,366
You know.

00:44:11,366 --> 00:44:16,166
So there is already,
you know, regulations, heavy regulations.

00:44:16,166 --> 00:44:17,833
And that's
just because we're dealing with minors.

00:44:17,833 --> 00:44:20,633
We're dealing with elementary,
middle and high school kids.

00:44:20,633 --> 00:44:25,600
So even though this technology exists
and there isn't no broad,

00:44:25,666 --> 00:44:30,800
you know, policy around the innovation
part of it, there is actually

00:44:30,800 --> 00:44:33,800
some guardrails in certain,

00:44:33,900 --> 00:44:36,300
verticals, and education is one of them.

00:44:36,300 --> 00:44:39,100
And so especially at the high school,
Medicaid, it's world space.

00:44:39,100 --> 00:44:40,266
And that's where we are.

So, you know, but, you know,
this technology is, is allowing us

00:44:44,466 --> 00:44:48,900
to reach communities
and in ways that just wasn't possible

00:44:49,100 --> 00:44:54,133
before and get resources to, you know,
rural Arkansas and rural Texas.

00:44:54,166 --> 00:44:55,266
Absolutely. There's no question.

00:44:55,266 --> 00:44:59,066
And New York City,
you know, and, and and and these boroughs

00:44:59,066 --> 00:45:02,833
where these are very, very, very poor
young people that are trying to get

00:45:02,833 --> 00:45:05,400
resources on how to go to college
and how to get a better.

00:45:05,400 --> 00:45:08,566
We're using this technology,
but we've had to get creative.

00:45:08,566 --> 00:45:10,766
So like, I'll give you an example.

00:45:10,766 --> 00:45:16,333
We wanted to build, an AI tool that was a
virtual college and career counselor and,

00:45:17,633 --> 00:45:20,800
and this tool, we had to do it with a rag.

00:45:20,833 --> 00:45:25,066
And so if anybody, you know,
doesn't know what a rag is called,

00:45:25,066 --> 00:45:28,066
retrieval, augmented generation,
that's what RAC stands for.

00:45:28,066 --> 00:45:31,266
But basically it
we created our own data set

00:45:31,433 --> 00:45:36,800
so that if one of the students
asked our bot a question, it does.

00:45:36,800 --> 00:45:40,200
And search all over the internet,
you know, to give them an answer,

00:45:40,300 --> 00:45:43,966
it is is searching for the answer
within our data

00:45:43,966 --> 00:45:45,900
set that we've been able to create.

00:45:45,900 --> 00:45:48,100
And that and those, right.

00:45:48,100 --> 00:45:48,400
All right.

00:45:48,400 --> 00:45:49,800
So I think maybe people know what it is
like.

00:45:49,800 --> 00:45:50,800
Basically, if you have it,

00:45:50,800 --> 00:45:54,400
it goes out and finds the information,
but you directed it to have a specific

00:45:54,400 --> 00:45:56,333
data set, specific data. Right.

00:45:56,333 --> 00:45:57,600
So I'll give you an example.

00:45:57,600 --> 00:45:58,633
You know, if, if,

00:45:58,633 --> 00:46:02,933
if some kid or parent, a first generation
student wants to know about fast food

00:46:03,300 --> 00:46:07,633
or potentially going to community college,
our bot would be able to have an answer.

00:46:07,800 --> 00:46:12,833
But if you ask this same hour by about,
you know what North Korea is doing

00:46:12,833 --> 00:46:16,300
today, it'll say it doesn't know
because that's not in our data set.

00:46:16,633 --> 00:46:20,900
And so that the US
positioning our technology

00:46:21,033 --> 00:46:26,566
that way was able to be used
in the K to 12 setting where

00:46:26,566 --> 00:46:30,000
there is under regulations,
because you're dealing with minors

00:46:30,000 --> 00:46:32,400
and kids and their data
and things like that.

00:46:32,400 --> 00:46:35,766
So like it's happening,
regulation is happening.

00:46:35,766 --> 00:46:39,200
And like in the educational vertical,
they're very cautious around it.

00:46:39,233 --> 00:46:39,366
Yeah.

00:46:39,366 --> 00:46:42,900
But they know that this technology
can help to reach community.

00:46:42,900 --> 00:46:44,966
I mean I think I2I think there are two
I think are two.

00:46:44,966 --> 00:46:47,233
Casey this is I, I go to that

00:46:47,233 --> 00:46:49,966
and that's not even what I mean
by data in terms of that.

00:46:49,966 --> 00:46:51,800
That's a sort of what I mean by policy.

00:46:51,800 --> 00:46:55,266
They have some good stuff in place,
but the mindset should be,

00:46:55,733 --> 00:46:58,533
this is like trying to not do
the internet.

00:46:58,533 --> 00:47:02,700
Like, we gotta, we gotta, we gotta,
we gotta figure out how they use this.

00:47:02,700 --> 00:47:04,433
So, you know, I talked to.

00:47:04,433 --> 00:47:06,433
Yeah, yeah, I used this.

00:47:06,433 --> 00:47:09,433
I was saying
education is a good example of

00:47:10,200 --> 00:47:14,200
you can probably straddle that line
a little bit because they have to be very,

00:47:14,200 --> 00:47:17,766
very, very strict
on what technology is coming in

00:47:18,000 --> 00:47:19,966
because you're dealing with these minors.

00:47:19,966 --> 00:47:24,266
But they also know that you need to to
to have the innovation

00:47:24,433 --> 00:47:27,433
so that you can reach communities
that haven't been reached before.

00:47:27,600 --> 00:47:31,500
And a lot of those older administrators
and teachers are retiring now.

00:47:31,500 --> 00:47:35,466
So there's a lot of 30 something, 40
something superintendents

00:47:35,666 --> 00:47:37,033
that are thinking more forward.

00:47:37,033 --> 00:47:40,233
So that space is going through, transition
right now.

00:47:40,233 --> 00:47:40,600
Right.

00:47:40,600 --> 00:47:43,533
That I think and this is just,
you know, again, we're early in the

00:47:43,533 --> 00:47:45,166
I think the other vertical verticals.

00:47:45,166 --> 00:47:49,200
That's what my point is
can look at education as a potential model

00:47:49,200 --> 00:47:52,700
for how to create regulations
in other spaces.

00:47:52,700 --> 00:47:53,866
That's absolutely pointed.

00:47:53,866 --> 00:47:56,166
That was absolutely I mean, there's
some other good models too.

00:47:56,166 --> 00:47:59,166
I mean, there's some models that go from,
there's a model,

00:47:59,333 --> 00:48:02,633
and in Europe
that everyone has a essentially a,

00:48:04,100 --> 00:48:04,833
a constitutional

00:48:04,833 --> 00:48:07,866
right, to their data and information.

00:48:07,900 --> 00:48:09,966
There are frameworks out there.

00:48:09,966 --> 00:48:12,933
None of them are perfect, but
there are frameworks that go from that.

00:48:12,933 --> 00:48:14,733
We certainly should be looking at.

00:48:14,733 --> 00:48:15,033
Okay.

00:48:15,033 --> 00:48:18,633
So let's talk more about how
we actually get more equity, like,

00:48:18,633 --> 00:48:21,866
because in a real way,
like there is there's when it comes to I,

00:48:22,633 --> 00:48:24,866
I is also not just a software play.

00:48:24,866 --> 00:48:27,400
It's an infrastructure play, as you know.

00:48:27,400 --> 00:48:31,333
And I know you have a big vision
for infrastructure that's worldwide.

00:48:31,400 --> 00:48:34,400
Tell us about that vision
and why it's so important.

00:48:34,466 --> 00:48:37,466
So it's a, it's a, it's a big idea
what they call a beehive.

00:48:37,500 --> 00:48:40,133
A very audacious.

00:48:40,133 --> 00:48:41,466
Yeah. Go.

00:48:41,466 --> 00:48:42,100
Yeah.

00:48:42,100 --> 00:48:43,800
Listen, you know, this is college.

00:48:43,800 --> 00:48:46,333
Yeah, I know, I know entrepreneurship.
Yeah.

00:48:46,333 --> 00:48:47,700
Yeah, yeah, yeah.

00:48:47,700 --> 00:48:49,933
So, you know, you know,
this is something I want to do,

00:48:49,933 --> 00:48:51,666
hopefully in the next,
in the next couple of years

00:48:51,666 --> 00:48:53,233
at least start on the seed of it.

00:48:53,233 --> 00:48:54,566
The seedlings of it.

00:48:54,566 --> 00:48:57,400
And you guys are here first
let's let's see what's going to be

00:48:58,533 --> 00:48:59,533
into the universe.

00:48:59,533 --> 00:49:01,533
It's going to be documented
a history of time.

00:49:01,533 --> 00:49:04,533
Yeah. It's going that's I think so.

00:49:04,866 --> 00:49:08,100
So, like I said, I run to venture back
tech companies, we're hoping to exit,

00:49:08,400 --> 00:49:09,133
the edtech one

00:49:09,133 --> 00:49:12,966
at a billion in about four years,
and we want to exit the deep tech one,

00:49:13,500 --> 00:49:17,033
probably in the next 2 to 3 years,
probably at around 100 million.

00:49:17,233 --> 00:49:19,166
That's our exit strategy on both of those.

00:49:19,166 --> 00:49:23,333
So I'm already thinking like, okay, well,
what do I want to do next after 25 years

00:49:23,600 --> 00:49:24,566
of doing this stuff?

00:49:24,566 --> 00:49:27,566
And now I have such a,

00:49:28,066 --> 00:49:31,733
I don't know, a heart for social impact,
you know, especially when you think about

00:49:31,733 --> 00:49:33,800
innovation,
entrepreneurship and technology.

00:49:33,800 --> 00:49:34,333
So I came

00:49:34,333 --> 00:49:37,200
I've been coming up with this idea
that, you know,

00:49:37,200 --> 00:49:39,600
how can we create
the infrastructure globally.

00:49:39,600 --> 00:49:45,133
So that to your point, Rob,
you know, we can get more builders from,

00:49:45,600 --> 00:49:48,600
every community on the planet building,

00:49:48,800 --> 00:49:52,266
this technology
and I'm talking about regionally specific,

00:49:52,900 --> 00:49:56,666
because one of the problems
with commercializing

00:49:56,666 --> 00:50:01,233
or or coming up with substantive business
models, let's just use looms,

00:50:01,233 --> 00:50:02,500
large language models,

00:50:02,500 --> 00:50:05,700
for instance, like AI and all that stuff
was really good at language.

00:50:05,700 --> 00:50:07,500
I mean, that's one thing that it is
good at.

00:50:07,500 --> 00:50:11,366
I can see somebody in Mandarin
right now using some app on iPhone,

00:50:11,366 --> 00:50:14,366
and we'll be at a talk
like we've been friends for 40 years,

00:50:14,466 --> 00:50:15,666
you know, so that's good at that.

00:50:15,666 --> 00:50:19,700
But if you think about the cultural
implications of what's

00:50:19,700 --> 00:50:23,833
happening with this, a, that's where
you can start seeing some of the gap.

00:50:23,833 --> 00:50:29,100
So we're here in America
and you can just use us as an example.

00:50:29,300 --> 00:50:31,100
Certain communities aren't represented.

00:50:31,100 --> 00:50:35,166
That's what we were talking about in the
technology as it's being developed today.

00:50:35,400 --> 00:50:36,600
And that's here in America.

00:50:36,600 --> 00:50:42,000
So just imagine, you know, Botswana
or just imagine, you know,

00:50:42,633 --> 00:50:45,000
you know, some other,

00:50:45,000 --> 00:50:47,966
not as well known places on the globe,

00:50:47,966 --> 00:50:51,366
you know,
they would never have their own lim

00:50:51,366 --> 00:50:55,366
that speaks directly to the cultural,
things

00:50:55,366 --> 00:50:59,300
that are meaningful for that, that region,
you know, these types of things.

00:50:59,300 --> 00:51:02,600
So which means that the technology,
in order for it to make sense

00:51:02,800 --> 00:51:05,366
for the region, has to be built
regionally.

00:51:05,366 --> 00:51:06,800
It has to be built there.

00:51:06,800 --> 00:51:09,466
You know, you think about the
what is that?

00:51:09,466 --> 00:51:13,466
The west,
the west coast of Spain, where they speak.

00:51:13,566 --> 00:51:15,000
What is that, Cantonese?

00:51:15,000 --> 00:51:19,400
You know, that's a very small, you know,
but that's very different.

00:51:19,400 --> 00:51:23,700
If you go to the east coast of Spain,
where they're speaking another language.

00:51:23,700 --> 00:51:27,133
And so how do you have this technology
that can be very specific.

00:51:27,266 --> 00:51:29,466
So the problem is, is that it makes sense.

00:51:29,466 --> 00:51:31,200
In China they got a billion people.

00:51:31,200 --> 00:51:35,200
It makes sense to build this in in India
they got a billion people here

00:51:35,200 --> 00:51:36,433
in the United States, you know.

00:51:36,433 --> 00:51:37,733
And the society that we live

00:51:37,733 --> 00:51:41,333
in, it makes sense because capitalism
pretty much runs everything.

00:51:41,500 --> 00:51:45,166
But in those smaller areas of the world,

00:51:45,366 --> 00:51:49,533
there is no business case
for building this technology.

00:51:49,533 --> 00:51:53,766
When you only have a country
that has 30 million people in it,

00:51:53,766 --> 00:51:54,833
you know what I mean?

00:51:54,833 --> 00:51:56,966
There's no global business case with that.

00:51:56,966 --> 00:52:01,933
So my idea is,
how do we create, an infrastructure where

00:52:01,933 --> 00:52:07,500
multiple developers of this technology
from multiple places around, the,

00:52:08,066 --> 00:52:10,600
the world can, can basically

00:52:10,600 --> 00:52:13,766
build regionally specific,

00:52:16,666 --> 00:52:18,433
lives in this technology.

00:52:18,433 --> 00:52:19,900
Right.

00:52:19,900 --> 00:52:23,066
That that will help them to compete
at a global level.

00:52:23,666 --> 00:52:27,633
I can tell you I can tell you my thoughts,
Ray, is that, is that, like,

00:52:27,633 --> 00:52:31,300
that's going to take intervention
and, public private partnerships.

00:52:31,300 --> 00:52:34,933
So, just the same way
infrastructure has been built out.

00:52:35,533 --> 00:52:38,833
There actually wasn't a
there wasn't actually a business case

00:52:38,833 --> 00:52:40,666
for the internet
as well as it seems. Right?

00:52:40,666 --> 00:52:42,000
Oh, 100%.

00:52:42,000 --> 00:52:42,300
Yeah.

00:52:42,300 --> 00:52:45,900
No, it was a Netscape was like,
let's run Netscape who prove

00:52:46,133 --> 00:52:47,800
that there was a business model
to the internet

00:52:47,800 --> 00:52:50,866
when they when when Netscape went public,
that was when everybody said, oh,

00:52:50,866 --> 00:52:51,533
wait a minute.

00:52:51,533 --> 00:52:54,566
It was, it was it wasn't actually
the federal government that was doing it.

00:52:55,200 --> 00:52:56,966
Right. And then no, no, no, no.

00:52:56,966 --> 00:52:58,566
So then Netscape pulled from that.

00:52:58,566 --> 00:53:01,700
So I'm saying this to say, yeah,
there was billions and billions

00:53:01,700 --> 00:53:05,700
and billions, poured into this technology
over many years.

00:53:05,733 --> 00:53:07,566
Yeah. The same thing's
going to have to be required.

00:53:07,566 --> 00:53:10,066
So this is where you have public private
partnerships.

00:53:10,066 --> 00:53:13,533
You have you definitely need
this is where you need government

00:53:13,533 --> 00:53:17,066
that understands that infrastructure
is more than building out

00:53:17,066 --> 00:53:19,000
roads and bridges,
is now about building out

00:53:19,000 --> 00:53:21,300
digital infrastructure
as digital infrastructure.

00:53:21,300 --> 00:53:23,500
Absolutely right.

00:53:23,500 --> 00:53:24,333
To be competitive.

00:53:24,333 --> 00:53:28,000
That's a part of what we're going to have
to do, to continue to be competitive.

00:53:28,000 --> 00:53:30,066
And we're still behind
the United States in this too.

00:53:30,066 --> 00:53:34,100
So like, we have to really, really built
that out, to get to the to really be able

00:53:34,100 --> 00:53:35,966
to maximize the technology
in a way that's meant.

00:53:35,966 --> 00:53:37,100
So I want to get

00:53:37,100 --> 00:53:39,166
I want to get to these last couple
of kind of kind of questions,

00:53:39,166 --> 00:53:41,433
because we've been on for a while
and it's been a great conversation.

00:53:41,433 --> 00:53:44,333
So, some of our rapid fire questions
I like to ask.

00:53:44,333 --> 00:53:47,900
So, all right, what is an important truth?

00:53:47,900 --> 00:53:51,333
You have that very
few people agree with you on?

00:53:54,100 --> 00:53:59,066
Man, You know,

00:54:00,500 --> 00:54:02,000
that's a good one, man.

00:54:02,000 --> 00:54:06,166
So to me, you know, listen, please
don't don't don't put it in the chat.

00:54:06,166 --> 00:54:07,100
Don't come after Rob.

00:54:07,100 --> 00:54:09,966
Guys,

00:54:09,966 --> 00:54:11,766
my my, I guess my important truth

00:54:11,766 --> 00:54:14,633
that not many people agree on
and especially in this.

00:54:14,633 --> 00:54:17,700
And, you know,
I guess, you know, my immediate community

00:54:18,000 --> 00:54:22,200
is that I don't know
if equity is the big issue.

00:54:22,200 --> 00:54:23,100
You know what I mean?

00:54:23,100 --> 00:54:24,800
It is an issue.

00:54:24,800 --> 00:54:27,800
But I don't think equity is the issue.

00:54:28,166 --> 00:54:28,833
What is this?

00:54:28,833 --> 00:54:33,566
You it I think I think that
the issue to me is commercialization

00:54:33,866 --> 00:54:37,700
because I think the, the, the resources,

00:54:37,733 --> 00:54:41,600
attention, all of that stuff goes
to where the money flows.

00:54:41,800 --> 00:54:46,100
So if you can find a way
to make money off the topic of equity,

00:54:46,500 --> 00:54:48,366
like I said, a business model behind it,

00:54:48,366 --> 00:54:51,366
then everybody is interested in
and then it starts to move the needle.

00:54:51,600 --> 00:54:54,333
And I think you do make a good point
on this in terms of

00:54:54,333 --> 00:54:58,000
because I think a lot in our community
really stick on nonprofits.

00:54:58,300 --> 00:55:02,233
They focus on like instead of like, well,
how do you create a sustainable business?

00:55:02,233 --> 00:55:03,766
Yes. Right. Yes.

00:55:03,766 --> 00:55:07,500
So to that, I agree in terms of like,
you can't move a community,

00:55:08,366 --> 00:55:11,600
unless unless you find ways to produce
value. Yes.

00:55:11,866 --> 00:55:14,266
You know, to do that.
So like there's no disagreement to that.

00:55:14,266 --> 00:55:17,800
I think that goes towards mindset
and working together

00:55:18,200 --> 00:55:22,133
and figuring out how to build,
because some of what that means is like,

00:55:22,900 --> 00:55:26,466
everything won't commercialize at first,
which means you got to have support

00:55:26,933 --> 00:55:28,066
from multiple people.

00:55:28,066 --> 00:55:31,633
Like, a lot of these people have lots
of runway and time to figure it out.

00:55:32,000 --> 00:55:35,666
We need to be in the same mindset
as a community to invest

00:55:35,666 --> 00:55:39,833
in entrepreneurship, not as something
that is a something that's wild out there.

00:55:39,833 --> 00:55:42,300
That's only for other people. Like,
you know what?

00:55:42,300 --> 00:55:46,233
To be honest with you, Rob, I don't think
that it's the community's responsibility.

00:55:46,233 --> 00:55:50,566
I think it's people like my responsibility
who's been in this space for 25 years,

00:55:50,566 --> 00:55:53,700
who have access to what I mean, bro,
that's what I mean. It.

00:55:53,866 --> 00:55:56,133
I mean, like, we get into the mindset.

00:55:56,133 --> 00:55:58,500
That's what I mean. Absolutely.
That's our responsibility.

00:55:58,500 --> 00:56:00,666
Let's like we're doing today
to teach the community

00:56:00,666 --> 00:56:03,366
because we just don't have that built
into our culture

00:56:03,366 --> 00:56:06,500
to go out and learn these things, invest
in these things, build these things.

00:56:06,700 --> 00:56:11,366
It takes us to build the infrastructure
so that others can come in

00:56:11,366 --> 00:56:14,566
and learn and build and grow
and develop and innovate.

00:56:14,733 --> 00:56:17,400
And that's that's
part of what I'm doing now.

00:56:17,400 --> 00:56:20,533
Absolutely.
All right. Very quickly concise.

00:56:20,566 --> 00:56:23,666
What's a slogan or theme in your
life in like one sentence?

00:56:29,000 --> 00:56:29,800
The path.

00:56:29,800 --> 00:56:31,900
And this is something I say to myself
all the time.

00:56:31,900 --> 00:56:36,033
The path less, less
taken is where I became good.

00:56:36,866 --> 00:56:39,866
But the path never taken
is where I became great.

00:56:40,833 --> 00:56:42,366
That's good. All right.

00:56:42,366 --> 00:56:43,533
All right. Final one.

00:56:43,533 --> 00:56:46,533
What do you see
as the next big global disruption.

00:56:47,366 --> 00:56:50,366
The next big global disruption.

00:56:56,600 --> 00:56:58,200
It's got to be climate.

00:56:58,200 --> 00:57:01,200
It's got to be food.

00:57:02,000 --> 00:57:04,600
And I think the technology,
the things that we're talking about

00:57:04,600 --> 00:57:08,666
today, are, you know,
I think they're interesting.

00:57:08,666 --> 00:57:12,066
But I think how you apply those things
to things like food,

00:57:12,333 --> 00:57:15,466
climate, you know, smart cities,
I think is the application.

00:57:15,466 --> 00:57:18,466
So everything we talked about
for the last hour, I think is interesting,

00:57:18,633 --> 00:57:22,033
but I think the application of it is
what's going to be the disruption

00:57:22,066 --> 00:57:24,733
for the next, the next 20 years.

00:57:24,733 --> 00:57:25,500
What's so interesting?

00:57:25,500 --> 00:57:28,433
You come full circle like it comes back to
how do we

00:57:28,433 --> 00:57:31,300
how do we make sure that humanity
is taken care of and not destroy itself?

00:57:31,300 --> 00:57:32,700
Is what I hear you saying, right?

00:57:32,700 --> 00:57:34,966
I mean, like it goes back to climate.

00:57:34,966 --> 00:57:37,533
So, yeah, honestly,

00:57:37,533 --> 00:57:41,066
geopolitical and not going and not
and not killing each other either.

00:57:41,733 --> 00:57:45,033
Let's, we've actually had a good run
relative for.

00:57:45,033 --> 00:57:46,200
I'm not saying everything's perfect.

00:57:46,200 --> 00:57:49,833
Yeah, but yeah, when you compare us
to where we are for the last 30 years,

00:57:49,833 --> 00:57:51,133
for the rest of humanity.

00:57:51,133 --> 00:57:54,166
Yeah, most of humanity
was constantly in conflict and wars.

00:57:54,166 --> 00:57:57,233
Like when people think about wars,
like they

00:57:57,533 --> 00:58:01,000
they just don't have any grasp of
how of how many people die through wars.

00:58:01,000 --> 00:58:04,800
This is because I think about World
War Two as we in this World War two.

00:58:04,800 --> 00:58:06,633
Do you know how many people died
during World War two?

00:58:06,633 --> 00:58:08,766
What the estimates are? No.

00:58:08,766 --> 00:58:09,633
What's your guess?

00:58:12,066 --> 00:58:13,633
10 million.

00:58:13,633 --> 00:58:14,133
Okay.

00:58:14,133 --> 00:58:17,066
It's over 50 million people. Wow.

00:58:17,066 --> 00:58:17,566
Wow. Right.

00:58:17,566 --> 00:58:20,166
I mean, when people just think about that
for a minute. Wow.

00:58:20,166 --> 00:58:22,500
Like, that's a lot of people, right?

00:58:22,500 --> 00:58:25,200
And wow. And you know, there's
but there is no more.

00:58:25,200 --> 00:58:26,966
There's
I don't think there's any more world wars

00:58:26,966 --> 00:58:29,966
like we got to we got the equipment
to wipe each other out quickly.

00:58:30,100 --> 00:58:30,900
Yeah, yeah.

00:58:30,900 --> 00:58:33,633
Analogy it is
technology will amplify that even more.

00:58:33,633 --> 00:58:34,166
Right.

00:58:34,166 --> 00:58:35,266
So like we're going to

00:58:35,266 --> 00:58:37,766
we're going to have to make sure
we figure out a way together

00:58:37,766 --> 00:58:39,200
because it's not going to be destroying
each other.

00:58:39,200 --> 00:58:42,666
We're not going to be able
to, outgun each other at this point.

00:58:42,666 --> 00:58:45,600
We can. Oh, nevermind.
Everybody has enough power to everybody.

00:58:45,600 --> 00:58:47,133
Yeah. Yeah. Absolutely.

00:58:47,133 --> 00:58:49,433
Yeah. So like, let's
we got this to figure it out.

00:58:49,433 --> 00:58:52,900
One of the things I would say
is, you know, there are bigger issues

00:58:52,900 --> 00:58:54,766
because, you know, we can talk about war

00:58:54,766 --> 00:58:56,366
and killing each other
and everything like that.

00:58:56,366 --> 00:58:58,866
But you know,
if everybody's starving to death,

00:58:58,866 --> 00:59:01,866
you know, I'm saying because it's not
enough food, we have to figure food out.

00:59:01,866 --> 00:59:04,800
You know, they're saying, I mean,
we got 7 billion people on the planet.

00:59:04,800 --> 00:59:06,466
They're saying in the next 20 years

00:59:06,466 --> 00:59:10,833
we'll have ten, like,
I mean, 10 billion people on this planet.

00:59:10,833 --> 00:59:14,166
There's already food
shortages, water shortages.

00:59:14,166 --> 00:59:17,700
Just imagine another 3 billion people on
this planet, you know, especially water.

00:59:17,700 --> 00:59:19,600
I think water is a bigger failure,
a bigger planet.

00:59:19,600 --> 00:59:21,700
But water is a huge problem under water.

00:59:21,700 --> 00:59:25,200
So yeah, we can use this technology
to solve those problems.

00:59:25,200 --> 00:59:27,966
That's why I think the big disruption
is going to happen.

00:59:27,966 --> 00:59:31,766
Those kind of breakthroughs in energy,
you know, breakthroughs like that

00:59:31,766 --> 00:59:35,200
type of stuff, you know, that like knows
that that's where the big disruption

00:59:35,400 --> 00:59:37,700
is going to happen.
That's why I'm excited.

00:59:37,700 --> 00:59:40,200
You know I'm excited
looking forward to disrupting.

00:59:40,200 --> 00:59:44,066
And when you exit from those companies
will you know I'll be at a good time.

00:59:44,066 --> 00:59:46,433
Then let's, let's go and
shake up the world together at some point.

00:59:46,433 --> 00:59:47,366
Jamal. Absolutely.

00:59:48,300 --> 00:59:48,500
Yeah.

00:59:48,500 --> 00:59:50,733
I appreciate
you, Rob. This has been incredible.

00:59:50,733 --> 00:59:52,733
Yeah. Thanks for coming out. Disruption
now as always.

00:59:52,733 --> 00:59:54,800
And I would love
to have you up at West Con.

00:59:54,800 --> 00:59:57,166
Let's definitely keep
the conversation going.

00:59:57,166 --> 01:00:00,600
I will talk to you about,
we'll follow up on HBCUs and partnerships

01:00:00,600 --> 01:00:03,600
there, and I just
I admire everything that you've done.

01:00:03,600 --> 01:00:06,600
I'm, I where you where you came from and,

01:00:06,866 --> 01:00:09,866
and I'm bullish about your future brother.

01:00:10,433 --> 01:00:11,500
Thanks, Rob.

01:00:11,500 --> 01:00:12,066
Thank you. All right.

HOSTED BY

ROB RICHARDSON

Share This!

In this episode of Disruption Now, Tremain Davis shares forward-thinking insights on how innovation is upending traditional business models and reshaping entire industries.

Here are three things you can learn from this episode:

1. Adaptive Leadership: Davis disrupts conventional norms by staying agile, empowering his team, and constantly reevaluating strategies to drive transformation.

2. Leveraging Disruptive Technologies: Learn how embracing new technologies can drive sustainable growth and create a competitive edge. Tremain's approach as a leader integrates cutting-edge digital solutions that challenge outdated business practices.

3. Strategic Risk-Taking: Understand the value of taking calculated risks and maintaining a proactive mindset to turn challenges into opportunities.

As a leader, Davis exemplifies disruption by challenging the status quo and fostering a culture of innovation that inspires others to break away from traditional molds.

Davis’s journey to becoming a disruptive leader in his community is rooted in his commitment to challenging outdated paradigms and championing local change. He transforms his business and drives community-wide innovation and resilience by empowering emerging entrepreneurs, mentoring future leaders, and building collaborative networks.

Tremain Davis's social media page:

LinkedIn: https://www.linkedin.com/in/tremain-davis-348504a4/

Website: https://www.thinkpgc.org/

Instagram: https://www.instagram.com/iamtremain/

DISRUPTION NOW LINKS:

Watch the episode: https://www.youtube.com/@Disruptionnow

Listen to the podcast episode: https://share.transistor.fm/s/587bc0b3t/


CONNECT WITH THE HOST

ROB RICHARDSON

Entrepreneur & Keynote Speaker

Rob Richardson is the host of disruption Now Podcast and the owner of DN Media Agency, a full-service digital marketing and research company. He has appeared on MSNBC, America this Week, and is a weekly contributor to Roland Martin Unfiltered.

MORE WAYS TO WATCH

Serious about change? Subscribe to our podcasts.