You're previewing an unpublished DRAFT episode!

How Dangerous Can AI Get, Dax is Down on DeepSeek, and AI First App Development

Dax:

You're sick of it. This is our last episode ever. We're not gonna do this podcast anymore. Adam doesn't wanna talk to me.

Adam:

Just I was just on the news. I just read about a plane crash

Dax:

and that's

Adam:

that's not good. That's sad. Yeah.

Dax:

I saw it last night and I'm getting on a plane

Adam:

tomorrow, so really bad Oh, man. Yeah. I guess two things collided in the air. I always worry about that. You always think like, I don't know, could something just run into the side of us because they didn't know we were here and they didn't look at the radar or whatever, sonar.

Adam:

I

Dax:

don't know. Yeah. In this situation, it it was kind of crazy. It was right as a plane was landing and it was a Blackhawk helicopter that was Yeah. Like in the air right over the ground in the airport.

Adam:

Anyway, that's a damper to start out with. How are you?

Dax:

I'm good. It's finally kinda warming up again. Like I went outside with no pants on today, which is good. But I'm still wearing, you know, a long sleeve.

Adam:

Long sleeve shirt. Yeah. I went for a walk this morning outside at 05:30 in the morning because it was 50 degrees here, which is amazing. It's been so cold and it should not be 50 before 6AM. That's

Dax:

going outside when it's 50 degrees is a is really dangerous.

Adam:

You should have seen how I was dressed. Is Probably lighter a than you're dressed right now.

Dax:

Well, I'm going to Boston, so I have to like go and pack

Adam:

Oh, no.

Dax:

All my like just heavy clothes from New York and my my like mountaineering jacket.

Adam:

That's funny. Wait, what are you doing in Boston?

Dax:

Liz's friend is having an engagement party.

Adam:

Oh.

Dax:

So we're going for that and then I'm gonna visit AJ on there. Nice. We're gonna hang out on Friday.

Adam:

That's awesome. Yeah. Tell him I Or said if if he's a listener, hi AJ, I'll just bypass Dax. He's a terrible middleman.

Dax:

He is a listener, I'm sure he'll he'll hear this. Yeah. Not till after we hang out.

Adam:

That's true. I've been listening to a lot of stuff. We don't want to talk about AI more, do we? I just listen to Just

Dax:

talk about whatever you want. Go ahead.

Adam:

Okay. I'll just talk about whatever want.

Dax:

You know

Adam:

what I'm saying? Just just whatever. I was just listening to oh my god. Just sorry. This reminded me of this reminded me of this stupid show Casey and I've been watching on Netflix called LaterDaters.

Adam:

And it's like these fifty fifty and 60. Yeah. It's like it's these older people, fifties and sixties dating, like, divorcees, widowers, etcetera. And there's this this woman who get they have like a dating coach who seems to know stuff about relationships. And she encourages this lady to like open up conversations, like break the ice by talking about like a podcast you just listened to.

Adam:

But like, the woman didn't quite understand. She doesn't seem to grasp the idea that you have to actually talk about the podcast, and she would just open all her dates with, so I was listening to this podcast with Matthew McConaughey, and that's all she would say, like that line. And like, you have to keep going. You have to It just made me think of it though when I said, so I was listening to this podcast. I just wanted to Pause.

Adam:

Break break the ice with you. Yeah. Pause. That's all. I was listening to a podcast.

Adam:

Man. I was listening I've been listening to, like, all of Lex's stuff because Prime's gonna be on there, and I just forgot I liked his podcast. So I was listening to like some of his back catalog and he had Anthropics CEO on, was super interesting.

Dax:

The Anthropics CEO seems solid. Like I don't Yeah. Get a sketchy vibe from him. He was always trying to like I feel like he's trying to be really practical

Adam:

Mhmm.

Dax:

With how he talks about all this stuff, which is pretty different from most people in this space. So yeah. Yeah.

Adam:

It was it was very illuminating. I'm not gonna try and regurgitate it because it won't be as illuminating coming out of my mouth. But you should go listen to it. He he's clearly very focused on safety and it's just fun to listen to people that like building these things, running companies, building these models, talk about like the risks and like the future and how it could play out. Because I always hear people talk about AI safety and it's like, yeah, well, I don't know.

Adam:

Like, it's all kind of vague and fuzzy. But he like talks about the specific categories of threat that they pose and like how to kind of like mitigate those things. This is super interesting.

Dax:

What is one example that you remember? Because I don't know anything about this.

Adam:

Yeah. So the he has like I can't remember the name of this. I think I think Anthropic came up with the system for categorizing the different levels of like threat that these models pose to society. Or level two is like the like state actors could use it to further their goals. Level three is like normal people could use it to like cause harm to humanity.

Adam:

And level four is that the AI itself along with humans is actually a threat. So, like, the AI can take its own actions even like circumvent things. Like, he talked about they have to worry about you know, they have these benchmarks or these tests that they do for safety to make sure that the model can't do certain things, like can't tell people how to make smallpox or whatever. So they have these tests, but they have to worry at level four that the AI will just, like, sandbag and pretend that it's not smart enough even though it is because it knows it wants to pass the test. Yeah.

Adam:

Which is super interesting to think about. Like, just what do you do with like, if these models can scale to, like, super intelligent, smarter than us, How do you how do you control something that's smarter than us? It just is super fascinating.

Dax:

I don't really understand the lower levels because what is did he talk about what practically is the difference between that and someone publishing a book that has instructions on how to make a He small

Adam:

didn't, no. Yeah, I guess So what you're saying is like, how is level three and below anything new to the world? It's just Is it just more efficient? Like a dumber person could figure out how to make an atomic bomb because AI is so smart.

Dax:

Given the stakes, it's like if you're someone that's like, oh, I wanna like unleash smallpox on the world, but I'm too dumb and I can't figure it out. Mean Yeah. That that's such an ambitious goal. So it just like to be that ambitious but like not just figure it out without AI, you know?

Adam:

He speaks to that like the world is is the state that it is, it's mostly been safe because the overlap of people who are extremely intelligent and the people who wanna do a lot of harm to people is a a small overlap. Like, generally, there's not a lot of people that fit both those things, but the fear is that AI increases that overlap because now you take people who want to do a lot of harm and you give them intelligence they didn't have. I guess that's the the vague general idea.

Dax:

I think this is where I would disagree with the way all these people think about it because I feel like they look at it from this really academic point of view which is I have like raw horsepower intelligence and I have, you know, trained knowledge in something and that's what gives me capability. But in the real world, especially when it comes to violent stuff like that, none of that matters, it's all about motivation. Like if someone is really motivated, they will figure this stuff out. It's not like the thing that was blocking them was that just like, oh, I'm not smart enough, you know. It's not really what what issue is.

Dax:

I I will agree that like Well a lot of crimes happen because they're more convenient and this would make certain things more convenient. I kind of see that point, but yeah.

Adam:

So I remember with North Korea, there was a lot of tension with North Korea and like they were shooting a lot of rockets just to like flex their muscles. And there's a lot of talk about like how soon could North Korea develop nuclear weapons. Is that not like that's not because they're not smart enough? Not smart enough, but like they don't have the knowledge of how to do it or it takes years to develop that technology. Is that not something AI could be faster at?

Dax:

Yeah. It could be fast. I mean, was taken the current form. Right? It's if you're someone that is trying to go from not knowing how to do this to knowing how to do this, what does North Korea have?

Dax:

They have motivation for sure. This is probably like their top priority. They have enough funding to figure it out. Mhmm. So given enough time, will.

Dax:

There's like no stopping that. Yeah. Do certain tools help them do that faster? Definitely. The same way that Microsoft Excel probably helps them figure out stuff faster.

Adam:

Yeah, sure. Okay.

Dax:

So I get I get why this feels like really specific, but if if you're talking about that level of impact in like the harm space, we should see like the equivalent level of impact where people are trying to do anything good, right? So I'm not like, I wanna cure cancer. I'm not like, suddenly as a random person any closer to doing that. Yeah. So yeah, I I think that side of it is a little overstated.

Dax:

I think they're kinda like I think they're just kind of in this bubble that's a little bit like like feeding this narrative into itself. Mhmm. So like, yeah, that that that's why the whole safety thing, I don't I don't fully get it. Like every technology makes certain things more convenient. It's a lot more convenient to produce firearms today than it was a hundred years ago, like much crazier firearms.

Dax:

Yeah, you have to think about it, but I don't I just don't see that acquisition of knowledge being the place that people get stuck. It's it's usually that The US and like all the countries try to have a crazy strict control over the raw material need to make a nuclear weapon. That's probably where the bottleneck is. And even that, you know, the countries work around because there's always someone that's against The US that has access and Yeah. I feel this detail is kind of irrelevant in the grand scheme of things.

Adam:

Yeah. If I'm being honest, I I don't really buy all the AI safety talk. I like, it's so hard to know what's just noise, like, what is just posturing and like even competitive, like, some of these CEOs, there's a bit of like pulling the ladder up. Right? That's been kind of at least theorized.

Adam:

I don't know if it's been proven, but like when the people that have the biggest AI companies training the big extensive models are the ones leading the charge on we need to make this harder. I don't know. Is there is there some other motive involved? But yeah, I guess like and then it seems like the other dialogue, you don't know what like is grounded in reality. There's so many people that talk about AI safety that don't seem to have any idea what that looks like.

Adam:

It's it's like at the government level, like they have no idea. Like nobody Right. Has any clue what that practically looks like. So yeah, it just feels like that whole conversation is either not grounded in reality or might have other kind of like hidden agendas behind it. I'm not scared.

Adam:

I say bring on

Dax:

the AI.

Adam:

Let's it's like, if it could solve problems and make things easier, yeah, I feel like if it gives the good guys more tools too, then yeah, what's the problem? I

Dax:

don't know. Yeah. It's just funny because it's it's such a virtual thing. It's like something that like, you can imagine someone going to a store and buying a physical hammer and like smashing your head in. That's like so so real.

Dax:

Whereas like this is just entirely in the virtual space and it's just it's hard to imagine that, you know, it just feels like they have a point. Like it's not that knowledge isn't harmful or dangerous but just compared to like just physical like buying a vehicle and like ramming it through a crowd Yeah. Is just so much more effective than anything that's bottlenecked by

Adam:

Sure.

Dax:

Your knowledge, you know.

Adam:

I guess on the digital front though, there is a lot of havoc that systems could do like banking systems. If autonomous AI stuff that had its own agenda, it could cause a lot of problems in the world even if it's only digital and doesn't have physical form. Right?

Dax:

Even if it's not its own agenda, if if there is some system that's now controlled by AI and now there's like a whole set of new vectors of, well, how can someone manipulate this Mhmm. System. Just because it is it's hard enough for us to create security around like deterministic systems. This is like a not deterministic system, so you never know if there's a certain set of words in the right order will make it ignore all the safeguards you put in place. So that to me is like a very practical application of AI safety and that's not even like about the AI being capable, it's actually a flaw with it being not very capable that it can be like reprogrammed by accident in these little ways.

Dax:

So I get that side of things for sure.

Adam:

I listened to another podcast of Lex's with I think his name was Adam Frank. He's like some kind of a astro something, astrophysicist maybe?

Dax:

Looks at space.

Adam:

He looks at space. But he like he like thinks about his job is like I guess they just got the first grant for looking for techno what did he call it? Technosignatures? Technosignatures. It's like it's like biosignatures would be like looking at a planet and saying is there any like other gases that would prove that there's life on this planet?

Adam:

But techno signatures are like, does this prove that there's advanced technology? So they're like actually looking at exoplanets in the habitable zone or whatever and trying to find like signs that they have created technology. I can't remember what some of them were. But super fascinating guy. Just go listen to Lex's podcast.

Adam:

What are you doing listening to us? Just go just listen to, like, the last five episodes are all good.

Dax:

I just listened to them all. We're we're just we're now just a podcast that summarizes that other podcast.

Adam:

That's probably a thing. That's funny.

Dax:

That's pretty cool. I think I there's there's something else. There's some clips of some other thing I was watching that was that was somewhat similar. So did did he talk about like what what is like the primary thing they're looking for? Is it are they looking for like Dyson spheres?

Dax:

Like what are they looking for?

Adam:

No. So he did talk about Dyson spheres, which I didn't remember knowing what those were. That's wild. Which I think they proved you can't they couldn't actually make a Dyson sphere.

Dax:

Did you just say you didn't remember knowing what that was? Do you mean like you forgot you knew about it and

Adam:

then never minding

Dax:

you did actually know about what it is?

Adam:

Yes. Listen, I I I don't have a great memory. And I know I've heard of Dyson spheres, but until I heard him talk about them on this episode, I didn't recall. It's like basically this big sphere around your star, around the sun that like To harvest

Dax:

capture all the energy. All the energy

Adam:

from that sun, which that's another crazy thing that he talks about is like the levels of civilization, the level whatever they are. The energy Yeah. Output. Yeah. But the Technosphere thing, I think the main one they're looking at what did he say?

Adam:

What did he say? It was not Dyson spheres. It was

Dax:

Satellites, radio waves.

Adam:

No. It wasn't waves. I don't remember. Man, I'm sorry. It would be interesting content and conversation.

Adam:

I just don't remember.

Dax:

Oh, what would you personally look for, Adam?

Adam:

Well, let's see. What would I look for? If there was a similar technology

Dax:

I would look for screens. I would look for

Adam:

Wait. Screen? Oh, yeah. No. This isn't like they have images.

Dax:

I know.

Adam:

He did he did talk about imaging. This is just we're going off the rails. I can just talk about different podcast episodes forever. But he did talk about, like, in the next however many hundred years that we'd be able to have, like, Manhattan sized imaging, like interstellar, like, view cities the size of Manhattan. What do you say?

Adam:

26 kilometer resolution or something on exoplanets. They have this idea for it sounds like science fiction for sure. And that's the cool thing about science fiction.

Dax:

That would be crazy.

Adam:

The way it worked is like, you send all these like sensors, cameras I guess, way away from Earth, the opposite direction from the sun. I can't remember how far he said, in the solar system, but like long ways. And they're looking at planets that are just past the sun because of the way large bodies warp space and Yeah. But So the sun basically like focuses the image of the star just beyond the edge of the sun and these cameras are looking at

Dax:

the Yeah. Wild

Adam:

It's super wild. So using the sun is like this amplification of our ability to to view exoplanets. Anyway, let's talk about something that's not on another podcast. My memory is not good enough for this exercise.

Dax:

You know what's definitely on other podcasts? It's the whole deep seek thing from this past week.

Adam:

Oh, we didn't Yeah. We didn't really talk about DeepSeek much, did

Dax:

we? Mm-mm.

Adam:

Can I Can can you just like run that on your local machine? Can I just like start getting coding benefits from DeepSeek r one without an API call?

Dax:

It's not really I mean, it's like a reduced version of the model and it's very slow and you need their hardware requirements are pretty crazy. So no, you cannot.

Adam:

Okay. So so how do how do people use DeepSeek r one right now? What is how does it exist? Is it commercialized in any way?

Dax:

There's like a hosted one from the company, but it's in China, so people feel sketchy out by that. But then it's been rehosted because open source has been rehosted by a bunch of providers that you're familiar with. Like Cloudflare, I think has a version of it.

Adam:

Oh, okay.

Dax:

There's been a few others. I don't think it's good.

Adam:

Oh, really?

Dax:

Well, it's like not better than anything else. It's just a recreation of what's already there, so

Adam:

They did it for less. It's like the the not Indian Jones. What's the guy MacGyver? They just MacGyvered it and they like made it out of duct tape?

Dax:

I don't believe any I mean, it's just like it's none of it's none of the information about it is true. Like it's just Oh,

Adam:

woah woah. Stop. Hold on. Catch me up. I didn't know that it wasn't true.

Adam:

What are the facts that aren't true? Because I don't even know the facts around it, really. I just heard it's

Dax:

claiming they did they they're claiming they trained the model for 5,500,000.0, which is like a crazy, several or magnitude less than what OpenAI's models cost. Everyone's like dunking OpenAI. Is

Adam:

it a currency thing? Did or were they talking maybe yen or something?

Dax:

That would make it even cheaper.

Adam:

Oh, okay. Yeah. Yeah.

Dax:

No. No. The the the number is just too small no matter what currency on the planet you're using.

Adam:

Okay. And you just think you think why would they Oh, because it's like a competitive thing, they're trying to lie.

Dax:

So the reason it's it's very noisy. There is true interesting things that they did, like so you can't take that away from them, like it's impressive. But Mhmm. That doesn't mean what they're saying about how it was done is true either. Like the numbers are just like way too much of a lie.

Dax:

Like there's no way that one, they're that low. Two, there's a lot of reasons for them to make it up, right? And no one's gonna reproduce it for once. So that's that's not a thing.

Adam:

Could you make some of them explicit? Yeah. Say some of the reasons because I don't always connect dots.

Dax:

Well, they're not China's not allowed to have certain GPUs.

Adam:

The what?

Dax:

Because of the export

Adam:

I didn't know this.

Dax:

Of the export controls.

Adam:

Okay. You've got a lot of context here. You need to lay it all out. Spell the case out for why DeepSeek is a fraud.

Dax:

On paper, NVIDIA is not allowed to export certain levels of GPUs to China.

Adam:

NVIDIA is an American company, right?

Dax:

Yes. Okay.

Adam:

See, these are things I just don't know for sure.

Dax:

So you gotta Yeah. Spell it out. So they they can't be like, hey, here's exactly what we used if they're using a bunch of stuff they're not supposed to have. So that like throws a bunch

Adam:

Sorry. Of questions to Going going back just real quick. The reason they can't export them to China is like American law.

Dax:

Yeah. Yeah. We we banned exports of GPUs about certain capability. Okay?

Adam:

Okay. Got

Dax:

it. There's other interesting fact that someone pointed out recently that Singapore is 20% of Nvidia's revenue.

Adam:

Okay. Is Singapore in China? I'm so dumb.

Dax:

No. No. Singapore Singapore is like a very small island nation in that area.

Adam:

Okay. So it's China adjacent.

Dax:

Why would they beat why would they beat 20% of Nvidia's revenue? That's a little

Adam:

Oh, so they're buying all the GPUs and then just taking them into China? Are they smuggling them? Yeah. Is this called

Dax:

is this These export controls practically are just not effective. Like it's how do you exactly what we're talking about earlier, like there's always gonna be a way if you're sufficiently motivated to to get these things. And of course,

Adam:

there's some I motivation around could just buy a bunch of them and take them to China and there's no one at China in China at China. There's no one at China that's gonna stop me bringing them in, right? It's just that

Dax:

America The is supposed to not US is telling NVIDIA you can't And do the other thing I was thinking about was like, man, what a deal this century. You could just be the dude in Singapore smuggling this stuff, adding a 20% whatever. Yeah. And that's like it's like 20% fee on like $20,000,000,000 of GPUs. Like that's That is crazy.

Dax:

That's nuts. That is really wild. So the the the point is there's like so many reasons why one, they wouldn't say, two, like, well, sorry, one, couldn't say what they actually did, and two, like there's a lot of reasons to just and this is what they always do, they always like lie about the price of things to like create market. Yeah. It's it's a good strategy.

Dax:

It works.

Adam:

Yeah. Okay. But the the DeepSeek put out a paper. I know this because all the software engineering nerds who act like they're smart enough to understand papers are like, oh, check out this paper. This is amazing.

Adam:

Like you did You don't know what the paper says, so stop.

Dax:

If you literally put the paper into DeepSeek and talk to it about it, you would learn more than just listening to other people talking about it. Yeah, probably.

Adam:

Well, okay. But question, did they not have to outline in the paper like what hardware they use and all that stuff? I guess they don't have to, but would they not generally do that?

Dax:

They talked about their techniques and their techniques are interesting and novel, so you can't take that away from them.

Adam:

Mhmm.

Dax:

But they then they separately claim that we use these techniques on this hardware to achieve this outcome. But there's so many ways to lie about that.

Adam:

If it's in the single digits of millions of dollars, I feel like there's somebody out there sufficiently motivated to like reproduce. Can they not reproduce based off the paper or there's still some secret stuff that's

Dax:

not a thing is if you okay, let's say someone told you that, hey, I can run a SQL query that filters a trillion rows in half a second,

Adam:

right? Mhmm.

Dax:

You as someone that understands this stuff, you're like, I'm not even gonna waste my time reproducing that because Oh, I what the fuck? That makes no sense.

Adam:

Okay.

Dax:

Yeah. So I imagine that something similar is going on here.

Adam:

So so you're saying like I have so many I'm sorry. I keep interrupting you. I just feel like you're you're moving a 100 miles an hour and I'm like at the stop sign still. So you're saying that like big companies just all believe this is a bunch of b s. Like it's just like like the broader people in the know in the industry just did dismiss this thing right out and we're all excited about it, but like they're like, yeah, whatever.

Dax:

Yeah. Mean, just because it's such a hyped space, it's like so hard to tell what's real and what's not. And also it's the the the noise comes on both sides. So remember that we're saying novel because it's been published publicly. We don't know that OpenAI already ran across this and is using it to develop their stuff, like the the techniques in there.

Adam:

Oh, So

Dax:

this might not even be a surprise to them as much as be oh, they like independently came across the same techniques and they know that, yeah, it's not causing like a thousand x decrease in Mhmm. In training costs. But then the other noises and so now, and this is a part where I'm like, okay, this could be noise from the other side. But I did think about this when it came out. OpenAI is claiming that they have proof that DeepSeek was trained on outputs of their models or of like some maybe potentially like unauthorized access to stuff from OpenAI.

Adam:

Okay. And

Dax:

there's like some like Again, this doesn't mean anything but like the pseudo science part of this is people were able to get DeepSeek to reply and make the exact same mistakes that o1 makes. It seems like maybe it's a coincidence, maybe it means something. But yeah, the point here is like, it's it's just such a crazy hype space with a ton of money that there's like zero ability to draw any kind of this is what's happening right now in the moment. It's just impossible for situations like this.

Adam:

Yeah. I guess has like OpenAI so you said they said this thing about them using their outputs. Have like people like Sam Altman or any of the figures in this space come out and said anything about DeepSeek publicly?

Dax:

You know how Sam Altman is. He just did the whole like generic, wow, it's really impressive and I'm invigorated by the competition, you know, just like the fucking he to be honest, Chat, he's more human than Suckerman already.

Adam:

Did you see did

Dax:

you see what Claude did to me yesterday?

Adam:

No. What? Did you tweet about it or something? This is

Dax:

the I can't believe it did this. So I was trying to deal with, again, like bringing it all back down to earth. I was trying to insert something into a Postgres database. And of course, on conflict, you wanna like do a do a update operation.

Adam:

Of course.

Dax:

I'm used to my SQL where you can just say on any conflict, do this operation, but on Postgres, guys specify like, oh, when this conflicts do that, when that conflicts do this. But I was like, okay, can I just on conflict on anything, is that possible? And Claude in a single reply writes out, hey, yeah, you can do this. And then it like writes out the query. And then right after it does that, it continues writing being like, just kidding, that syntax doesn't exist.

Adam:

What? It said just kidding?

Dax:

Oh my God, this this tweet tweet has 5,000 likes. I didn't even notice.

Adam:

What? You're you tweeted this? I gotta see this. I'm trying to do so many things. I was also trying to look up techno signatures because I feel so bad about not Th d x r.

Adam:

Okay. So you just tweeted this recently?

Dax:

I tweeted it last night.

Adam:

Last night? Oh, wow, bro. What Cloud is straight Can up pranking me I make it do it do on conflict on anything? Yeah. Some post crashes you can use on conflict on.

Adam:

Just kidding. What in the world? That's hilarious.

Dax:

Yeah. What? It's funny because we're so used to these models being quirky, but like think about this in a traditional product. Like imagine you have a product and you have a button and the button is like click here to do something useful and you click it and it pops up being like, just kidding, we don't get that. Like that that would be so ridiculous to actually ship something that did that.

Dax:

Like that's something the But terminal would this is like in in Claude. And to be honest, something I just I just have been annoyed with Claude more and more for the past couple of weeks and this to me was like

Adam:

Same.

Dax:

This is like the final

Adam:

I wanna talk where

Dax:

I'm like, you're straight up just joking right now. Like, I I I'm gonna actually consider I think I'm gonna stop paying for it. Gonna re I need to like reassess what I'm paying for.

Adam:

Yeah. Yeah. Because I just keep signing up for them and then it's like, it's easy to forget. The Claude thing, I heard somebody or somebody tweeted this the other day that Claude was getting dumber, and he talks about it on the podcast. Apparently, Lex asked him a question from Reddit, which was like, why did why does Claude just keep getting dumber?

Adam:

And he kind of goes on to say like that people report this on all the major models. This isn't just unique to Claude.

Dax:

No. It's not. But he

Adam:

kind of explains like I don't know. It was kind of hand wavy. I don't know. I didn't really take from it that I I believe they don't get dumber. He said they don't intentionally they never change the weights.

Adam:

They do sometimes change the system prompts and they change some other things and I don't know. But he he basically was saying like, most people it's just like a psychology thing. You're really impressed at first and then just get less impressed over time.

Dax:

That's what I was wondering. Is that the case? And the more you use it, the more you understand the boundaries, like

Adam:

But I do genuinely feel like it's gotten dumber in the last couple weeks, and I don't know what to do with that feeling because if I if I felt it and then I read someone else felt it and then I learned that Reddit feels it, like, what there's something there. Right? Because it's like things that I felt like it was doing a pretty good job a few weeks ago, it does it feels like it's not doing as good. Is it just a feeling?

Dax:

Yeah. I'm on the side that it's just a feeling. I I mean, I think I would I would doubt that it's that clear cut. Like they must constantly be optimizing or like playing with the amount of compute they're allocating to inference. And there's like ways to like kind of make it more efficient for you to run.

Adam:

Is that the kind of like tinfoil hat theory that like they're just it's a cost thing that they're just kinda like using less resources over time for inference balance and that's what

Dax:

it. Like there's no way that on day one of releasing something, they nailed it and they never have to like tweak that. Mhmm.

Adam:

So I

Dax:

would be surprised if there's like not any like thing where they explicitly know that, oh yeah, we did this because we made this trade off. But I do agree that it must be a psychology thing as well because if I really think about it, the thing that's not static is I'm trying to use this stuff more and more and it's really hard to keep track. You know, it's that thing where like everyone's like, oh yeah, I know what I eat every day and like, you know, like I know I eat this many calories or whatever. But then you make them write it down, they you realize like, it's so different with people's like perception of how much they eat or what they eat is. Mhmm.

Dax:

So I think it's kind of similar where I know that I'm using it, trying to use it more and more aggressively and I know over time as I get more comfortable with it or like becomes more and more of my workflow, I'm definitely gonna be pushing the boundaries of it more. That just happens with any tool. Mhmm. So it's it's hard to say that that's not a factor. Oh, is that the end

Adam:

of your thought?

Dax:

Yeah. Wasn't good enough for you?

Adam:

No. It's good. It's good. It's I just thought you were like on a roll and I'm, like, looking up techno signatures. And then you just

Dax:

saw No. Listening. Techno signatures. We've moved on.

Adam:

I found it. I'm I found it. So I'm gonna tell you at some point. But I do wanna respond to to the last thing you said, which I totally knew what you're saying. I'm sorry.

Adam:

This has been a weird one.

Dax:

Yeah. By the way, just straight up smells like fire in my house right now, so I hope I'm not That's burning not good.

Adam:

Yeah. That's not great.

Dax:

I I think it's because Liz turned on the heater and like, you know, how's in Florida? You're not really supposed to

Adam:

use they the heater? Do yeah. You never use it and then when you turn it on for the first time it does, it smells like there's like a actual like wood burning fire in your house. I know that smell. I did have a follow-up to what you said.

Adam:

Sorry. I had a question. Do you know, like, when we were just talking about inference and the GPU resources allocated to inference, like, they have to use, I guess, now thousands of GPUs to do the training. Do you know like orders of magnitude wise, like what inference looks like compared to training resources like infrastructure?

Dax:

They still allocate most of their stuff to training, not to inference.

Adam:

Okay. So if they have 10,000 GPUs, like most of them, 9,000 of them are used for

Dax:

Yeah. I I know the exact ratio but I would I I know it's more on the training side than on inference side.

Adam:

Okay.

Dax:

Yeah. I mean, it just it just makes sense because why when if you don't win the model battle, the inference the fact that people are using your product is kind of irrelevant. So it doesn't make sense to

Adam:

Yeah.

Dax:

Over allocate there.

Adam:

Like intuitively that made sense to me and I figured that was the case. It's just interesting when you think about a business, the like the lifeblood of Anthropic or OpenAI is like this huge farm of GPUs and they're only really like that huge investment in GPUs is useful for training new models. So they just always had to be training new models to get the thing out of that huge investment. Right? Which I guess they always will be training new models, so maybe it doesn't matter.

Adam:

Yeah.

Dax:

I mean, in the end, to me so far and I felt this from the beginning, this feels like the worst part of the stack to be in. It is the most difficult and the most expensive and it is the most like commodified. So yeah, I mean, I think the thing that people point out with DeepSeek is it's impressive to create something as good as OpenAI stuff. Mhmm. It's totally realistic to assume making a model that's 1% better than OpenAI stuff costs like $50,000,000,000.

Dax:

Like that's like totally realistic. Yeah. And that that's like an argument in favor of being like this is why OpenAI will, you know, it's not really a threat to them. Simultaneously, it's also like condemning this entire business because it's just like, if it's gonna take that much capital to make these marginal improvements and it's like a crazy competitive space where, you know, the costs are being turned to zero and all these companies are competing. Yeah, it's just, I don't know, to me it never made sense.

Dax:

If I was like an investor, this is not the part of and I wanna like bet on this AI thing, this just feels like the worst place to to put your money.

Adam:

It's so intense, so capital intensive, right?

Dax:

Yeah. Like when I see that, I'm like I I need to invest in someone that benefits from having access to cheap AI models, not the people building the cheap AI Mhmm. And and yeah, VC Twitter, like it's funny, they just go on these little things. And currently they're on this, they've swung back and forth and currently they're all saying, oh yeah, like the application layers where you're gonna make a lot of money. But like, you know, a couple weeks ago they were saying the opposite.

Dax:

But I do That does make more sense to me. Again, it's not I'm not taking the moonshot bet because the moonshot bet is you invest in OpenAI and they eliminate the whole economy, which I I get and I like bets like that. It's just for me this one is not the one that I would go for. Yeah. Because of something like less crazy is probably gonna be the outcome.

Adam:

Yeah. And Sam Altman sucks. That's an easy way to not wanna take that bet.

Dax:

Well, mean, OpenAI or its competitors.

Adam:

It could be Anthropic or yeah. Okay. Sure. I guess. One

Dax:

last thing on this. Yeah. Someone I did come across something today. Do you remember Mistral?

Adam:

Yeah. Woah. Yeah.

Dax:

Okay. So so Where'd that

Adam:

they're go?

Dax:

Like, this is maybe the worst company fundraise of all time because they raised like a 150,000,000 on like a $300,000,000 valuation or something.

Adam:

What? Like gave up half

Dax:

the They like gave up half their company and like that's nowhere near enough money to like play in this game.

Adam:

Like they're trying to do the the frontier model thing, like, they're on that.

Dax:

Yeah, exactly. And just like

Adam:

Oh, jeez.

Dax:

What the fuck are they gonna do? I I I mean, maybe they're they're like a French company and maybe it's just like they're just gonna serve the French market, because I guess the company's there.

Adam:

Maybe they're gonna train models for a thousand times cheaper than OpenAI. Maybe they're

Dax:

gonna go the deep secret. That that that's possible. But again, like, just gave away 50% of your company. If you need any more

Adam:

money Yeah.

Dax:

That's crazy. If you need 1,000,000 more dollars, like, what deal are you gonna make?

Adam:

So I I didn't remember if Mistral was there's been so many of these companies with doing like image stuff. I think the image space is even more messed up in my brain. And I thought they maybe were one of image generating things, but no. I wanna talk about the app layer, the AI app space, because that's also kind of top of mind for me. Maybe it's because VCs are very excited about it.

Adam:

And Marc Andreessen was just on Lex Friedman, and like I said, I've listened to all of his podcasts. I wanna talk about my experiences, and I wanna hear from you how you think about those companies. But first, techno signatures. The main one that we're looking for is chlorofluorocarbons because, like nature can't create those. Like Okay.

Adam:

That requires some sophistication technology. And like he talked about Earth, we pumped so many in the atmosphere that we blew a hole in the ozone layer and that that would be detectable using the right instruments from far away.

Dax:

That seems pretty solid. I'm I'm I'm increasingly convinced that there's nothing out there, but

Adam:

Really? Oh, because I'm increasingly convinced. I listen to a lot of sci fi. Now, I'm increasingly convinced that it's everywhere. It's a dark forest.

Adam:

They're all out there. They're just being quiet. That's how I feel. Tell me why you feel that way. Why do you think there's nothing out there?

Dax:

I desperately want that not to be the case and I think in a lot of ways it's like unlikely that there's nothing out there. But man, given just the size of the universe When I say nothing out there, I mean like, even if there is, it's not in our perceivable universe or whatever and like, you know, the galaxy are separating faster and faster over time.

Adam:

Right. So like, there's no way we'd ever reach

Dax:

Yeah. So it just feels like I don't know. It feels like a I just gotta get much I gotta get a negative feeling towards that whole thing. It feels like so impossible and and unlikely. But again, not not based on science, just based off of how I feel.

Adam:

Yeah. Just feel. I guess, like okay. I have a lot of thoughts. First, you just said that and it reminded me that I just heard how things can't travel across space and time faster than speed of light according to our understanding of physics.

Adam:

Mhmm. But the actual universe moves faster than the speed of light. So yeah, the the galaxies like moving apart are moving faster than the speed of light. Right?

Dax:

Because there's like new space being created created in between them,

Adam:

which if

Dax:

you like map that to velocity, I get mean, I'm just making it Well, now up you're Based on what you're saying.

Adam:

Losing me. But

Dax:

It's like if I magically created between you and me more space, it's like we've moved further apart at a certain rate.

Adam:

Right.

Dax:

But we didn't yeah. So maybe that's what he's talking about. I don't I

Adam:

I just got ahead of my skis here. Just even trying even just trying to think about what you're saying. But I think what Adam Frank just said on this podcast was that space time moves faster than the speed of light, like, the expansion of it, but you can't an object can't move across space and time

Dax:

Right. Right. Faster Gotcha. Than

Adam:

the speed

Dax:

of light.

Adam:

But if it if it is true that the galaxies are moving apart faster than the speed of light, then yeah, you could never get to another galaxy because we can only dream to ever move at the speed of light, which would be a crazy accomplishment. But if it's moving

Dax:

faster Unless you like do something crazy, like you violate or like you have a Or completely new something. Marvel That like just yeah, it just totally breaks breaks that. But yeah, outside of that, you know, speaking quote unquote practically whatever that means in the space, Like, yeah, maybe our galaxy is explorable. And man, like, even that just feels like I can see there being nothing there.

Adam:

So, okay. So my stance, guess, it's that it's about time. It's not about distance. It's like

Dax:

How long stuff has been around for?

Adam:

It maybe yeah. Maybe civilizations, and I'm stealing this from all the various science fiction writers and the actual scientists that I've listened to in the last year. But, yeah, maybe it's that like intelligent societies just don't last very long. So the chance that overlap is happening like the you know, our whatever hundred years, two hundred years of technological advancement here is just such a tiny little blip in the broader expanse of the universe that like the chance of that blip happening at the same time as a bunch of other blips is maybe super low. But that maybe life is super common, just not intelligent societies that last long enough.

Adam:

Like, if we can get past Adam Frank talks about this too. If we could get past all the the terrible things that could end our civilization, whether that's nuclear war, climate change, AI, whatever. If we get past all those hurdles and we can figure out how to, like, live for hundreds of thousands of years, millions of years as a civilization, then, like, then the chances of finding life maybe is is more realistic Yeah. Because you're around long enough to See, I don't know. I'm just saying stuff that I don't have any credibility to say.

Dax:

This is all just like different answers to the Fermi Paradox thing. But to me, I find the problem with the Fermi Paradox, which is just to reiterate, it's given the size, the age of the universe, we'd expect it to be like full of life. Mhmm. Given how like long stuff has been around and given how much there is. And there isn't, so then you ask, okay, what are some explanations of that?

Dax:

And there's a lot of good explanations, that's a problem. There's so many good explanations and they could all be true, but the result of all them is that life is exceedingly rare and you're unlikely to intersect with it. So that's what kind of bums me about about this concept.

Adam:

It bums you out because it would be nice to like

Dax:

I I'm so I I don't wanna die but if I'm gonna die, it's because of an alien invasion, I'm like kinda down for that. Because at least I learned something deeply important for a few seconds before I get wiped out.

Adam:

Interesting. Okay.

Dax:

Yeah. Like I don't wanna die in like car accident, like it's done.

Adam:

Oh, yeah. No, that's terrible.

Dax:

Like, yeah. Like, I want like, if I'm gonna die, like at least give me some crazy existential moment.

Adam:

Okay. Yeah. What what's your like top three ways to die? What would I be

Dax:

if if don't can I'm I I have regretted it. I'm sorry. Yeah. Yeah. Yeah.

Dax:

No. I gotcha.

Adam:

Existential dread and etcetera, etcetera. Okay. Anyway, let's talk about the app AI app stuff. So this idea was seeded in my head just a few days ago when Andresa was on Lex and he talked about just I think the example they used was email, AI first email. And like how so many apps just have like AI bolt ons now.

Adam:

Like, we've got a little button in the corner that's like ask AI. But companies that are started with the whole premise of like rethinking the product, the entire category of product with AI first. So he used the example of an AI company building an email client or something, which I've now I think I've downloaded. I don't know if it's the one that they're invested in, but he kind of threw that out there and just said like all the different categories. And then I heard, you?

Adam:

Did you tweet about this?

Dax:

No. I told you something that you can't repeat. Oh, yeah. Not public information yet.

Adam:

Which I will not repeat. Thank you. Okay. That's what it was. Yeah.

Adam:

It was a DM. I knew some other data point hit my brain that was like, oh, the app layer of AI. That's a thing. And it's like when you hear when you learn a new word and then you start seeing it everywhere. So could you tell me with your big brain that you've been thinking about this probably for, ten years, could you tell me what is going on in the app AI space?

Dax:

Yeah. So the way I look at it is there's a new capability. Again, this is I I would categorize AI in two categories. There is the boring parts is what we're talking about now and this is the bet that society will continue to be roughly the same and this isn't like a, you know, truly disruptive, like a totally disruptive thing.

Adam:

You're speaking to like the bolt on thing or you're speaking to like the commodity of like

Dax:

You're just talking about like building a traditional product but thinking through AI, that's like a not very bold way of looking at

Adam:

all Yeah.

Dax:

Mhmm. But part of me like doesn't wanna engage with that because like I said, I don't I don't believe so far in that it's like much bigger bet. But I believe generally that's where you should put your attention and things that

Adam:

Mhmm.

Dax:

Kind of fall in that category. Mhmm. That said, let's say this that this ends up not being that crazy of a thing and this is a way this is the direction things go. So right now we're in the era of there's a new thing and nobody knows how to build good UX around it, right? There If you imagine when like the iPhone came out, pull to swipe or sorry, pull to refresh.

Adam:

Oh, yeah

Dax:

yeah. Someone had to come up with that and the moment they did, it was so obvious that everyone did it. So I think we're like in that phase where almost every single product that added AI is just a stupid ass little button that's on top of other shit and it's just like kind of getting in your way and you're always accidentally clicking on. So that's just like that that's the era we're in. But at some point we'll see stuff that is like, oh, obviously.

Dax:

And I think we're actually already starting to see some of that stuff. Have you seen this granola AI product? No. Okay, so I think it's a brilliant example of what you're talking about, rethinking products from an AI lens. And they did it in a way that is very well executed.

Dax:

It's not like the It's not the first thing you would think of, right? But they were like, okay, problem existed forever. How do we, make people who take notes for meetings, how do we make that easier? Boring problem, been around forever. Years, years of products that do that.

Dax:

Bunch of AI products that do that, right? There's a bunch of AI products that are like, I'm Bob the AI and I've I'm a I'm a bot and I've joined your Zoom call and I'm here to take notes. It's just Yes. Like Uh-huh. Weird, totally unnatural, not relating to your current habits thing at all.

Dax:

Weird social norms around it, like it's just not a good way to introduce this idea to people. So what this product does is it runs on your Mac, it records all the audio from your

Adam:

Your meeting?

Dax:

Yeah, from anything that's happening. So we're like in we're also in this era where like no one's doing perm like direct integrations anymore because AI can just handle like raw input. Mhmm. So if you can record audio from your Mac, you would now support every single

Adam:

Every app. I mean, that makes so much sense.

Dax:

Of the box, right?

Adam:

Yeah.

Dax:

This shows up in a bunch of different places when people are putting AI products. It's totally invisible and it's totally out of your way. They give you a typical notepad you take notes on, okay?

Adam:

Mhmm.

Dax:

You take your shitty little notes, you know, few comments here and there, whatever. When a meeting is done, AI will go through your notes and augment them with what it knows about the meeting. So if you're like, oh, it knows what you were talking about, saying this is a priority and it'll like make your notes much nicer.

Adam:

Mhmm.

Dax:

And it's just like a one step process. So it doesn't feel like an AI product, it just feels like a I magically good take notes with the same habits that I've had forever.

Adam:

Mhmm.

Dax:

And then at the end, I just get much better notes than I would do with any other app. And I think this is kind of what you're talking about where they're reimagining it and they've done it in a way where it's not like you need a chat with my with my AI bot, right? It's like totally invisible.

Adam:

Super smart.

Dax:

So I think we'll to see products that they are technically powered by AI but it's invisible. The only way you can tell is the outcome or the quality of the product is much higher Mhmm. Just because all of these structuring, unstructured data problems are like effectively solved now.

Adam:

Man. Does that make sense? There yeah. It makes a ton of sense. I've already downloaded Granolah now.

Adam:

I feel like this is very exciting as a person who has an entrepreneurial side. Just kind of makes you want to build like a million companies. Not a million.

Dax:

Just build one.

Adam:

Yeah. Just like one company. It just makes you want to build something, doesn't it? Like, this it feels like the Wild West, it's like starting over. Like, all the digital products we use just could be reimagined and there's so many categories of those and it kinda makes you just wanna like build some of them.

Dax:

I do think though that people should be aware that this isn't a reset to like 2010. Because in 2010

Adam:

What was what was 2010?

Dax:

The like, you know, it was a similar situation like nothing was built and there was like all these opportunities to build these pretty like basic straightforward applications.

Adam:

Wait. 2010, what was the new thing that enabled like mobile? What are you talking about?

Dax:

Just just like more internet, more web, more capability of like SaaS, was kind of created in that era, all that stuff.

Adam:

Gotcha.

Dax:

In that time, you're you were shifting people from not using computers to using computers to solve this problem. So as much as it feels like, we're in a reset and there's always a new opportunity, it's not the same because you can't just deliver an MVP any. Oh, sure.

Adam:

You can

Dax:

deliver an MVP in 2010. But if you wanna build a new email AI product, you need to build something as good as superhuman as a floor. Mhmm. And then you can do the side The that's

Adam:

extra stuff.

Dax:

Innovative, right? Yeah. So Okay. It's still gonna be quite hard just because the bar is very high to get something to switch from something that just out of norm all all the normal app features are pretty exhaustive and and and work pretty well. That said, that side of things has also just gotten easier to do as well.

Dax:

But Mhmm. Yeah. I am feeling this with Radiant because, yeah, categorizing financial transactions was very very difficult

Adam:

Mhmm.

Dax:

Like prior to AI. And now it can do like a really good job even like a shitty thing I implemented. I like was able to go through my stuff with like And I've done this for years, right? Like agile and business transactions, I've gone through every single one of them for years and years. And just having AI do a first pass and then me doing a second pass, it's much better.

Dax:

And this is just the beginning of Yeah. Of all this stuff. But we still have to build like the entirety of a straightforward app and you have to do that while the incumbent fails to do the new thing, which I think will happen. It's just, you know, not as easy as it as it seems.

Adam:

Yeah. There's like the table stakes part that's kind of boring where you just have to have all the features that people expect from an app like that in order to unlock the new way of thinking about it. So for the granola case, it was like they had to build an actual note taking app and all that comes with it.

Dax:

That's a good example of something that works because those table stakes scope is is really small. Yeah. And they benefit from this new dynamic of not having to do a 100 integrations with every single like we support Zoom, we support Google Meet, we support

Adam:

And how did you explain how that dynamic came to be? Because I get it for like recording audio, it just works for everything. But what you're saying is this whole era of not integrating directly with stuff? What what's that about?

Dax:

Yeah. So let's say you're like I mean, let's let's take we're not actually doing this but for Radiant, there's 5,000 financial accounts that we need to for all the various places people have

Adam:

their

Dax:

data. Could just send AI to go like visit this site for you and like figure out how to pull out your information instead of like manually doing integration Mhmm. With which is each thing. Because AI can operate at like a like one level down, like it doesn't need an API. A developer needs an API, like an AI agent like in theory doesn't need one.

Dax:

So you can kind of like give it a general set of instructions that'll work on any raw input. Anywhere where you like needed all these like nice clean integrations that you can probably make do with a much messier like unsanctioned integration.

Adam:

Interesting. Okay. That didn't really answer my question. I mean, I don't feel satisfied. Maybe it did but I like, I I think it's like there there's another example and I can't remember.

Adam:

I feel like there is another company where it was like, oh, that's a clever way of integrating with everything. Oh, no. It's the conversation we had about like an AI tool that just looks at the file system. Use that as a source of truth and then you don't have integrate with every editor. You just interact with the file

Dax:

system. Yeah.

Adam:

Yeah.

Dax:

There you go.

Adam:

So that that's like a very clever way to get around this, like, this thing on your landing page where you have all the things you support. Yeah. It's like, just what's a common denominator?

Dax:

The other side of this though is if you look at a lot of these products like granola, like there was the other one that I forgot the name of it, it records everything. It takes like a screenshot every three seconds and then like has AI index it and you can ask it like, hey, what was that thing I read the other day about whatever? So you see how all of these things are native apps at the OS level? It just brings up the question like, isn't Microsoft and Apple is gonna bake these in? Oh, yeah.

Adam:

If you're if you're building that kind of stuff, it's scary.

Dax:

Yeah. If you think about like this stuff, like we're getting these like one off solutions that people come up with. But at the end of the day, if we'll just integrate it at the OS level, it would just work everywhere and kind of be just a lot more awesome. Yeah. So it feels like that should be the ultimate

Adam:

The Apple intelligence kind of thing, like Apple intelligence should do that stuff if it ever actually does anything.

Dax:

It sucks. Apple intelligence sucks, but in theory

Adam:

What does does it even work yet? Like, I don't even think they've Did they like turn it off because it was like doing bad things? It's

Dax:

I've had it for a while and I have not used it once. I think somehow it's made things even worse. I feel like I used it even less

Adam:

now than I used to. Don't know. I don't

Dax:

know what they're doing. It's it's it's pretty bad.

Adam:

Hopefully, they do that thing where they catch up really fast because I would like Apple software to be good because I love their hardware.

Dax:

Yeah, we'll see. But I will say this this type of thinking is new for me where I'm like, see how I described a very clearly good opportunity and then the ideal which would be like, you know, Apple, Microsoft integrating. But that ideal might be ten years away. So there's still plenty of time to make money, know, those in Yeah. That But I've like shifted to like not Like if I can see the ideal and it's not like aligned with what I'm doing, I just don't wanna work on it.

Dax:

Just feels bad to me now. Like

Adam:

even if it's ten years, you just don't wanna invest in that idea.

Dax:

Like, I I wanna have a real shot of building the ultimate thing even if that means even if the opportunity is great otherwise.

Adam:

Are you quitting terminal? Is that what you're saying? Is it not AI enough for you?

Dax:

It's not AI enough.

Adam:

You missed the meeting yesterday. I'm just saying.

Dax:

I was the only one that remembered the meeting.

Adam:

Yeah. That's That's

Dax:

the funny part.

Adam:

Was a meeting.

Dax:

We have weekly Wednesday meetings I like, oh, I can't make it. So I posted at 02:30 when have the meeting. Hey, guys, I can't make the meeting. And nobody else said anything that no the meeting didn't happen, so everyone missed it. I was the only one that actually remembered that it was supposed to happen.

Adam:

It only would have happened if you started it, but the fact that you didn't start it because you weren't gonna make it, it's funny.

Dax:

There's something else I wanted to talk about. It was totally unrelated to all of this.

Adam:

Totally unrelated to AI and apps and aliens?

Dax:

Mhmm. Yes. I posted a video last week or was it earlier this week? No. Was it was on Sunday.

Dax:

Posted on Sunday. Best video I've ever made in terms of

Adam:

Oh, really? Reviews. Yeah. I gotta check out the SST YouTube.

Dax:

A video. Again, I think it's it's really not the execution of the video. I think we're just picking like some pretty good topics.

Adam:

What's your handle?

Dax:

Ugh. I did it I did it it's just SST, so just the

Adam:

Nope. That's something Korean. That's definitely not it.

Dax:

What? Really?

Adam:

At s I s

Dax:

mean, I guess

Adam:

What what s s t You

Dax:

don't you don't have to look it up. I'll just tell you.

Adam:

I got it. No, I got it. I don't use my computer. Is it that one?

Dax:

Yeah. So I made a video on my like my remote dev set. This is

Adam:

Oh, I've been wanting this video. I can't believe I didn't see it. How do I not see This is how like big the world is. Anytime you think like everyone just sees all your stuff, I if anyone sees your videos, I should see your videos. Right?

Dax:

That's true. Yeah.

Adam:

And I didn't know you made this video.

Dax:

But like, are you ever on YouTube?

Adam:

No. No. I go to YouTube from Twitter links.

Dax:

Are you

Adam:

Yeah. Then why

Dax:

would you see it? Oh, because Are you

Adam:

on Twitter? Think I would see I mean, sometimes. You would think I would see your tweets. I don't know.

Dax:

That's true.

Adam:

I feel like we're friends and I should know when you make a good video that I really wanna see. And this is one I've wanted you to outline because I didn't wanna bug you too much and be like, hey, could you tell me how you do the remote team mux thing? But now you've just made the video and I can watch it like every other normie. This is awesome.

Dax:

Yeah. It was a I think a lot of people were waiting for it which is why I think it it did pretty well. So this is our best performing video ever which we're really happy about.

Adam:

I love the title. I don't use my computer. Yeah. I mean the the thumbnail. Yeah.

Dax:

So YouTube comments, let's talk about YouTube comments real quick.

Adam:

Oh, yeah.

Dax:

For me personally, this is where I experienced just like the dumbest of all humanity, think. It is it's really wild that people Like I've been on Twitter a long time, of course I get dumb annoying comments there, but YouTube somehow just consistently tops it. It surfaces persona that I run into a lot on the internet and to me it's like a very miserable persona. It's a persona of someone that thinks that every single thing they interact with is a scam somehow. Like they're like, they're so eager to be like I think what's driving them is they wanna feel like they're smart and they like picked up on something that everyone else is falling for.

Dax:

Mhmm. But they're so desperate for that moment that every single thing that they perceive, they like project onto it. That oh, this is like a scam somehow. Yeah. So a bunch of people were just like, this is an ad or like, they were talking about how like, I only do this because it's it's free.

Dax:

Because I because I mentioned that my server that I use now is sponsored. But like, I've been doing this for you

Adam:

Shout out to reliablesite. Site. Yeah. Yeah. It's very reliable.

Adam:

Well, like I for

Dax:

it for like Yeah. Before they before I got that deal.

Adam:

Mhmm. And

Dax:

also in the video, I outlined how you can start really small and the entry price for this, again, people love saying $5 VPS. There's just a $5 VPS. Realistically, maybe more like 15 for something that's decent but reasonable price. But everyone was just like, as soon as their brains work together like, oh, this is the angle. A bunch of comments were were around talking about how like, I was trying to trick them into doing this because it's expensive.

Dax:

And I'm just like, how like how do you go through life like this? Like everything must be so miserable if you're just perceiving it as like, every person you interact with is trying to rip you off somehow, you know.

Adam:

Yeah. The internet kind of sucks. It's kind of amazing, but it also kind of sucks. I'm just reading YouTube comments now. I wish I hadn't.

Adam:

Sorry. Would you just not just don't don't remind me that YouTube exists and I'll be a happier person.

Dax:

That's funny. What's even at the top right now? I think one of those is probably at the top.

Adam:

So the it's funny. I just saw Kevin Naughton commented Yeah. An excuse to not do any work for the next three or four weeks. I really do need to spend like two days and just like copy all your NeoVim setup and my NeoVim is so bad right now.

Dax:

I know.

Adam:

I just need to do all that work and I just it's so hard to take a time out. It's that stupid meme that I do hate because I resonate with it of like the cavemen with like square wheels and they're like, too busy. Leave me alone. The guy's like, but here's a wheel. It's that, but it's just so hard.

Dax:

Maybe you should just go use cursor.

Adam:

You know what? I've actually thought about downloading it. I think that I'm doing it right now. I do want to

Dax:

I have it downloaded.

Adam:

Yeah. I want to download it. Like, why have I it's like, all this stuff is free. Paid for by VCs. Why am I not using all of it?

Dax:

Just It's play around with not free, but like It's free. You have to pay for it's not crazy. Yeah, it's not that expensive.

Adam:

I just assumed it was free.

Dax:

I just It's just so miserable for me going Yeah. This is like another point of stress for me, which is stress as being very dramatic. Stress around my editor. I really like Neovim and it is truly incredibly productive.

Adam:

Mhmm. Same.

Dax:

This cursor style of thing, if it continues to get better, that's just gonna be the most productive thing. Yeah. But it doesn't address the parts that I particularly find annoying. I hate the clunkiness and the slowness of Versus code and navigating and stuff. And yes, you're doing all that less with this type of thing but it's not taking it to zero.

Dax:

I don't see why Neovim would get something that's equivalent. I've seen the current effort for it

Adam:

Yeah.

Dax:

And I like go visit the GitHub and I like read it like once a week and I am just like, this this just doesn't feel like it's gonna be good. There's like so much setup involved and

Adam:

It's yeah. It's the we have cursor at home and it's like cursor at home is like four libraries duct taped together in like socks. Like, why am I installing something like that on my machine? What is going on? It's like, there's too much.

Adam:

There's too much steps, too many steps.

Dax:

I don't mind switching editors. I just wish the foundation that this new stuff was built on was not Versus Code because Versus Code sucks. That said, I think I think Zed will probably cause they're in this hyper competitive mode.

Adam:

Wait. You you think they will what?

Dax:

I think their AI stuff will get as good as cursors, if not better.

Adam:

So so they are working on AI stuff then?

Dax:

They they have to be. I mean, have to Because

Adam:

I I just I just had the thought in my sleep last night, which is just an indictment on my sleep. I had the thought like, oh, poor Zed. Like, Zed how does Zed have a chance when there's like all these AI things now, but they're doing the AI thing? It's like there's so many editors already. If you're not an AI editor, good luck.

Adam:

Right?

Dax:

Yeah. No. It's it's true. Like they they have a tough battle because they okay. It it kinda goes in two directions.

Dax:

On one hand, like, yeah, it was way faster to ship cursor by building on Versus code. On the other hand, I've just found as I get older that doing the more extreme thing always ends up having a good benefit that you can't predict. So them going ground up, building a new editor, way harder, all this shit fast mindset would be like, that's a waste of time, just focus on the part that differentiates AI part. But I can see how actually no, like this is gonna end up being the thing that wins. So to me it's plausible, like I don't I don't think they're screwed And that they are gonna do AI stuff.

Adam:

Yeah. I just didn't even know they were working on it. If they're working on the AI stuff, then yeah. Good for them.

Dax:

Did they And they're not built on

Adam:

I have no idea. I don't keep up on this stuff. I've just I'm I use NeoVim. Someone said use NeoVim, so I do.

Dax:

I mean, they they say AI in their integrate upcoming LM, it's your workflow, generate transform, analyze code. So and and cursor's not a lot of features. It's like a really small set of features, to be honest.

Adam:

I've never played with it. I'm literally setting it up right now.

Dax:

But yeah, so I'm like, okay, that gives me some hope because maybe the editor experience won't suck. But then it's not in the terminal anymore, so then my whole setup is now like a lot more confusing. Like I like having everything in a single terminal Yeah. Switching between between it.

Adam:

Yeah. All my muscle memory is around like switching between TMux panes and doing all this stuff. And if I'm just in some editor now, I guess like I can get the Vim experience in the files, the actual files I'm modifying. But like I okay. Can I go back to something to just on behalf of the normies that listen to us?

Adam:

Why is Versus Code bad again? I know we all hate Versus Code, but just someone remind me, why do we why is

Dax:

it Whenever I try to use it, it's like a slow piece of shit and the VIM emulation is like bad. So

Adam:

it's slow.

Dax:

Yeah. It just it just feel to me, feels bad to use.

Adam:

Okay. I just take everyone's word for it. When everyone's like making fun of Versus Code, I'm like, yeah, Versus Code. But I didn't actually know why.

Dax:

It just doesn't feel good to use. That's what it all comes down to for me.

Adam:

Okay. Okay. Well, I'm gonna try cursor. I'm gonna give it a go. Hope it doesn't botch the whole terminal code repo.

Adam:

YOLO. Here we go.

Dax:

Yeah. Zed does have this they have their own like remote protocol thing, so I could continue to like effectively host Zed on my server even though the front end of it is running on my machine. That's cool. But again, then I have to like have like a separate terminal window unless my terminals run inside of Zed.

Adam:

Ah, just use the integrated terminal. I hear it's good.

Dax:

Skeptical, but Skeptical.

Adam:

I'm gonna give it a shot. I'll let the listeners know if cursor's good. They probably already know,

Dax:

but I'll let you know. No. Use cursor and use zed. And then go fix your NeoVim.

Adam:

Yeah. I need to fix my NeoVim. Okay. I'll try zed. If zed has AI stuff, I'll start there actually, because I'd rather use the thing that you think is good generally in life.

Dax:

Do do they does it introducing zed a I, this was like in August, but they're definitely stuff.

Adam:

Definitely stuff. The zed what is it? Zed.dev? The editor for what's next with humans and AI. Let's go.

Dax:

I had this thought the other day, I was like, if you're like a VC funded company, you've probably like shifted towards AI. Like almost if you look up anyone's websites, no matter how random it is,

Adam:

like Mhmm.

Dax:

They seem to like really focus on AI. Like most of them just took their existing slogan and added like and AI to it. Wait, is that literally what Zed

Adam:

did? Maybe.

Dax:

Yeah. With humans and AI.

Adam:

And AI.

Dax:

So I I saw I saw something the other day and I was like, yeah, I'm looking at Terceau's website and at the bottom now they have unlimited databases, personalized scale, supercharge, which you know, probably was there before. Your LLM applications. So they just like add

Adam:

that. Uh-huh.

Dax:

They're like, okay, this this we've all observed this, you know, whatever. But then I think about, okay, there's VC funded companies at this stage that have not done this at all. The guys that he hasn't done this, but like ignoring us. And I'm like, what is that like? I'm like, yeah, like Bun didn't go and add like

Adam:

Yeah. The

Dax:

best way to run JavaScript for humans and AI, you know?

Adam:

That's a

Dax:

good point. I'm not making I'm not making fun of Zed because with Zed it like actually makes sense. A lot of No, does. Like just general purpose things have now added and AI to it. Mhmm.

Dax:

So I'm like, how do how are they thinking about this stuff? Like, they're just in a way like heads down ignoring it. I'm sure they're not actually but like, you know, their strategy is heads out ignoring it. Yeah. Oh, what?

Dax:

Alright. This is probably a coincidence but I went to bun site and they have a used bisection and one them is mid one of them is mid journey. So Oh, so they also kind They're like like, you know, probably is just a coincidence.

Adam:

Tip of the hat to AI.

Dax:

Used by x type form mid journey and tailwind.

Adam:

That's an interesting collection of companies.

Dax:

You know who else uses it?

Adam:

Terminal. Terminal. We gotta get the terminal on I the bun Let's go.

Dax:

I think I might be the number one bun user. I'm not gonna explain this. I think I'm the number one bun user because I use it I've

Adam:

been bun pilled. I'm enjoying bun quite a lot because I just copy everything you do and

Dax:

I cannot stop talking about of like how good their product execution is. Like, it is so Yeah. They're incredible. Because every single time they put out a feature, I've been like, I don't get it. And then fast forward three weeks later, I'm using it.

Dax:

Like it just like invisibly just snuck into every little piece. So we're launching a new update in the SD console. We have this workflow section as the config where you can like set up your CI steps. And before we we didn't let that be configured, so most people don't have to, the defaults make sense. But if you wanna configure it, we were like, okay, how do we like let you run shell scripts, like in JavaScript and have your own JavaScript conditionals?

Adam:

Mhmm.

Dax:

And we're like, okay, fuck it. We're just gonna drop bun shell in there. Got it. So now, s t config is just like your workflow is just bun shell and they figured out all that stuff.

Adam:

Love it.

Dax:

So Really great product execution. Amazing.

Adam:

There's nothing better than that, like, await dollar sign and then put your shell command in there. That feels so good.

Dax:

Yep. Yep. Yep. Yep.

Adam:

Yep. I gotta For non biological reasons. Okay. Another biological.

Dax:

No no one believes you.

Adam:

Just I gotta go, Dax. When I say I gotta go and you're like, one more thing and then you have like four more things. We could pause if you wanna do a two hour episode. Okay.

Dax:

No. That's fine. You can go. You don't wanna you wanna talk to me. It's fine.

Adam:

I wanna talk

Dax:

to you.

Adam:

I

Dax:

just You don't wanna talk to It's fine. I'm going to see myself. This is our last episode ever. We're not anymore.

Adam:

Stop it.

Dax:

Adam doesn't wanna talk to me.

Adam:

Stop it. Okay. I'm going. Alright. Too.

Creators and Guests

Adam Elmore
Host
Adam Elmore
AWS DevTools Hero and co-founder @statmuse. Husband. Father. Brother. Sister?? Pet?!?
Dax Raad
Host
Dax Raad
building @SST_dev and @withbumi
How Dangerous Can AI Get, Dax is Down on DeepSeek, and AI First App Development
Broadcast by