EXCLUSIVE: an OpenAI x Nate Conversation on Atlas, AI Agents, and the Future of Work

3.5K views November 11, 2025

What’s really happening inside OpenAI’s new AI-powered browser, Atlas?
The common story is that it’s just “ChatGPT with tabs” — but the reality is that Atlas could redefine how humans and AI share the web itself.

In this exclusive interview with OpenAI’s Ben Goodger — the engineer behind Chrome, Firefox, and now Atlas — I go deep on:
• Why AI agents mark a new chapter in browser evolution
• How LLMs are transforming engineering speed and creativity
• What security design trade-offs come with agentic browsing
• Where the future of web interaction is headed next

Atlas isn’t just another browser — it’s a glimpse into how the next generation of AI will live on the web. Read Ben’s blog for the technical deep dive: https://openai.com/index/building-chatgpt-atlas/

Subscribe for daily AI strategy and news.
For deeper playbooks and analysis: https://natesnewsletter.substack.com/

0:00 This is a good one, guys. I got to sit
0:01 down with Open AAI's Ben Goodger, the
0:04 lead engineer building the Atlas browser
0:06 for OpenAI. We had a really wide-ranging
0:08 conversation. We talked about the
0:09 history of browsers. Ben, of course, has
0:11 been involved in building Chrome, been
0:13 involved in building Netscape Navigator
0:14 for those of you with gray hairs, and
0:17 now he's taking the lead on building the
0:19 AI powered browsers of the future at
0:21 OpenAI. So, I had a lot of fun. We
0:24 talked about what the future looks like.
0:25 We talked about security implications.
0:27 We talked about the way we get work done
0:29 and how that's changing. It was a super
0:31 wide-ranging conversation. I put a cut
0:33 up here uh on YouTube and if you want
0:35 the full cut, of course, you can go over
0:36 to Substack. Uh enjoy and uh yeah, let's
0:40 talk about AI powered browsers. Hey all,
0:42 I'm Nate and I have a special guest with
0:45 me today. Uh Ben, why don't you
0:47 introduce yourself?
0:48 >> Hi, I'm Ben and I am the head of
0:50 engineering for Chat GPT Atlas here at
0:53 OpenAI.
0:55 And how did you get to be head of
0:56 engineering at JIGBT Atlas?
0:58 >> I've always been interested in the web,
1:00 web browsers, uh going back to I guess
1:02 like the mid to late 1990s when the web
1:04 was first developing. Uh I was like a
1:07 hobbyist web developer building sites
1:09 just for fun. Uh and early in my career
1:12 I got involved with Mozilla was open
1:13 source project made contributions to
1:15 that ended up getting hired by Netscape.
1:18 Uh and so I came I was grow I grew up in
1:21 New Zealand. Uh I I moved up to to
1:24 Silicon Valley for a period of time. Got
1:26 to see Netscape in its final days which
1:28 was an interesting experience. But ended
1:30 on ended up moving on to Mozilla where I
1:33 helped work on Firefox the first version
1:34 of that. Uh and then moved to Google. I
1:37 was at Google for nearly 20 years.
1:39 Helped build the Chrome browser there.
1:42 And then about um almost 18 months ago I
1:45 came over to OpenAI. uh is very
1:47 interested in exploring what the web
1:50 would look like if you see it through
1:52 the eyes of maybe having an assistant
1:54 like chat GBT really at the core of the
1:56 browsing experience. Uh and so since
1:59 then have been trying to to build that
2:02 and we built uh we shipped our Atlas
2:04 product for a Mac a couple weeks ago
2:06 which is really exciting. I am really
2:09 curious to hear, you know, you've shared
2:11 a bit about what made it compelling for
2:12 you to come to OpenAI, the sort of the
2:14 piece of having intelligence on the web.
2:16 What is it that you felt like you
2:19 learned or that crystallized for you
2:21 along that journey over the last 18
2:23 months that gave you a sense of clarity
2:25 about what you wanted Atlas to be and
2:27 where you wanted to take Atlas? I think
2:29 that, you know, the Atlas product, you
2:31 know, at face value, it kind of
2:32 resembles a traditional browser. And I
2:34 think that that's important because
2:36 everyone knows what a browser is.
2:38 Everyone uses a browser on a on a, you
2:40 know, pretty frequent basis. So, I think
2:41 there was an aspect of we need to build
2:43 something that people can understand.
2:45 Um, even as we try and bring really
2:48 advanced capabilities into it. Uh and
2:50 then the other thing that I've learned
2:52 is that the pace of development of the
2:55 the tech that is happening here at
2:57 OpenAI is just so incredibly fast uh
3:00 that even some of the you know
3:02 limitations that you might see you know
3:04 one month you know won't be there the
3:06 next month. So just you know as we've
3:07 built features like agent seeing that
3:09 come together seeing that get much much
3:10 faster get much more accurate um at at
3:13 clicking on things and doing stuff for
3:15 you on the web. Uh, so I I I've learned
3:17 to be more um uh you know, I'm an
3:21 optimistic person by nature, but I like
3:22 even more optimistic about about where
3:24 this tech is going and what types of
3:26 product experiences that that will
3:27 enable. Uh what is it about your working
3:31 style or the team's working style that
3:33 has shifted over the last 18 months as
3:35 you've been working on building this
3:36 browser? Now you know for me when I
3:38 joined uh you know I'm the team manager
3:41 but you know also like when I was the
3:42 first person here working on this I was
3:44 also writing you know code doing
3:45 prototypes that type of thing uh I used
3:48 chat tbt early on extensively to help me
3:51 learn new programming languages and like
3:53 really get up and running again like but
3:55 then as we had more and more engineers
3:56 join the team and then we've also
3:58 launched products like codeex we've gone
4:00 on the evolution of codec especially
4:02 codec cli I would say in the past couple
4:05 of months even codeex is like really
4:07 changed the way in which we work and uh
4:09 what we see is that people that are
4:10 using codeex are just like so much more
4:13 productive and I think there's an aspect
4:14 to codeex which is you know it allows
4:16 people who don't code that much to do a
4:19 little bit of coding but it also helps
4:20 very experienced engineers get way more
4:23 done and I think that's what I'm really
4:25 excited about uh because a really
4:26 experienced engineer can kind of steer
4:28 codecs and and and be like just
4:30 monstrously productive
4:33 is that you just sort of situation where
4:35 you see engineering productivity
4:37 patterns starting to shift toward
4:38 multi-threadedness.
4:40 One is to sort of understand how
4:43 software that exists works and another
4:46 is to um prototype a new idea to see if
4:49 it is you know sort of the juice is
4:50 worth the squeeze uh if you like. Uh and
4:52 then lastly it's to actually get the
4:54 work done um for for implementing the
4:56 new feature. uh we work in these large
4:59 code bases sometimes actually usually
5:01 the documentation isn't where it needs
5:02 to be or the documentation is out ofd or
5:04 that that type of thing you know codeex
5:06 or other similar tools can read much
5:09 more quickly than you can and give you
5:11 like a pretty good answer very fast so I
5:13 have an idea for the product and I think
5:14 well maybe this is like super
5:15 interesting and I could spend a bunch of
5:17 time on it I can throw together a
5:18 prototype and codeex and decide hey this
5:20 doesn't quite work the way I want it to
5:22 uh and so maybe I'll I'll choose
5:24 something else uh to focus on and then
5:26 lastly you know some of our most
5:28 experienced, most productive engineers
5:30 are just using it to build features uh
5:32 in the drive. You know, either
5:33 refactoring code or just building like
5:35 the front-end code itself or you know,
5:37 any any aspect of the feature really. I
5:40 think that gets a use case I actually
5:42 had for Atlas today. Um, which was
5:44 really interesting to me. Uh, I was
5:47 looking at a GitHub repo and I was
5:50 running through and I was like, I want
5:51 to get a sense of what's in this GitHub
5:53 repo really quickly. What if I just have
5:56 Atlas look at it and I put the assistant
5:59 up on the side and I just have a
6:00 conversation with Atlas about the repo.
6:05 And what I found was there was this sort
6:07 of magic that happened when I was in
6:11 Atlas because I was working with sort of
6:14 that uh there's sort of a magic to code
6:18 comprehension I'm finding with some of
6:19 sort of the way Chad GPT like touches
6:21 and plays with code. it was able to like
6:23 click through, take control of the
6:24 screen, look at all of the different
6:26 files in the repo, and it came up with
6:28 some really, really thoughtful questions
6:31 that enabled me to get much more
6:32 fingertippy with the code much more
6:34 quickly. It was one of those magic
6:36 moments for me. So, I think the browser,
6:38 it's it's so simple, it almost sounds
6:40 silly, but like reducing the friction
6:42 for a user to access some of this magic,
6:44 it can feel like making the magic
6:46 available like in the first place. You
6:48 know, I've always been able to take a
6:49 web page, maybe you could print it as
6:50 PDF or you could copy and paste it and
6:52 paste it into a into a into chatbt, but
6:55 that's just like a few extra steps. And
6:57 when you can just bring this up, you
6:59 know, in situ and ask the question
7:01 directly, it's it's like it, you know,
7:03 suddenly it's it's there. One of my
7:05 favorite use cases has been shopping.
7:07 You know, I could do a lot of online
7:08 shopping when I'm not not working.
7:10 Sometimes I'd be looking at a product
7:11 and I can ask the sidebar, is this sort
7:14 of the best price that I can find on
7:15 this thing? And there, you know, when
7:17 paired with our search agent, like it
7:19 will go off and like browse the web and
7:21 find out if there really is that is the
7:23 best price or if there's a better deal
7:24 on it somewhere. I had one case where it
7:26 actually it hit really well. I was
7:28 looking at a pair of shoes uh and it
7:30 found the shoes available from a
7:32 different site for about $60 less. Wow.
7:35 >> And that was one of those like really
7:37 kind of wow moments.
7:38 >> Yeah. I there's something around habit
7:40 formation where when you get that
7:42 dopamine hit of wow, this is really
7:43 easy. this is something that I didn't
7:45 realize I could do. What were some of
7:47 the things that your team really had to
7:48 wrestle with around trade-offs and
7:50 decisioning to make that browser come to
7:53 life? One is the product design itself
7:56 and then the other is the the technical
7:58 infrastructure and all the magic that
8:00 went into it. So with the product um we
8:03 wanted to design something that was
8:04 really useful but also felt familiar.
8:07 Like I said everyone kind of knows what
8:08 a browser is. Um, we had, you know, lots
8:11 of debates about how we should design
8:13 features like basic aspects of the
8:15 browser. I think there was a lot of room
8:17 for innovation in those areas, but it's
8:19 also a bit of a double-edged sword. So,
8:21 we've tried to find a balance between
8:23 something that feels familiar uh and yet
8:25 still has has some improvements for for
8:28 folks that are, you know, looking to
8:29 find more efficient ways to get stuff
8:31 done. And then on the technology side,
8:34 uh this is where you know we spent a lot
8:36 of time both on the more traditional
8:38 browser infrastructure building on
8:39 chromium as well as how we built some of
8:42 the more cutting edge features like
8:43 agent. We want to build a product that
8:45 feels very fluid and fast. Uh and we
8:48 also wanted to build a very cutting edge
8:50 kind of product user experience and just
8:52 sort of the standard way in which a lot
8:54 of Chromium browsers are built just
8:57 don't make that super easy to access. So
8:59 we built a a a unique way of holding
9:02 Chromium. It's almost sort of agentic uh
9:04 in in form. We run Chromium as an
9:07 outofprocess service so that when you
9:09 start Atlas, you're not actually blocked
9:11 on Chromium starting up. So the browser
9:14 can start very very fast. Uh and sort of
9:17 takes however long it takes to start up.
9:19 And then when you run a feature like
9:20 agent, it is doing things like
9:22 synthesizing input events to click on
9:24 things. um we're able to do that in a
9:28 very robust and secure way. So there's a
9:30 bunch of stuff uh like that that we've
9:32 done. We posted a we did a technical
9:33 blog post recently that uh that covered
9:36 this in some more detail. But yeah,
9:37 we're pretty excited about that. Yeah, I
9:40 recommend if if folks listening have not
9:42 read the technical blog post, if you're
9:44 at all technically minded, it's a super
9:46 interesting read. I'm I'm sort of
9:48 curious given your experience and sort
9:51 of all the different browsers you've
9:52 worked on, to what extent does Atlas
9:56 feel like a
10:00 fully solved problem, feel like a
10:02 partially solved problem? Are there
10:04 pieces of it that you're really excited
10:05 to dig your teeth into next? Uh, yeah, I
10:08 I I think this is really the first step
10:10 in a in a long journey. Like when I talk
10:12 to people about this, you know, if I'm
10:14 talking about browser history, I would
10:16 say this is like the Netscape 1.0 moment
10:19 for this new era of like Agentic
10:21 browsers. Um we're excited to um get get
10:25 the agent feature out there. Uh but it's
10:28 also very much sort of a research
10:29 preview. We are discovering use cases
10:32 for it. Uh we're also like really
10:34 enthusiastic to hear how other people
10:37 are using it and and and we expect to
10:39 make a lot of improvements uh both to
10:42 the you know the the experience the sort
10:44 of accuracy speed. Most most folks uh
10:46 haven't seen this type of functionality
10:48 in a browser before uh and so we think
10:51 it's really important to uh bring people
10:54 along you know with with this with this
10:56 thing. You're very used to clicking on
10:57 things yourself. So having a tool that
10:59 can do that on your behalf um you know
11:01 is both exciting sometimes maybe a
11:03 little intimidating. So we want to like
11:05 be very clear about how the product
11:06 works. So the first time you use this we
11:09 give you sort of this nice disclosure
11:10 that tells you you know what what its
11:12 capabilities are and we give you some
11:13 options too. You can choose for example
11:16 uh if you want the agent to run with
11:18 your logged in sites just like it really
11:20 was browsing as you you can make that
11:22 choice or you could choose for example
11:24 to have it run logged out and then you
11:26 have to be very explicit about what
11:27 sites you want it to log into and you go
11:29 and log into those yourself and so
11:31 there's choices like that for example
11:32 that help people you know have whatever
11:35 their comfort level is uh to apply that
11:37 to their use of the product.
11:39 >> Yeah that makes a lot of sense. Is there
11:41 a use case that you heard from the wild
11:44 post launch that was most surprising or
11:47 interesting to you? Well, there's stuff
11:48 that I think yeah, super cool. Like I
11:50 said, I was like an online shopping uh
11:52 fiend and I'm like, "Wow." Like, I used
11:55 to have to go and have a different tab
11:56 where I'd go and search for those things
11:58 and then go and try them back one by
12:00 one. And so, the fact that you could
12:01 just kind of set this and forget it and
12:03 then it would try them all and then pick
12:04 the best one, that was pretty pretty
12:06 cool. Um, yeah. And then like not
12:09 personal use case but more like work use
12:11 case. Um you know I obviously had been
12:14 diving through a lot of um feedback for
12:17 from our launch the other week. So I
12:19 wanted to come up with like a quick
12:20 survey. Uh and so I I had this
12:23 discussion with chat GPT about some good
12:24 good questions. Uh and then I asked the
12:26 agent to go off and make a Google form.
12:29 Uh so I could do the survey. And you
12:31 know I've used Google forms a bunch. It
12:32 tends to be a bit finicky, the different
12:34 types of question formats and other
12:36 stuff, but Agent just figured it all out
12:38 for me. And you know, a few minutes
12:39 later, I came back, I had this form, I
12:41 could just publish it right away. I can
12:42 think of a lot of government websites
12:46 that are very frustrating to use that
12:47 feel like they were built in the 1990s
12:49 that for me fall into that category. Um,
12:52 yeah. So then if we switch modes a
12:54 little bit about the security side of
12:56 things and how you think about security
12:58 with LLMs in the browser specifically
13:00 and there are divergent opinions ranging
13:03 from this is an impossible solve to this
13:06 is tractable to we can make progress in
13:08 this area. How do you think about it as
13:10 a problem space?
13:12 >> Yeah. Yeah. I think this is interesting
13:14 for for agent which is like a net new
13:16 capability and and both the technologies
13:18 evolving. We we expect to do a lot of
13:20 work on it uh you know over the next
13:23 while. Um I talked before about um just
13:26 sort of the onboarding and some of the
13:27 choices you have about uh you know what
13:30 what sort of site access it has if it's
13:33 uh authenticated or not. There's a few
13:35 other mitigations that we have in place
13:37 like if it's going to do something
13:38 sensitive uh on your behalf like be in
13:40 your email. Uh yeah it's going to want
13:43 you to watch it. Now I the analogy I
13:45 give for this is I have a car that has
13:47 this sort of auto drive functionality.
13:49 I'll be there on the highway and I can
13:50 turn on the the the cruise control and
13:52 it will you know even take the wheel and
13:54 steer it a little bit uh for me. But it
13:56 in in return what it wants is that I
13:58 keep my eyes on the road uh and it has
14:00 this little camera in the dashboard
14:02 somewhere that like will will shut it
14:04 off if I'm not paying attention. And so
14:07 similarly in in Atlas, if you were
14:09 having the agent do work in one of these
14:11 sensitive contexts like your email, uh
14:13 it wants you to be on that tab paying
14:15 attention. If you switch away, it's
14:16 going to stop. And then another thing
14:18 that we have is like if you ever been in
14:19 a machine shop and worked with a big
14:21 piece of lathe or something else,
14:22 there's usually like a big red button
14:23 somewhere that if it starts doing
14:25 something you don't want to do, it's
14:26 very clear. You hit the red button and
14:27 it stops. And so the agent has that,
14:29 too. Uh and so if you see it doing
14:31 something that you don't want it to do,
14:32 then you just hit stop. Um these are
14:35 good tools uh to have and I think will
14:36 help people get confidence that they you
14:39 know you're the one that's still like in
14:41 control um of how it works. Yeah, I
14:43 think that makes a lot of sense. I think
14:45 that this is a moment where we have a
14:46 it's new again um and we have the
14:49 opportunity to revisit a lot of these
14:50 sort of foundational primitives. I think
14:53 that sort of brings me to an interesting
14:55 question. It feels to me like there's an
14:57 opportunity for shifting the browser
15:00 experience further. But I I'm curious if
15:02 you turn on the high beams, to use the
15:04 car metaphor. What is what does that
15:05 look like for you?
15:07 >> One of the things I thought was special
15:08 about the early web was the fact if you
15:10 go back to that time in the 1990s, you
15:12 think about how people got software, it
15:15 was to go to a store, you drive to a
15:17 store, you buy a box of a product with
15:19 some shrink wrap, and you take the discs
15:21 out and you install it. Uh, and then you
15:23 think about the web where you just click
15:24 on stuff and it comes up on your screen.
15:26 Like that was pretty magic. And then
15:28 that aspect of using the web where you
15:30 just go from site to site. It actually
15:32 what resonated with me was that it felt
15:35 kind of like how my mind worked. Now if
15:38 you fast forward to LLMs today, it is an
15:41 even more accessible I can just talk to
15:43 this thing uh and this thing will figure
15:45 out kind of what to do. That's maybe
15:47 like an idea of what the the future
15:49 looks like where instead of having to
15:50 dig through a bunch of settings menus,
15:52 you can just tell the system what you
15:54 want from it and it will just figure out
15:56 how to do it. Uh, and then if I sort of
15:58 extrapolate from there, I actually think
15:59 like a lot of what people struggle with
16:01 on a day-to-day life is is um in their
16:04 day-to-day lives is ambiguity. Um, like
16:06 there's this thing that I want, but I
16:08 don't quite know how to do it. When I
16:09 first started using ChatgBT, it was kind
16:11 of like a friend. it would say, "Oh, if
16:13 you want to do this, you should, you
16:15 know, think about taking one of these
16:16 three steps." For something that, you
16:17 know, is about betterment of yourself,
16:19 maybe, you know, you should go do those
16:20 things. But for a lot of things that are
16:22 more tactical, like it would be great if
16:24 your agent could just go off and do that
16:26 thing for you and report back on the
16:27 status of it. And so, I think, you know,
16:30 maybe there's some version of the
16:31 future, but even though I say that, I
16:33 also think that people will continue to
16:35 browse the web themselves because
16:37 there's stuff that as humans we want to
16:38 do. We want to be entertained. We want
16:40 to create things. Yeah, that that's a
16:42 really sort of rich area to dive into.
16:44 It felt like there were sort of two big
16:47 buckets that browser work falls into.
16:50 There's the delight bucket where you're
16:52 you're trying to learn, you're trying to
16:54 be curious, you're you want to be
16:55 surprised and then there's the oh gosh,
16:58 I don't want to do this and I would
16:59 prefer to avoid it and could someone
17:01 please take care of that part for me?
17:02 Mhm.
17:03 >> And as I was reflecting,
17:06 it's pretty easy to make the case from
17:08 an agent perspective that the not fun
17:11 part is something ideally in a perfect
17:13 world you'd want the agent to just go
17:14 and take care of for you. But the fun
17:17 part to imagine is how could an agent
17:20 also enrich that delightful side of
17:23 things.
17:23 >> Yeah, totally. Like I I I definitely
17:25 think you know that trying to reduce
17:27 toil is like something that we
17:29 definitely want to support. browsers
17:31 have been evolving for 25 30 years. What
17:34 does it look like to take the next step
17:37 in that process and what is the
17:39 direction the trajectory that is
17:41 changing? And I find that super
17:43 fascinating because I think we are at an
17:45 inflection moment. Um, you know, when I
17:48 think about the early web, you know, a
17:50 lot of the focus back then was just, you
17:52 know, helping people understand those
17:54 those links uh that are out there and
17:56 clicking on links and going from place
17:57 to place. We could go and find like
17:59 collections of links that people had an
18:00 opinion about if they were good or not.
18:02 As the web scaled that stopped working
18:04 and then you got search and then search
18:05 was transformative because it helped you
18:07 find like the little piece of
18:08 information or the website that you
18:09 wanted to go to. People started building
18:11 these rich apps and then as you point
18:13 out yeah inflection point now we're in
18:15 this new stage. We like made the way in
18:18 which we can interact with this
18:19 technology just radically more human and
18:22 we can scale up the capabilities of the
18:24 platform and the platform will be able
18:26 to do things on your behalf. I think
18:27 this is really the third phase uh of the
18:30 web.
18:31 >> Yeah, it's super exciting.
18:32 >> There's some some UX that we're
18:33 exploring around it a few different
18:35 ways, but there's some compelling use
18:37 cases like I think I I ran into this
18:38 before launch when I was trying to um
18:41 you just sort of synthesize a single
18:43 document from a bunch of different
18:44 sources that were opened in tabs and
18:46 then it wouldn't work. So, I kind of
18:47 wanted I don't think it's even a fixed
18:49 number of tabs. It's like I wanted you
18:51 know n basically you know subject like
18:53 context window size. Um but but yeah,
18:57 >> we talked about ways of working and how
18:58 the team is using codecs a little bit.
19:01 One of the things I've been hearing
19:02 really consistently from small
19:05 companies, large companies, individuals
19:06 is that it feels like roles are blurring
19:09 as we lean into LLMs more and more and
19:12 more. And I'm curious if you look at the
19:15 roles on your team and how they're
19:17 evolving as you guys work, you know.
19:20 Yeah. So um the
19:24 every engineer on our team is a product
19:27 engineer basically. This is someone that
19:28 like really thinks about the breadth of
19:31 the the experience. Every engineer on
19:33 the team is empowered to to sort of own
19:35 the design and development of a feature.
19:37 And so all of our engineers are talking
19:39 to users uh reading feedback you know
19:42 figuring out how to integrate it into a
19:43 future update. Um we have a bunch of
19:45 tools. not super familiar with like all
19:47 of the tools that we use, but I know
19:49 that we use a bunch of, you know, LLMs
19:52 basically to help us sift through
19:53 feedback, get common themes, you know,
19:55 that sort of stuff. So, that will help
19:57 >> you know, understand like what the top
19:59 pain points are uh that people are um
20:02 talking about at scale, you know,
20:04 through our user support forums and
20:05 stuff like that.
20:07 >> Yeah. No, it makes sense. If we pivot a
20:10 little bit uh sort of back to the
20:11 future, I would like to see this
20:13 breakthrough happen or I would like to
20:14 see this um technical challenge solved
20:17 and then we will unlock this new
20:18 experience. What pops out to you as
20:20 significant milestones in the next call
20:22 it 18 months to two years where you're
20:25 really excited to see something unlock
20:27 as we get to a particular capability.
20:32 I think people will get more used to
20:34 this more accustomed to this
20:36 functionality and so they will seek to
20:39 do more with it. That is from the
20:41 customer side and then on the product
20:43 side we will make that sort of magic
20:46 more more real uh and sort of help you
20:48 figure out the opportunities uh to to
20:50 leverage that magic. Um the capabilities
20:54 will continue to increase at a
20:56 breathtaking pace. And so what I want to
21:01 end up in is a place where
21:04 uh you can really give this tool fairly
21:08 ambiguous complex tasks and it will
21:11 break it down and figure out how to make
21:13 progress on your behalf. I used the word
21:14 toil before. You know, how do we
21:17 >> get rid of some of that that annoyance
21:19 and and and make it just sort of more
21:22 reliable, simple, trustworthy? uh you
21:25 know these are these are things that
21:26 that we want to do.
21:29 >> I one of the things we haven't talked
21:31 about that I'd be curious for your take
21:33 on so much of our browsing happens on
21:35 the phone. Are you guys thinking in
21:37 terms of mobile? Yeah, that's another
21:39 request that's coming in a lot. Uh we're
21:41 trying to figure out the best way uh to
21:43 bring this functionality to mobile. I
21:46 think one of the observations that we
21:48 have is that the way people interact
21:51 with the web is a bit different on
21:53 mobile. Um, so on the desktop platform,
21:55 you know, the browser is like an
21:57 embedded operating system. You know, all
21:59 of your favorite apps for the most part
22:00 are in the browser. Whereas on mobile,
22:02 the mobile operating system itself is
22:04 kind of like the browser. And so people
22:06 tend to have, you know, relationships
22:08 directly with specific apps. Uh, but
22:11 then, you know, for the web, there's a
22:13 couple of different use cases. One is I
22:14 want to go and read a specific website
22:16 uh that I don't have an app for. So then
22:18 the the browser form factor makes a lot
22:20 of sense. So I think we're just we're at
22:22 a stage where we're figuring out like
22:23 how we want to make browsing work uh on
22:26 on the mobile device given these
22:27 different use cases. The way mobile
22:31 works
22:33 implies dramatically different usage
22:36 patterns. What it looks like from a
22:39 small screen perspective to have the
22:41 chat assistant there and the browsing
22:44 experience. It's super interesting
22:45 challenges to get into.
22:46 >> Yeah, there's a bunch of also
22:47 interesting stuff. think like voice is a
22:49 very compelling u modality where if you
22:52 load a page that you can ask follow-up
22:54 questions and have all of that work. Uh
22:55 I think we just need to figure out the
22:57 right way to build that.
22:59 >> And you know I I would be remiss if I
23:02 didn't dive into one of the more
23:03 interesting sort of the more interesting
23:06 features that that that you've launched
23:08 with Atlas that uh and that is that you
23:10 guys bring in the chat GPT memory from
23:13 previous chat GPT conversations and that
23:16 is part of the browse experience. Uh,
23:17 I'd be curious for like the product
23:19 decisioning there and then how you how
23:22 you think about that as an asset to the
23:24 browsing experience and what that looks
23:26 like. Well, the chat memory feature of
23:29 Chat GBT is like an incredibly powerful
23:31 one. Uh, and it means that like as you
23:33 move from chat to chat that you don't
23:34 always have to start from zero, but in
23:36 some sense like makes it feel like it
23:37 knows you um a bit better in that way.
23:40 And that's a really interesting way in
23:43 which the browser becomes more useful
23:45 the more you use it. So,
23:47 One thing I always love to do as we sort
23:49 of bring this conversation to a little
23:51 bit of a close. If you could say one
23:53 thing that you feel like ah people
23:54 didn't quite get that piece of the
23:56 launch, I'd love to try and say it again
23:57 and emphasize it. What would that be for
24:00 you for Atlas?
24:02 >> So for Atlas, I think that this is a it
24:04 is a familiar tool but with this amazing
24:07 new set of capabilities.
24:09 >> Uh and so I encourage you to go and try
24:11 it out for a whole bunch of different
24:14 things. I would say even challenge
24:15 yourself to ask more questions of it or
24:18 things that even if you were thinking oh
24:21 I don't need to ask that just try it out
24:22 and see what it does I would say this is
24:24 the beginning of a journey for us to to
24:26 build this type of app uh we push a new
24:29 build every week and so as we hear more
24:31 feedback from you as you try it out and
24:33 and do a lot of different interesting
24:35 things uh we will make it better and
24:37 better
24:38 >> well thank you Ben thank you for coming
24:40 and chatting
24:41 >> yeah it's been great awesome I
24:42 appreciate