Mozilla recently rolled out a new on-device AI feature in Firefox, but it's already drawing complaints from users reporting high CPU usage and faster battery drain.
I don’t think the centralised approach works either. If you bake that grouping metadata of individual popular pages into Firefox you have an issue with keeping it current if page content changes. And you have a difficult trade-off between covering enough pages vs not blowing up the size too much. And the approach can’t work for deep web pages, e.g. anything people can only see when logged in.
Ignoring all that: The groupings you could pre-process would be static and determined over some assumed average user behaviour, not an actual cluster of a specific users themes. You take some hardcore Warhammer 40k fan, and all his tabs on minis and painting techniques and rulebooks and fan media, and apply the static grouping then it all goes into “Warhammer”. However if you ran it locally it might come up with “Painting” “Figures” “Rules” “Fanart” or whatever. It would produce a more fine grained clustering for someone who is deep into a specific niche interest, and a more coarse grained one otherwise.
So I think fundamentally it’s correct to cluster locally and dynamically for a usable result. They need to make it opt-in, and efficient enough. Or better yet they could just abandon the idea because it’s ultimately not that much use compared to the required inference cost.
The problem with useful suggestions like these is that they can’t be used when the MO is to shove AI into everything and anything to seem relevant, and chase the pot of cost savings at the end of the rainbow which is totally gonna turn up any day now, we think, we’re pretty sure anyway.
I don’t think the centralised approach works either. If you bake that grouping metadata of individual popular pages into Firefox you have an issue with keeping it current if page content changes. And you have a difficult trade-off between covering enough pages vs not blowing up the size too much. And the approach can’t work for deep web pages, e.g. anything people can only see when logged in.
Ignoring all that: The groupings you could pre-process would be static and determined over some assumed average user behaviour, not an actual cluster of a specific users themes. You take some hardcore Warhammer 40k fan, and all his tabs on minis and painting techniques and rulebooks and fan media, and apply the static grouping then it all goes into “Warhammer”. However if you ran it locally it might come up with “Painting” “Figures” “Rules” “Fanart” or whatever. It would produce a more fine grained clustering for someone who is deep into a specific niche interest, and a more coarse grained one otherwise.
So I think fundamentally it’s correct to cluster locally and dynamically for a usable result. They need to make it opt-in, and efficient enough. Or better yet they could just abandon the idea because it’s ultimately not that much use compared to the required inference cost.
The problem with useful suggestions like these is that they can’t be used when the MO is to shove AI into everything and anything to seem relevant, and chase the pot of cost savings at the end of the rainbow which is totally gonna turn up any day now, we think, we’re pretty sure anyway.