

22·
5 days agoThe difference is that we’ll just be running small, specialized, on-demand models instead of huge, resource-heavy, all-purpose models. It’s already being done. Just look at how Google and Apple are approaching AI on mobile devices. You don’t need a lot of power for that, just plenty of storage.
The guy stomps on innocent, unsuspecting little turtles and occasionally burns them alive. Of course it should be flagged.