

It’s rather telling though isn’t it that Nintendo then abandoned the technology as soon as they went over to their next console. If it had been popular they would have included it on the switch.
It’s rather telling though isn’t it that Nintendo then abandoned the technology as soon as they went over to their next console. If it had been popular they would have included it on the switch.
Every time an AI ever does anything newsworthy just because it’s obeying it’s prompt.
It’s like the people that claim the AI can replicate itself, yeah if you tell it to. If you don’t give an AI any instructions it’ll sit there and do nothing.
That is the problem with AI, if I have to check the output is valid then what’s the damn point?
There is a piece of software which will take a word document and convert it into an embossed 3D print file. So you could always just skip the middleman and 3D print yourself a plaque version of your document instead.
They would have to become sci-fi level capable before they would be considered household staple items.
I didn’t even know he had a brother.
The Figure robots don’t really have cables that are visible. I think the bigger takeaway is that the movement is too fluid, every robot I’ve ever seen moves like robot, mostly because of how heavy they are. That thing’s moving like a human gymnast probably because it is a human gymnast being motion tracked.
Yeah Spain’s doing the same thing though so I don’t think it’s got anything to do with the EU.
They had this at CES. But the online footage doesn’t really capture it very well so I have no idea how good the product is.
If it’s pulled off where I suppose it could be useful but it just sort of seems gimmicky to me. Remember 3D TVs, remember how there was absolutely no content for them and no one bought them. Yeah.
Have people memory holed this?
I have no idea what you mean. I’ve never heard of a new 3DS when did it come out?
Firstly the 3DS couldn’t pull it off. Secondly this is a completely different technology. Basically it’s just tracking your eyes and Parallax mapping the screen effects.
It’s still a 2D display in reality.
They’re trying to imply that they would get kidnapped and forced to work for openAI or something. In reality they’d probably just get job offers, which they’d accept because as crap as the US is right now, it’s got to be better than China.
It is simply that China doesn’t want their citizens working for American companies unless China is the one that’s placed them there.
I can’t remember where I heard this quote but somebody was one saying that the television wasn’t invented because somebody wanted a television, it was invented because thousands of scientists over decades worked on ancillary technologies that culminated in the television.
The most recent major invention in jet engines came from the 3D printer industry. I don’t think anybody who was working on 3D printers was really necessarily thinking that this could be applied to jet engines, it’s just how it turned out.
Most of the mathematics behind modern computer graphics were come up with in the 18th century. The equations were just an interesting mathematical oddity for centuries until the technology caught up with the theory. Obviously the mathematician wasn’t trying to optimize rendering pipelines for a technology that wouldn’t exist for over 200 years.
Technological innovation is like evolution, there isn’t necessarily a target objective, things just evolve over time. You don’t just throw a lot of people at a problem and expect a solution, academics understand that which is why they do research. Research is not necessarily in aid of anything in particular, it’s just more knowledge to the toolbox of humanity, amd who knows how that knowledge might subsequently be applied.
Currently the world is getting a lot of that.
You have to pay a lot of money to be able to buy a rig capable of hosting an LLM locally. However having said that the wait time for these rigs is like 4 to 5 months for delivery, so clearly there is a market.
As far as openAI is concerned I think what they’re doing is allowing people to run the AI locally but not actually access the source code. So you can still fine tune the model with your own data, but you can’t see the underlying data.
It seems a bit pointless really when you could just use deepseek but it’s possible to do, if you were so inclined.
They kind of have to now though. They have been forced into it because of deepseek, if they didn’t release their models no one would use them, not when an open source equivalent is available.
I have no idea how it works I’m not sure if Lenovo have released that information.
It was at CEX but all the footage of it is mostly of people going “it’s really interesting but it doesn’t show up on camera”. So is anyone’s guess how good this will be.
I’m inclined to feel like it’s a gimmick though. Does anyone want 3D displays?