I didn’t even realize these were being manufactured by Sparkfun now. I’ve bought all my Teensy boards straight from the original designer: https://www.pjrc.com/store/ (It’s been several years since I’ve bought any parts)
- 0 Posts
- 20 Comments
xthexder@l.sw0.comto
Technology@lemmy.world•UK Expands Online Safety Act to Mandate Preemptive Scanning of Digital CommunicationsEnglish
2·4 days agoI’d also argue a human monitoring your conversation would likely make similar mistakes in judgement about what’s happening, and this kind of invasion of privacy just isn’t okay in any form. There could be whole extra conversations happening that they can’t see (like speaking IRL before sending a consentual picture).
xthexder@l.sw0.comto
PC Gaming@lemmy.ca•Larian's head writer has a simple answer for how AI-generated text helps development: 'It doesn't,' thanks to its best output being 'a 3/10 at best' worse than his worst draftsEnglish
2·6 days agoI’ve seen some horrendous systems where you can tell a bunch of totally separate visions were frankenstein’d together
My experience has been that using AI only accelerates this process, because the AI has no concept of what good architecture is or how to reduce entropy. Unless you can one-shot the entire architecture, it’s going to immediately go off the rails. And if the architecture was that simple to begin with, there really wasn’t much value in the AI in the first place.
I don’t think File Explorer on Windows uses fork() to copy files? If it does, that’s insane. I don’t think git calls fork per-file or anything either, does it?
xthexder@l.sw0.comto
PC Gaming@lemmy.ca•Larian's head writer has a simple answer for how AI-generated text helps development: 'It doesn't,' thanks to its best output being 'a 3/10 at best' worse than his worst draftsEnglish
5·7 days agoThis sounds like it takes away a huge amount of creative freedom from the writers if the AI is specifying the framework. It’d be like letting the AI write the plot, but then having real writers fill in details along the way, which sounds like a good way to have the story go nowhere interesting.
I’m not a writer, but if I was to apply this strategy to programming, which I am familiar with, it’d be like letting the AI decide what all the features are, and then I’d have to go and build them. Considering more than half my job is stuff other than actually writing code, this seems overly reductive, and underestimates how much human experience matters in deciding a framework and direction.
I fully blame this on NTFS being terrible with metadata and small files. I’m sure everyone’s tried copying/moving/deleting a big folder with 1000s of small files before and the transfer rate goes to nearly 0…
On the bright side, you’re getting paid to wait around ( /s because I know the feeling, and it’s just slow enough you can’t step away and do something else)
xthexder@l.sw0.comto
PC Gaming@lemmy.ca•Larian's head writer has a simple answer for how AI-generated text helps development: 'It doesn't,' thanks to its best output being 'a 3/10 at best' worse than his worst draftsEnglish
12·7 days agoWhat improvements have there been in the previous 6 months? From what I’ve seen the AI is still spewing the same 3/10 slop it has since 2021, with maybe one or two improvements bringing it up from 2/10. I’ve heard several people say some newer/bigger models actually got worse at certain tasks, and clean training data is pretty much dried up to even train more models.
I just don’t see any world where scaling up the compute and power usage is going to suddenly improve the quality orders of magnitude. By design LLMs are programmed to output the most statistically likely response, but almost by definition is going to be the most average, bland response possible.
xthexder@l.sw0.comto
PC Gaming@lemmy.ca•Larian's head writer has a simple answer for how AI-generated text helps development: 'It doesn't,' thanks to its best output being 'a 3/10 at best' worse than his worst draftsEnglish
11·7 days agoThis is based on the assumption that the AI output is any good, but the actual game devs and writers are saying otherwise.
If the game is too big for writers to finish on their own, they’re not going to have time to read and fix everything wrong with the AI output either. This is how you get an empty, soulless game, not Balders Gate 3.
Legitimately it is a winning strategy: https://www.history.com/articles/us-invasion-of-panama-noriega
xthexder@l.sw0.comto
Technology@lemmy.world•X pulls Grok images after UK ban threat over undress toolEnglish
8·8 days agoI don’t think it really matters how old the target is. Generating nude images of real people without their consent is fucked up no matter how old anyone involved is.
xthexder@l.sw0.comto
Technology@lemmy.world•The Guardian view on granting legal rights to AI: humans should not give house-room to an ill-advised debate | EditorialEnglish
9·9 days ago“A computer can never be held accountable, therefore a computer must never make a management decision.”
– IBM Training Manual, 1979
We’re going so backwards…
xthexder@l.sw0.comto
Technology@lemmy.world•Dell says the quiet part out loud: Consumers don't actually care about AI PCs — "AI probably confuses them more than it helps them"English
2·10 days agoThe diminishing returns are kind of insane if you compare the performance and hardware requirements of a 7b and 100b model. In some cases the smaller model can even perform better because it’s more focused and won’t be as subtle about its hallucinations.
Something is going to have to fundamentally change before we see any big improvements, because I don’t see scaling it up further ever producing AGI or even solving any of the hallucinations/ logic errors it makes.In some ways it’s a bit like the Crypto blockchain speculators saying it’s going to change the world. But in reality the vast majority of applications proposed would have been better implemented with a simple centralized database.
xthexder@l.sw0.comto
Technology@lemmy.world•"Microslop" trends in backlash to Microsoft's AI obsessionEnglish
9·11 days agoIt must be hard to admit he spent billions on a slop machine. Sunk cost fallacy is probably one of many things they’re fighting.
xthexder@l.sw0.comto
Technology@lemmy.world•Grok AI still being used to digitally undress women and children despite suspension pledgeEnglish
9·12 days agoAI Company: We added guardrails!
The guardrails:

xthexder@l.sw0.comto
Technology@lemmy.world•Microsoft Office has been renamed to “Microsoft 365 Copilot app”English
1·12 days agoWell on the bright side, maybe in a few years when people search for “office software” they’ll be directed to libreoffice instead of Microsoft
xthexder@l.sw0.comto
Technology@lemmy.world•How the AI ‘bubble’ compares to historyEnglish
1·17 days agoIt’s not actually the transistors that break down in flash memory. Flash memory works by storing charges in what is effectively a grid of capacitors, and in order for the data to remain stored, the insulating oxide layers in the cells need to be preserved. Every time a cell gets written, a charge is forced through the insulation with high voltage, and this degrades the insulation. A single flash cell might only have a few 1000 writes before this insulation goes bad and it no longer holds data. Modern SSDs have wear levelling techniques to make the drive as a whole last longer.
Transistors on the other hand don’t have any inherent degradation that I’m aware of other than external factors like corrosion. The first thing that’s likely to die on a GPU is the electrolytic capacitors in the power filtering electronics, which have fluid in them that dries out over many years.
xthexder@l.sw0.comto
Technology@lemmy.world•A San Francisco power outage left Waymo's self-driving cars stranded at intersectionsEnglish
1·27 days agoHonestly, I’m happy they picked this as a default “car doesn’t know what to do” scenario. From what I’ve seen Tesla’s default is to just ignore the unknown thing, I wouldn’t be surprised if Robotaxis would have just treated all the blank lights as green.



It seems like SketchUp uses OpenGL, which should be supported just fine by a linux GPU driver. I haven’t tried it myself, but you could maybe try running it through Proton (idk if there’s a way outside of Steam?)