

Have you tried…
you know,
(cough)
maybe just …
(air quotes) “heavily implying”
the , ah, you know …
particular…
(taps side of nose)
ah, torrent
in
(eyebrow waggle)
ahem,
QUESTION?
Eh?
Ehhhhh?
I’m a technical kinda guy, doing technical kinda stuff.
Have you tried…
you know,
(cough)
maybe just …
(air quotes) “heavily implying”
the , ah, you know …
particular…
(taps side of nose)
ah, torrent
in
(eyebrow waggle)
ahem,
QUESTION?
Eh?
Ehhhhh?
What do you expect from running 10 and more amperes through a cord?
Well , I expect enough engineering behind it that the cord and connections don’t melt. I am an auto electrician, I routinely deal with 12v systems that draw much more than that without melting, using connections that aren’t much bigger. It’s not like it’s some mystical technology, it’s just that this setup has been done on a budget.
But it doesn’t help that every single logic gate in a graphics card is run at a speed/currents that are literally just below meltdown.
Inertia, mostly.
Of course Plex then takes advantage of that with the slow erosion of the free edition.
It’s difficult on the back end of the charger as well.
A shopping centre or rest stop can’t just spring for a few high capacity chargers for the car park. A single megawatt charger is 50 houses worth of consumption, so they now need a substation upgrade to provide what is basically a whole neighbourhood-equivalent of power.
Well, I did delete a company-mandated image from the bottom of my signature after I realised that it made even just a one-line “Thanks” email balloon out to 800kb.
People don’t just leave leaking apps out there for consumption.
Ha! Welcome to corporate, where vendors sell you software and say that the hardware has to have 128GB of ram and when you poke around a bit you discover a single JVM with constantly growing memory usage with a script that restarts it every time it runs out of resources.
AND a log file that describes - in typical Java excruciating detail - the precise lines in each module where the devs allocated resources but didn’t free them. About 40 times a second.
It’s only one wire in the cable, and it’s not the wire, but it looks like the pin, or possibly the crimp point on the female pin.
So a few possibilities:
Bad pins. Female pins (sockets) have internal wipers that grip the male pin and there is also the crimp connection. Bad QA on those leads to hotspots in the pin under high current draw. I’d probably go for this explanation, looking at the photos.
Bad electrical layout on the card that means that the bulk of the current goes through this pin. Milliohms on the track traces are enough to cause imbalances. This might be balanced out by having a small-but-still-larger resistance in the (standard) cable, which leads to:
It looks like thicker cabling is soldered and heatshrinked to smaller cabling that actually goes into the pins in the connector. There’s a reason why industrial cable connections aren’t soldered. Possibly a solder connection on another cable has broken and hidden in the hearshrink leaving more current to pass through this one.
Following from this it’s also quite possible that the thicker cable with less resistance , now has less voltage drop across it, and simply allows more current then designed through a connection already at its limit.
It’s quite possible that there are different pins/connector sets for different current draws. This cable might be using the wrong connector with the same physical size but lower current rating. The fact that the cable has been soldered to skinnier wires in the actual connector suggests this, but it’s quite possible that the connector is the right one.
“Oh, it’s got an embedded TIFF of the actual content. That explains it.”
Yes, I am quite old now.
If you occasionally boot to windows, it’s known to leave NICs in an unusable state if you just hibernate/quick power off. You need to boot back to windows and so a “proper” shutdown for it to come good.
Consider yourself corrected then. I’ve skimmed your comment history. Your go-to insult is “bootlicker” or alternatively, a simple clown emoji. In your comments you seem to provide very little context as to why you think that, it’s just, “I deem you to be a BOOTLICKER! Next!”
So maybe a little guidance for you:
The very, very, first thing you do when dealing with perceived propaganda - be it on mainstream media, online, or wherever - is to remove all the emotion and insults and see what’s left. You know what I see when I parse your comments like that? Very little.
Thus I conclude you have nothing of importance to say, and you become background noise that gets tuned out.
Actually your comments do have some small value. I check your bootlicker-comment-score and if it’s greater than 5, I know the community you posted in isn’t worth my time.
It was a Sharp “Memory LCD”.
https://sharpdevices.com/memory-lcd/
Basically “visible memory storage”.
You treat it as addressable memory and write into it, and it will hold that state using about 15 microwatts to do so.
You can still buy the display modules , there’s a few boards that let you easily drive them with arduinos and etc.
“Akshually”, so do you. You had a chance to discuss and inform, and instead you went straight to “bootlicker”.
Do you think they’re going to take any notice of whatever you say from here on?
What was Wenger thinking, sending Walcott on that early?
The distinction is “through which users”.
Merely putting something online does not make it social media. The key is the ability for users/passers-by to add their own content and/or comments, which then allows for interaction between users.
Well you see, engagement is down, and the whole “sponsored content” thing is in a death spiral due to AI slop. So Meta has decided to cut out the middleman and generate their own AI slop, because surely their version of personalised AI slop will solve the whole engagement problem and keep line always going up, because if it’s one thing users love, it’s an endless torrent of AI slop.
Try “lspci -vv” first to see the devices on the bus and to figure out which device is causing this.
Secondly, check all your BIOS’ “performance” settings, such as memory timings, bus speeds, and etc, and set them to default.
See how things go after that.
Off the top of my head I’d say:
And holy shit does their algorithm latch onto any minor interest in their content.
Accidentally tapped on a floor tiling video the other day, three days of tiling and handyman videos jammed into my feed and me pressing the “not interested” button on every single one.
Facebook, I am there for the rare post from my 150 or so friends and family. That’s it. Nothing else.
The reason we don’t use it anymore is because actual posts from real humans we know are buried under a torrent of shit. Sometimes their posts take days to surface leading to all sorts of chain-mail posts on how to “get your feed back”. None of which work because the whole business model is about jamming sponsored shit down your throat.
Trying to, because there is no more money to continue development.
Hopefully they can pull it off and do the same as Pebble did when they released a last firmware update for their watches that allowed third party servers to be used.
Well, anything’s possible, I say we give it a try and see what happens.