

Gotta link for that? Can’t imagine what tuning would make that possible simply from library choices or kernel flags.


Gotta link for that? Can’t imagine what tuning would make that possible simply from library choices or kernel flags.


Am I missing something? I’m seeing like a 1-3% difference in most things aside from that Pogocache bench.
DHH is right-wing diarrhea


This will be another massive fucking failure in the AI space.
Instead of making these dumbass and inefficient stacks they’ve all built this crap on MORE efficient, they’re like “Space is cold…sounds good.”
This planet is doomed.


These MFers are in SO MUCH TROUBLE it is absolutely insane. There is no way out of this by way of revenue, and this shit should be ILLEGAL. Idiots and morons have taken over the government, and are going to collapse the stock market because OpenAI can’t fucking stop making promises.
When we kick Trump out, these assholes should be put in jail.


Have you updated firmware for the card and your BIOS lately?
Edit: also, who makes the card?


Jensen… c’mon, bud. Take those old orange balls out of your fucking mouth.


Probably shouldn’t have fired all those tech specialists that worked there then, huh?


Whoa, looks like your home directory or user got orphaned somehow during the upgrade.
who and see what the output issudo su -u, enter your password, then ls -lh /homeWhat’s the output?


Resistant, but not guaranteed. Nothing is guaranteed when it comes to encryption.


This is a great talk, but it’s ignoring the real issue in that it would need to be “in-line”, which is not anywhere near possible. They sort of address that, but are talking about the cyphers themselves mostly.
I think we’ve reached the cusp where we can exchange new derivative keys on the fly per request without making too much of a dent in speed, but that comes with all kinds of tradeoffs on session length and convenience I suppose.
Edit: I guess there is another eventuality where governments just go and farm public keys and use them against targeted traffic. Not a good way to beat that right now.


You seem to think the entire world is connected by fiber now. Not the case. 98% of long distance cable runs across the world are still air run copper. That won’t immediately impact high dollar facilities like datacenters or undersea cables that are interconnected, but everything else it will take a hit.


The only thing that could take the entirety of the Internet for a bit of time is a massive EMF event that damages enough infrastructure to disable point-to-point communication between nodes. This means something like a Coronal Mass Ejection so large it cooks all satellites on its way in (on one side of orbit at least), then toasts a lot of other protected hardware on the ground.
The P2P nature of the internet would be hard to kill in totality with one event in any sense of the word. At the the very least, it would quick to get local infrastructure up within hours, assuming the entire DNS system isn’t destroyed.
May want to throw this in the mix: https://cryptomator.org/


That’s…not even accurate, what in the world 🤣


You are not reading or understanding comments, child. That’s not what I said whatsoever.
You’re of the opinion that DevOps engineers can be automated away. I proved you wrong. Now you’re talking about the tooling, which is in my first comment to you. Automated tooling is NOT all DevOps is, and the fact you think shows me you’re unseasoned in whatever it is you do, have no concept of the role.
It would be people like you who would not pass the first round of interviews from answering a question about this topic exactly as you’ve stated, because you don’t understand the core function of the team, and you’re role in it.
You make some code, and obviously have no idea how to run it, let alone at scale. All the working pieces of a platform at large need to be understood and vetted by a DevOps team in order to make it run, and run well. That’s understanding everything from start to finish, in ways you wouldn’t be able to comprehend being one part of team that is building one part of a platform. You can’t make an agent that understands all the underpinnings of all the services or metrics, and why they fail, that then takes action on them, because it’s not something AI does. Case in point, the AWS outage and others I mentioned.
Now, you could make MANY agents that take actions on many things, but that doesn’t give situational awareness or comprehension to any singular agent, something AI also doesn’t do. That’s what DevOps teams do.
I don’t even need to keep arguing with you about this, because the down votes on your comments speak for themselves. I’m just trying to educate on your false understanding about how it all works so you don’t stumble through your career making the same comments and mistakes.


Smart. Kids are really hyping physical media again, akin to when 80’s kids brought vinyl back in the 2000’s.
Bit of a stretch the way this is angled…