

Oh, look, the reason Dodge reliability is garbage.
Oh, look, the reason Dodge reliability is garbage.
I’ll take “things that might have happened if Elon didn’t cut the department that relates self-driving” for $100, Alex.
In my experience, LLMs are good for code snippets and input on best practices.
I use it as a tool to speed up my work, but I don’t see it replacing even entry jobs any time soon.
So true, it’s an amazing tool for learning. I’ve never been able to learn new frameworks so fast.
AI works very well as a consultant, but if you let it write the code, you’ll spend more time debugging because the errors it makes are often subtle and not the types of errors humans make.
True, I should have said a benefit that is a negative for the consumer.
You’re getting down-voted, but, yes, this change only really affects user experience.
I don’t know why anyone would think that what the LLM can access for context during your session is a limiting factor for what OpenAI has access to.
If this change freaks you out, the time for you to be freaked out about history was the moment they started storing it.
I think you might be confused about the difference between giving the LLM access to your stored conversations during your session and using OpenAI using AI to search your stored conversations.
What the LLM has access to during your session changes nothing but your session.
It’s not some “I, Robot” central AI that either has access or doesn’t as a whole.
That is the difference, but it’s a pretty minimal difference. Open AI hardly needs to give the LLM access to your conversations during your session to access your conversations.
In fact, I don’t see any direct benefit to OpenAI with this change. All it does is (probably) improve its answers to the user during a session.
I’m not going to defend OpenAI in general, but that difference is meaningless outside of how the LLM interacts with you.
If data privacy is your focus, it doesn’t matter that the LLM has access to it during your session to modify how it reacts to you. They don’t need the LLM at all to use that history.
This isn’t an “I’m out” type of change for privacy. If it is, you missed your stop when they started keeping a history.
You mean DE1, right?
Mint for general use.
Nobara or PopOS for gaming.
Edit - you know what’s dumb about silent down votes? If you have an opinion, share it.
*pastes list of 100 numbers"
"Flunky, put this in a where in SQL statement. "
It’s taught me a few new tricks. It’s good at short bits of code, but anything longer is likely to have problems that take as much time to fix as writing it yourself to begin with.
I also find learning something new to be much faster when you can just ask a question about best practices or why something works like it does.
Nobara or Pop! OS would be good choices.
Yeah, VR is still catching up, but I feel like (dual) booting to Win 10 just for specific purposes would greatly reduce the risk.
Yeah, though the joke is funny, this is the real answer.
Storage is cheap compared to creating custom libraries.
Yeah, no matter what way you disorganize 60,000 rows, the data is still going to read into memory once.
Right? There’s no part of that xeet that makes any real sense coming from a “data engineer.”
Terrifying, really.
You could query 60,000 rows on a low tier smart phone. Makes no sense at all.
Even if it was local, a raspberry pi can handle a query that size.
Edit - honestly, it reeks of a knowledge level that calls the entire PC a “hard drive”.
Reading over that list, I don’t really see anything that isn’t “maybe gets read privileges for non-critical data”. Hardly useful enough to be worth attempting access to a single personal Jellyfin server.
I’d be mildly surprised if anyone has ever bothered.
You do you, but in my view the effort outweighs the benefits.