Have you noticed how expensive these things have gotten? Doesn’t it seems like charging you for more RAM than it necessary and then locking 20% of it away for “features” no one wants is a bad thing?
I think it’s likely you would have access to all of it because the testing in the article clearly shows the kernel can see the memory. Thus, the graphene kernel should be able to use it.
It’s not about a single app, it’s about multitasking without having to reload apps.
At various times I’ve juggled between 4 apps at once on my phone. Say something like Messaging, Firefox, maybe a lemmy app, and Bitwarden for logging into something.
Seriously, what app is capable of saturating 13GB?
So then why does it have 16GB?
Have you noticed how expensive these things have gotten? Doesn’t it seems like charging you for more RAM than it necessary and then locking 20% of it away for “features” no one wants is a bad thing?
What if we use graphene OS I would guess that RAM would not be longer use and you can use the full 16GB.
I think it’s likely you would have access to all of it because the testing in the article clearly shows the kernel can see the memory. Thus, the graphene kernel should be able to use it.
I have no idea.
It’s not about a single app, it’s about multitasking without having to reload apps.
At various times I’ve juggled between 4 apps at once on my phone. Say something like Messaging, Firefox, maybe a lemmy app, and Bitwarden for logging into something.
Me too, and I’ve never had an issue juggling those apps on my Pixel 6 with 8gb RAM.
At that point I go to my PC.
I think it’s more to do with app switching without having to reload or losing your state in minimized apps.
If this is enabled by default it’ll certainly be needed, especially in cases like multitasking.
https://android-developers.googleblog.com/2024/08/adding-16-kb-page-size-to-android.html?m=1
From the linked article. So I doubt that the larger page size is the (only) reason for 16G ram. AI is the more likely reason.
Ooh I understand that it’s for AI, I just meant that more RAM would certainly help in this case.
Ollama server running in termux