

Good as in “the worst that the Internet has to offer is an old man orgy”.
It was a simpler time.
— GPG Proofs —
This is an OpenPGP proof that connects my OpenPGP key to this Lemmy account. For details check out https://keyoxide.org/guides/openpgp-proofs
[ Verifying my OpenPGP key: openpgp4fpr:27265882624f80fe7deb8b2bca75b6ec61a21f8f ]
Good as in “the worst that the Internet has to offer is an old man orgy”.
It was a simpler time.
Jesus, tell me that site is still up. I need to be reminded of when the Internet had good things.
I bought a car that comes with a “free” 300k/30 year warranty, but only if to do oil changes every 4k miles or 3 months. Maybe this guy has something similar?
For me, I may try And keep it up for a bit, but driving to one particular dealer every 3 months just to get a ridiculous warranty that will probably never actually pay out isn’t worth it.
I’m not sure if the person I replied to was thinking about this movie in particular, but it certainly came to mind when I posted that gif:
Also of note - if you’re using docker (and Linux), make sure the user is/group id match across everything to eliminate any permissions issues.
I dunno, Trump showed the world you can completely give up decades of hard won soft power in only two months, maybe China will think it’s cool?
/S
But the prices will come down if tariffs are removed, right? Right!?
I’m doing that with docker compose in my homelab, it’s pretty neat!
services:
ollama:
volumes:
- /etc/ollama-docker/ollama:/root/.ollama
container_name: ollama
pull_policy: always
tty: true
restart: unless-stopped
image: ollama/ollama
ports:
- 11434:11434
deploy:
resources:
reservations:
devices:
- driver: nvidia
device_ids: ['0']
capabilities:
- gpu
open-webui:
build:
context: .
args:
OLLAMA_BASE_URL: '/ollama'
dockerfile: Dockerfile
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
volumes:
- /etc/ollama-docker/open-webui:/app/backend/data
depends_on:
- ollama
ports:
- 3000:8080
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434/'
- 'WEBUI_SECRET_KEY='
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
volumes:
ollama: {}
open-webui: {}
Sure! I mostly followed this random youtuber’s video for getting Wyoming protocols offloaded (Whisper/Piper), but he didn’t get Ollama to use his GPU: https://youtu.be/XvbVePuP7NY.
For getting the Nvidia/Docker passthrough, I used this guide: https://www.bittenbypython.com/en/posts/install_ollama_openwebui_ubuntu_nvidia/.
It’s working fairly great at this point!
I spun up a new Plex server with a decent GPU - and decided to try offloading Home Assistant’s Preview Voice Assistant TTS/STT to it. That’s all working as of yesterday, including an Ollama LLM for processing.
Last on my list is figuring out how to get Home Assistant to help me find my phone.
Someone make it show
Doge ⬇️ Trump ⬇️ Congress
I want to see trump get asked about it in the next presser.
I (unfortunately) am in the market for a new-to-me car and I’m so pissed that my only options are either 15+ year old vehicles or spyware on wheels.
I may not rip out any antennas, but I’m damn sure going to try and interfere with them somehow.
This is the way. Layer 3 separation for services you wish to access outside of the home network and the rest of your stuff, with a VPN endpoint exposed for remote access.
It may be overkill, but I have several VLANs for specific traffic:
There are two new additions: a ext-vpn VLAN and a egress-vpn VLAN. I spun up a VM that’s dual homed running its own Wireguard/OpenVPN client on the egress side, serving DHCP on the ext-vpn side. The latter has its own wireless ssid so that anyone who connects to it is automatically on a VPN into a non-US country.
My homelab is three Lenovo M920q systems complete with 9th gen i7 procs, 24GB ram, and 10Gbps fibre/Ceph storage. Those mini PCs can be beasts.
I don’t know if I can completely explain the difference, but I would classify myself as a home labber not a self-hoster.
I use Proton for email and don’t have any YouTube/Twitter/etc alt front ends. The majority of my lab (below) is storage and compute for playing around with stuff like Kubernetes and Ansible to help me with my day job skills. Very little is exposed to the Internet (mostly just a VPN endpoint for remote lab work).
I view self-hosting as more of a, “let me put this stuff on the internet instead of of using a corporation’s gear” effort. I know folks who host their own Mastodon instance, have their own alt front ends for various social media, their own self-hoster search engines.
This is what I did. Grabbed UL certified faceplates and a bunch of keystones. Total outlay was probably $50 including drywall/Sheetrock mounting brackets.
I went all out with Cat6A. I have some 10Gbps capabilities with my home lab, and although I currently do not have any 10GbE copper capable systems, I thought I’d try to go future proof.
My only regret is that I only went with riser grade cable - plenum was way too much, even for plain Cat6.
I literally just did this over the Christmas break. The drywall mounting outlets are a game changer.
When I bought my Hyundai they pushed fucking hard to sign up for their Blue link product. Free for life! Look, map updates! You can personalize your driver profile pic! Want to remote start your car over the Internet?
Luckily, my VIN wasn’t working for registration (I guess it hadn’t quite gone through fast enough that the car was purchased). I’ve gone two months without BlueLink and I’m hoping that’s saving me from some of the info gathering (or at least it’s not directly linked to me.