

I just… don’t connect the TV to the internet. Never had an issue with anything like that.
I just… don’t connect the TV to the internet. Never had an issue with anything like that.
Shit! Sorry, got my wires crossed, I actually meant locality of behavior. Basically, if you’re passing a monad around a bunch without sugar you can’t easily tell what’s in it after a while. Or at least I assume so, I’ve never written anything big in Haskell, just tons of little things.
I’m not sure if I entirely follow, but in general you actually have much better locality of behavior in Haskell (and FP languages in general) than imperative/OOP languages, because so much more is explicitly passed around and immutable. Monads aren’t an exception to this. Most monadic functions are returning values rather than mutating some distant state somewhere. Statefulness (or perhaps more precisely, mutable aliasing) is the antithesis of locality of behavior, and Haskell gives you many tools to avoid it (even though you can still do it if you truly need it).
I’m not really sure what you mean by “don’t really know what’s in it after a while”. It might be helpful to remember that lists are monads. If I’m passing around a list, there’s not really any confusion as to what it is, no? The same concept applies to any monadic value you pass around.
Yeah, that makes tons of sense. It sounds like Transaction is doing what a string might in another language, but just way more elegantly,
I think you might have misunderstood what I was describing. The code we write doesn’t actually change, but the behavior of the code changes due to the particular monad’s semantics. So for example, let’s say I write a query that updates some rows in a table, returning a count of the rows affected. In this Transaction
code block, let’s say I execute this query and then send the returned number of rows to an external service. In code, it looks like the API call immediately follows the database call. To give some Haskell pseudocode:
example :: Transaction ()
example = do
affectedRows <- doUpdateQuery
doApiCall affectedRows
return ()
But because of how Transaction
is defined, the actual order of operations when example
is run becomes this:
BEGIN;
to DBdoUpdateQuery
COMMIT;
to DBdoApiCall affectedRows
. Otherwise, do nothingIn essence, the idea is to allow you to write code where you can colocate your side-effectful code with your database code, without worrying about accidentally holding a transaction open unnecessarily (which can be costly) or firing off an API call mistakenly. In fact, you don’t actually have to worry about managing the transaction at all, it’s all done for you.
which fits into the data generation kind of application. I have no idea how you’d code a game or embedded real-time system in a non-ugly way, though.
I mean, you’re not going to be using an SQL database most likely for either of those applications (I realize I assumed that was obvious when talking about transactions, but perhaps that was a mistake to assume), so it’s not really applicable.
I also generally get the impression that you have a notion that Haskell has some special, amorphous data-processing niche and doesn’t really get used in the way other languages do, and if that’s the case, I’d certainly like to dispel that notion. As I mentioned above, we have a pretty sizeable backend codebase written in Haskell, serving up HTTP JSON APIs for a SaaS product in production. Our APIs drive all (well, most) user interaction with the app. It’s a very good choice for the typical database-driven web and mobile applications of businesses.
Ironically, I actually probably wouldn’t use Haskell for heavy data processing tasks, namely because Python has such an immense ecosystem for it (whether or not it should is another matter, but it is what it is)… What Haskell is great at is stuff like domain modeling, application code (particularly web applications where correctness matters a lot, like fintech, healthcare, cybersecurity, etc.), compilers/parsers/DSLs, CLI tools, and so on.*
I’m not sure what you mean by “locality of reference”. I assume you mean something other than the traditional meaning regarding how processors access memory?
Anyway, it’s often been said (half-jokingly) that Haskell is a nicer imperative language than imperative languages. Haskell gives you control over what executing an “imperative” program actually means in a way that imperative languages don’t.
To give a concrete example: we have a custom monad type at work that I’m simply going to call Transaction
(it has a different name in reality). What it does is allow you to execute database calls inside of the same transaction (and can be arbitrarily composed with other code blocks of type Transaction
while still being guaranteed to be inside of the same transaction), and any other side effects you write inside the Transaction
code block are actually collected and deferred until after the transaction successfully commits, and are otherwise discarded. Very useful, and not something that’s very easy to implement in imperative languages. In Haskell, it’s maybe a dozen lines of code and a few small helper functions.
It also has a type system that is far, far more powerful than what mainstream imperative programming languages are capable of. For example, our API specifications are described entirely using types (using the servant library), which allows us to do things like statically generate API docs, type-check our API implementation against the specification (so our API handlers are statically guaranteed to return the response types they say they do), automatically generate type-safe API clients, and more.
We have about half a million lines of Haskell in production serving as a web API backend powering our entire platform, including a mobile app, web app, and integrations with many third parties. It’s served us very well.
As a senior engineer writing Haskell professionally, this just isn’t really true. We just push side effects to the boundaries of the system and do as much logic and computation in pure functions.
It’s basically just about minimizing external touch points and making your code easier to test and reason about. Which, incidentally, is also good design in non-FP languages. FP programmers are just generally more principled about it.
The point of the joke is not that the Python interpreter will change types mid-program on its own, but that you don’t have any real way of knowing if you’re going to get the type you expect.
Programs are messy and complicated, and data might flow through many different systems before finally being used for output. It can and often does happen that one of those systems does not behave as expected, and you get bugs where one type is expected but another is used in actuality.
Yes, most likely what would happen in Python is a TypeError
, not actual output, but it was pretty clearly minor hyperbole for the sake of the joke.
You don’t see how type mismatch errors can happen in a dynamically-typed language? Then why do they happen all the time? Hell, I literally had a Python CLI tool crash with a TypeError
last week.
1GB of files is not configuration.
Have you used Jira? It’s a memory guzzler
Kind of, though they honestly just do pretend immutability. Object references are still copied everywhere.
Forced to use copilot? Wtf?
I would quit, immediately.
Yep, senior Haskell developer here and I have had their recruiters hounding me many times, even though I have told them to fuck off again and again.
I always find it so funny that they chose Haskell. They are desperate to hire, but no one in the Haskell community actually wants to work for them. I’m in a discord server with a bunch of veteran Haskellers and everyone there won’t touch them with a 100ft pole.
Would be the most sane thing he’s ever done.
Blocking hexbear is a sensible choice, good for them.
It very well could be typical corporate fuckery, but that makes me wonder if it’s actually a bug and that it’s computing the per kg price based on the single until price but dividing by the total weight of the pack.
Or perhaps it’s a “bug” that’s left intentionally until called out.
Its query planner is also much, much more powerful. Like it’s not even close.
There’s hardly any good reason to use MySQL today. Postgres is easier and nicer to work with, with a strong community backing it.
SQLite is completely different from both and has entirely different usecases.
Postgres, hands down. It’s far better than MySQL in every way.
The system has a lot of problems for sure, but IME as a senior software engineer, people without degrees are often lacking in core CS skills and are much less comfortable with the more conceptual aspects of the field like graph theory, systems design, DSLs, etc. Usually database skills aren’t quite as strong either due to not having studied relational algebra and other database concepts.
None of this is to say that someone without a degree can’t be a valuable part of the team, but at the higher levels of seniority, you do want people who have really strong foundations so you can ensure that you actually are building strong foundations. A degree doesn’t guarantee these qualities, but it certainly makes a person much more likely to have them. Not saying someone without a degree can’t possibly achieve this on their own, but it’s quite rare and requires much more self study than most actually do.
First off, videos on tiktok aren’t really worth taking seriously. There’s just too much fake garbage on there.
But anyway, the cost of education is absolutely a huge problem. It should be free or very low cost.
That being said, it’s simply demonstrably false to claim that a degree is useless or doesn’t help you get a job. There are many fields where a degree is an absolutely a requirement, like medicine, law, engineering, etc. The specific degree does matter a lot, though, and there are other important job hunting skills that you need to develop in order to actually get a job.
Speaking from personal experience, every job I’ve had thus far (as a software engineer) has listed a 4 year degree as either a hard requirement or strongly preferred. I do not believe recruiters would have given me the time of day were it not for my degree, because they are looking to match as many requirements as possible and are filtering people out. And when applying for jobs, ATS programs routinely filter out job applications with resumes that don’t list a degree.
Job seeking is an extremely gameified system and you have to learn the game in order to beat it. It sucks big time and I loathe doing it, but it’s what you have to do if you want to get high-paying jobs. That, or know someone at a company that can get you a job.
There are many other FOSS CMS systems, and WordPress has always been trash from a technical perspective. It just was one of the first options in the early internet and consequently developed an ecosystem early. Would be great if it would go away so it stops sucking all of the air out of the room and alternatives can pick up steam.
I’m no chef but… smoked paprika? Sounds like it could fit the bill, maybe.