"The most important section is that of 2-3 year old vehicles, because maintenance and mileage play lesser roles in reliability. The best performers in this category were the Mazda2 (2.9% defect rate)"
Once again, my intuition is wildly off regarding how bad even the relatively good things are. 3% defect rate is good?
Tesla seems insane. How do you get away with being so much worse for so many years in a highly competitive market?
Hot take: Actual "irreversibly delete x stuff with the next action" is simply too powerful and bad design for most people, and has probably caused considerably more harm than good in the world. It's particularly silly with software, where few reasons exist for this to be an actual thing.
What the average human needs is laws and enforcement, and trust in both.
The tale of the coder, who finds a legacy codebase (sometimes of their own making) and looks at it with bewilderment is not new. It's a curious one, to a degree, but I don't think it has much to do with vibe coding.
I have been working for 20 years and I haven’t really experienced this with any code I’ve written. Sure I don’t remember every line but I always recall the high level outlines.
You can get some really hefty fines for not playing by the rules. It's taken extremely seriously in basically every aspect of life in Europe. It's not enforced hard enough against US company empires like meta and the like unfortunately, but it absolutely works.
> It's taken extremely seriously in basically every aspect of life in Europe
Yeah, like every single cookie banner out there not actually being compliant. A regulation can't be considered to be meaningfully enforced when every single storefront openly breaches it in total impunity for years.
Yeah... Ask Schrems about the hefty fines and all that pretty things bright to Europeans by the GDPR. Come on! The GDPR is at best a pretty face to a rotten nothing-burger.
Nah it’s privacy. Gotta get consent from users. Cookies, GDPR, and all. Meta has learned from their fines, and isn’t opting users automatically into features.
The first two paragraphs of that article are a wild incoherent ride, mixing all kinds of things, that have very little to do with each other (feel slighted [...] even the slightest mistreatment [...] failed to deliver birthday greetings on time)
Regardless of the analysis and without having read the study, I agree with the sentiment in the headline and it's sad agreement that matches my experience: Making employees feel not slighted works really well – for both the company and the employee. It does not require that you respect anyone, and actually often runs counter it.
Once you figure that out as an employer, I can see why you would chose to just get better at fooling and distracting people.
Perfect summary. I'll add: insane defaults that'll catch you unaware if you're not careful! Like foreign keys being opt-in; sure, it'll create 'em, but it won't enforce them by default!
Always send "pragma foreign_keys=on" first thing after opening the db.
Some of the types sloppiness can be worked around by declaring tables to be STRICT. You can also add CHECK constraints that a column value is consistent with the underlying representation of the type -- for instance, if you're storing ip addresses in a column of type BLOB, you can add a CHECK that the blob is either 4 or 16 bytes.
The fact that they didn’t make STRICT default is really a shame.
I understand maintaining backwards compatibility, but the non-strict behavior is just so insane I have a hard time imagine it doesn’t bite most developers who use SQLite at some point.
> The fact that they didn’t make STRICT default is really a shame.
SQLite makes strong backwards-compatibility guarantees. How many apps would be broken if an Android update suddenly defaulted its internal copy of SQLite to STRICT? Or if it decided to turn on foreign keys by default?
Those are rhetorical questions. Any non-0 percentage of affected applications adds up to a big number for software with SQLite's footprint.
Software pulling the proverbial rug out from under downstream developers by making incompatible changes is one of the unfortunate evils of software development, but the SQLite project makes every effort to ensure that SQLite doesn't do any rug-tugging.
Nearly every default setting in sqlite is "wrong" from the outset, for typical use cases. I'm surprised packages that offer a sane configuration out of the box aren't more popular.
I mean it has blob types. Which basically means you can implement any type you want. You can also trivially implement custom application functions to work on these blob types in your queries. [1]
Isn't SQLite a de facto standard? Seems like it to me. If I want an embedded SQL engine, it is the "nobody got fired for selecting" choice. A competitor needs to offer something very compelling to unseat it.
Yeah, that's the one prominent example but, like you said, also just rather recently. Since "the network is slow, duh" has always been true, I wonder why.
My guess would be that performance improvements (mostly hardware from Moore's law and the proliferation of SSDs, but also SQLite itself) have led to far fewer websites needing to run on more than 1 computer, and most are fine on a $5/month VPS
I haven't investigated this so I might be behind the times, but last I checked remotely managing an SQLite database, or having some sort of dashboarding tool run management reporting queries and the likes, or make a Retool app for it, was very messy. The benefit of not being networked becomes a downside.
Maybe this has been solved though? Anybody here running a serious backend-heavy app with SQLite in production and can share? How do you remotely edit data, do analytics queries etc on production data?
It is for use cases like local application storage, but it doesn't do well in (or isn't designed for) concurrent use cases like any networked services. SQLite is not like the other databases.
It's becoming so! Rails devs are starting to ship SQLite to production. It's not just for their main database either... it's replacing Redis for them, too.
> when it became clear the work could be financially exploited
That is not the obvious reason for the change. Training models got a lot more expensive than anyone thought it would.
You can of course always cast shade on people's true motivations and intentions, but there is a plain truth here that is simply silly to ignore.
Training "frontier" open LLMs seems to be exactly possible when a) you are Meta, have substantial revenue from other sources and simply are okay with burning your cash reserves to try to make something happen and b) you copy and distill from the existing models.
reply