Can’t exit Vim
Ah yes, the legendary filter
the only reason people use vim is because they are stuck in there
I first tried vi in the early 90s, before I had easy access to online resources. I had to open a new shell and kill the vi process to exit it. Next time I dialed into my usual BBS I asked how to exit that thing. But since then I’ve liked it, because vi has been on every system I ever ssh’ed into.
You quit it just like you quit
ed
orex
, just that you have to enter the prompt (:
) yourself asvi
is not by default in prompt mode. And you should knowed
,ed
is the standard editor.I use Helix btw.
I can exit Vim, it just feels like trying to rip out the dashboard and the interiors from a family car because race cars also lack them. Kate is a good speedy alternative to VSCode, not to mention it also does not have Microsoft’s greedy hands on it.
I don’t get your analogy, but (neo)vim is a full featured IDE if you configure it to be one
Out of the box, Vim’s default configuration is very basic as it’s trying to emulate vi as close as possible. It like if you want things like headlights or a heater or a tachometer in your family car, you got to create a vimrc and turn those features on. That was my experience when I first started using Vim - I spent a lot of time messing around creating a vimrc until I got things the way I wanted.
One of the big changes with Neovim is their default settings are a lot more like what you would expect in a modern text editor.
Yeah that’s a fair way to look at it
:x
it says I don’t have permission
:q!
I wanted to follow up with the other error, where you didn’t open a file, so it doesn’t know where to write, but :q! always works :/
Can I somehow not discard my changes tho? I always open a 2nd terminal in root only for vim when editing system files so I don’t have to re-do the whole config but this time in sudo.
Cumbersome: save to some temporary file I guess.
:wq
will save the current buffer and quit.
deleted by creator
Oh no, I was never a programmer in the past.
It’s 2025 and I have no idea what the current way to center something is. Then again, my job is that of a backend engineer so it’s rare I’m outputting anything that isn’t a log statement. They can pry tables and center tags from my cold, aging hands.
IMO tables should be more used for… tabular data. Shocking, I know, but the amount of websites that try to emulate a table with
div
s andul
s out there is crazy.It’s <center>, obviously.
removed by mod
I’ve never asked ChatGPT to fix a syntax error because I use Copilot
If you are going to be this pedantic, I’ll have you know Copilot is a ChatGPT model in a Microsoft skin.
Yeah OK, but back then, an office suite was like 500 LOC.
Love the shoutout to Margaret Hamilton
removed by mod
Depends how far you go back. The top half is pretty representative of the professional dev team I was in in 1992.
So were “computers”. It used to be a job, delegated mostly to women. The JD is doing calculations day in and day out.
The moon landing by hand wouldn’t have been as funny without the over the top body builders first.
No, I don’t think so. It’s true that many of the earliest programmers were female, but there were very few of them, and that was a long time ago.
In a way, Ada Lovelace was the first programmer, but she never even touched a computer. The first programmers who did anything similar to today’s programming were from Grace Hopper’s era in the 1950s.
In the late 1960s there were a lot of women working in computer programming relative to the size of the field, but the field was still tiny, only tens of thousands globally. By the 1970s it was already a majority male profession so the number of women was already down to only about 22.5%.
That means that for 50 years, a time when the number of programmers increased by orders of magnitude, the programmers were mostly male.
Obligatory Grace Hopper
removed by mod
removed by mod
removed by mod
The large initial percentage of female coders was due to computer having been a female job, because secretary was. Their role within companies didn’t change, what changed is that they were using machines to do the computing instead of doing it by hand.
We’re kinda lucky to have the woke trifecta (Ada, Grace, Alan) (first programmer (woman), inventor of compilers (woman), absolute unit (gay)) to keep the chuds at bay. Even if we weren’t all socially inept nerds (or pretending to be so to bosses) there’s only so much you can do, culturally, if the population is growing exponentially. Uncle Bob (yes I know he’s a chud) did the maths at some point IIRC it was something like the number of programmers doubling every two years. Which also means that at any one point in time roughly 2/3rds of programmers have no idea what they’re doing, which explains the javascript ecosystem.
You can throw Margaret Hamilton in there, who was in charge of the software team that landed people on the moon. The picture of her standing next to a printout of the Apollo guidance software is iconic.
At first I thought this was the Wicked Witch of the West’s actress and thought she must have been multitalented. Then I looked it up to verify. Nope, same name, different women.
If you want famous actresses who contributed to technology, you want Hedy Lamarr:
At the beginning of World War II, along with George Antheil, Lamarr co-invented a radio guidance system for Allied torpedoes that used spread spectrum and frequency hopping technology to defeat the threat of radio jamming by the Axis powers.
removed by mod
We need to bring back 2010-2012 rage comic memes. All we needed was a badly cut-out blonde wig to trans Derp’s gender.
The glory days of Derp and Derpina
“Creates a whole game in assembly” is probably referring to roller coaster tycoon, which was written by a man. (lots of other games were written in asm, like many NES games, but I’d wager RCT was what they were alluding to)
That was my immediate thought. There were many that came before RCT, but it has the distinction of being (possibly) one of the last in an industry that had already moved on to higher-level languages to do merely half as much.
Wth is “Fixing memory leaks using pointers”?
I once had an intern attempt to install sudo using NPM and when that didn’t work he asked ChatGPT “Why can’t I install sudo from NPM?” while I’m trying to explain it to him.
He was smart, but somehow knew very little about commercial computers despite being on the verge of getting his master’s in computer science.
“Wait why can’t I install windows iso from vscode extension store?”
I still want to get into coding the OG manual way (because I enjoy pain and disappointment apparently) but now it seems like a waste of time since vibe coders and 13 year olds already are lightyears ahead of me. Also I have no reason to learn it, all apps are already built xD
all apps are already built
Couldn’t be further from the truth. You also have to consider competition.
True. But it seems like it’s way too late to start learning any programming now. Plus it’s saturated. I’ll learn it on the side for fun but it’ll probably never be a viable job for me
Can’t think of anything that could serve a major need right now, but I absolutely identified things in my life where I could use a preexisting tool to accomplish my goal, but it’s much less hassle for me to use the one I made for myself. You don’t have to transform the world, sometimes you can help yourself with a minor inconvenience and then put it out there for anyone who might find themselves with the same inconvenience.
I’m in the same boat. I used to be an amateur front and back end web developer. Almost made a text based RPG in middle school. I had to stop when shit got crazy in high school and college, but I don’t feel like any programming is worth my time right now. I’m focusing on gardening and maybe some cooking. You know, human activities that we can still enjoy.
Yeahh exactly. AI has pretty much ruined computer based fun now. Which in some ways is good, we should all learn physical hobbies again and not be reliant on tech. I still enjoy my hobby desktop computers though, I just enjoy learning how it really works under the surface.
Nah, it’s not that bad.
In 10 years with continued AI use? Yep.I’m thankful for AI. It guarantees my job as developer will continue to exist to repair all future AI-damage.
Hey now. Searching stack overflow circia 2011 to 2018 was an Art. You had to know enough to find the correct question that wasn’t deleted because a mod thought it was a duplicate of another question
Also to find the actual correct answer three comments down because the one that was voted highest worked, but was actually a really shit way to do the thing being asked
I often found the correct answers in the comments of an answer
Still do.
Before that you had to hang out on flipside or other gamedev sites and show your worthiness before begging for information.
I was so proud when they shared the DS hack (basically a homebrew SDK made by trial and error by some people) so that I could make small games on it.
After a while you got know which stack overflow questions were a waste of time, and you used that knowledge for years.
Honestly, CSS is a fucking joke and it’s solely to blame for why centering something isn’t always straightforward.
By the way, this picture is a crock of shit for people who aren’t programmers. Anyone who is a programmer will not take it seriously because programming is so much more about helping others instead of shaming them.
CSS is amazing, if you know how to use it 😉
Everybody complaining about css like “but it doesn’t do what I want if to do without me investing a minute into why”.
Ironically, it’s oh so often the RTFM crowd.
Stackoverflow: exists solely from the urge of developers to help developers, and since ExpertsExchange was paid dogshit.
This meme: pisses on its whole purpose.Stackoverflow is for senior devs to clown on junior devs. It’s the inverse of helping juniors.
The missing middle section was documentation and QA getting worse
Well yea, when you train the entire 2nd generation of coders on a book that is “For dummies” what did you expect?
Don’t forget the third gen’s JavaScript: The Good Parts
I started with C++ and went to Java to .NET to Javascript and now to Terraform.
I know this is all a joke but there’s something definitely different with the ones above and the ones below. There’s a bit of satisfaction you can get sometimes when you’re working with memory directly and getting faster feedback (yes, there’s more math back then and it wasn’t easy to look stuff up, for sure). However, there’s new challenges nowadays … there’s so many layers on top of layers. I feel as though Stack Overflow and ChatGPT are so needed because the error messages and things we give are obfuscated or unclear (not always any library author’s fault as there’s compatibility issues, etc)
We’re doing serverless stuff at my current company and none of our devs run code locally. They have to upload it using CDK or Serverless Framework to run on the cloud. We don’t use SST so we can’t set breakpoints but like that’s a lot of crap inbetween just running your code already. Not even getting into the libraries and transpilers and stuff we use. I spent like a few weeks over Christmas to get our devs to run the code locally. Guess what? None of them use it because they’re so use to uploading it. I was like, "you can put breakpoints in it! you can have nodemon and it instant reloads! nope, none of them care … "
First learning is last learning.
Same reason we still do
console.log("FUCK")
.First learning is last learning.
I’ll be the dumb one to ask: what do you mean? Is this that making a mistake that costs a lot is the best teacher, because you only have to mess it up once to learn it forever?
Pretty sure they mean people don’t learn something again when they already learned it. Once you learn how to do something, willingness to learn it again but a different way dries up, and so you stick to bad habits as long as they ‘work’
It’s a mantra about teaching people and then expecting them to forget it. Doesn’t work. They’ll default to what they already know.
My freshman English teacher got married in October and I called her by her maiden name the entire year.
Like all programming mantras, it’s not universally true, but it’s annoyingly reliable. It reflects the shape of the human brain.