Honestly I feel even an AI could write better code than what some big tech software uses lol
Big words from someone who can’t even write “than” properly.
?
The paramount+ app doesn’t even know how to properly hide the pause icon after you hit resume ffs. It’s been months.
For like, a couple years, sure. Then there will be a huge push to fix all the weird shit generated by AI.
Let me weigh in with something. The hard part about programming is not the code. It is in understanding all the edge cases, making flexible solutions and so much more.
I have seen many organizations with tens of really capable programmers that can implement anything. Now, most management barely knows what they want or what the actual end goal is. Since managers aren’t capable of delivering perfect products every time with really skilled programmers, if i subtract programmers from the equation and substitute in a magic box that delivers code to managers whenever they ask for it, the managers won’t do much better. The biggest problem is not knowing what to ask for, and even if you DO know what to ask for, they typically will ignore all the fine details.
By the time there is an AI intelligent enough to coordinate a large technical operation, AIs will be capable of replacing attorneys, congressmen, patent examiners, middle managers, etc. It would really take a GENERAL artificial intelligence to be feasible here, and you’d be wildly optimistic to say we are anywhere close to having one of those available on the open market.
I agree with you completely, but he did say no need for ‘human programmers’ not 'human software engineers. The skill set you are describing is one I would put forward is one of if not the biggest different between the two.
This is really splitting hairs, but if you asked that cloud CEO if he employed programmers or ‘software engineers’ he would almost certainly say the latter. The larger the company, the greater the chance they have what they consider an ‘engineering’ department. I would guess he employs 0 “programmers” or ‘engineeringless programmers’.
Anyone in software engineering will tell you that as you get more senior you spend less time writing lines of code and more time planning, designing, testing, reviewing, and deleting code.
This will continue to be true, it’s just that there will be less juniors below who’s whole job is to produce code that meets a predefined spec or passes an existing test, and instead a smaller number of juniors will use AI tools to increase their productivity, while still requiring the same amount of direction and oversight. The small amounts of code the seniors write will also get smaller and faster to write, as they also use AI tools to generate boilerplate while filling in the important details.
How much longer until cloud CEOs are a thing of the past? Wouldn’t an AI sufficiently intelligent to solve technical problems at scale also be able to run a large corporate division? By the time this is actually viable, we are all fucked.
deleted by creator
AI is terrible at solving real problems thru programming. As soon as the problem is not technical in nature and needs a decision to be made based on experience, it falls flat on its face.
It will never understand context and business rules and things of that nature to the same extent that actual devs do.
Lol, as a programmer who uses generative AI myself, I would genuinely love to see them try.
I’d believe AI will replace human programmers when I can tell it to produce the code for a whole entire video game in a single prompt that is able to stand up to the likes of New Vegas, has zero bugs, and is roughly hundreds of hours of content upon first play due to vast exploration.
In other words, I doubt we’ll see human programmers going anywhere any time soon.
Edit:
Reading other replies made me remember how I once, for fun, tried using a jailbroken copilot program to do python stuff slightly above my already basic coding skill and it gave me code that tried importing something that absolutely doesn’t exist. I don’t remember what it was called ince I deleted the file while cleaning up my laptop the other day, but I sure as hell looked it up before deleting it and found nothing.
Could you imagine Microsoft replacing windows engineers with a chat gpt prompt? What would that prompt even look like?
To be honest, this could be an example of where AI could do marginally better. I don’t mean that because of code quality or functionality. I mean it in the sense of MS software getting absolutely fucked by internal competition and stack-ranking fostered during the Balmer years. The APIs are inconsistent and there is a ton of partially implemented stuff that will never be pushed to completion because everyone who worked on it was fired.
An AI might be able to implement things without intentionally sabotaging itself but, because LLMs are in the forefront of what would be used and do not have the capability of intention or understanding context, I’m a bit pessimistic.
No matter the human expense, the green line must go up.
The best copilot can do is autofill lines that everyone’s written a million times. That’s not nothing, but it aint replacing a human brain any time soon.
Honestly, GPT has strengthened my coding skills… for the simple reason that the handful of times I’ve asked it to do something the response I get back is so outlandish that I go “That CAN’T be right” and figure out how to do it myself…
Research with extra steps… I get it, but still…
I feel like it’s whispering bad advice at me while I’m typing. It’s good for as auto completing the most rudimentary stuff, but I have a hard time imagining it completing even one file without injecting dangerous bugs, let alone a large refactor.
I taught myself Python in part by using ChatGPT. Which is to say, I coaxed it through the process of building my first app, while studying from various resources, and using the process of correcting its many mistakes as a way of guiding my studies. And I was only able to do this because I already had a decent grasp of many of the basics of coding. It was honestly an interesting learning approach; looking at bad code and figuring out why it’s bad really helps you to get those little “Aha” moments that make programming fun. But at the end of the day it only serves as a learning tool because it’s an engine for generating incompetent results.
ChatGPT, as a tool for creating software, absolutely sucks. It produces garbage code, and when it fails to produce something usable you need a strong understanding of what it’s doing to figure out where it went wrong. An experienced Python dev could have built in a day what took me and ChatGPT a couple of weeks. My excuse is that I was learning Python from scratch, and had never used an object oriented language before. It has no excuse.
ChatGPT only gives good answers if you ask the right questions and to do that you have to be better than a novice. It’s great as a rubber ducky that answers back but it’s usefulness is a reflection of the user.
Spoken like someone who manages programmers instead of working as one.
deleted by creator
Until you ask it to do something never done before and it has a meltdown.
But you have to describe what it is. If only we had universal languages to do that… Oh yeah, it’s code.
It’s really funny how AI “will perform X job in the near future” but you barely, if any, see articles saying that AI will replace CEO’s in the near future.
Here’s one. And their profits went up when they replaced their CEO https://www.forbes.com/sites/sherzododilov/2024/01/11/can-ai-become-your-next-ceo/
C-suites are like Russian elites.
The latter are some thieves who’ve inherited a state from Soviet leadership. They have a layman’s idea of what a state and a country is, what history itself is, plus something that a taxi driver would say. In the last 20 years they are trying to apply that weird idea to reality, as if playing Hearts of Iron, because they want to be great and to be in charge of everything that happens.
The former have heard in school that there were industrial revolutions and such, and they too want to be great and believe in every such stupid hype about someone being replaced with new great technology, and of course they want to be in charge of that process.
While in actuality with today’s P2P technologies CEO’s are the most likely to be replaced, if we use our common sense, but without “AI”, of course. Just by decentralized systems allowing much bigger, more powerful and competitive cooperatives than before, and those that form and disband very easily.
Somewhere there is a dev team secretly programming an AI to take over bureaucratic and manegerial jobs but disguising it as code writing AI to their CTO and CEO
I wish.