Sorry if this is a dumb question, but does anyone else feel like technology - specifically consumer tech - kinda peaked over a decade ago? I’m 37, and I remember being awed between like 2011 and 2014 with phones, voice assistants, smart home devices, and what websites were capable of. Now it seems like much of this stuff either hasn’t improved all that much, or is straight up worse than it used to be. Am I crazy? Have I just been out of the market for this stuff for too long?

  • @leadore@lemmy.world
    link
    fedilink
    1630 days ago

    Yes you are correct, it’s worse now. At first it was creative, innovative products that made things more convenient or fun, or at least didn’t harm its users. Now all the new things are made by immature egotistical billionaire techbros: generative AI which has ruined the internet by polluting it with so much shit you can’t get real information any more, not to mention using up all our power and water resources, the enshittification of Web 2.0, Web 3.0 that was pure shit from the get-go, IOT “smart” appliances like TVs, doorbells, thermostats, refrigerators that spy on you and your neighbors, shit “self-driving” killer cars that shouldn’t be allowed on the roads, whatever the hell that new VR Metaverse shit is, ads, ads, ads, ads, and on and on. It’s a tech dystopia.

  • @rumba@lemmy.zip
    link
    fedilink
    English
    629 days ago

    Facebook’s AR glass prototype is fire. It’s too expensive to release to the public but in a few years…

    Tech in general isn’t accelerating as fast. Drives aren’t twice as big every year at half the price. Processors aren’t twice as fast. 2024 stuff is still better than 2021 stuff, but it’s not twice as good. A few things have take a couple steps backward as they try to wrangle AI data capture into our lives. Up until recently, we’ve been able to scale thing down, so the same thing, only smaller and faster, but we’re hitting the limit of that, which is why people are latching on to ML to distract us from the fact that our gaming system from 6 years ago is still fine.

  • @EndlessNightmare@reddthat.com
    link
    fedilink
    1529 days ago

    I’ve been saying this for a while, and have estimated a similar 10-year time frame.

    Most new tech (except for medical advancements) doesn’t really benefit the average person. Instead, it just gives corporations and governments more data, more control, and the ability to squeeze more money out of us. They don’t represent actual improvements to society as a whole or to individual users.

  • @givesomefucks@lemmy.world
    link
    fedilink
    English
    51 month ago

    You grew up in a time of huge technological innovation, so you see anything else as unusual

    Boomers grew up in stagnation, and expect tech to keep progressing at the same rate.

    Both are 100% normal ways for our brains to expect shit to go, but neither fit modern society.

  • @NOT_RICK@lemmy.world
    link
    fedilink
    English
    1281 month ago

    I think new tech is still great, I think the issue is the business around that tech has gotten worse in the past decade

    • @Redredme@lemmy.world
      link
      fedilink
      21 month ago

      The question op is posing is:

      Which new tech?

      In the decade op’s talking about everything was new. The last ten years nothing is new and all just rehash and refinements.

      ML, AI, VR, AR, cloud, saas, self driving cars (hahahaha) everything “new” is over a decade old.

      • @heraplem@leminal.space
        link
        fedilink
        English
        129 days ago

        AI is not technically new, but generative AI was not a mature technology in 2014. It has come a long way since then.

    • @neidu3@sh.itjust.works
      link
      fedilink
      English
      65
      edit-2
      1 month ago

      Agree. 15+ years ago tech was developed for the tech itself, and it was simply ran as a service, usually for profit.

      Now there’s too much corporate pressure on monetizing every single aspect, so the tech ends up being bogged down with privacy violations, cookie banners, AI training, and pretty much anything else that gives the owner one extra anual cent per user.

      • AnyOldName3
        link
        fedilink
        51 month ago

        Lots of the privacy violations already existed, but then the EU legislated first that they had to have a banner vaguely alluding to the fact that they were doing that kind of thing, and later, with GDPR, that they had to give you the option to easily opt-out.

        • @neidu3@sh.itjust.works
          link
          fedilink
          English
          281 month ago

          Enshittification was always a thing but it has gotten exponentially worse over yhe past decade. Tech used to be run by tech enthusiasts, but now venture capital calls the shot a lot more than they used to.

      • AwkwardLookMonkeyPuppet
        link
        fedilink
        English
        101 month ago

        What’s crazy is that they were already making unbelievable amounts of money, but apparently that wasn’t enough for them. They’d watch the world burn if it meant they could earn a few extra pennies per flame.

      • @Philosofuel@futurology.today
        link
        fedilink
        English
        61 month ago

        You know this happened with cars also, until there is a new disruption by a new player or technology - companies are just coasting on their cash cows. Part of the market cycle I guess.

      • @Dagwood222@lemm.ee
        link
        fedilink
        61 month ago

        [off topic?]

        Frank Zappa siad something like this; in the 1960’s a bunch of music execs who liked Frank Sinatra and Louis Armstrong had to deal with the new wave coming in. They decided to throw money at every band they could find and as a result we got music ranging from The Mama’s and The Papas to Iron Butterfly and beyond.

        By the 1970s the next wave of record execs had realized that Motown acts all looked and sounded the same, but they made a lot of money. One Motown was fantastic, but dozens of them meant that everything was going to start looking and sounding the same.

        Similar thing with the movies. Lots of wild experimental movies like Easy Rider and The Conversation got made in the 1970s, but when Star Wars came in the studios found their goldmine.

        • HobbitFoot
          link
          fedilink
          English
          230 days ago

          But even then, there were several gold mines found in the 1990’s, funded in part due to the dual revenue streams of theater releases and VHS/DVD.

          You’ve got studios today like A24 going with the scatter shot way of making movies, but a lot of the larger studios got very risk adverse.

          • @Dagwood222@lemm.ee
            link
            fedilink
            330 days ago

            Just saw Matt Damon doing the hot wings challenge. He made a point about DVDs. He’s been producing his own stuff for decades. Back in the 1990s the DVD release meant that you’d get a second payday and the possibility of a movie finding an audience after the theatrical run. Today it’s make-or-break the first weekend at the box office.

    • JustEnoughDucks
      link
      fedilink
      11 month ago

      Well it is literally not going as fast.

      The rate of “technology” (most people mean electronics) advancement was because there was a ton of really big innovations at in a small time: cheap PCBs, video games, internet, applicable fiber optics, wireless tech, bio-sensing, etc…

      Now, all of the breakthrough inventions in electronics have been done (except chemical sensing without needing refillable buffer or reactive materials), Moore’s law is completely non-relevant, and there are a ton of very very small incremental updates.

      Electronics advancements have largely stagnated. MCUs used 10 years ago are still viable today, which was absolutely not the case 10 years ago, as an example. Pretty much everything involving silicon is this way. Even quantum computing and supercooled computing advancements have slowed way down.

      The open source software and hardware space has made giant leaps in the past 5 years as people now are trying to get out from the thumb of corporate profits. Smart homes have come a very long way in the last 5 years, but that is very niche. Sodium ion batteries also got released which will have a massive, mostly invisible effect in the next decade.

      • @Valmond@lemmy.world
        link
        fedilink
        -11 month ago

        Electronic advancement, if you talk about cpus and such, hasn’t stagnated, its just that you don’t need to upgrade any more.

        I have a daily driver with 4 cores and 24GB of RAM and that’s more than enough for me. For example.

        • JustEnoughDucks
          link
          fedilink
          2
          edit-2
          30 days ago

          It has absolutely stagnated. Earlier transistors were becoming literally twice as dense every 2 years. Clock speeds were doubling every few years.

          Year 2000, first 1GHz, single core CPU was released by nvidia.

          2010 the Intel core series came out. I7 4 cores clocked up to 3.33GHz. Now, 14 years later we have sometimes 5GHz (not even double) and just shove more cores in there.

          What you said “it’s just that you don’t need to upgrade anymore” is quite literally stagnation. If it was a linear growth path from 1990 until now, every 3-5 years, your computer would be so obsolete that you couldn’t functionally run newer programs on them. Now computers can be completely functional and useful 8-10+ years later.

          However. Stagnation isn’t bad at all. It always open source and community projects with fewer resources to catch up and prevents a shit ton of e-waste. The whole capitalistic growth growth growth at any cost is not ever sustainable. I think computers now, while less exciting have become much more versatile tools because of stagnation.

          • @Valmond@lemmy.world
            link
            fedilink
            030 days ago

            “Mores laws dead” is so lame, and wrong too.

            Check out SSD, 3D memory, GPU…

            If you do not need to upgrade then it doesn’t mean things aren’t getting better (they are) just that you don’t need it or feel it is making useful progress for your use case. Thinking that because this, it doesn’t advance, is quite the egocentric worldview IMO.

            Others need the upgrades, like the crazy need for processing power in AI or weather forecasts or cancer research etc.

            • JustEnoughDucks
              link
              fedilink
              2
              edit-2
              29 days ago

              GPU advances have also gone way way down.

              For many years, YoY GPU increases lead to node shrinkages, and (if we simplify it to the top tier card) huge performance gains with slightly more power usage. The last 4-5 generations have seen the exact opposite: huge power increases closely scaling with performance increases. That is literally stagnation. Also they are literally reaching the limit of node shrinkage with current silicon technology which is leading to larger dies and way more heat to try to get even close to the same generational performance gain.

              Luckily they found other uses for uses GPU acceleration. Just because there is an increase in demand for a new usecase does not, in any way, mean that the development of the device itself is still moving at the same pace.

              That’s like saying that a leg of a chair is reaching new heights of technological advancement because they used the same chair leg to be the leg of a table also.

              It is a similar story of memory. They are literally just packing more dies in a PCB or layering PCBs outside of a couple niche ultra-expensive processes made for data centers.

              My original comment was also correct. There is a reason why >10 year old MCUs are still used in embedded devices today. That doesn’t mean that it can’t still be exciting finding new novel uses for the same technology.

              Again, stagnation ≠ bad

              The area that electronics technology has really progressed quite a bit is Signal Integrity and EMC. The things we know now and can measure now are pretty crazy and enable the ultra high frequency and high data rates that come out in the new standards.

              This is not about pro gamer upgrades. This is about the electronics (silicon based) industry (I am an electronics engineer) as a whole

  • @iii@mander.xyz
    link
    fedilink
    English
    241 month ago

    Your BS radar has simply improved I’m guessing. Go through a few hype cycles, and you learn the pattern.

    Hardware is better than ever. The default path in software is spammier and more extortionist than ever.

  • JackbyDev
    link
    fedilink
    English
    181 month ago

    What? No. lol. Tech is still improving. You’re just thinking of the bad new stuff and good old stuff. Nostalgia is a hell of a drug. Phone’s batteries and resolutions are much better than they were in 2014. Voice assistants never really took off. Smart home stuff is maaaaybe a little better now but there are also a shit ton more brands now and most are crap. But that also means cheaper and more widespread.

    • @wolfpack86@lemmy.world
      link
      fedilink
      129 days ago

      I think what OP meant is that there’s no new creation of types of devices.

      My new phone is objectively and subjectively better than the previous two I’ve owned over the last 6 years, but it’s doing the same tasks.

    • @Nibodhika@lemmy.world
      link
      fedilink
      230 days ago

      The thing is that 10 years ago the phone I had was very similar to the one I have now, the laptop I had was very similar to the one I have now, and up until very recently I still had parts from the desktop I used to had back then installed on my current desktop, I also visited lots of the same sites I do now and played some of the same games. But if you go back another 10 years it’s very different. In 2004 I didn’t had a cellphone, by 2014 I had a Google Nexus, now I have a Google Pixel. In 2004 I didn’t had a laptop, in 2014 I had a 8GB RAM 512GB dual core laptop, now I have a 32GB 1TBB 6 core one. In 2004 My desktop had 256MB RAM 10GB single core 1.6GHz processor, in 2014 it had 16GB RAM 1TB 6 core, now it has 32GB RAM 3TB 6 cores.

      Obviously my computer now is much better than the one from 10 years ago, bit not by the same amount than the one form 2014 was from the one from 2004. To try to put it in perspective I would need to have around 500GB of RAM for it to be the same leap in RAM amount.

    • @ApatheticCactus@lemmy.world
      link
      fedilink
      English
      130 days ago

      I just got a new phone, and the ai voice assistant is actually good. It’s what people imagined it was going to be when they first came out. It doesn’t have access yet to a lot of things, so it can’t ‘act’ on things, but it actually gives consistently relevant info.

      One thing I’ve used it for recently is I was in a game and knew there was a secret chest and it could accurately tell me what to do to get it Way better than looking up a video.

  • @Shardikprime@lemmy.world
    link
    fedilink
    -1
    edit-2
    29 days ago

    Hell no. Fuck that shit

    We had like 500 form factors for phones, now it’s standardized

    Resistive touch screens? Ewww

    Like a billion mp3/MP4/ipod clones? Just to listen to music? A thing which now we can do easily on our phones?

    Slow ass ssd/nand memory chips?

    Freaking 1 core processors on phones, PCs and laptops?

    Seriously someone misses their devices behaving like slowpokes?

    Wireless audio devices that worked like shit unless they were extremely high end? Oh yeah wired worked great, but we were flooded with a ton of clones of those too. So no great quality from those “Skeleton Sweet” or “earpods”.

    Batteries that were in dire need of charge at least thrice a day?

    Wireless routers that with any luck had gains that allowed to step out of the room?

    Car wise, no stability control? You seriously fucking with stability control? That shit avoids like 25% of all car accidents globally

    Medical wise, CRISPR? gene therapy for muscular dystrophy? Vaccines that can be whipped out in months?

    Innovation slowing down? In what planet do you fuckers live on?

    I’d say more but I think you get my point

    • @phlegmy@sh.itjust.works
      link
      fedilink
      128 days ago

      10 years ago was 2014, not 2004.

      The samsung galaxy s5 was released at the start of 2014 with a capacitive 1080p amoled touchscreen, a quad core snapdragon 801 processor, 21h of ‘talk time’ battery, wireless charging, a fingerprint sensor, NFC, dual band 802.11ac wifi support, and emmc 5.0 storage (250 MB/s sequential read).

      New cars were mandated to have ESC in the US and the EU by 2014.

      There have definitely been many innovations since 2014, but most consumer technology upgrades have been iterative rather than innovative.

    • @Rednax@lemmy.world
      link
      fedilink
      429 days ago

      I don’t understand the downvotes. The tech may be less revolutionairy from the perspective of a user, but we absolutely made a lot of progress.

  • @antlion@lemmy.dbzer0.com
    link
    fedilink
    91 month ago

    TV resolution peaked about 10 years ago with 1080p. The improvement to 4K and high dynamic range is minor.

    3D gaming has plateaued as well. While it may be possible to make better graphics, those graphics don’t make better games.

    Computers haven’t improved substantially in that time. The biggest improvement is maybe usb-c?

    Solar energy and battery storage have drastically changed in the last 10 years. We are at the infancy of off grid building, micro grid communities, and more. Starlink is pretty life changing for rural dwellers. Hopefully combined with the van life movement there will be more interesting ways to live in the future, besides cities, suburbs, or rural. Covid telework normalization was a big and sudden shift, with lasting impacts.

    Maybe the next 10 years will bring cellular data by satellite, and drone deliveries?

    • @HeavyRaptor@lemmy.zip
      link
      fedilink
      English
      -31 month ago

      Sorry to make you feel old but 10 years ago 4k was already mainstream, and you would have already had difficulty finding a good new 1080p TV. That is roughly the start of proper HDR being introduced to the very high end models.

      Also, maybe you’ve only experienced bad versions of these technologies because they can be very impressive. HDR especially is plastered on everything but is kinda pointless without hardware to support proper local dimming, which is still relegated to high end TVs even today. 4k can feel very noticeable depending on how far you sit from the TV, how large the screen is, and how good one’s eyesight is. But yeah, smaller TVs don’t benefit much. I only ended up noticing the difference after moving and having a different living room setup, siting much closer to the TV.

      • data1701d (He/Him)
        link
        fedilink
        English
        51 month ago

        I wouldn’t call 4K mainstream in 2014 - I feel like it was still high end.

        I didn’t have a 4K TV until early 2019 or so when unfortunately, the 1080p Samsung one got damaged during a move. Quite sad - it had very good color despite not having the newest tech, and we’d gotten it second-hand for free. Best of all, it was still a “dumb” TV.

        Of course, my definition of mainstream is warped, as we were a bit behind the times - the living room had a CRT until 2012, and I’m almost positive all of the bedroom ones were still CRTs in 2014.

      • @antlion@lemmy.dbzer0.com
        link
        fedilink
        130 days ago

        My lower back makes me feel old, not TV resolutions. My TV is a 2020 LG OLED, 55”. I do notice the difference but I just don’t think it’s a big deal, because 1080p is sharp enough. I wear glasses when I watch TV correcting to 20/15. Or another way of saying it, is that my old eyes don’t care for big screens. The experience is the same.

    • @chrizzowski@lemmy.ca
      link
      fedilink
      41 month ago

      Strong disagree about the 4k thing. Finally upgraded my aging 13 year old panels for a fancy new Asus 4k 27"and yeah it’s dramatically better. Especially doing either architectural or photographic work on it. Smaller screens you’ve got a point though. 4k on a 5" phone seems excessive.

      • @Valmond@lemmy.world
        link
        fedilink
        11 month ago

        Couldn’t that be just overall quality?

        Source bought a lot of fullhd screens, some were just so bad, I only go with iiyama today.

      • @antlion@lemmy.dbzer0.com
        link
        fedilink
        31 month ago

        I mean for television or movies. From across the room 4k is only slightly sharper than 1080p. Up close on a monitor is a different story.

          • @antlion@lemmy.dbzer0.com
            link
            fedilink
            130 days ago

            Yeah my wall limits the size of my TV to 55”, but I also have a fairly short viewing distance of 8 ft. That puts me in the 1080p range. The details of 4K show up better if I sit closer, but I still wouldn’t characterize it as a dramatically different viewing experience. I watch nature documentaries in 4k, but for close ups of faces 1080p is enough for me. I really don’t need to see every pore. And for action/CG I feel higher resolution, like higher frame rate or interpolation, seems to cheapen the effects. I like my movies choppy and blurry like they were meant to be.

    • JackbyDev
      link
      fedilink
      English
      31 month ago

      3D gaming has plateaued as well. While it may be possible to make better graphics, those graphics don’t make better games.

      I haven’t played it, only seen clips, but have you seen AstroBoy? It’s true that the graphics aren’t really too much better than the PS4, but there’s like a jillion physics objects on the screen with 60fps. It’s amazing. Graphics are still improving, just in different ways.

      • @antlion@lemmy.dbzer0.com
        link
        fedilink
        130 days ago

        Astro Bot looks pretty cool, but I think the same gameplay experience was totally possible 10 years ago albeit with fewer pretty reflections, and lower polygon counts.

        I think the next breakthrough in gaming and/or VR will be when somebody figures out how to generate Gaussian splatting environments. It’s fundamentally different from the polygon approach and it feels so much more photo realistic.

  • @utopiah@lemmy.world
    link
    fedilink
    429 days ago

    I work in VR and AR. I traveled to a conference this week to showcase demos of my work.

    I have in my backpack a headset that’s costing few hundred bucks and can spawn in front if your eyes 3D models you can directly manipulate with your hands or a pen.

    It just works.

    I even use it offline while flying.

    This didn’t exist 10years ago. It’s amazing.

    • @Rednax@lemmy.world
      link
      fedilink
      329 days ago

      It always amazes me how much professional uses VR/AR has, and what kind of stuff has been created for it by all sorts of industries. Some see it as a failure because the consumer variants have not seen revolutionairy improvements over the past years, but the industry around it is quickly growing. So many companies use it, that the technology doesn’t need games to survive.

      • @utopiah@lemmy.world
        link
        fedilink
        2
        edit-2
        23 days ago

        the consumer variants have not seen revolutionairy improvements over the past years

        They probably haven’t tried a Quest 3 (overall trade off) or a Vision Pro (resolution and eye tracking, arguably not for consumers though based on the price… but compared to gaming PC + VR kit few years ago I’d say it is comparable) because even though IMHO the biggest revolution has been going from 3DoF to 6DoF recently, just the improvements (resolution, inside-out tracking, hand tracking, BT support with a ton of peripherals, etc) is actually providing an experience different enough that people who had doubt few years ago, say on a Valve Index, are reconsidering “just” based on form factor and thus convenience.

  • @Platypus@lemmings.world
    link
    fedilink
    11 month ago

    Not really, one of my favourite games of all time came recently and it wouldn’t be possible to exist without more current tech plus I like modern phones more and more

  • @futatorius@lemm.ee
    link
    fedilink
    1529 days ago

    That was when innovation slowed down and rent-seeking increased, once the big players started exploiting their oiligopolies in earnest.