It’s kind of silly, but I still really dig the idea behind torrenting and peer to peer sharing of data. It’s cool to think about any old computer helping pass along some odd bits & bytes of data, whether a goofy drawing or strange story.

  • @QuarterSwede@lemmy.world
    link
    fedilink
    101 year ago

    Any large file is going to be much quicker getting through BT as long as there are enough seeders. OS distros, patches, P2P files, 4K anything, etc.

  • @Dave@lemmy.nz
    link
    fedilink
    41 year ago

    If you just mean peer to peer, I feel like magnet links (often using bittottent) are still found for downloading large files from time to time (not just ISOs). Things like open source games and software, though if I’m being honest I can’t think of a single one that still uses them. You used to find magnet links all over the open source scene but I guess with github offering free hosting it’s not so common anymore.

  • Max-P
    link
    fedilink
    641 year ago

    I think a good chunk of the Internet Archive is available as torrents, at least the software collections and public domain media.

    You can also download a torrent of the whole of Wikipedia, with and without images.

      • Lunch
        link
        fedilink
        12
        edit-2
        1 year ago

        Not a direct answer to your question, but this is where I download my stuff from, and it also shows size.

        https://library.kiwix.org/#lang=eng

        Edit: Wikipedia is available there, the full thing is 109.89GB. I wonder how up-to-date it is.

      • Barry Zuckerkorn
        link
        fedilink
        81 year ago

        As of last year, English Wikipedia, articles only, text only, was about 22GB compressed (text compresses pretty efficiently), according to the current version of this page:

        As of 2 July 2023, the size of the current version of all articles compressed is about 22.14 GB without media

        Some other sources describe the uncompressed offline copies as being around 50 GB, with another 100 GB or so for images.

        Wikimedia, which includes all the media types, has about 430 TB of media stored.

  • @HarriPotero@lemmy.world
    link
    fedilink
    211 year ago

    PeerTube uses Webtorrents to offload hosting of hueg files.

    Odysee uses something similar to do the same. (At least they claim to, but last time I took a dig at it it seemed to be hosted “regularly”)

    Spotify famously had their own p2p-thing going in their desktop apps in the early days. Saved them a pretty coin back when hosting was expensive.

    Coming to a browser near you is IPFS.

  • @zerakith@lemmy.ml
    link
    fedilink
    41 year ago

    Its a really interesting question. I wonder what the underlying economics and ideologies are at play with its decline. Economies of scale for large server farms? Desire for control of the content/copyright? Structure and shape of the network?

    I guess it has some implications for stream versus download approaches to content?

    • @ShittyBeatlesFCPres@lemmy.world
      link
      fedilink
      English
      21 year ago

      If I recall, Spotify moved away from it just because the client/server model got way cheaper and the P2P model had some limitations for their future business plans. I remember them mentioning that offering a family plan was a challenge with their P2P architecture when people on the same network/account were using it at the same time.

      It was probably also part of the move to smartphones. Spotify was just a desktop program for a long time and, while I’m not an expert, I would guess the P2P model made a lot more sense on desktop with a good connection than early smartphones on flaky 2G/3G connections. They might have had to run a client/server model for iOS and/or Android anyway.

      • @zerakith@lemmy.ml
        link
        fedilink
        11 year ago

        Very interesting, thank you. I guess then the centralised server must have some sort of economy of scale.

        In my head, I’m comparing the network to the electricity grid with certain shapes of network making different technologies more or less feasible. I would guess the internet network is probably similar to the electricity grid in most places having fewer hubs and lines of high bandwidth rather than a more evenly distributed network. Maybe the analogy is bad though.

  • @sunbeam60@lemmy.one
    link
    fedilink
    91 year ago

    IIRC Steam uses BitTorrent to help users download game assets. There’s an option to switch it off, still, so must still be going.

  • Zagorath
    link
    fedilink
    91 year ago

    I podcast I listen to says that they used to distribute episodes by BitTorrent, way back in like 2006, as a way to keep bandwidth costs down when they were new. I’m pretty sure they had stopped that option by the time I started listening in about 2008/9.

  • @Jimmycrackcrack@lemmy.ml
    link
    fedilink
    5
    edit-2
    1 year ago

    I remember when it was relatively new and controversial BBC’s iPlayer hadn’t been around very long and they said they were going to start using Bittorrent tech for streaming. Guessing that never came to fruition though.

  • @FiskFisk33@startrek.website
    link
    fedilink
    651 year ago

    I don’t think they do it anymore, but spotify started out with a p2p network on the backend.
    Super smart way of bootstrapping such a thing without having to upfront huge server costs.

  • Rentlar
    link
    fedilink
    71 year ago

    Transferring files to several other computers. I’ve done it in the past before I used KDE connect to transfer files rather than use ftp or just memory sticks. It would be useful at a LAN party to get several copies of the software distributed. (Kinda piracy but doesn’t have to be if the game is free or everyone owns it legitimately).

  • TXL
    link
    fedilink
    201 year ago

    One funny use I discovered when I was cloning a lot of computers is that even on a closed lan, BT with local discovery was stupidly fast in distributing a big set of files across a pile of computers instead of rsync. Also, setting it up was much easier.

  • @LWD@lemm.ee
    link
    fedilink
    311 year ago

    This might be stretching the definition of “common” and “torrenting,” but BitTorrent created BitTorrent Sync with similar tech for personal file synchronization. It was later rebranded Resilio and still exists today.

    https://www.resilio.com/

    An open-source alternative that works in a similar fashion, SyncThing, also exists.

    https://syncthing.net/

    • @AdamEatsAss@lemmy.world
      link
      fedilink
      111 year ago

      I would consider this to be one of the intended functions of torrent files. Torrents started as faster ways to share files peer to peer. If a few people had a large file on their machines they could each upload part to someone who needs it essentially multiplying their upload bandwidth. This became less popular as internet speeds increased, except for “illegal” stuff. I would definitely try one of these…if I had more than one computer.

      • @LWD@lemm.ee
        link
        fedilink
        41 year ago

        A common use case for SyncThing is keeping a password file up to date between, say, your PC and your phone. It’ll even work remotely, thanks to the presence of relays.

        (The downsides include pretty heavy battery usage )

  • CharlesReed
    link
    fedilink
    61 year ago

    I torrent old out of print books that I can’t find anywhere else. The scans are usually pretty good. There was also a podcast I used to listen to called Caustic Soda. When they ended it, they released all of their episodes through torrenting so the fans could have them.