They might as well say 2 terabytes per second at this rate. How could someone even use 10Gbps? That is two raw DVDs per second.
They might as well say 2 terabytes per second at this rate. How could someone even use 10Gbps? That is two raw DVDs per second.
If that speed becomes commonplace it could revolutionize my job - could just edit from home with multi-cam HD video stored on the work server that I'm editing over the Internet.
While fiber might go to junction boxes and whatnot, what goes through most of the neighborhoods and to houses is just good old copper, and that can have serious bandwidth issues if not managed properly by the company.
yeah, my ISP has fiber to their own offices downtown, and they're laying down fiber for all their new stuff.... but since my town's infrastructure was relatively recently upgraded with whatever they used before we'll be one of the last cities to get fiber. probably just after the next big data transmission upgrade comes out.
anyway since all the infrastructure between me and the local office is cable, my internet is pretty consistently good, but no more than average. and every now and then it gets fucking retarded.
Then yeah, it would be, but as long as most neighborhoods are served by copper, issues will keep coming up. In larger cities, it's much easier to get large sections of the population. In more rural areas, as well as medium cities will continue to suffer. Basically any area still serviced by house to house telephone poles is screwed for a good long while.
Honestly, I'm okay with not having gigabit. I usually get 60-70 down and ~40 up (advertised speed is 50/25...what?), and the only time i ever really even wish it was faster is if i don't go to random image thread for a while, and it needs to load an entire (50 post) page at once. Everything else loads pretty much instantly, and downloads are pretty much always done before i want to use them anyway, unless it's being limited by the server's upload anyway.
Although my ping to local servers is usually at about 12 ms, i wouldn't mind dropping that to 1.
Is there any particular reason that ISPs advertise and refer to internet speeds in ---bits/sec rather than ---bytes/sec other than the fact that it allows them to artificially inflate the numbers, banking on the fact that the vast majority of people don't know the difference between a bit and a byte?
Because bits per second has been used since before a byte was always 8 bits, i guess.
A byte was always 8 bits. And a nibble was always 4 bits, or half a byte. Early digital communication infrastructure however started up (slightly) before the byte became an established or even standardized unit, and thus they used bits to describe their speed.
The byte becoming a standardized and streamlined in the infrastructure happened like four decades ago though, so continue to use bits nowadays serves no purpose other than confusing lesser knowledgeable people-- just like how hard drive space numbers are deceiving.
I don't think i've ever noticed they advertise in bits. Then again now that I think about it speedtest also measures in bits. That and 50Mbps sounds better then 6MBps
So yea, they're banking on the fact that most people don't know or don't notice the difference, that answers that.
I've never heard of any other kind of data described in terms of bits, or even displayed in bits on any computer I've ever used.
Even actual downloads and data transferring, the actual results of network speed, is always in bytes.
No it hasn't. A byte has probably been, at some point i time, almost every number of bits between 1 and 32. Of course, we're talking decades ago.
Change is hard. It's what has always been used, so why change it, if it works just fine? People will be just as confused when if their internet speed "drops".The byte becoming a standardized and streamlined in the infrastructure happened like four decades ago though, so continue to use bits nowadays serves no purpose other than confusing lesser knowledgeable people-- just like how hard drive space numbers are deceiving.
They aren't trying to trick people, it is a standard convention, that predates the internet.So yea, they're banking on the fact that most people don't know or don't notice the difference, that answers that.
I think it's more deceiving that windows lies about the units than hdd manufacturers using the right ones. Any many linux variants use the SI units, so you can't please everyone.just like how hard drive space numbers are deceiving.
Except everything that is actually monitoring active data transfer on your computer. Steam, Windows Resource Monitor, any given download monitor for browsers, etc etc.
Only ISPs use bits for data transfer, and yes it's intentionally deceptive, just like the people who decided to market HDD sizes as "a GB is 1000 MB, TB is 1000 GB" etc. Don't be naive.