Why not gigabit?
March 26, 2007 1:45 PM   Subscribe

Why do some networking products have 10/100 ports instead of gigabit ports?

Gigabit ethernet nics have been around for a while. Apple has had them standard in their computers for years now. Why does Apple (and other manufacturers) not put gigabit ports in their networking products? In Apple's case the new 802.11n draft Airport Extreme and AppleTV. I assume at this point the cost would be negligible. Or is it more of a push to faster wfi specs?
posted by rathikd to Computers & Internet (9 answers total)
 
The bottleneck for consumer networking is not 10, 100 or 1000 Mbit. It is the relatively low bandwidth coming in through your phone, fiber optic or cable line.
posted by Blazecock Pileon at 1:54 PM on March 26, 2007


They're still charging a pretty substantial premium to add Gigabit. For instance, a Netgear 5-port switch is $32 for 10/100 and $50 for Gigabit. That's probably not supported by the cost, but what does cost have to do with pricing?
posted by smackfu at 1:54 PM on March 26, 2007


The cost differential for Gigabit products isn't so much the chip set, or the physical packaging, but the added high speed memory needed for buffering the backplane in multi-port devices. Even something as small as a 5 port switch can get into collision territory if 4 ports are populated, and cross communicating. A 10/100 device can resolve this with only a few Kb per port of memory, but a Gigabit device needs 20-30x that amount of buffer, at wire speed, to stay truly Gigabit. Most small home level switches don't actually do even that, and they don't support all the features of the Gigabit spec, like jumbo frames, either, simply because they don't have enough per port memory, or fast enough processors in their switching backplanes.
posted by paulsc at 2:29 PM on March 26, 2007 [3 favorites]


BP—The new Airport Extreme also functions as a LAN hub and NAS "head". It would definitely benefit from GigE.

GigE does require a more powerful chip, which runs hotter. Apple was probably trying to reach a price point and/or form factor that would have been impossible with GigE.
posted by adamrice at 2:34 PM on March 26, 2007


There's almost no need for it in the home environment and very difficult to justify the cost. I read a report recently on how almost all of home ethernet equipment goes unused except for the wireless interface. Even if everyone suddenly switched to wired its still only a few seconds to copy over a 700meg video file from one computer to another at 100mbps. Cutting that time half or by a third isnt worth the extra cost of a gigabit interface. Its not a 10x real world speed increase.

Heck, we still have a lot of 10mbps equipment out in the wild. Some people who are getting very high speed connections (like FIOS) in their homes have found out the hard way that the WAN port on their home 10/100 routers is really just a 10mbps port.
posted by damn dirty ape at 2:53 PM on March 26, 2007


The average home router can't really even route gigabit connections, beyond that, most hard drives struggle to keep up. Remember 1 gbps is over 100 Megabytes per second. A reasonably fast hard drive will be about 1/2 to a 1/3 as fast as that.
posted by Freen at 3:56 PM on March 26, 2007


Response by poster: Just to clarify: I wasn't thinking in terms of ISP to computer speed but rather computer to computer (or what you) speed.

Thanks for the replies so far. They have all been pretty good!
posted by rathikd at 4:56 PM on March 26, 2007


Also note that there is a substantial difference in terms of hardware between offering a gigabit NIC (i.e. a single port) and making a multiport switch that operates at line rate at gigabit speeds. I would venture that almost all motherboards these days that offer integrated NICs are gigabit. Such switches, however, used to cost thousands of dollars not so many years ago.
posted by Rhomboid at 6:50 PM on March 26, 2007


Even if everyone suddenly switched to wired its still only a few seconds to copy over a 700meg video file from one computer to another at 100mbps.

Actually, the theoretical best transfer speed for 700 megabytes on 100 megabit ethernet is just over a minute.

700 megabyte * 8 megabits / megabyte / 90 megabits / second = 62 seconds.

I'm far too impatient for such things!
posted by flaterik at 2:19 AM on March 27, 2007


« Older What gets you out of bed in the morning?   |   where to sit at the at&t center Newer »
This thread is closed to new comments.