What can a library do with 1 Gigabyte broadband?
September 22, 2010 1:03 PM Subscribe
So, practically speaking, what would 1 gigabit broadband internet get for a place?
I'm trying to figure out what a facility could do (think school, library, etc.) with a 1 gigabyte internet connection. I know that these places generally don't have the bandwidth they need, so I'm trying to make a practical argument for bringing a 1 gig connection in.
More specifically, I'm looking for the kind of information, like, if a library has X workstations and they're all doing Y they would need Z total Mbps bandwidth. Or, to turn it around, with 1 Gigabyte broadband you could have 100 workstations all on youtube or 50 doing two-way video chat (ie: Skype) or something like that (note: those numbers are just made up)
I've been trying to google this 8,000,000 ways today and I can't seem to find what I need. I did come across this random site and he seems to think that "1.2 Mbps of available bandwidth for every 40 workstations that connect simultaneously to web-delivery platforms. Most internet require an average connection of 40 Kbps per workstation; Normal internet client requires 128 Kbps per workstation. " That seems awfully low, or at least outdated. If he's spot on, by all means please let me know.
Thanks in advance, hive. I can address any follow up questions as they come up; I never know if I'm being clear in what I'm asking.
I'm trying to figure out what a facility could do (think school, library, etc.) with a 1 gigabyte internet connection. I know that these places generally don't have the bandwidth they need, so I'm trying to make a practical argument for bringing a 1 gig connection in.
More specifically, I'm looking for the kind of information, like, if a library has X workstations and they're all doing Y they would need Z total Mbps bandwidth. Or, to turn it around, with 1 Gigabyte broadband you could have 100 workstations all on youtube or 50 doing two-way video chat (ie: Skype) or something like that (note: those numbers are just made up)
I've been trying to google this 8,000,000 ways today and I can't seem to find what I need. I did come across this random site and he seems to think that "1.2 Mbps of available bandwidth for every 40 workstations that connect simultaneously to web-delivery platforms. Most internet require an average connection of 40 Kbps per workstation; Normal internet client requires 128 Kbps per workstation. " That seems awfully low, or at least outdated. If he's spot on, by all means please let me know.
Thanks in advance, hive. I can address any follow up questions as they come up; I never know if I'm being clear in what I'm asking.
This isn't the kind of thing you Google, it's the kind of thing you calculate and estimate. Measure your own bandwidth as you do the kinds of things you imagine these people doing and multiply it by the estimated number of people who will be using the connection. You can undershoot this to account for the fact that not everybody is going to be watching YouTube while listening to Pandora, but it's really just math. Don't get stuck on the "gigabit" thing, unless the underlying premise of this question is that you have been tasked with selling gigabit internet connections, in which case I don't have much to say. At any rate, I'd start by finding out how much of the "gigabit internet" connection is usable, e.g. typical ethernet has at least 40% overhead (100Mbit ethernet = ~60Mbit max in practice).
posted by rhizome at 1:11 PM on September 22, 2010
posted by rhizome at 1:11 PM on September 22, 2010
I know that these places generally don't have the bandwidth they need, so I'm trying to make a practical argument for bringing a 1 gig connection in.
You may "know" this, but if they don't have specific needs that are not being met with their current connection, there's no point in making something up.
Now, if you're a marketing person, and trying to sell them on the idea -- well, that's another kettle of fish. But presuming you're trying to do well by them, focus on the internal bandwidth (Gigabit ethernet the whole way down, assuming you have systems that back up the machines and/or get images remotely installed, so that the bandwidth can be up to the task) and if you make a pitch for 1GB wan, do it as a backup should the existing or proposed (cheaper) setup turns out to be insufficient.
posted by davejay at 1:12 PM on September 22, 2010
You may "know" this, but if they don't have specific needs that are not being met with their current connection, there's no point in making something up.
Now, if you're a marketing person, and trying to sell them on the idea -- well, that's another kettle of fish. But presuming you're trying to do well by them, focus on the internal bandwidth (Gigabit ethernet the whole way down, assuming you have systems that back up the machines and/or get images remotely installed, so that the bandwidth can be up to the task) and if you make a pitch for 1GB wan, do it as a backup should the existing or proposed (cheaper) setup turns out to be insufficient.
posted by davejay at 1:12 PM on September 22, 2010
Oh, and
Normal internet client requires 128 Kbps per workstation.
Remember that 128Kbps is a portion of available capacity. In a school setting, you wouldn't expect to see each and every client running torrent downloads, say, so you should assume workstations would use bandwidth in random bursts, and the 128Kbps scenario is *worst-case* for simultaneous use of all machines to download. Not to be all internet-is-tubes, but it's the networking equivalent of "how big does the downpipe need to be, in case all the toilets in the building are flushed at once?"
posted by davejay at 1:14 PM on September 22, 2010
Normal internet client requires 128 Kbps per workstation.
Remember that 128Kbps is a portion of available capacity. In a school setting, you wouldn't expect to see each and every client running torrent downloads, say, so you should assume workstations would use bandwidth in random bursts, and the 128Kbps scenario is *worst-case* for simultaneous use of all machines to download. Not to be all internet-is-tubes, but it's the networking equivalent of "how big does the downpipe need to be, in case all the toilets in the building are flushed at once?"
posted by davejay at 1:14 PM on September 22, 2010
Response by poster: How many workstations/servers/whatnot run off the 100/100 connection? That might help get to the core of my question. I mean, theoretically, I could take whatever your place's max would be and multiply by 10, right?
I'm not really asking about the costs associated with the connection, but I am trying to put together something so I could go to a library (for instance) and say "If you got a fiber in here, THIS is what you could do." Real, concrete capabilities.
posted by indiebass at 1:18 PM on September 22, 2010
I'm not really asking about the costs associated with the connection, but I am trying to put together something so I could go to a library (for instance) and say "If you got a fiber in here, THIS is what you could do." Real, concrete capabilities.
posted by indiebass at 1:18 PM on September 22, 2010
Another question to ask: would your existing infrastructure allow you to use all that bandwidth effectively?
posted by zamboni at 1:29 PM on September 22, 2010
posted by zamboni at 1:29 PM on September 22, 2010
Response by poster: davejay: no, I'm not in marketing. Though I am dealing broadly, and nationally (in the US) and yes, I am trying to figure this out for the good of the schools and libraries themselves, and if it further helps I'm not selling anything at all. This is strictly academic at this point.
Some of the facts I'm working with: 53% of libraries with an Internet connection get speeds at 3 Mbps or lower, and that number becomes 66% if you only count rural areas. According to a poll by the American Libraries Association, over half of librarians said that availability of bandwidth to support extra workstations was either "important" or "the most important" factor limiting the decision.
But yes, I think the all the toilets flushing at once analogy is probably pretty close to what I'm asking... I think. Or more accurately, how big does the pipe need to be to get 50 people watching youtube at once? (or any other metric that would be identifiable.)
ON PREVIEW: Zamboni, I assume you mean internal connections? For my purposes, we're living in a world where that get's magically taken care of, so the inside of the building would always be optimized for whatever pipe is coming in.
posted by indiebass at 1:33 PM on September 22, 2010
Some of the facts I'm working with: 53% of libraries with an Internet connection get speeds at 3 Mbps or lower, and that number becomes 66% if you only count rural areas. According to a poll by the American Libraries Association, over half of librarians said that availability of bandwidth to support extra workstations was either "important" or "the most important" factor limiting the decision.
But yes, I think the all the toilets flushing at once analogy is probably pretty close to what I'm asking... I think. Or more accurately, how big does the pipe need to be to get 50 people watching youtube at once? (or any other metric that would be identifiable.)
ON PREVIEW: Zamboni, I assume you mean internal connections? For my purposes, we're living in a world where that get's magically taken care of, so the inside of the building would always be optimized for whatever pipe is coming in.
posted by indiebass at 1:33 PM on September 22, 2010
Best answer: I work at a library and we have a gigabit connection to our internet provider. Right now, though, we only pay for about 20mbps as it's close to what we need. But when you have patron wireless devices on your network, infected with all kinds of shit or running crazy DSL-oriented apps that are happy to help themselves to a meg or four, no amount of bandwidth would ever be enough. Sheesh, after school gets out we could probably consume a gigabit systemwide just to youtube and the patrons would love it, snappy as hell!
This project is more about building a gigabit-capable fiber infrastructure than actually providing gigabit-speed connections to libraries at launch. In our case, building that fiber was the best thing we ever did, as we can essentially buy as much bandwidth as we need with no circuit costs involved. It's a huge savings for the organization and for taxpayers.
The flip side is that we regularly have opportunities to participate in Internet2 or academic live HD netcasts, some of which can take 100-300mbps just for a single viewer. Remember the old 640kb barrier? At no point in IT does it ever make sense to say "nobody would ever use all that," especially when you're talking about an infrastructure that will be used in an unforeseeable future.
posted by ulotrichous at 1:34 PM on September 22, 2010 [1 favorite]
This project is more about building a gigabit-capable fiber infrastructure than actually providing gigabit-speed connections to libraries at launch. In our case, building that fiber was the best thing we ever did, as we can essentially buy as much bandwidth as we need with no circuit costs involved. It's a huge savings for the organization and for taxpayers.
The flip side is that we regularly have opportunities to participate in Internet2 or academic live HD netcasts, some of which can take 100-300mbps just for a single viewer. Remember the old 640kb barrier? At no point in IT does it ever make sense to say "nobody would ever use all that," especially when you're talking about an infrastructure that will be used in an unforeseeable future.
posted by ulotrichous at 1:34 PM on September 22, 2010 [1 favorite]
You can transfer large data sets in a fraction of a time it would have otherwise. Digital media from other libraries could be streamed to multiple locations in their original format. (Think less youtube, but more blu-ray or dvd quality)
Yes, there will be extra bandwidth unused at it's initial stage, but the point is to give these places room to grow.
They may not have all the concrete applications for it now, but you want to create a sandbox big enough for the new generation to have space to implement their new ideas.
posted by arjuan at 1:35 PM on September 22, 2010 [1 favorite]
Yes, there will be extra bandwidth unused at it's initial stage, but the point is to give these places room to grow.
They may not have all the concrete applications for it now, but you want to create a sandbox big enough for the new generation to have space to implement their new ideas.
posted by arjuan at 1:35 PM on September 22, 2010 [1 favorite]
Best answer: indiebass, the answer is that fiber is an investment to enable essentially unlimited future bandwidth. It's the best investment that a library can make to future-proof the organization and make upgrades incremental and cheap rather than the costly scrapping of both ends and starting over again that usually accompanies infrastructure upgrades.
posted by ulotrichous at 1:37 PM on September 22, 2010 [2 favorites]
posted by ulotrichous at 1:37 PM on September 22, 2010 [2 favorites]
Okay, so three steps:
1. Determine a set of criteria (like "simultaneous youtube downloads", as you suggest) and pick the one that's both reasonable for the client (do you really want to be focused on youtube downloads in a library?) and the most bandwidth-intensive of the reasonable ones you had.
2. Get a speed tester on your local connection, measure your available bandwidth (ideally at home, so other people using computers won't corrupt your results), then start doing the task you identified and, while doing it, run the speed test again. If youtube videos, make sure you start a long, high-quality video as a representative sample, and kick off the test immediately after starting the video.
3. Do the math -- you now know how much upload and download bandwidth are needed to do that task, so multiply by workstations and you have your requirements (for internal and external bandwidth.) Make sure you factor in network overhead (if you need 50Kbps, don't say 50Kbps, say 50Kbps + whatever the overhead, let's say 40%.)
posted by davejay at 1:39 PM on September 22, 2010
1. Determine a set of criteria (like "simultaneous youtube downloads", as you suggest) and pick the one that's both reasonable for the client (do you really want to be focused on youtube downloads in a library?) and the most bandwidth-intensive of the reasonable ones you had.
2. Get a speed tester on your local connection, measure your available bandwidth (ideally at home, so other people using computers won't corrupt your results), then start doing the task you identified and, while doing it, run the speed test again. If youtube videos, make sure you start a long, high-quality video as a representative sample, and kick off the test immediately after starting the video.
3. Do the math -- you now know how much upload and download bandwidth are needed to do that task, so multiply by workstations and you have your requirements (for internal and external bandwidth.) Make sure you factor in network overhead (if you need 50Kbps, don't say 50Kbps, say 50Kbps + whatever the overhead, let's say 40%.)
posted by davejay at 1:39 PM on September 22, 2010
(in order words, this is less a technology problem, and more a problem of identifying the specific tasks a library or school would want to do -- and the only folks who can tell you what those things are, are librarians and principals.)
posted by davejay at 1:39 PM on September 22, 2010
posted by davejay at 1:39 PM on September 22, 2010
1 gigabit (1000megabit) for internet connection is drinking from a fire hose. You'd need a LOT (like a campus's worth) of simultaneous downloading before anyone noticed the difference between that and let's say a 10megabit cable internet connection. Overcapacity for internet.
On the other hand, do you know anything about desktop virtualization? Let's say you have a library branch or a branch office for a business with 10 computers under 10 desks. With a gigabit connection there, and another one back at headquarters, you could move all 10 computers from the branch to a room in headquarters (or put them all on a single virtual desktop server). Back at the 10 branch desks, there's a little $100 box called a zero client or thin client that provides the network connection, monitor, keyboard, and mouse - nothing else. No need to have an actual computer under the desk anymore; it's running as part of a server back at HQ, and you're using the gigabit bandwidth to transmit everything that gets typed, clicked, and displayed.
Facilities, campuses, and corporate campuses are all moving towards this, because it's very efficient for IT staffing and tech support. Instead of having enough staff to maintain and support 30 individual physical computers for the marketing department on the 27th floor, you just have one server in the basement that provides 30 virtual computers over the network. One person can maintain several of those servers cheaply and easily; and the only thing that users can screw up, damage, or steal is a keyboard, mouse, and monitor.
This is no problem if everything's at the same site; gigabit bandwidth at both ends would let you do this between two distant locations. So you could say "with gigabit internet, you would never buy or repair a computer at a branch location ever again". (Caveat: the money spent buying individual computers basically shifts to buying the servers that run them virtualized back at HQ, and you have to virtualize many desktops before saving any money that way. But you'd remove basically all maintenance and support costs, and the servers would be on HQ's budget instead of the branches.)
posted by bartleby at 2:09 PM on September 22, 2010 [1 favorite]
On the other hand, do you know anything about desktop virtualization? Let's say you have a library branch or a branch office for a business with 10 computers under 10 desks. With a gigabit connection there, and another one back at headquarters, you could move all 10 computers from the branch to a room in headquarters (or put them all on a single virtual desktop server). Back at the 10 branch desks, there's a little $100 box called a zero client or thin client that provides the network connection, monitor, keyboard, and mouse - nothing else. No need to have an actual computer under the desk anymore; it's running as part of a server back at HQ, and you're using the gigabit bandwidth to transmit everything that gets typed, clicked, and displayed.
Facilities, campuses, and corporate campuses are all moving towards this, because it's very efficient for IT staffing and tech support. Instead of having enough staff to maintain and support 30 individual physical computers for the marketing department on the 27th floor, you just have one server in the basement that provides 30 virtual computers over the network. One person can maintain several of those servers cheaply and easily; and the only thing that users can screw up, damage, or steal is a keyboard, mouse, and monitor.
This is no problem if everything's at the same site; gigabit bandwidth at both ends would let you do this between two distant locations. So you could say "with gigabit internet, you would never buy or repair a computer at a branch location ever again". (Caveat: the money spent buying individual computers basically shifts to buying the servers that run them virtualized back at HQ, and you have to virtualize many desktops before saving any money that way. But you'd remove basically all maintenance and support costs, and the servers would be on HQ's budget instead of the branches.)
posted by bartleby at 2:09 PM on September 22, 2010 [1 favorite]
Response by poster: OH, and rhizome: the 1 Gigabyte thing comes from the National Broadband Plan introduced by the government. This is why I am using that particular number.
posted by indiebass at 2:50 PM on September 22, 2010
posted by indiebass at 2:50 PM on September 22, 2010
One other factor is whether "the Internet" can deliver you a gig of sustained bandwidth. Your connection goes into an ISP, which in turn connects to other ISP's. The upstream connections need to have sufficient bandwidth themselves to support the gig connection to you.
posted by Runes at 6:28 PM on September 22, 2010
posted by Runes at 6:28 PM on September 22, 2010
« Older How do two foodies survive on a budget? | How does Canon's Highlight Tone Priority work? Is... Newer »
This thread is closed to new comments.
PS I am an IT tech at a public library.
posted by majortom1981 at 1:08 PM on September 22, 2010