IT advice for non-profit
November 19, 2009 10:57 PM   Subscribe

How likely is it that in the next 2-3 years cloud computing (using for example google chrome OS with web apps running off thin clients) will be the norm and how should a non-profit organization with high tech needs (healthcare) but limited tech resources begin to position itself for the changes this might entail?
posted by dougiedd to Technology (18 answers total) 8 users marked this as a favorite
 
"The norm" for what? The vast majority of computer programs don't need that kind of compute power.

Don't underestimate the power of inertia. Most computer applications won't change because there's no need to do so.

"Thin clients" has been proposed over and over again, but every time someone tries to make it work, the cost of "fat clients" comes down even further and it stops making any sense. The notebook computer I have on my desk right now has as much compute power as the full sized tower I bought for $7000 in 2002, and about five hundred times as much power as the first home computer I bought 30 years ago.

There will be a place for "cloud computing" but it won't be the norm.

And it definitely won't be a good solution for your particular application. Data security in that kind of system is nearly nonexistent; you can't put confidential records out there stored on, or processed on, computers you don't control and may not even know the location of.
posted by Chocolate Pickle at 11:04 PM on November 19, 2009


I don't think it's going to happen in 2-3 years but that seems to be the direction we're heading.
posted by Jacqueline at 11:23 PM on November 19, 2009


How likely is it that in the next 2-3 years cloud computing (using for example google chrome OS with web apps running off thin clients) will be the norm?

Virtually impossible.

How should a non-profit organization with high tech needs (healthcare) but limited tech resources begin to position itself for the changes this might entail?

Cloud applications emulate desktop applications and typically share file formats. Google Docs for your organization is free if you want to be way, way ahead of the curve.
posted by Electrius at 11:24 PM on November 19, 2009


I agree with Chocolate Pickle.

You can buy a complete system from Dell for less than $500 that runs office software. In five years, you'll still be able to buy a system from Dell for around that price that runs office software.

I'm not sure what your 'high tech needs' are. If you're talking about PACS or the like, running some type of remote rendering system might be beneficial, but you would want to run it on a computer you manage, for security and reliability reasons.
posted by demiurge at 11:27 PM on November 19, 2009


I'm using Google mail and docs for a collaborative project, and let me tell you, in 2 - 3 years this is going to be the norm for small to mid-size companies (whether it's Google or Microsoft or some as-yet unborn service).

This model is the future. NOT HYPE-IST

It's not about compute cycles or disk space, it's about concentrating on what makes your org interesting/useful/different, rather than spending cycles running, patching, virus scanning, and rebooting your Exchange server (choose whatever inconvenient infrastructure item is currently sucking up 1/5 - 1/3 of your budget).

I come from the mainframe era - back when everything was a thin client (a thin gruel of a client). No one loved that model then because resources (disk, memory) were scarce and doled out a thimbleful at a time.

The second big wave of thin clients was what, in the mid 90s? when high speed network was rare and web browsers weren't yet fast enough to give you the responsiveness of a desktop app. So maybe you could run a thin client on your intranet at work, but forget about using it from home or on the road.

Now?

Gmail.

And coming up, Google Docs.

They feel like desktop apps (docs has its limitations). Gmail's spam filtering is better than anything I could ever set up, and I don't have to *ever* fix/patch/reboot my mail server. That alone is a godsend.

I also don't have to maintain desktops and roll out software to everyone. As long as my team has working web browsers (surprise, we all do!) we're good to go. We don't have flag days where we all have to simultaneously upgrade to Vista / Snow Leopard / whatever, we just have to all run modern versions of our web browsers. So much simpler!

The future? System admins will spend more time administering interesting systems - customer facing webapps - and less time futzing with sendmail / qmail / Exchange / NFS / Samba.

And small (mom and pop) to medium-sized businesses will have infrastructure that the equal or better (for their purposes) of the in-house stuff at Fortune 500 companies.
posted by zippy at 11:37 PM on November 19, 2009 [3 favorites]


Disclaimer: I am a health informatician with a background in open source. My open source and health informatics activities are basically independent of each other because I study health informatics, but I produce open source tools outside the health field.

Since mostly management make decisions about purchaces, and these are heavily influenced by the people that come to sell them software. This seems to lead to the same position that's encountered elsewhere in the software industry, where a product that's technically not the best (e.g. VHS/Windows) is more commercially successful than a technically superior product (e.g. Betamax/Mac) [yeah alright, open source purists, debatable analogy!]. This happens because the marketing people are more effective than the competitions', and over time the number of players in a market get smaller until there are one or two vendors with good marketing machines and software that keeps management happy but other than that are quite likely to produce fairly crappy software. Microsoft, and companies that are inspired by their business methods do well in this market, and I'd expect them to continue to do so for one or two decades longer (with incrementally less market power over time).

So if your organisation is interested in this area, I'd recommend that you try to recruit an IT person with good Unix experience (especially with experience in the Perl, Python or Ruby languages, I wouldn't consider a PHP programmer unless they'd made a major open source contribution elsewhere). On the other hand you might not be able to afford such a person (although a lot of open source people are not massively influenced by money, so you might), and unless you're in a big city, they're likely going to be quite tricky to recruit. Recruiting such a person into your organisation, who is pragmatic and acknowledges the important place that propietary software has in business, while having a good level of awareness of the alternatives (read: computers as platforms for empowerment) is really what you need at this stage.

Or you can wait and be a follower rather than a leader. On the other hand, this exposes you to the risk of lock in, so you know, in the end it's 6 of one and 1/2 a dozen of another. Most organisations chose the path of least resistance, which means crappy to adequate IT people who do what everybody else does. Which leads back to the beginning of my explanation...
posted by singingfish at 11:44 PM on November 19, 2009


How likely is it that in the next 2-3 years cloud computing

Nthing impossible. I work for a company that would love nothing more than to see this happen and actively plays in the space as one of the biggest players. Even the most hyperbolic crystal ball gazers, experts and marketers do not mention this possibility. It would be beyond their wildest dreams. Maybe in 10-15 years. Maybe.

*Not denigrating cloud computing but for the reasons stated above it's really not a fits-all solution and - at the moment, for what you're looking for, for the kind of price you're looking at - it is not a good solution for your org.
posted by smoke at 12:04 AM on November 20, 2009


I think there will always be a role for fat clients and there will be a role for thin clients; i dont see either one fully going away for a very long time. What we're seeing in 'the cloud' is mostly sharing of non-critical stuff (and thats about all I'd use it for). I'd still want local backups on a regular 'puter especially for anything remotely critical also local apps. My laptop is far more reliable in every sense than my internet connection and I dont want to be dependent on that connection for all my critical work. Sharing docs tho? Sure.

So the '2-3 year' timeframe I think is virtually impossible. Is there a long term trend to keep shifting things to the cloud? Sure, but its got a timespan more on the order of decades than years, if you ask me. Doesnt mean we wont see some cool cloud apps come out (we see that every few months anyway), i'm talking about mass adoption and institutional paradigm shifts. Aint gonna happen in 2-3 years. Maybe 20-30 years.
posted by jak68 at 12:55 AM on November 20, 2009


There's a quote I read in the Economist that I'll paraphrase because I can't remember it exactly: the impact of disruptive change tends to be overestimated in the short run and underestimated in the long run. I don't see major changes from cloud computing in the next two to three years, but in the next five to seven years, yeah, definitely. By 2015 you'll do pretty much everything important inside the browser.

Whether you want to be ahead of the curve or not depends a lot on the kinds of applications your organization needs and the kind of people that work in the organization. If the people are savvy and you don't mind the dangers of being on the cutting edge, then go for it. Since you're a healthcare organization, I'm guessing the answer to both of those is "no" so proceed carefully.

I've been using Zoho for office apps. It isn't perfect, but it's reasonable.
posted by Loudmax at 1:10 AM on November 20, 2009


When I did the maths I saw that unless you need bursty infrastructure, cloud computing is quite a bit more expensive.
posted by devnull at 2:26 AM on November 20, 2009


Basically, this question is impossible to answer without thinking carefully about the applications you want to run: how CPU-intensive versus data intensive are they, how "bursty" are the computational demands, etc.

As others have said, in the health care space, confidentiality is going to be an issue that's going to make cloud computing a much more difficult (but not necessarily impossible) proposition.

This white paper might give you some insight into what makes an application most suitable for the cloud.

You can buy a complete system from Dell for less than $500 that runs office software

But so can the cloud provider, yes? Things are different now than the past "thin" versus "thick" client cycles because (a) we have much more bandwidth now, and (b) the "thick" client is really just a warehouse full of thin clients. It's a lot cheaper for one company to manage 1000 computers than for 1000 companies to manage 1 computer each.
posted by sesquipedalian at 6:16 AM on November 20, 2009


Cloud Computing is a loaded term, having multiple interpretations depending on who you talk to.

There is the "Cloud Computing" where almost everything is done in the cloud. All application execution, logic, and data is performed in the cloud and the client does nothing more than send/receive info and render a nice UI. This model is outsourcing your internal infrastructure to an external provider (like Amazon Web Services).
This model will not be the norm in the near term.

There is the other usage of "Cloud Computing" where certain applications are cloud based. This is Google Docs, Social Networking, etc. This is happening now and will continue to grow, and *may* morph into the first kind of cloud computing (or may take a different direction). This cloud model has centralized data that is simultaneously shared across many users. This is a natural outgrowth of that other loaded term of "Web 2.0".

The way that you set yourself up for the second version of cloud computing is to just have internet connections and devices that can use them, and you're all set.

There may be migration issues, but I don't think there are any good general planning rules, since the migration issues will be determined on the current non-connected application you currently use and the future connected application you plan to use. Here are some examples to give some better context. Let's say you currently use Word and shared files are stored in a network file system. If it was me, and my organization had the budget, my transition plan would be to wait until Microsoft got their act together (they will pretty soon I'm sure) and deploy their version of Google Docs for their Office products. On the other hand, if I have people spread all over, on all kinds of different computing platforms, working on the same projects and emailing/file server access/vpn is a royal pain, then I might consider migrating, on a project by project basis to some cloud based app. One thing to keep in mind about cloud computing in this context is how you feel about the actual raw data being in the cloud.
If you're data is sensitive and critical to your organization, then trusting a third party to keep the integrity of that data becomes an issue (backups, availability, etc). The issues are very similar to other internet applications (email, web servers, etc). Do you trust a third party or do you maintain your own? The answer to that question would indicate whether you go to a complete 3rd party solution with no guarantees, but for free (i.e. Google Docs), or some organizational based platform that does the same thing (i.e.,the sucky Sharepoint*), or something in the middle (i.e., a fee based 3rd party service with real SLAs).

*I think the concept of Sharepoint is good, I just think its implementation sucks.
posted by forforf at 6:31 AM on November 20, 2009


I'm an IT consultant and I've worked in a few Fortune 500 companies, and earlier in my career I did tech support.

Here is my thinking regarding cloud computing, based on my general knowledge of the industry and how people respond to technology.

First, the technology really isn't new - things like Microsoft Terminal services, and Citrix have been doing things similar to cloud computing for years and while those aren't exactly comparable the way its delivered and will appear to the users really isn't much different. I suspect if the technology was really that good, it would have been deployed a long time ago. Sharepoint for example can be a good technology, but I've only been in one organization that really knew even how to install it properly.

Virtually nothing I say here is going to apply to home computer users, or a small-office -- deploying a few instances of nearly anything (and on the scale I am talking about 200 users is small) is always easier than it is on the large scale.

Here is why I think cloud computing may not get much traction:

I saw user bases who were really reluctant to use the remote-delivered apps because they felt performance was bad. First, load-balancing was problem - it was always a problem when at 9AM everyone on the west coast would be checking their email (and its first thing in the morning, its fairly natural that everyone would be checking their email).. So an organization either needs to have the resources to accomodate peak times and deploy a bunch of technology which will be under-utilized most of the time - both servers and bandwidth or take a performance hit. Deployed on a big enough scale this may not be a problem (say if everyone uses google docs). Consider this a hidden cost.

Second - remote-delivered apps also make your network/Internet connection a single point of failure. Networks are pretty reliable but not 100%. Locally your network provider may have something approaching 100% uptime but understand this is not the case across North America and certainly isn't the case in most of Asia and Africa. Losing network connectivity may cause a productivity hit to any network site, but if you have a PC/locally installed apps you are usually capable of some work.

Third - a lot of hobbyists and futurists overestimate the amount of computer skills and willingness to make changes in the general population.

I'd say maybe 10% of computer users are very good, they understand basic functionality of apps, networks and databases - can understand where problems take place, and how computer systems work and have no problems adapting to something new. The second group is probably 70% of people - they undertand apps they use a lot very well, but face a learning curve when something new is put in front of them; these people figure stuff out but it takes some time. Another 10% know only how to do the functions they use on a regular basis (usually by memorization), and anything you change on the system (changing an icon even) will throw them into turmoil which will take a signficant amount of time to resolve. These users panic when their systems receive notification that their systems are being patched overnight and any major change virtually shuts them down and puts a huge load on technical support. The other 10% of people are the ones who try to avoid using computer - these are usually corporate board members/senior managers who think of IT as overhead or something their admin does for them and don't like to spend any money on it; they resist virtually any IT expense and they are usually decision makers. Remember those ratios, because virtually no company would want to take even a 5% hit to productivity. Any change to new systems has to be almost seamless.

Fourth, the legal system really hasn't caught up to even existing technology. Consider that we don't have uniform privacy laws or consistent standards around what data can be stored or for how long - the further we spread out this computing the more complicated maintenance of legal standards/syetm compliance becomes. That will still be a patchwork.

Fifth, security would be an issue. I'm not talking system security - IT can design good systems, but the biggest problems will still be people not locking their machines when they walk away, forgetting their passwords and calling technical support, and leaving laptops on car seats in urban settings. Adding another layer of mobility on technology adds another layer of security issues.

We'll see some elements of cloud-computing for sure, but I certainly don't expect a revolution... I doubt planning for one is a good long term policy.
posted by Deep Dish at 8:46 AM on November 20, 2009


How should a non-profit organization with high tech needs (healthcare) but limited tech resources begin to position itself for the changes this might entail?
  • Stop using an applications you have to "install". Use web-based alternatives, or have someone write web-based versions of you interal apps for you.
  • If you are using Internet Explorer 6 or 7, upgrade to version 8 or some other more standards compliant browser. (Firefox, Chrome, Safari, Opera)
  • Switch to VOIP (Voice Over Internet Protocol).
  • If you have applications that run on your desktop that you absolutely can't get rid of, look into virtualization (Citrix)
  • If you can switch to free versions of your applications or web sites. Stop spending money on software when there are acceptable free alternatives. Open source can be good.
  • Become remote-worker friendly, meaning ensure that you employees can work off-site, from home, via telecommuting, etc.
  • Outsource your IT and stop running customized/made in house solutions or hardware

posted by blue_beetle at 8:56 AM on November 20, 2009


I think we are at a watershed moment as far as business IT goes. Things are ripe for a rethinking of how services and data are hosted and delivered, and for how end-users interact with them.

It has been the better part of a decade since a lot of IT organizations made a major revision to their desktop computing infrastructure. The last big wave ended with WinXP, OfficeXP and IE6. It was driven by the rise of the web, the culmination of the merging of the Win95 and WinNT platforms, along with a big wave of upgrades driven by Y2K. A lot has happened in the time since.

Browser performance and capabilities have come a long way since IE6 was released. "AJAX" web applications approach the rich UIs of desktop applications. Ironically, it was a feature that Microsoft added to IE6 for Outlook Web Access that contributed the A in AJAX. Costs for bandwidth have come down dramatically. Notebooks are replacing desktops. Mobile devices capable of running full-featured web browsers are becoming widespread. As a result, people are using multiple devices.

Against all of this, Moore's law has continued to push the costs of computing down and the capabilities up. A major result of this is that, the costs of IT have increasingly shifted away from hardware, which puts the focus on the costs of software and operations/management.

You may be able to get a $500 computer from Dell preloaded with Office & Windows 7, but then how much would you spend to deploy and maintain it, along with servers for Exchange, Sharepoint, etc? Price that against and equivalent to that dell PC (Probably $250 or so), only running ChromeOS and GoogleApps, which is free to a small organization, or $50 peruser/year for larger orgs.

People have listed other objections to "thin client" scenarios, but take them with a grain of salt. A lot of business computing is dependent on reliable networks right now. For one thing, there are a lot of old browser-based applications (which is why IE6 has been so slow to die off) If anything, modern web based apps with some off-line capabilities are more resilient in the face of network glitches than a traditional desktop app that has to use a fileserver on the opposite side of the continent, or some sort of remote desktop. People also underestimate the importance of these new computing models in emerging markets.

That's half the equation, the other half is what organizations will do with the central services they need? As others have noted, privacy and confidentiality requirements may limit what organizations trust to "the cloud," but "cloud computing" vendors will continue to chip away at those objections. A bigger issue is the vested interest of the IT departments, but at some point it won't be their decision once the CEO & CFO see that peers and/or competitors are saving a lot of money by finding new ways to meet their IT needs.

I think in the next 2-3 years, it will be increasingly obvious that a major change is afoot. Startups are already embracing a lot of cloud computing, both for their commodity computing needs (email, calendar, basic documents, etc), and to build and deploy their own services (Amazon S3, EC2, Rackspace Cloud, etc). I wouldn't be surprised if larger and more established businesses switch over with surprising speed, but I also wouldn't be surprised if it takes the better part of another decade for that to play out.

blue_beetle gives good advice. I think we could give better advice if you could tell us more about the current size of your organization, the expected growth, and what your computing needs are. I'll add a few others:
  • Understand your exit strategy.
    • I don't think Amazon's web services or Google Apps are going away any time soon, nor do I think they'll make major changes to their strategy or pricing that leave existing customers in an awkward spot, but it is still early in this era, and other options may emerge.
    • On the other hand, startups with compelling offerings, like Zoho may not achieve viability, or they may be acquired by someone who doesn't really care about their existing customer base. Even if you don't know what you'd switch to, knowing how you'll get data out is important, and even better to set up some sort of automation so you have a relatively fresh in house backup.
  • Embrace the possibilities offered by something like Google Apps: Anyone, any time, any place, any device. How could your users take advantage of easier access and collaboration?


posted by Good Brain at 12:28 PM on November 20, 2009 [1 favorite]


Cloud computing, as in Google Docs and Gmail, is definitely in the "early majority" phase of adoption, especially for smaller organizations that want to shed or avoid in-house IT infrastructure.

Cloud computing, as in virtualization like citrix and VMware is definitely making its way into mainstream IT.

Cloud computing as in Amazon web services and Google App Engine is still on the early adopter phase.
posted by kenliu at 4:12 PM on November 20, 2009


I think it's important to think incrementally. Look at something like Flickr. Nobody thinks they're "cloud computing" when they use it, they're just uploading their photos someplace convenient to use. Similarly with how Facebook and Twitter have become important communications media without most people thinking they're doing something sea-changing. That's going to continue to be the case -- online apps are going to sneak their way into organizations like yours without you even doing much to encourage it.

Clearly, Gmail is among the top cloud apps. In one swell foop you can dispense with Exchange/Outlook and big swathes of hardware you used to need to maintain. I think that's going to continue to be the case: customers are going to move to these things one by one, less often as an overall strategy. There have been quite a few disastrous all-at-once rollouts and people have gotten burned.

Mainly it's going to be more about flexibility and ability to turn on a dime rather than a massive plan to overhaul everything over X years. That's actually the cool part about it, although it's also one of the risks, because you are suddenly becoming dependent on a third party provider in a way you weren't before. With desktop apps, you can keep using something even if it's no longer supported. With the cloud, something can go away. So you do want exit strategies, backup plans, Plan Bs. I would hope that the mix of cloud apps will continue to be robust with multiple major offerings from reliable providers, the way you had Excel or 1-2-3 as viable spreadsheet choices for many years.

But certainly one of the great advantages is the elimination of software installation support and the need to think about things like licenses and seats. You don't need to install a spreadsheet on everyone's PC in the hopes you're covering them. They can just go to Google Docs and it's there if they need it. On the other hand, if you get into systems like Salesforce.com or Basecamp, you're going to be in a little trickier territory and you'll want to think more carefully about how they're used and how you can retain control of your data should you need to shift gears.
posted by dhartung at 7:24 PM on November 20, 2009


Please forgive this purely anecdotal answer:

Just this week I transitioned our small nonprofit off an in-house Exchange server and onto (free) Google Apps. There was some grumbling, ironically from some of the more sophisticated users, who didn't like having to change how they use calendars. Some of the least tech-savvy folks immediately 'got' the gmail interface and started using the built-in chat, and a couple days later folks were experimenting with Google Docs. We still have a couple folks using Outlook, but the Exchange server gets shut down in a couple days, and it's awesome no longer having to worry about maintaining that machine.

The nonprofits that I've seen generally need to cut every financial corner possible, because general operating support funds are very limited. Anything that allows a lower set of operating costs, while improving stability, is a serious win.

We still have some self-hosted databases and customized software, but we're definitely moving in a cloudwise direction. When I leave, they can replace me with someone much cheaper and dumber...

I don't know about the rest of the world, but I can certainly see how the weather looks from here, and it's very cloudy.
posted by Erroneous at 11:24 AM on March 26, 2010 [1 favorite]


« Older Are there any Muay Thai themed xmas presents?   |   How much is that [blank] in the window? Newer »
This thread is closed to new comments.