Remote file management
November 6, 2013 5:12 AM   Subscribe

I've been tasked to find a solution for the following: We make software for different kinds of machines. The machines are all controlled from Windows PCs connected to each machine individually. Every developer who has to test something copies their newly built software on a flash drive, walks out to the lab, makes a backup of what was running on the PC connected to that particular machine, puts in their own software, runs the tests, and in the end copies everything back like it was before.

We'd like some kind of central storage of software builds (collection of files), where a developer can log on and make backups of a file set on a PC or deploy new files to a PC remotely. Does any kind of system like this exist already? This is partly for convenience but mostly to keep an eye on what's deployed where and reduce the scope for human error (mistakenly copying the wrong files etc).
posted by Harald74 to Computers & Internet (6 answers total) 2 users marked this as a favorite
 
If you're using version control software, it's relatively straight-forward to setup a build server with automatic deployments. We use TeamCity for that, but you would want to get familiar with the software for the actual implementation. They have a free version if you want to evaluate it.
posted by blue_beetle at 5:30 AM on November 6, 2013


Response by poster: We're using Perforce and Jenkins for VCS and automatic builds. We don't really want to automatically deploy anything on the control PCs, though, as the developer have to check that the machine is free for their use, and then trigger the "backup and deploy new" script.
posted by Harald74 at 5:42 AM on November 6, 2013


Is there a reason the software can't run directly from the flash drive, or in different directories, so you don't have to back up and restore the files every time?

Also I was thinking Jenkins tasks (or some other plugin) could be used to deploy files, maybe the "Publish Over CIFS Plugin"
posted by RobotVoodooPower at 6:11 AM on November 6, 2013


That almost sounds like my previous job.

We originally used the "bundle of files" approach as well, but our testing was hugely simplified when we implemented a proper installer for our software. We used WiX to reduce our software build to a single MSI file. This made installation pretty much impossible to screw up. (Admittedly it took a while before the installer worked correctly.)

Jenkins served as our central build repository. We configured it to archive the MSI file after every build which have us a nice list of downloads, each annotated with the corresponding commit message.

We didn't bother with backups as we never modified anything manually -- if you wanted to restore an older build, you could just fetch the old MSI from Jenkins and re-install it.

(Running from an flash drive was not an option for us -- we needed to install custom Windows drivers.)
posted by reynaert at 6:44 AM on November 6, 2013


Use some sort of virtual machine (VirtualBox is free and useful) on the PC hooked to each machine. Save off the default image of that machine. When you need to test, install the software and run tests. To reset, just restore the VM from the saved image. This process can be run remotely as well.

I'd definitely recommend setting up an MSI project using WIX as well. Windows Installer service is both a boon and a nightmare, but it does serve the purpose of wrapping a set of operations into an atomic transaction. This could also be automated.

In the end, then, you have a push button Jenkins task that will restore/boot a clean VM image to a VM. A second job (?) to push an MSI to the restored VM and run msiexec against it. A third job to kick off your test suite and do reporting. Then re-run the clean VM task to reset your environment.

We actually use BladeLogic (in the past, Altiris) to do these sorts of operations with triggers in Jenkins jobs associated with our builds. This may be overkill for your needs. Other tasks are push button jobs. There's a fair bit of homegrown architecture around these tools that has developed over the last five years or so as we continue to automate what was previously a lot of manual labor. Devoting some dev time to these sorts of problems actually increases effeciency and quality over the long term.

I'm an ESM/SCM guy who actually likes this "latrine duty" stuff. Until this job I've never been in a shop that takes any of these things seriously and the night/day difference between productivity, support, and quality in this place has been relevatory.
posted by Fezboy! at 9:59 AM on November 6, 2013 [1 favorite]


Response by poster: OK, thanks for your input. I'll update the question later when we've decided which way to go.
posted by Harald74 at 6:42 AM on November 12, 2013


« Older Minor computer upgrade.   |   Find me the perfect media cabinet/console Newer »
This thread is closed to new comments.