Back in 2003, I thought I had a bright idea.
I had been taking ever increasing amounts of digital photos without hard-copies. Also - I had been storing more and more documents on my computer and I started to worry about 'the fire' scenario. EVEN IF - I backed up regularly, all of my most valuable digital documents and memories would disappear in an instant.
I also hated the process of performing a manual backup. It was tedious and time consuming and at the time, external 2.5 inch hard drives were too expensive, as were flash drives, so backups were performed to CDs or DVDs. I was paranoid about the quality of the media, so often I would double up the backup, then when I got some new files in, I had to remember the differences between the files.
Which led me to my idea. The idea was to transparently mirror one or more of my folders to a server on the Internet. All files would be transparently streamed in my unused upload bandwidth and encrypted for transport. The server would instantly distribute the data to multiple physical locations and the data would be safe if there was a simultaneous fire or disaster in n-1 of the locations. The software would be a simple install and invisible in normal use but transparent with what it is sending and receiving and which files are currently pending distribution.
The business model would be to charge a subscription fee for a yearly backup based on the amount of data stored.
The theory was that as more and more people started carrying digital cameras and digital video cameras, the amount of data to be backed up would increase exponentially. My pitch was "Digital Insurance". I was then, as I am now, sure that eventually the majority of users of the Internet would have one form of digital insurance or another. To not have would be akin to not having house insurance. To risk all of your irreplaceable memories to save a nominal yearly cost with zero complexity is simply not rational once the adoption hits the tipping point.
There was one problem, I didn't own a huge server farm and I couldn't find a way to make the idea work without a huge investment in capital which frankly I had no experience in collecting. The client and server software were relatively trivial to build but it was the infrastructure that was the sticking point. Eventually, I gave up the idea as I felt that one day someone would implement the exact same service for me (albeit without me becoming a massive multi-billionaire for virtue of having the idea).
At the time of giving up, I strongly felt that the best fit for this kind of service would be Google due to their infrastructure. Years passed and a few services have appeared offering online backup in a way that is very similar to my original idea. Carbonite appeared and there are others, but Google remains quiet. Carbonite is a strange service. It only backs up data on internal drives and will not backup from external USB drives. I store a lot of data on external USB drives so it immediately excludes me as a customer.
I heard rumours about something called GDrive a few years back and I was sure Google was going to announce their version of an online backup service. But no news. Nothing.
What brought this all back to me was that they made the announcement today that 20GB of storage on the Google service is now just $5 a year and 80GB is just $20 a year.
Google Storage Upgrade
Unfortunately, this is just a file container for pictures and emails, not a synchronized network/local drive.
The cost is finally in the ballpark of what an online backup system should be charging. If the infrastructure is finally available to store data in these volumes then GDrive cannot be far away. I suspect the software will treat the online copy of photos, documents as the master, and will allow users to stream files directly from the network drive from any computer with any computer being able to add files to the drive (given the right login credentials).
The service must negate the possibility of user-error when asking users to store all of their personal data in the cloud.
It is vitally/critically/of the paramount importance that customer not forget their password or lose their usb-thumbkey authentication as any password replacement policy implies that the photos are decryptable from the server side which should not be tolerated.
Key management is the hardest part of the entire service and it will be interesting to find out how Google intends to deal with this.
If the data can be decrypted on the serverside, without your key phrase/password/thumb-stick/whatever then that means that all your private documents and photos are readable to Google and possibly to others online. Only if your documents and photos are entirely scrambled on the serverside will users adopt the service. Security and key management are paramount.
Possibly, a copy of the key or password should be entrusted to a third party. If a thumb-key is required, then a backup of the key in a separate physical location is also required otherwise there is still fire-risk to the data as if the key is lost and if google does not have a copy of the key then you lose your physical copy and your online copy (as its scrambled). This is a real sticking point in my mind, and possibly one of the reasons this service is taking so long. Username and password authentication is too weak for this service, but this service is unique in that all the data on the server MUST be scrambled meaning that if a user forgets their password or security information, the user potentially loses everything which would hit the news and would hit profits.
To try to round off this ramble, online backup has been a long time coming but the technology is ready now. The pieces are in place and I look forward to being able to forget about backup and to be able to access my data from anywhere in a secure way.
Could GDrive finally be arriving soon? $20 a year for a backup of all my precious memories? Yes please.