Interoperability has not been a huge focus around the quickly emerging cloud computing space. Other than “we support interoperability” statements from the larger cloud computing providers, there is not a detailed plan to be seen. I’ve brought it up several times at cloud user group meetings, with clients, and at vendor briefings, and I often feel like I’m the kid in class who reminds the teacher to assign homework.
Data interoperability is not that hard. You’re dealing with a few key concepts, such as semantic interoperability, or the way that data is defined and stored on one cloud versus another. Also, you need to consider the notions of transformation and translation, so the data appears native when it arrives at the target cloud, or clouds, from the source cloud (or clouds). Don’t forget to add data governance and data security to the mix; you’ll need those as well.
There has been some talk of concepts such as the Intercloud, or a data exchange system running between major cloud computing providers. Also, a few cloud standards organizations, such as the Open Cloud Consortium, are looking to drive some interoperability standards, including a group working on standards and interoperability for “large data clouds.”
So how do we get down the path to data interoperability for the clouds? Don’t create yet another standards organization to look at this by committee. They take too long, and this is something that’s needed in 2010 to drive cloud computing adoption. Instead, the larger cloud computing providers should focus on this behind the scenes and create a working standard enabling technology to solve the data interoperability problem. If the larger providers are all on the same page, believe me, the smaller providers will quickly follow.