If you’ve read my previous articles on using RightScale to manage Windows Azure cloud infrastructure, you've made sense of your Software Development Lifecycle (SDLC) and you've automated your development and test environment deployments using Windows Azure virtual machines and RightScale, but that's not the last step in the process of successfully rolling out ongoing updates. In this third and final article, I'll show you how to leverage the cloud to perform environment and configuration-level testing so you can stay ahead of the curve with the latest releases of the technologies you depend on.
Credentials within RightScale offer a convenient way to store sensitive or frequently used values in an easily manageable interface. It's important to remember that RightScale credentials consist of a Name, Value and Description and are intended to be used as inputs for servers. Credentials are accessible via the RightScale dashboard as name/value pairs where the value is passed into a script or other object as any other text input would operate.
Are you still trying to decide how to move your organization into the cloud? Do you already have servers in the cloud but can’t figure out how to best utilize them? At RightScale, we have these conversations with prospective customers all the time. I've outlined three of the most beneficial aspects of the RightScale solution that our customers use to quickly and more efficiently manage their deployments in the cloud.
Where Is Cloud Computing Going?
First, cloud computing management is evolving in both automation – programming your apps and infrastructure to run themselves – and abstraction – working across heterogeneous OSes and clouds.
A recent O’Reilly Radar article on big data in the cloud included a mention of RightScale and its ability to orchestrate server management across multiple clouds. It prompted me to think about the role of RightScale in supporting the implementation of big data solutions for our users. But first, what exactly is big data? Edd Dumbill, a contributor to O’Reilly Radar, offers this explanation:
“Big data is data that exceeds the processing capacity of conventional database systems[because it] is too big, moves too fast, or doesn't fit the structures of your database architectures.”
A commonly accepted way to identify big data is to determine if it meets any of these three key criteria:
When a systems or software engineer is learning a new language, a plethora of examples to learn from is invaluable. The cloud currently feels like a new software language to many - new constructs, better tools, rewritten rules. RightScale has always provided training to help people jump into this new world, and this release continues the education.
In it, you will find concrete examples of how to maintain advanced database architectures in the cloud, how to auto-scale Windows .NET applications, and even how to move database information between clouds. With your feedback, these examples will become production solutions that you can extend and modify. Before you know it, you're a sysadmin rock star for your organization - people will wonder how you accomplish such magic.
Don't hold back your secrets. ;-)