Comments
yourfanat wrote: I am using another tool for Oracle developers - dbForge Studio for Oracle. This IDE has lots of usefull features, among them: oracle designer, code competion and formatter, query builder, debugger, profiler, erxport/import, reports and many others. The latest version supports Oracle 12C. More information here.
Cloud Expo on Google News
SYS-CON.TV
Cloud Expo & Virtualization 2009 East
PLATINUM SPONSORS:
IBM
Smarter Business Solutions Through Dynamic Infrastructure
IBM
Smarter Insights: How the CIO Becomes a Hero Again
Microsoft
Windows Azure
GOLD SPONSORS:
Appsense
Why VDI?
CA
Maximizing the Business Value of Virtualization in Enterprise and Cloud Computing Environments
ExactTarget
Messaging in the Cloud - Email, SMS and Voice
Freedom OSS
Stairway to the Cloud
Sun
Sun's Incubation Platform: Helping Startups Serve the Enterprise
POWER PANELS:
Cloud Computing & Enterprise IT: Cost & Operational Benefits
How and Why is a Flexible IT Infrastructure the Key To the Future?
Click For 2008 West
Event Webcasts
Cloud Optimized Storage Solutions: Tiering & Expectations
Part 3 of our ongoing series

Dave Graham's Blog

In Part One of this Cloud Optimized Storage Solutions (COSS) series, we took a look at the content being stored on COSS and in Part Two at how it is stored.

Storage within the cloud is meaningless without a measurable level of performance that it can be compared against. Since there are no established benchmarks that determine performance of storage within a cloud infrastructure, it is reasonable to apply tiering metrics to storage based on content valuation and service level agreements (SLAs) and utilize this as an overarching methodology to judge COSS storage capabilities based on application set.

Within the concept of Information Lifecycle Management (ILM), there exists the idea that storage can be “tiered” based on criticality of need, application storage performance, or other derived metrics that couple relative responsiveness and bandwidth to serviceability. Layered over the top of this metric is the concept of aggrandized Service Level Agreements that cover compliance, data protection, and data access (to name a few). To bring things into more direct focus, there are 4 direct “Tiers” that are recognized (or promoted) within the storage community: Tier 0, Tier 1, Tier 2, Tier 3. While it is worthwhile to note that these Tiers are arbitrary in nature, they do provide a baseline framework upon which to build a more robust data service model.

The Data Tier Models

Tier 0 & 1: Performance & Availability Guaranteed Storage

Tier 0 is a relatively recent addition to the ILM tiering schema and is based solely on the emergence of Solid State Devices (Fusion I/O, EMC/STEC EFDs, et al.). These storage devices feature sub-millisecond access times coupled with extremely high bandwidth to the storage subsystems, thus driving a higher level of storage access and bandwidth metrics for applications such as Online Transaction Processing (OLTP) and Data Warehousing. The criticality of response to these applications is of a higher priority than Tier 1.

Tier 1 was originally established to service high availability, high bandwidth, low response time application and storage needs that tied directly to OLTP/OLAP, DSS, Data warehousing, etc. type workloads. Typically applications in this Tier have a sub-20ms response time requirement (or best practice) and are more sensitive to latency issues than, perhaps, Tier 2 workloads.

Tier 2: Availability Guaranteed Storage

Tier 2 is commonly referred to a transitional data tier, namely due to the nature of data that lives within it. Common data placement within Tier 2 centers on file systems or data that has a change rate that is based on occasional access within a fixed window of time (e.g. 30-60-90 days). It has a decided focus and tuning towards making sure that data is “online” and accessible without stringent latency or performance criteria.

Tier 3: Accessibility Guaranteed Storage

The last noted Tiering level for data is Tier 3. Tier 3 can best be described as an archive level for stagnant or “stale” data that has infrequent access. Historically, Tier 3 has been policy driven, that is, the recipient for some sort of data migration movements versus being a primary storage Tier for end-user access. Within the COSS environment, however, Tier 3 becomes as crucial of a storage target as Tier 1 or 2 simply due to the large preponderance of unstructured data within the cloud space.

Expectations of Data Tiering

The somewhat open-ended definitions to these data tiering levels are purposeful. In defining the principles of a COSS, there is an inherent need to keep them somewhat fluid especially as content continues to change and develop more complexity. Additionally, while the cloud currently has a majority stakeholder in unstructured data, there is no reason why structured data (and associated programmatic hooks and layers) cannot regain ground. As stated previously, these data tiers, while arbitrary, still provide an essential top-down view of how data can be categorized when planning for a COSS implementation. As an extension of data tiering, it’s also important to understand how global and particular Service Level Agreements (SLAs) can and will affect the data stored on COSS.

Author's Notes:

  • ILM, to the best of my knowledge, is not an EMC-designed concept. Whether or not storage tiers existed before EMC popularized the term is indisputable and inherently inconsequential to this discussion.
  • Disclaimer - The opinions expressed here are my personal opinions. Content published here is not read or approved in advance by EMC and does not necessarily reflect the views and opinions of EM
About Dave Graham
Dave Graham is a Technical Consultant with EMC Corporation where he focused on designing/architecting private cloud solutions for commercial customers.

In order to post a comment you need to be registered and logged in.

Register | Sign-in

Reader Feedback: Page 1 of 1

Latest Cloud Developer Stories
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jers...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive C...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many ...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices ...
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
Click to Add our RSS Feeds to the Service of Your Choice:
Google Reader or Homepage Add to My Yahoo! Subscribe with Bloglines Subscribe in NewsGator Online
myFeedster Add to My AOL Subscribe in Rojo Add 'Hugg' to Newsburst from CNET News.com Kinja Digest View Additional SYS-CON Feeds
Publish Your Article! Please send it to editorial(at)sys-con.com!

Advertise on this site! Contact advertising(at)sys-con.com! 201 802-3021



SYS-CON Featured Whitepapers
ADS BY GOOGLE