Comments
yourfanat wrote: I am using another tool for Oracle developers - dbForge Studio for Oracle. This IDE has lots of usefull features, among them: oracle designer, code competion and formatter, query builder, debugger, profiler, erxport/import, reports and many others. The latest version supports Oracle 12C. More information here.
Cloud Expo on Google News
SYS-CON.TV
Cloud Expo & Virtualization 2009 East
PLATINUM SPONSORS:
IBM
Smarter Business Solutions Through Dynamic Infrastructure
IBM
Smarter Insights: How the CIO Becomes a Hero Again
Microsoft
Windows Azure
GOLD SPONSORS:
Appsense
Why VDI?
CA
Maximizing the Business Value of Virtualization in Enterprise and Cloud Computing Environments
ExactTarget
Messaging in the Cloud - Email, SMS and Voice
Freedom OSS
Stairway to the Cloud
Sun
Sun's Incubation Platform: Helping Startups Serve the Enterprise
POWER PANELS:
Cloud Computing & Enterprise IT: Cost & Operational Benefits
How and Why is a Flexible IT Infrastructure the Key To the Future?
Click For 2008 West
Event Webcasts
Cloud Optimized Storage Solutions: Tiering & Expectations
Part 3 of our ongoing series

Dave Graham's Blog

In Part One of this Cloud Optimized Storage Solutions (COSS) series, we took a look at the content being stored on COSS and in Part Two at how it is stored.

Storage within the cloud is meaningless without a measurable level of performance that it can be compared against. Since there are no established benchmarks that determine performance of storage within a cloud infrastructure, it is reasonable to apply tiering metrics to storage based on content valuation and service level agreements (SLAs) and utilize this as an overarching methodology to judge COSS storage capabilities based on application set.

Within the concept of Information Lifecycle Management (ILM), there exists the idea that storage can be “tiered” based on criticality of need, application storage performance, or other derived metrics that couple relative responsiveness and bandwidth to serviceability. Layered over the top of this metric is the concept of aggrandized Service Level Agreements that cover compliance, data protection, and data access (to name a few). To bring things into more direct focus, there are 4 direct “Tiers” that are recognized (or promoted) within the storage community: Tier 0, Tier 1, Tier 2, Tier 3. While it is worthwhile to note that these Tiers are arbitrary in nature, they do provide a baseline framework upon which to build a more robust data service model.

The Data Tier Models

Tier 0 & 1: Performance & Availability Guaranteed Storage

Tier 0 is a relatively recent addition to the ILM tiering schema and is based solely on the emergence of Solid State Devices (Fusion I/O, EMC/STEC EFDs, et al.). These storage devices feature sub-millisecond access times coupled with extremely high bandwidth to the storage subsystems, thus driving a higher level of storage access and bandwidth metrics for applications such as Online Transaction Processing (OLTP) and Data Warehousing. The criticality of response to these applications is of a higher priority than Tier 1.

Tier 1 was originally established to service high availability, high bandwidth, low response time application and storage needs that tied directly to OLTP/OLAP, DSS, Data warehousing, etc. type workloads. Typically applications in this Tier have a sub-20ms response time requirement (or best practice) and are more sensitive to latency issues than, perhaps, Tier 2 workloads.

Tier 2: Availability Guaranteed Storage

Tier 2 is commonly referred to a transitional data tier, namely due to the nature of data that lives within it. Common data placement within Tier 2 centers on file systems or data that has a change rate that is based on occasional access within a fixed window of time (e.g. 30-60-90 days). It has a decided focus and tuning towards making sure that data is “online” and accessible without stringent latency or performance criteria.

Tier 3: Accessibility Guaranteed Storage

The last noted Tiering level for data is Tier 3. Tier 3 can best be described as an archive level for stagnant or “stale” data that has infrequent access. Historically, Tier 3 has been policy driven, that is, the recipient for some sort of data migration movements versus being a primary storage Tier for end-user access. Within the COSS environment, however, Tier 3 becomes as crucial of a storage target as Tier 1 or 2 simply due to the large preponderance of unstructured data within the cloud space.

Expectations of Data Tiering

The somewhat open-ended definitions to these data tiering levels are purposeful. In defining the principles of a COSS, there is an inherent need to keep them somewhat fluid especially as content continues to change and develop more complexity. Additionally, while the cloud currently has a majority stakeholder in unstructured data, there is no reason why structured data (and associated programmatic hooks and layers) cannot regain ground. As stated previously, these data tiers, while arbitrary, still provide an essential top-down view of how data can be categorized when planning for a COSS implementation. As an extension of data tiering, it’s also important to understand how global and particular Service Level Agreements (SLAs) can and will affect the data stored on COSS.

Author's Notes:

  • ILM, to the best of my knowledge, is not an EMC-designed concept. Whether or not storage tiers existed before EMC popularized the term is indisputable and inherently inconsequential to this discussion.
  • Disclaimer - The opinions expressed here are my personal opinions. Content published here is not read or approved in advance by EMC and does not necessarily reflect the views and opinions of EM
About Dave Graham
Dave Graham is a Technical Consultant with EMC Corporation where he focused on designing/architecting private cloud solutions for commercial customers.

In order to post a comment you need to be registered and logged in.

Register | Sign-in

Reader Feedback: Page 1 of 1

Latest Cloud Developer Stories
"DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team...
NanoVMs is the only production ready unikernel infrastructure solution on the market today. Unikernels prevent server intrusions by isolating applications to one virtual machine with no users, no shells and no way to run other programs on them. Unikernels run faster and are light...
CloudEXPO | DevOpsSUMMIT | DXWorldEXPO Silicon Valley 2019 will cover all of these tools, with the most comprehensive program and with 222 rockstar speakers throughout our industry presenting 22 Keynotes and General Sessions, 250 Breakout Sessions along 10 Tracks, as well as our ...
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, vi...
Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. ...
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
Click to Add our RSS Feeds to the Service of Your Choice:
Google Reader or Homepage Add to My Yahoo! Subscribe with Bloglines Subscribe in NewsGator Online
myFeedster Add to My AOL Subscribe in Rojo Add 'Hugg' to Newsburst from CNET News.com Kinja Digest View Additional SYS-CON Feeds
Publish Your Article! Please send it to editorial(at)sys-con.com!

Advertise on this site! Contact advertising(at)sys-con.com! 201 802-3021



SYS-CON Featured Whitepapers
Most Read This Week
ADS BY GOOGLE