yourfanat wrote: I am using another tool for Oracle developers - dbForge Studio for Oracle. This IDE has lots of usefull features, among them: oracle designer, code competion and formatter, query builder, debugger, profiler, erxport/import, reports and many others. The latest version supports Oracle 12C. More information here.
Cloud Expo on Google News
Cloud Expo & Virtualization 2009 East
Smarter Business Solutions Through Dynamic Infrastructure
Smarter Insights: How the CIO Becomes a Hero Again
Windows Azure
Why VDI?
Maximizing the Business Value of Virtualization in Enterprise and Cloud Computing Environments
Messaging in the Cloud - Email, SMS and Voice
Freedom OSS
Stairway to the Cloud
Sun's Incubation Platform: Helping Startups Serve the Enterprise
Cloud Computing & Enterprise IT: Cost & Operational Benefits
How and Why is a Flexible IT Infrastructure the Key To the Future?
Click For 2008 West
Event Webcasts
Cloud Optimized Storage Solutions: Tiering & Expectations
Part 3 of our ongoing series

Dave Graham's Blog

In Part One of this Cloud Optimized Storage Solutions (COSS) series, we took a look at the content being stored on COSS and in Part Two at how it is stored.

Storage within the cloud is meaningless without a measurable level of performance that it can be compared against. Since there are no established benchmarks that determine performance of storage within a cloud infrastructure, it is reasonable to apply tiering metrics to storage based on content valuation and service level agreements (SLAs) and utilize this as an overarching methodology to judge COSS storage capabilities based on application set.

Within the concept of Information Lifecycle Management (ILM), there exists the idea that storage can be “tiered” based on criticality of need, application storage performance, or other derived metrics that couple relative responsiveness and bandwidth to serviceability. Layered over the top of this metric is the concept of aggrandized Service Level Agreements that cover compliance, data protection, and data access (to name a few). To bring things into more direct focus, there are 4 direct “Tiers” that are recognized (or promoted) within the storage community: Tier 0, Tier 1, Tier 2, Tier 3. While it is worthwhile to note that these Tiers are arbitrary in nature, they do provide a baseline framework upon which to build a more robust data service model.

The Data Tier Models

Tier 0 & 1: Performance & Availability Guaranteed Storage

Tier 0 is a relatively recent addition to the ILM tiering schema and is based solely on the emergence of Solid State Devices (Fusion I/O, EMC/STEC EFDs, et al.). These storage devices feature sub-millisecond access times coupled with extremely high bandwidth to the storage subsystems, thus driving a higher level of storage access and bandwidth metrics for applications such as Online Transaction Processing (OLTP) and Data Warehousing. The criticality of response to these applications is of a higher priority than Tier 1.

Tier 1 was originally established to service high availability, high bandwidth, low response time application and storage needs that tied directly to OLTP/OLAP, DSS, Data warehousing, etc. type workloads. Typically applications in this Tier have a sub-20ms response time requirement (or best practice) and are more sensitive to latency issues than, perhaps, Tier 2 workloads.

Tier 2: Availability Guaranteed Storage

Tier 2 is commonly referred to a transitional data tier, namely due to the nature of data that lives within it. Common data placement within Tier 2 centers on file systems or data that has a change rate that is based on occasional access within a fixed window of time (e.g. 30-60-90 days). It has a decided focus and tuning towards making sure that data is “online” and accessible without stringent latency or performance criteria.

Tier 3: Accessibility Guaranteed Storage

The last noted Tiering level for data is Tier 3. Tier 3 can best be described as an archive level for stagnant or “stale” data that has infrequent access. Historically, Tier 3 has been policy driven, that is, the recipient for some sort of data migration movements versus being a primary storage Tier for end-user access. Within the COSS environment, however, Tier 3 becomes as crucial of a storage target as Tier 1 or 2 simply due to the large preponderance of unstructured data within the cloud space.

Expectations of Data Tiering

The somewhat open-ended definitions to these data tiering levels are purposeful. In defining the principles of a COSS, there is an inherent need to keep them somewhat fluid especially as content continues to change and develop more complexity. Additionally, while the cloud currently has a majority stakeholder in unstructured data, there is no reason why structured data (and associated programmatic hooks and layers) cannot regain ground. As stated previously, these data tiers, while arbitrary, still provide an essential top-down view of how data can be categorized when planning for a COSS implementation. As an extension of data tiering, it’s also important to understand how global and particular Service Level Agreements (SLAs) can and will affect the data stored on COSS.

Author's Notes:

  • ILM, to the best of my knowledge, is not an EMC-designed concept. Whether or not storage tiers existed before EMC popularized the term is indisputable and inherently inconsequential to this discussion.
  • Disclaimer - The opinions expressed here are my personal opinions. Content published here is not read or approved in advance by EMC and does not necessarily reflect the views and opinions of EM
About Dave Graham
Dave Graham is a Technical Consultant with EMC Corporation where he focused on designing/architecting private cloud solutions for commercial customers.

In order to post a comment you need to be registered and logged in.

Register | Sign-in

Reader Feedback: Page 1 of 1

Latest Cloud Developer Stories
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Ind...
DXWorldEXPO LLC announced today that Kevin Jackson joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Kevin L. Jackson is a globally recognized cloud computing expert and Founder/Author of the award win...
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit f...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scala...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging ...
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
Click to Add our RSS Feeds to the Service of Your Choice:
Google Reader or Homepage Add to My Yahoo! Subscribe with Bloglines Subscribe in NewsGator Online
myFeedster Add to My AOL Subscribe in Rojo Add 'Hugg' to Newsburst from CNET Kinja Digest View Additional SYS-CON Feeds
Publish Your Article! Please send it to editorial(at)!

Advertise on this site! Contact advertising(at)! 201 802-3021

SYS-CON Featured Whitepapers