Tech Expectations

A deeper look at disruptive business and personal technology

Leave a comment

VMware should acquire EMC

I’m kicking myself for not completing the thought a year ago. But Arik Hesseldahl beat me to it with a great article about the EMC board thinking about a “downstream merger.” Mr. Chris Evans followed with a bit more industry color. Chris’s post inspired me to add a few more quick points, to avoid the “stuck blog” trap.

  • VMware has a more valuable and differentiated position in the data center. When folks think private cloud and future infrastructure, VMware has to be in the picture. EMC storage is one of many options.
  • Software and public cloud is the future, not hardware appliances. VMware is in a better position to operate as a pure software company with optional appliances, not the other way around. Go-to-market strategies (e.g. selling hardware, giving away software) and everything supporting them are very difficult to change, if not impossible.
  • Finally, while VMware still accounts for only 25% of the revenue of the “EMC Federation,” it is quickly becoming the profit engine, as seen below in the net income results over the last ten quarters.

I’ll be watching very closely to see what the EMC board decides!

Leave a comment

Big Data, Big Infrastructure, And You

data-warehousesSuperlatives are the norm when talking about Big Data. We all see Big predictions of the Big numbers and High growth of petabytes, connected devices, data containers, etc., and the Tidal changes they bring.

All true!

But I think the “Big” adjective best applies to Complexity. How do IT shops manage all these 1s and 0s and set the table for meaningful analytics?

At the most basic level, there are two approaches, seemingly contradictory, for laying the analytics foundation: consolidate data onto fewer platforms, or analyze it selectively across platforms in a decentralized fashion. These actually complement one another well as part of an iterative data management model in which IT continuously re-positions data sets to support evolving analytics needs.

Let’s examine each, then look at the squishy middle, where most companies will reside.

Continue reading

1 Comment

How much data does x store?


Being in the tech infrastructure industry, I often get the question, “How big is that service?” or “How much does x store?” Here is where I will keep track.

(updated 3/28/15)

Some interesting tidbits:

The list so far:

Photo “Warehouse” by Erik Söderström
Photo by r2Hox

Leave a comment

2015 Data storage market review: continued disruption by flash, SDS, and cloud

(Updated 3/23 – version 3)

Twinstrata, Maginatics, and Amplidata get acquired. Riverbed exits the storage business. DataGravity and Primary Data launch. HGST and Seagate continue to move into the systems business. Nutanix, SimpliVity, Cleversafe, and Scality form alliances with the global systems vendors like Dell, HP, and Cisco. Microsoft opens up their Office 365 ecosystem to other cloud storage providers like Dropbox. Qumulo and several stealth companies are continuing to raise millions of dollars and not telling us what they are doing. Box goes public (finally), the first cloud storage company to do so, and continues to trip up like Mr. Bean. Veritas, arguably the granddaddy of software-defined storage, returns as its own company. Storage unicorns run amok with SimpliVity just joining the club. And this year, we’ll finally get a look at Amazon Web Services financials instead of just clever guessing. Ho hum, just another few months in the data storage market. Continue reading

1 Comment

Collaboration: WTF is it, really? An attempt to map the Collaboration Market

OK, I’m expressing a little frustration. Every time I see the term “collaboration,” I shudder. Why? The term is actually worse than “cloud” or “big data” or other terms that end up obscuring the products and vendors in a market. Ask five people what they mean when they say “collaboration,” and you will end up with five completely different definitions. Am I taking this personally? Well, yes. I am trying to collaborate every day. I am creating content that needs to get out to peers and the public. I am trying to get the word out to colleagues in UK, France, Germany, Belgium, Singapore, Tokyo, down the hall, next door, at the desk next to me – and failing. Or at the very least, spending about five times as much time as I think I should be.

Continue reading


Your Enterprise Tech cheat sheet

As you speculate on who’s buying whom, who’s breaking up, who’s on the rise, and who’s in trouble, consider the Enterprise Technology stack. Hats off to Geoffrey Moore and the EMC M&A team who first taught me to think about the tech landscape in this way.

Technology markets are often controlled by “gorillas” or market leaders in particular parts of the stack. Consider the stranglehold Microsoft had for a long time in computer operating systems (dramatically reduced by Linux), or the continued dominance of Oracle in business databases (gradually being challenged by open source and the Cloud). Showing who has what in a stack orientation allows you to quickly scan gaps in a portfolio (that my need to be filled), alliances that may need to be formed, niche players that are being marginalized, etc.

This is a version 1 and would appreciate input. I had a tough time deciding whether to split Cloud vs. packaged/appliance into its own set of layers or not. I kept them as options within particular layers (e.g. database), but it may be more helpful to split them. In any case, here’s version 1. Let the speculation continue!

Leave a comment

99.8 percent of the world’s data was created in the last two years

I’ve recently come across the statistic that “90% of the world’s data has been produced in just the last two years.” However, I can’t find the real source for this statement, so I’ll try to quickly break it down below:

In 1997, Professor Michael Lesk, the Chair at the Department of Library and Information Science at Rutgers University, made the statement that all the world’s information amounted to about 12 Exabytes. This is from looking at “traditional” information in the form of cinema, images, broadcasting, sound, and telephony.

If we look at the last two years, the leading sources would be IDC with their Digital Universe study (sponsored by EMC), and the University of California, San Diego’s Global Information Industry Center. IDC is more of an apples to apples comparison, and they indicate that in 2012 and 2013, 2.8 and 4.4 zettabytes were created respectively.

If we directly compare 12 Exabytes with 7,200 Exabytes, 99.8 percent of all information was created in the last two years.

This bears additional investigation, but at least we’re now talking numbers. 😉