Archive for the ‘Uncategorized’ Category

I prefer HaikuLeaks to WikiLeaks, everybody wants an app store, and more predictions for 2011.

14/01/2011

Sixty Second Snapshot

Data is at the very heart of the evolution of corporate IT and the revolution of consumer technology.  The coming year will see many customers evolve what was their storage strategy and how they stored what they created to a data strategy and how/where they access data through classifying the data being created. This classification will give rise to segmenting data for permanent in-house retention [structured] and data to be streamed off-site retention [unstructured]. The biggest question of 2011 for data will likely be ‘for what do I want to own the storage outright versus what I can hire access to’.

Firstly, a very Happy New Year if I haven’t seen you or had the opportunity to say HNY thus far …and this officially marks the last time I think we can get away with wishing each other a Happy New Year before we cross the Rubicon hurtling well into 2011 and beyond!

Secondly …I’d like you to go and find a newspaper article or favourite magazine, clip an article that you find meaningful or interesting, photocopy it, and post to 20 close friends and family.  It’s okay, I’ll wait.

Right …done?  Good!  Now, how long did it take to do that, or , on balance did you not bother as you reckoned the amount of effort and time was more than you were prepared to expend for little old me to illustrate a point?  Not to worry, I wouldn’t have done it either!

Over the past Christmas and New Year break I was thinking that, even a few short years ago, easily sharing information was far from easy.  Yet, flash forward to now and we probably don’t even think twice about sharing something on Facebook or an email or even Twitter with tens, hundreds, or even thousands of people …all in the blink of an eye, with the technology which enables this remaining largely invisible and transparent to you as a customer or consumer.

Equally, the world wide distribution of sensitive information via something like WikLeaks has highlighted a profound change in almost effortless information distribution.  We could certainly argue the legality or even the sense of releasing secure diplomatic cables …and wouldn’t it be delightfully ironic if someone leaked Julian Assange’s book before it was published.  Interestingly, though, WikiLeaks would probably not have been possible even a few short years ago.   I must admit, I remain unconvinced regarding the execution of WikiLeaks stated purpose and it would appear that perhaps some of Assange’s key staffers do as well given they are abandoning the Good Ship Julian and starting OpenLeaks.

What’s this got to do with data storage and protection?

Last year I gave you my predictions for 2010 so this isn’t a blog post about WikiLeaks …to be honest, I prefer HaikuLeaks …but, rather, my observations and predictions for 2011.

As a data guy I would say that data has changed the world forever.  Data is being set free at an incredible pace, whether through mediums such as WikiLeaks or access ‘anytime/anywhere’ mobile devices …and data is being created at an even more alarming rate.  How much data?  How about we create as much information in two days now as we did from the dawn of man through 2003.  And it is this data …or, more importantly strategies for data …which give us some clues as to what 2011 holds.

1.  Do you have a data strategy or a storage strategy?

Up to 2011 we have advised both as an industry and within Computacenter that our customers have a storage strategy, with these strategies often driven by categorisation such as ‘what kind of RPO/RTO is required’ and ‘throughput’ and the strategy often defined solely by the speed and type of disk drives [e.g. Tier One = 300GB /15K fibre channel drives]. Whilst throughput and traditional categorisation remains important, we will see storage strategy evolve to data strategy with key categorisation such as ‘what is the data’, ‘should the data reside internal or external to the organisation’ joining the more traditional speeds and feeds.  It is this data strategy which will seek to reduce OpEx and CapEx costs in both the immediate and long term whilst also increasing corporate agility by enabling secure ‘anytime/anywhere’ access for corporate users.

2. Virtualisation … and optimisation …of the entire datacentre continues at pace.

As data strategies begin to take shape, it will become more and more apparent that catering to data through intelligent and efficient storage devices alone will not be enough to fully realise a data strategy which seek to reduce OpEx and CapEx costs in both the immediate and long term whilst also increasing corporate agility by enabling secure ‘anytime/anywhere’ access for corporate users.  As such, expect to see virtualisation of the datacentre continue  to accelerate at considerable pace as the optimisation and cost reduction benefits are realised and advertised by early adopters of productised VDCs such as VCE vBlock, NetApp FlexPod, and Oracle Exadata.  Equally, 2011 will be about the realisation this isn’t a zero-sum game where one is forced to select a single ‘stack’ as some environments and workloads will be suited to one ore more of them …or a mixture of them all.  What is important is aligning the business to the data which will determine how, when, and why to select one or more optimised datacentre stacks.  Besides, the ‘stack wars’ …if there is such a thing …may not matter much as in the not too distant future there may be one chip to rule them all.

3. Intel Romley will have a profound impact on storage …and servers, hypervisors, networking …pretty much everything.

As we continue to optimise the datacentre, the offload of tasks to the chipset will gather pace.  Emulex showed the way forward and the possibilities with their OneConnect converged adapter product, and I think we’ll see things like RAID …formerly the sole domain of specialised data storage hardware controllers …move down to the Intel chipset.  But I don’t think it will stop there and within the next few years we’ll see the ability to run data storage, server compute, networking [routers, switches], and even hypervisors directly from Intel chips with software automating the self tuning/self healing required to distribute the load and ‘tell’ the chip which operation to perform …i.e. behave like a data storage controller today, but tomorrow we need you to perform like a server to help service the data access requirements.

4. Customers will demand their own internal corporate app stores which will ensure that their users remain productive anytime, anywhere.

The concept of the app store began in the commercial space with mobile devices such as the iPod, iPhone and iPad.  The iPhone and iPod Touch gained 85m users in 11 quarters …that’s 11 times faster than AOL added users in 1994 …and the app store is about to hit 10 billion downloads.  Add to that the iPad selling 1m units in 28 days and on target for 10m sold in 2011, not to mention the Google Android devices and smartphones …users will want access to their data anytime, anywhere for a truly mobile experience. In fact, mobile as a concept will probably cease to exist.

But on 11 January 2011 the app store entered the corporate space with the launch of the Mac app store.  Not only does this point in the direction of a convergence of desktop and mobile operating systems in the not too distant future, it also points to a new and perhaps more efficient way for organisations to distribute and maintain corporate software as apps and through a secure corporate app store.  The IT landscape is littered with instance upon instance of consumer technology being demanded by users at work to drive business agility increases and change, but it won’t be the device that enables this change for organisations …it will be the ability for organisations to federate their structure data housed internally [e.g. customer databases, ERP, and billing systems] with unstructured data housed externally with service providers [e.g. email] to provide a unified app space for their users.  Put another way, the device will become far less important that the apps you can run and data federation, geotagging, and data container security will enable the corporate app store.  Don’t believe me?  It would seem companies like Apperian are trying to steal a march on competition with their Enterprise App Services Environment [EASE] product.

5. Federation of data will be what customers will require to keep costs down permanently, but will dip their toes in the water with selected workloads.

Fantastic apps such as Evernote for note taking, Dropbox for data, and Kindle for reading make it possible to take notes, save data, and even read books on any device you happen to own …all whilst keeping any modifications made synced up automatically with generally no additional effort on your part.  How do they make this ‘magic’ work?  The full answer can be somewhat complicated, but the short answer is that data federation is at the heart of each of these solutions.  Customers will seek similar automated syncing and federation from their internal service provider [read, IT department] as well as external service providers such as Computacenter.  How significant a change will this be?  Well, let’s put it this way …I’m not sure I would want to be a USB flash drive manufacturer moving forward.

6. Data storage arrays will be able to be run as a virtual ‘instance’ on a virtual machine [VM].

The data storage of the past ten years …deduplication, compression, automated tiering, the list goes on and on …are really software at their core and not hardware solutions.  Put simply …storage is really just software, after all.  Given this, I expect one of the data storage vendors to ‘break cover’ and allow users to run their data storage array software as a virtual instance …a VMware VM, if you will.   Indeed, NetApp have had an ONTAP filer simulator for quite a few years so it doesn’t take a huge leap of imagination perhaps to see where going one step further and allowing users to run a data storage array as a virtual machine may not be far away.

7.  If we’re evolving our storage strategies to be data strategies, what will remain for data storage hardware tiering recommendations will be ‘flash and trash’; solid state drives [SSD] for performance intensive workloads and dense SATA/SAS drives for everything else.

Notice that you don’t see fibre channel [FC] drives in that equation?  No, I haven’t made a mistake …I think this is the year that FC drives drop out of the architecture.  Whilst they served an important purpose once upon a time, they have outlived their usefulness in the data storage architecture moving forward.  Automated storage tiering, such as FAST from EMC, means that we can now move data automatically at the block level from highly performant SSD to SATA/SAS as required with no need for administrative intervention and whilst remaining transparent and seamless to the users.

I am convinced that 2011 is going to be an extremely interesting and perhaps even watershed year for data specifically and the datacentre generally.  I would expect virtualise, containerise, mobilise will be joined with monitor, deploy, automate as we seek to reduce storage and IT costs whilst increasing business agility.

I’ll be blogging in much more detail about all of the seven predictions I’ve made above, and tune in next week as EMC will be making a major product announcement which could help prove point number seven …as well as possibly a few more!

Advertisements

Yeah, I’m fat … but so is your data.

05/11/2010

Sixty Second Synopsis

Throwing technology at what is potentially a ‘people’ issue will have limited benefit long term.  We do need to deploy more intelligent and efficient storage devices, but we also need to help our users understand the negative impacts of unfettered data growth and storage. Transparent reporting of what users consume can help them understand what they are consuming/costing with a view to proactively reducing these costs.

Identity is a very important and, frequently, quite a private thing.

I have seen how our friends from NetApp try to hide the cringe when they hear someone incorrectly call them ‘NetApps’ and, whilst I don’t have this on good authority, I do wonder if Acadia’s recent decision to rebrand themselves as VCE had anything to do with constantly being called ‘Arcadia’.  I’ve no doubts that Michael Capellas would love to sell vBlocks as quickly as Philip Green sells t-shirts and jumpers, but there’s not getting away from the fact that they are different sales skillsets … not to mention price points!

Equally, I have lost count of the number of times I have had to explain to people why Computacenter has an ‘a’ in the middle and isn’t spelled with an ‘re’ at the end … even though we are a European company.  But Computacenter we are, and I am a proud member of the team ready to defend our name and strategy.

Now, anyone that knows me knows that I’m not exactly over the moon when someone calls me ‘Matt’ …  particularly when I introduce myself as Matthew, answer my email as Matthew, answer the phone as Matthew … you get the point.  There are a few personal reasons for this that I won’t bore you with here, but one of the big ones is that when I was younger I was called ‘fat Matt’.  Yes, kids can be cruel, and, whilst I have been blessed with a few attributes … svelte isn’t one of them.  Being reminded of always having to shop with my mother in the ‘husky’ section as a child coupled with the thought that if I get any bigger I may have my own gravitational pull is never fun..

I had two brief moments of sveltedom … once when I started university, and once just before I got married … but being married to a chef who’s family owns a butcher shop, having an interest in wine, and being able to resist everything but temptation was probably never going to end well.

I’ve blogged previously about my struggles with my weight as well as my efforts to lose weight permanently and, unfortunately, there is no getting away from a simple and stark fact … I’ve failed miserably.

The definition of insanity is to do the exact same thing you did before but expect different results so, from today, I’m making a big change.

Liposuction?  No, too expensive and doesn’t change the underlying issues.  Gastric band?  Somehow I doubt swallowing a rubber band is going to change things for the better.

Nope … I’m going public.  That’s right, from today you can access my food diary where I’m logging everything I eat, every bit of exercise I take … open, transparent, no limitations.  No excuses.

What’s this got to do with Data Storage & Protection?

I suppose I believe that a problem shared is a problem halved, and really I have no one other than myself to blame at present for not losing weight.  Perhaps by putting my diary online and making it open will help me to ‘control’ myself and, who knows … perhaps others with a similar issue will want to join me and we can make a bit of a contest out of this.  And it becomes that much more difficult to bend the truth when someone asks you how your diet is going when you know they can check for themselves!

The challenge is, just as I know that by remaining overweight I’m not doing my health any good and potentially shortening my life, so too we know that we are creating data at an alarming and, some would say, unsustainable rate.  Indeed, we’re creating as much data in three days as we did in all of 2003.  And much of the data growth is coming from ‘unstructured’ data … data that, in simple terms, is not likely to be business meaningful.

That this data creation is harmful to us and our world at large is open to examination and robust debate, and there is no doubt that storage vendors are doing everything they can to create ever more intelligent and efficient storage devices.

But this is only half the equation, in my opinion.

Short of shooting the users … not something I would condone, and I’m fairly certain it’s illegal just about, well, everywhere … we have to come up with a way in which we get users to be cognisant of the resources they consume and the costs thereof.  Perhaps if users know the cost of the data storage they are consuming they will think twice before saving yet another copy of the same PowerPoint, forward on a large spreadsheet they could have sent a link to, or send on that high-def video I’m sure is just as hilarious as you think it is.

But how?

1. Baseline the cost of workloads in the organisation.

By workload I mean the application, server, hypervisor [if applicable], network, and data storage required to provide the service to the user.  If you don’t know what it costs, how to break it down by user, et al … I know a nice service provider in the UK who would be more than happy to help!

2. Decide what and how you would charge a user for usage.

I have spoken to some customers who are spending upwards of 50% of their IT budgets on data storage, so I would be surprised if lowering storage costs wouldn’t be of interest … but what would you look to meter?  Cost per gigabyte stored?  Over a month?  Over a year?  You get the point.  And the tools on the market today let you set a cost per gigabyte as well as having LDAP hooks such that they can report exactly what a user is creating … and costing … historically as well as currently.

3. Make the costs per user/department/division transparent.

This isn’t about ‘naming and shaming’ users, but I do think that if users saw what they were costing the company in a ‘leader table’ with their peers perhaps they would be more inclined to reduce those costs through more diligent data management.  Data growth isn’t the fault of the IT department, and their simply deploying more intelligent devices won’t solve the problem alone in my opinion.

4. Make it a competition, with an award for the user/department/division who can reduce their costs the most.

Weight loss is all about targets and realistic expectations, but we all need a reward for good behaviour now and again to help keep us moving in the right direction.  Reducing the amount of data stored is most certainly of value to an organisation, and awards help to show and reward the progress being made.  We’re all in this together, after all, and simply shouting at IT to get better storage stuff or at purchasing to get a better price for the storage thingey isn’t going to solve the underlying issues.

As for me, time to go and take a brisk walk around the block so I can have a glass of wine or two with dinner.

Have a great weekend,

-Matthew

Click here to contact me.

We need business solutions to business problems.

09/10/2010

To a man with a beaker, everything is a solution.

Or so goes the sage advice of one of my university professors whose name I sadly seem to have forgotten, relegated to the mists of time and memory.

I am often reminded of this and, perhaps a more popular and well known version of the same sentiment, Abraham Maslow’s famous quote … ‘To a man with a hammer, everything looks like a nail.’ … when I’m meeting with customers as the conversation inevitably winds its way round to three common queries.

1. What do you think about storage vendor [x] versus vendor [y] versus vendor [z]?

2. We’re paying too much for storage specifically and/or IT generally.  We thought IT was meant to be a commodity which supported the business, yet costs feel wildly unpredictable. What would you recommend, and where should we start?

3. What made you think that that tie went with that shirt?

Oh dear.  Where to begin?

Number three is the easiest to deal with as I just need to ensure that I turn the lights on when I dress in the morning or, better still, let Mrs. PL choose my ensemble.

As for numbers one and two, well … they can be slightly more challenging.

When tackling number one, I am as candid as I can be about what we know from experience in deployment and testing both in the field and the Solution Centre as well as my personal opinions based on personal research and visits to our vendor partners development facilities … understanding that the vendors frequently FUD one another, but this is to be expected and always somewhat suspect.  Equally, it is worth bearing in mind that there are no silver bullets nor are there perfect solutions … at least I haven’t come across any in the 15+ years I’ve worked in technology and the 30+ years I’ve been around IT.  Indeed, when I used to go to work with my father it wasn’t called IT but ‘data processing’.

As for the second, the conversation will almost undoubtedly involve ‘Should I virtualise my servers?’, ‘Should I virtualise my storage?’, ‘Should I thin provision, deduplicate data, archive data, deploy grid storage, consider federated storage … ‘.  Unfortunately the answer is always …yes.  I do recognise it can be frustrating to hear … and I’m trying desperately to ensure that it doesn’t come across as flippant … when I know full well what many folks want is a direct answer and order to follow to solve what is arguably a three dimensional issue.

Ultimately what this all boils down to is that technology has largely become a three dimensional challenge, as I discussed last week, and that what our customers are asking us for is not technical jargon nor do they want to watch us throw technology at a business issue but, rather, proffer business solutions to business problems.

What’s this got to do with Data Storage & Protection?

I’m sometimes criticised for not getting to the point quickly enough, or for circular speech.  Fair enough.  But, in my defence, when faced with three dimensional business issues …if I recommend grid storage with a single storage media type but don’t take into account your future VDI and data mobility aspirations, for example …simply throwing a two dimensional solution is not going to get us where we need to be, no matter how pretty the Powerpoint slides.  These things need to be thought through and discussed and that takes time …and frequently a glass of wine or two.

So what to do?

1. Magic quadrants are good, but working equations are better.

We do use a consultancy equation …ROI + CBA + DPB = CSS …which attempts to help solve the ‘what to do next’ and ‘what’s the best solution’ for three dimensional business issues.  The composite score then points us towards the storage and technology most appropriate to underpin said solution.

2. The Computacenter Virtual Datacentre [VDC] solution is a three dimensional business solution.

VDC seeks to solve business issues by increasing business agility, automating highly repeatable tasks, optimising all aspects of a datacentre to reduce CapEx/OpEx costs by 30% to 50% … and we’ve been working on and have experience with VDC for over 18 months, long before others were even thinking about such solutions.  Don’t believe me?  Have a look at the date stamp on the Automated Storage Provisioning demo video … it reads 25 March 2009 if you don’t feel like clicking the link.

3. Vendors are rushing to create silver bullets as quickly as they can.

VCE vBlock, Oracle Exadata/Exalogic, NTAP IVA, HDS UCP, IBM Dynamic Infrastructure, HP Converged Infrastructure … it doesn’t really matter what marketing decides to call it, at no point in my technology career have I seen vendors spend this amount of effort trying to create complete datacentre silver bullets for customer business issues.  I’m not saying this is good or bad as it is still too early to tell, but the concept does seem to be resonating with customers.

4. If you don’t want to go the VDC route just yet, introduce a 3D storage solution.  HDS is trying to create just such a 3D storage solution which scales out, up, and deep.

HDS announced their Virtual Storage Platform this week, effectively replacing the USPV.

HDS VSP page level tiering allows a customer to create a pool of storage which in turn creates 3D tiering; scale up, scale out, scale deep.

Scale up; pooled storage media [FC, SATA, SAS, SSD] allows the VSP to locate the data on the most appropriate tier based upon business needs [e.g. my workload needs faster response during our corporate end of quarter billing] in an automated fashion such that workloads remain performant with zero or minimal administrative action as well as zero intrusion to the users.

Scale out; expand workloads automatically to accommodate greater storage requirements [e.g. users are creating more data so we need to expand the workload container] again in an automated fashion with zero or minimal administrative action as well as zero intrusion to the users.

Scale deep; demote/move data to denser storage for long term retention at lower cost as long term retention has become more important than performance [e.g. move workload to dense but less performant SATA] again with zero or minimal administrative action as well as zero intrusion to the users.

Does the HDS VSP work and is this the 3D answer to data storage?  Is page level tiering better than block level automated tiering?

The page level tiering allows a customer to leverage existing storage arrays and their previous storage investments, so there is a valid business case for page tiering.  However, to be honest we haven’t received our test unit into the Solution Centre yet, so I don’t want to offer an opinion until Bill McGloin and the CP Data team have finished their evaluation putting the VSP through its paces.

But watch this space as I think 2011 is going to be very interesting indeed as 3D solutions such as VDC begin to find their way into corporate datacentres.

Have a great weekend,

-Matthew

Click here to contact me.

The future is 3D and the future is now.

03/10/2010

It feels like quite a long time since I blogged just prior to my going on holiday with Mrs. PL and PL Junior, so please do forgive this Weekly View being a) somewhat late and, b) out of sequence.  I am back and into the full swing of things, with the Weekly View commencing each Friday again from this week. 

Many interesting things have happened since I went on holiday; here are just some that caught my attention … PL Junior started reception which left me wondering where the time goes, Lloyds made the front page of the weekend FT by piloting iPads, Paul Maritz [CEO, VMware] stated that ‘in 2009 organisations deployed more virtual machines [VMs] than physical machines for the first time’, and let’s not forget the $2.4b tussle between HP and Dell over 3PAR … with HP winning the tug of war.

Now, I have to make a confession here … I hate packing and unpacking possibly more than anything I can think of and will do just about anything to not have to do it … to wit, when Mrs. PL and I got married I suggested we throw our honeymoon clothes into suitcases dirty and then have housekeeping launder them when we got there.  Made sense to me … no hassle packing, unpacking, and we’d have clean and neatly pressed clothes delivered to our room!  Mrs. PL was less enamoured with this idea and offered some suggestions of her own for my packing which I’m still not sure are anatomically possible.

What’s this got to do with Data Storage & Protection?

I don’t like packing for two primary reasons; 1) I’m rubbish at packing and can never seem to get things packed properly, and 2) what looks like a large suitcase never seems to hold what it should.

Enter Mrs. PL who has a PhD in packing, bringing order to chaos and filling every cubic centimetre of our suitcases.  Which got me thinking … packing is a three dimensional problem and what Mrs. PL does so exceptionally well is to bring three dimensional solutions to this three dimensional problem.

I believe the exact same thing … 3D solutions for 3D issues … is happening with technology generally and with data storage specifically.

1.  Customers tend to reap only 40% utilisation from their storage infrastructures.

Customers want to get 100% utilisation from their storage infrastructures every bit as much as Mrs. PL wants to ensure she has used every cubic centimetre of a suitcase wisely.  When it comes to storage inefficiencies, there are numerous reasons; fat provisioning, orphaned volumes, duplicate data, dormant data which hasn’t been accessed in days/weeks/months/years … even inefficiencies within the storage arrays themselves.  The past two years have seen quite a consolidation of technologies such that the vast majority of tier one storage vendors have worked to introduce thin provisioning, data deduplication/data compression, automated storage provisioning/tiering into their storage arrays and offerings to increase utilisation.  In essence, introducing what were once products in their own right as features into mainstream data storage.  Why?  Put simply, increasing utilisation decreases costs … and as customers continue to store more and more data they require the highest utilisation possible to avoid excessive storage costs.

2.  Three dimensional problems require three dimensional solutions.

Virtualising a server, introducing dedupe into backups, thin provisioning a few volumes … each on their own are two dimensional solutions.  Whilst two dimensional solutions will reduce costs somewhat, it is only when these solutions are coupled holistically into three dimensional solutions that true cost reductions both in the immediate and for future growth can be achieved.  The Computacenter Virtualised Datacentre [VDC] solution is a three dimensional solution, seeking to holistically optimise the network, platform, hypervisor, storage, and automation such that OpEx and CapEx costs can be reduced by as much as 30% to 50% or more both in the immediate and for future workload creation and data retention.

3.  3D solutions, whether at the datacentre level or storage level, are business solutions and not technical solutions.

It is true that VDC is made up of technical components such as hypervisors, universal compute, virtualised 10GB Ethernet, grid storage, and automation … however VDC isn’t a technical solution.  It is a business solution which seeks to reduce costs by optimising and reducing wastage between components, automating highly repeatable tasks such as server and storage provisioning, all with a view to aligning technology [IT] to business value.  Why is this important?  Put succinctly, when calculating Total Cost of Ownership [TCO], only 30% of TCO is represented in acquisition cost … the remaining 70% is OpEx.  Many tools and technologies have focussed heavily on immediate return [e.g. TCO calculators, data dedupe] however the real long term cost savings remains in OpEx reduction.  Helping an IT department optimise OpEx should return significant long term value and cost reductions, and we’ve spent a lot of time putting science behind this such that we can underwrite the business benefits as opposed to peddling the marketing hype.  I’ve written at length about how and why VDCs help internal IT departments transition/evolve into internal service providers, so I won’t rehash that now but click here and here if you’d like to revisit these posts.

4.  Using commoditised hardware in data storage and treating data storage as a commodity are not the same thing.

Grid or ‘scale out’ systems such as IBM XiV and EMC VMAX form the basis of the Virtualised Datacentre … and cloud computing.  The secret to scale out systems is that they use commoditised hardware … Intel chips, SATA, SAS, and SSD drives … and use software to manage data placement and automated tiering.  However, this isn’t the same thing as treating data storage as a commodity and buying ‘good enough’ storage at the lowest price to store yet more data as this strategy is, generally speaking, what leas to low storage utilisation and high IT costs.  These next generation arrays represent the introduction of true business intelligence into the storage fabric and seeking to store data created as efficiently as possible from creation throughout the lifecycle.  Indeed, without scale out storage VDI, service catalogues, automated provisioning, cloud computing, et al wouldn’t be possible at costs low enough to help organisations overcome the inertia of how storage has traditionally been purchased and allocated.

5.  If three dimensional solutions are the future of technology and the key to significant cost reductions, why not introduce them into data storage directly?

A very good question, and certainly one that HDS have put to the market and customers with their recent Virtual Storage Platform [VSP] announcement.  Whereas the arrays mentioned in item 4 are more intelligent than traditional storage arrays, it could be argued that one must connect them to other components to achieve a 3D solution.  HDS would argue that their new VSP introduces 3D storage … scale out, scale up, scale deep … into the market for the first time.

I’ll be tackling the HDS VSP announcement in my next blog post , giving you my thoughts about how I think it stacks up to competitive technologies and solutions.

Shoot the users.

26/02/2010

Are geeks born or are they made?  Is technoweenism (def. ‘of to or pertaining to being a technoweenie’; see Matthew Yeager) genetic or is it environmental?

To be sure, I’ve been a geek for as long as I can remember.  My first video game was Star Trek on an IBM mainframe whilst my father did the reel to reel backups.  Seriously.  Try as I might, I never did beat Khan nor the Klingons.  Khaaaan!  My father tells me that he was once called in to a parent/teacher conference to enquire where I had got the copy of Newsweek which I was boring the kids to death within the playground.  Laugh if you must, but the Soviets shooting down KAL007 was a hugely important incident and led to Ronald Reagan ordering the military to allow the use of the US GPS systems by civilians.  And I still believe to this day that’s waaay more important than kickball.

So, nothing much has changed although now I bore people at dinner parties as opposed to the playground, my video games tend to be of the PS3 variety, and instead of reading about planes I’d much rather be flying them.

But, as big a geek as I may be, I’ve never really been a petrol head.  Indeed, I hate driving and view it as a colossal waste of time …time which could be spent doing far more productive things.  As such, I don’t really care too much about cars.  This isn’t to say I don’t love Top Gear or would turn away a Maserati Quattroporte, but in the absence of someone giving me one I can’t really justify £100k on a hunk of metal …okay, a terribly fast hunk of metal with lovely leathery bits …and so plan to drive my humble BMW 320D until the wheels fall off.

Now, I like car maintenance almost as much as I like being called ‘Matt’ or ‘reseller’ but BMW have done something very clever.  They’ve put what amounts to a countdown timer in the car so that as the miles increase, the car tells you how many more you can travel before the next service.  Genius.  And what happens if you go over the service threshold?  It starts counting the miles in negative digits and ‘bongs’ every time you get in or out to remind you that you really must get a service organised.

Gentle nagging works, but the simplistic countdown is what drives me (no pun intended) to organise services regularly.  And at £180 or more a go, I reckon BMW continues to make a tidy profit from me.

What’s this got to do with data storage and protection?

I wrote in my year end post that, along with the march to the virtualised datacentre, subscription models would change the way we consume just about everything.

One of the questions I inevitably get asked by customers when discussing optimised storage and virtualised datacentres is so what?  What business benefit do I get by deploying the elements of optimised storage such as thin provisioning, storage virtualisation, storage compression and so on.

It is a fair question and customers are right to ask it.  Let’s be clear, whilst I am a geek, deploying technology for technology’s sake is not something I would ever advocate.  Nor would I advocate just buying more ‘stuff’ in the absence of a proper strategy.  Frankly, this is partially the cause that we ended up with unsolved Rubik’s cubes for infrastructures.

Now, whilst we will publish TCO metrics and projected cost savings for optimised infrastructures and/or virtualised datacentres …and in some cases underwrite these costs …one of the unintended consequences of optimising/virtualising a datacentre is it gives the IT guys room to breathe.

Why room to breathe?  By automating certain datacentre tasks and removing the need to deploy more data storage for even a few weeks …although it often ends up to be months …the IT folks get time to manage data.  The elephant in most datacentres is not whether we should virtualise or optimise but, rather, how we manage data.  This is one of the biggest questions I get from customers …‘How do we manage our data?  How do we keep our internal customers from creating more?’ and I generally give them the exact same answer.

Shoot the users.

Not trying to be flippant, but seriously …no users, no more data.  Job done, crack tubes.

‘But we can’t shoot the users!’

Indeed.  So how do we manage the data if users are going to keep creating it?  Yes, there are technical solutions available in archiving and data deduplication, but why throw technology at a people problem?  Why create the data in the first place?

What if we gave the users a yearly and monthly subscription for data storage?

Try phoning your mobile phone company when you’ve used up your minutes and demanding more now and for free because hey …your moby is fundamental to your job!

The challenge is not in having adequate tools for data management …three of the best are , in my opinion…

  1. Storage Fusion for enterprise data infrastructure.  SF is brilliant at showing you exactly what the storage devices are doing; how much raw storage is deployed/consumed, how much power they’re consuming, how the storage is tiered, and so on.
  2. Symantec CommandCentral Storage.  Symantec CommandCentral is fantastic for managing infrastructures comprised of more than one storage vendor, and lets you manage storage down to the file level.
  3. Northern Storage Suite.  I really love Northern because it has robust LDAP hooks, which is geekspeak for ‘it analyses the data and tells me who, specifically, created it’.  Imagine being able to run reports to tell you which users and departments are creating the most duplicate data, data which isn’t accessed often if ever …and what they are likely to create in future …and you can see why I love these guys.

But the tools are only half the solution.  The other half is what we do with the information.  In the past we’d try to nicely ask the user to delete duplicate or old data.  We’ve even tried ‘quotas’ which rarely, if ever, work.

In 2010 I think we should be asking ‘How much space does a user need and how can I build a subscription model to monitor …and report to them …their usage.’

Shoot the users?  A bit harsh.  But perhaps charge them after they use up their subscription is the answer to user managed data.

Have a great weekend, and please contact me if I can be of any assistance in helping you manage data.

I promise I won’t shoot your users.

-Matthew

Click here to contact me.

Archived Post – On a slimmer me.

25/09/2009

ARCHIVE – Originally posted 24 April 2009.

If I am completely honest with myself, I always knew this day would come. It is difficult to see the Stay Puft Marshmallow Man like image not only staring back at me in the mirror each day, but becoming ever bigger over the past five years. I suppose the tipping point was when Google phoned Mrs. PL to enquire as to when they could pop round to take my latitude and longitude for Google Earth. That’s right, campers …I’ve decided that I need to step up my diet and lose four and a half stone.

So what is going to be different this time? Well, I went round to see the doctor who helpfully also told me that I need to lose weight but, and here’s the interesting part, instead of sitting and having a qualitative conversation about losing weight so that I’m healthier and around to enjoy PL Junior as he grows older …we had a quantitative conversation. Whilst there are all kinds of different diets and related diet medication out there, seems that people who count calories actually lose 50% more weight over the course of their diet …and also tend to keep it off for good. Why? There’s the obvious answer that they are consuming fewer calories, but scientists have discovered that humans, on average, need to do something at least eighteen times for it to become a permanent habit. Most crash or fad diets don’t actually change eating behaviour, whilst counting calories does help you to not only understand what you are eating but also ‘recalibrate’ your eating habits such that you consume fewer calories even after you have ‘finished’ your diet.

Now, counting calories is right up there with Mrs. PL plucking my nose hairs on the list of things I would rather not do. It’s boring and, frankly, a bit of a pain in the backside. That’s when the doctor pointed me to a great website http://www.livestrong.com created by Lance Armstrong, seven time winner of the Tour de France. Gotta tell you, love this site! You register for free and then begin counting calories by adding foods you’ve eaten as you consume them …with over 550,000 foods in the database, I’ve found it difficult to find any foods it doesn’t have and adding these to your ‘daily plate’ is as easy as a couple of clicks. Wait, it gets better! You tell the site how many pounds you want to lose in total and then how many you wish to lose per week and …presto! You now have the date when you will be at your target weight as well as how many calories you can consume per day to lose the desired weight. And there’s even an app for the iPod iTouch, iPhone, Blackberry etc. so that there’s no reason to not log your food …and so you can see how many calories you have left in the ‘bank’ as the day progresses. Fandabbydoozy! All being well, you’ll see a new lighter me on 09 September 2009.

What does this have to do with Storage and Software?

Gartner and IDC estimate that the average customer reaps just 40% total utilisation out of their storage infrastructure, and when calculating the Total Cost of Ownership [TCO] for storage we know that only 30% of the measurement is acquisition …the remaining 70% is comprised of OPEX [power, cooling, backup, physical management, etc.] However, in my experience many [if not most] of our customers are unaware of the low utilisation and TCO measurements. Why? I don’t want to oversimplify this, but I believe that sometimes vendors, and certainly our competition, would prefer to have qualitative conversations with customers regarding storage.

Customer: ‘I need more storage for my new datacentre project.’

Our Competition: ‘Of course! What are you using now?’

Customer: ‘Ermmm, an EMC thingy.’

Our Competition: ‘Fantastic! And how much budget do you have for the project?’

Customer: ‘About £1.2 m’

Our Competition: ‘Wow, are you in luck …we have the new EMC Vmax for exactly £1.2m!’

Or something along those lines. Never really talked to the customer about what type of performance was required, never enquired as to how many people manage storage now and whether the customer would like to reduce OPEX …nope, not a sausage. And this type of engagement is what leads to low storage utilisation and hefty OPEX within storage infrastructures. So what should we do and how can we differentiate ourselves as Computacenter?

On average we can recoup upwards of 30% of unused storage in a fat provisioned storage infrastructure by implementing thin provisioning, we know that the industry standard for data deduplication ratios is 40:1, and automated storage provisioning can reduce the number of administrators required to manage storage significantly. There are also other technologies that we can look to implement and deploy, such as storage virtualisation, but none of these technologies are ‘silver bullets’ and we shouldn’t market them that way. In fact, you might liken them to ‘crash diets’ which deal with unstructured data when what we really want to do is to reduce the amount of unstructured data we’re creating to ensure that we’re only storing data that is useful to the customer in question.

But counting calories is a pain, and so is counting data …in fact, most users are just not that interested in reducing their unstructured data as they feel it impedes on their productivity. In fairness, though, this is a qualitative argument and we need to be having quantitative discussions with our customers. How?

By baselining the customer storage infrastructure using our Storage Assessment and Strategy Service we can show customers exactly what their storage environment looks like today; how much data is there, what data is duplicate, when is the last time the data was accessed, who is creating the data in question, and so on. This baseline also shows us how the environment has developed historically and, once we understand how we got here, we can now make intelligent assumptions about what the environment is going to look like moving forward over the coming months and years. Most importantly, we can now have very productive and quantitative conversations about how we can implement technologies and strategies to reduce CAPEX and OPEX over time …and without being disruptive to the customer’s production business or leaving users feeling as if we are impeding on their productivity.

We can help our customers lose weight and, in a difficult economy, I’m certain that is a message which will find resonance.

Please contact me if you you feel a Storage Assessment and Strategy Service would be of benefit.

Have a great weekend.

-Matthew

Click here to contact me.

Indulging my inner geek.

18/09/2009

This week Computacenter opened a new branch office on a small nondescript street in the leafy north west London suburb of Mill Hill. There was no laying of a cornerstone nor was there any real fanfare …save for PL Junior screaming ‘hooray daddy!’ at the top of his lungs when the WiFi connection came back online and Mrs. PL stopped giving me the evils as she was trying to order more track suit bottoms from M&S.

To be sure, if you drive around Mill Hill looking for a new Computacenter logo board you won’t find one …if you haven’t guessed already, the new branch office is actually my home office within Casa PL.

Now, many folks have home offices and as the world becomes ever more mobile, home offices will become more and more de rigeur and less of a unique phenomenon. So what makes me special? Nothing really, other than I recently decided that prices had dropped sufficiently enough for me to ‘supercharge’ my home office’s IT capabilities.

*Warning: You already know I’m a technoweenie, and I’m about to geek out for a paragraph or so!

I have upgraded the sub 2MBs DSL internet connection, which uses copper, to a more stable and efficient 8MBs broadband connection which uses fibre optic cable. I have also upgraded my four year old Linksys 802.11b/g wireless router which transmitted at 2.4GHz …and was rather flaky at distance through Casa PL’s brick walls …to an Apple Airport Extreme Basestation 802.11a/b/g/n which operates at 5GHz and slices through the brick walls like hot knife through butter! I’m downloading at 8MBs from Mrs. PL’s garden!Also, the PS3 can stream from the new connection, and Mrs. PL and I were able to watch old Dragons Den episodes using the BBC iPlayer from the PS3 connected via an HDMI cable to our 1080p high definition telly. And, the pièce de résistance? I have connected a 2TB SATA drive kit to my wireless router with a USB 2.0 cable …et voilà! Casa PL now has NAS storage where we can centrally store all of our iPod music, videos, family photos, documents, backups, the lot …and with the MacBook Air I’m consistently getting 230MBs – 300MBs transfer speeds to the disk inside the house …and I have remote disk connectivity setup so that I can access our drives from anywhere in the world with an internet connection using a Mac, iPod iTouch …or even a 3Gs iPhone! Fandabbydoozy!

What has this got to do with Data Storage & Protection?

I could take the obvious route here and continue to indulge my inner geek by extolling the virtues of SATA drives and screaming hot wireless connectivity etc …but I won’t. Equally, some of you may know that I haven’t been feeling well this week and may rightly wonder where I found the time to do all of this. Well, thankfully I wasn’t squealing or snorting …and I don’t eat pork …so I think I’ve survived the man flu as opposed to swine flu, but would you believe me if I told you that I accomplished the above in a little under 37 minutes? Okay, I had a little help from our friends at Amazon.co.uk with delivery …but setup from start to finish, including opening the boxes …was 37 minutes. The speed at which we can now deploy IT, providing you know what you are doing more or less, is becoming frighteningly quick …and easy.

But why did I do all of this? Just to impress you, dear reader? No. You see, with the setup above …and my trusty and loyal friend the MacBook Air …I don’t think that I’ve ever been more efficient in my professional life. And, as practice leader in a business with 200+ sales personnel, 5+ tier one storage vendors, and 500+ hard deck customers alone …I need all the free cycles I can get! Yes, when it comes to efficiency I will admit that the shiny brushed aluminium and slim form factor alone of the MacBook Air are as difficult to justify to Mrs. PL as a pair of Monolo Blahnik would be to me …however, walking in to a customer meeting and being able to open the clamshell of the MBA and be able to work instantly? The native 802.11n wireless support? Not having to lug a 1.5 kilo hunk of metal on the plane? Access to the Windows corporate build via virtualisation software? Efficiency cubed. And the additional accroutrements of the new ‘supercharged’ home office add …nay, amplify …this efficiency.

In saying that, Mrs. PL really couldn’t care less. Yes, okay …she is supportive of any efficiency that means I don’t bring work home at night or weekends and spend more time with her and PL Junior …but 802.11a/b/g/n 5GHz NAS connected devices? She couldn’t give a Fig Newton …but when the wireless connection kept breaking down, or the internet was too slow for her to harvest her crops in FarmVille after PL Junior went to bed, or when the internet connection would reset every time the bleedin’ phone rang, or when we had to ‘sneaker net’ files with USB drives between her ‘puter and my CrackBook …you bet your patootie she cared, and I got an earful!

And so it is with our customers; I believe that the world has become almost evenly divided between those who view IT as a strategic enabler to their business and are prepared to invest sensibly (me, in the equation above) versus those who aren’t at all interested in ‘speeds and feeds’ and view IT as a service (Mrs. PL in the equation above).

Who is right? We .. and they …both are, and Computacenter solutions are designed to address both IT as a strategic enabler and as a service. But we aren’t stopping there …we are so confident of our ability to positively transform an IT environment to be more agile and a strategic business enabler that we are now prepared to have ‘risk / reward’ conversations with customers whereby we will underwrite the savings and risk associated with the transformation. Equally, Colin Bradford has me and others busy at work designing ‘Storage as a Service’ and other IT services which fit into managed offerings whereby we can sell a customer storage at £(x) per user per month.

The challenge is in knowing how to identify our customer’s business issues and ensuring that we articulate the answer as either a strategic enabler, service …or both.

Please feel free to contact me if you need assistance with such a journey now.

-Matthew

Click here to contact me.

Storage sushi?

07/08/2009

I’m delighted to be back at work after having spent a fortnight on holiday with Mrs. PL and PL Junior in Malta.  I hadn’t been away from work for a full fortnight since Mrs. PL and I went on honeymoon almost six years ago, so it was nice to get away from the hectic pace of London life for a while and relax in almost perpetual sunshine.  This was our second holiday to Malta as we’d has such a pleasant and relaxing time last year, and if you’ve not considered Malta as a holiday destination …I cannot recommend it highly enough.

Many interesting things happened whilst we were away …a nearly three year old PL Junior ensured endless entertainment …but one of the more interesting occurred when we decided to visit Malta’s premier sushi destination.  Now …I know what you’re thinking …on a 70 square mile island in the middle of nowhere, surely the sushi will be substandard?  Erm, no as it happens …it was stunning, possibly the best sushi I have had anywhere in many years.  The fish was incredibly fresh, the miso soup was lovely …and the service was second to none.  I suppose it helps that the owner is from Japan and almost all the staff Japanese …but the thing which really shocked me was PL Junior insisting to eat a bowl of miso soup …which he loved …followed by his swift devouring of all of Mrs. PL’s salmon and tuna nigiri!  Who knew PL Junior liked sushi?!  The sushi chef was so impressed that he taught PL Junior how to bow and say ‘arigato‘ …which he now does just about everywhere.  So if you are ever in Malta and are looking for a good meal, stop by Zen restaurant in Portomaso, St. Julians and tell them the little boy from Mill Hill sent you.

What does this have to do with Data Storage & Protection?

Becoming a sushi chef is no trivial matter and can often take years.  When we tried to leave the restaurant, PL Junior refused as he was fascinated watching the sushi chef prepare the orders and wanted to watch more.  I have to be honest, I was pretty fascinated myself and was happy to stick around a bit longer and I’m pleased I did as the sushi chef explained something that I never knew about sushi …the sharpness of the knife and the way in which the fish is cut can make all the difference in the taste of the fish.  I know, I know …I thought he was having me on as well, but he cut me a piece of sashimi where he brought the blade straight down as well as one where he rocked the blade properly and believe me …there is a discernible difference!

I am sometimes asked by both customers and Computacenter sales folks alike, ‘If we don’t make anything, why wouldn’t a customer just work directly with the vendor directly?‘  A fair point, but I could try to make the same point with sushi.  It is, after all, just raw fish …and I’m quite sure that on an island surround by Mediterranean water such as Malta …I could find good quality ingredients.  But what happens when I want to make some mouth watering nigiri or sashimi?  And what happens if I want salmon and Mrs. PL wants yellow fin (as often happens)?  And what if there are other equally delectable types of sushi which I might love, but have never been exposed to yet …like PL Junior?  I suppose I could train to become a sushi chef …or find a nice restaurant like Zen.

Having read Malcolm Gladwell’s ‘Outliers’ recently on holiday, there is a very interesting chapter which talks about a study which showed that, to truly become an expert at something, requires on average about 10,000 hours of dedicated training and practice.  Now,  our Computacenter storage consultants retain some of the highest accreditations in the industry …and to be fair I’ve never asked them if they’ve spent 10,000 hours studying and practicing data storage …however, I wouldn’t be surprised if many of them weren’t far off that number.  And here’s the critical point …they often have gained this experience across multiple vendor products and within customers which span right the way from banks through to merchants through the public sector …and beyond.  Indeed, I have storage expertise which spans several countries [USA, Ireland, United Kingdom, Luxembourg] as well as several vendors [EMC, HP, IBM, NetApp, HDS] and can happily tell you what I think works well and perhaps not as well given different business issues and drivers.  And I’m not alone …we have dozens of such ‘storage masters’ within our organisation.  Sushi storage masters, if you will!

We will continue to work with vendor partners who make exceptional storage products which help our customers to align their data to business value in the pursuit of reduced costs and increased efficiencies.  The freshest fish, if we are using our sushi analogy.

Our job is to understand these technologies at an expert level to ensure that we produce solutions which not only demonstrably reduce costs …but also do so in a way which is non-disruptive to our customer’s business.  How to produce the best sushi at the right cost, if we are going to extend our sushi analogy.

To do that takes expertise and experience not easily gleaned without deciding to become a sushi master …to know which fish to select to produce the best sushi …to know how to cut in just the right way to produce the best tasting product you can.

Please feel free to contact me if I can be of any assistance in articulating how Computacenter adds significant value to vendor products in the pursuit of market leading solutions for our customers.