Musings on IT, data management, whitewater rafting, and backpacking

Tuesday, December 29, 2009

Massive disk failure

I have been a big fan of Sun's ZFS file system for many reasons. We have about 100 TB of active ZFS storage, mostly ZFS RAIDZ2, which can correct from double drive failures.

We recently suffered the biggest storage failure we've had in decades. The story is still unfolding, but the beginning and middle have already been written.

Tuesday, December 22, 2009

Comcast?

Heard from our building manager that Comcast had pulled cable into our building. So I called Comcast to find out what they might offer for our existing needs, and for building our our new space.

I was surprised.

Tapes are not for long term storage

Tapes are much more energy efficient than even the best MAID disk storage system.

However, until you've had to recover data from tens of thousands of rotting tapes which need tape drives that are no longer manufactured -- don't talk to me about energy efficiency. We need to keep lots of digital data "forever", but we rarely get enough money to do that well.

I can tell many strange stories about tape rot and recovery, but one story is especially absurd.

Saturday, December 19, 2009

Thoughts on cloud storage

We have hundreds of terabytes of active storage on our servers. Most of that storage is cold storage, so we've focused on inexpensive, reliable, slow, online storage. We use tape for offsite backups only, more on that later.

Lots of noise about cloud storage, so I started running the numbers using Amazon's published numbers.

Two problems popped out immediately:


Thursday, December 17, 2009

Cat 6 or 6A cabling?

The usual dilemma when installing a new network, especially along side an existing network.

Our existing network (about half the total space) uses Cat 6 wiring installed 3-6 years ago, before Cat 6A was widely available. We have Gigabit Ethernet to the desktop. GbE is good enough most of the time, but some heavy data users and some network backups could use more bandwidth.

Which means I should put in Cat 6A in the new space to "future proof" the network. Which costs more. Which might not be in the budget. And we definitely don't have the money to rip and replace the existing Cat 6 cabling, creating bandwidth "haves" and "have nots" in the building.

Cat 6A cabling is much thicker than Cat 6, which will likely cause problems with conduits, ladder racks, cabinet punchouts, bend radii, etc.

By the way, POTS (Plain Old Telephone Service) runs just fine over 2 pairs of Cat 6 cabling terminated in RJ-45 jacks. RJ-11 cables plug right in with no problems. Any jack in any office can carry either POTS voice or Gigabit Ethernet data, with a couple minutes in the telecom closet swapping jumper cables.

We probably won't decide between Cat 6 and Cat 6A until we get cost estimates in a few months.

Four standard answers

A few years ago, the people I raft with started circulating the Four Standard Answers that may be used by whitewater raft guides in response to nearly any question:

  • It depends.
  • About 45 minutes.
  • Just around the next bend.
  • Chest high on a duck.
Figuring out what these answers are good for is left as an exercise for the reader.

Especially the last one. Think about it.

Wednesday, December 16, 2009

No raised floors!

Been there, done that, have scars, never again.

Traditional data center design uses raised floors, perforated tiles, rows of cabinets surrounded by hulking HVAC units and giant PDUs, and a whole room dedicated to batteries. No thanks.  I have a better way.


Phones are a PITA!

Seriously, I've suggested ditching all but a handful of land lines since most of our staff have cell phones. Nobody's willing to cut the cord(s) just yet.

Our existing space has about 120 AT&T Centrex lines. Very old school, very cheap to install, very expensive monthly charges, very easy to administer, and the phones work when the power goes out -- which is frequent here. Centrex doesn't have a lot of fancy features, and you pay a lot to add them. We don't need much -- voicemail on about half the lines, a few people need Caller ID, that's about it.

Ideally, we'll add another 120 Centrex lines. Checking with AT&T to see if they have another 120 pairs available here. We're a couple of cable miles from the CO, so I'd be astounded if they pulled more pairs just for us. Last time we added lines here, didn't look like they had a lot more pairs coming into the MPOE, and other tenants have moved into previously-empty spaces since then.

Our only alternative is an Avaya VOIP system. Corporate standard, no other vendors allowed. We'd dump our existing Centrex and go all VOIP, need about 240 lines. Lots of features, sexy new technology, security issues, high up-front cost, high administrative load, need some POTS lines for backup when the power fails.

Still waiting for a quote from our contract, but a pretty good estimate shows the VOIP system will cost about $100K more up front, with a pay back time around 2 years. Regardless of the payback time, I might have a hard fight to get the extra $100K up front.

If we can't get the Centrex lines, the decision is easy.

I'll know more in a few weeks.

It's not about square feet

For those of you new to data center design: it's not about square feet, or watts per square foot.

It's about total kilowatts, and cubic feet per minute.

New space, new data center, new networks, new phones

We're finally moving the other half of our group into newly remodeled space adjacent to the first half, about 70 people joining 70 others.

I'm designing a new data center (500 sf, 60 KW, not very big), a new network, maybe new WAN links (currently DS-3 45 Mbps), and probably a new phone system for both the new and existing space. We're working through multiple levels of indirection, so this will be challenging.

Good news:
  • I've done this before, and I like doing this
  • I've had a couple of years for research and design, due to various delays
Bad news:
  • Tight budget and deadlines.
  • Need to minimize operating costs, too
  • Big "green" push
I've learned a lot in the last couple of years.

Bottom line: like anything else, lots of vendors out there have "solutions" that make them rich and don't meet our needs, while claiming to be "green".

Details to follow.

Friday, December 11, 2009

Introduction

I know a lot about some topics:
  • Information Technology - security, storage, networking, data center design
  • Whitewater rafting
  • Lightweight backpacking and day hiking
  • Wilderness first aid
Why should you read what I write?
  • I've worked in IT and IT management for over 30 years
  • I've been a whitewater raft guide, guide trainer, and trainer-trainer for over 20 years
  • I've been backpacking and hiking for over 40 years
  • I've kept up my wilderness first aid training for over 30 years, WFR certified for 15 years.
And yes, I tend to think and write in outlines.