(Psst, hey you. You may want to go read Part 1 first)
After months and months of planning and prep work GCC's main IT production gear moved offsite to a real datacenter just a few weeks ago on Sunday Jan 22nd! w00t! We've always wished we could have our gear in sweet colocation (colo) space, but I never would have guessed we'd be there in early 2012 or even 2020 due to costs. How this came to be is a fun story, but it's not short … so that will be another blog post or 2 down the road :-)
Until then, here's a few details and pictures. Oh and monster props to Justin who did all the heavy lifting to plan, prep and execute the myriad of complexities of this move as there were a number of huge network changes and upgrades that proceeded the actual move. Thanks also to IT volunteers Ed Buford, Tom Templin, and Aaron Nush for their help Sunday to relocate the gear. Additional thanks to Ed Buford and Pinnacle for Exchange changes the week prior to the move to minimize email downtown.
Where is the gear now?
We're part of a non-profit co-op sharing a micro suite at Global Access Point (GAP) in downtown South Bend, IN. We share a 42U rack with our long time friend Josh from Center For Hospice Care for the price of … FREE! We're currently fully utilizing 20U of space (see below) and Josh says we can have a few of his U's if needed … right Josh? ;-)
The going rate for space at GAP is $50-70/U/month so we're getting $1000-$1400/mo of space at no cost. Booyah!
We have dedicated redundant 20A power feeds into our PDU in the rack so if one 20A feed goes down the other takes over without a power drop. The power feeds are also on UPS and generator backups … it's sweet to no longer worry when thunderstorms roll into the area wondering if power will go out at GCC. The downside? It's a salty $350/mo for said power. Wonder if any of the top dogs at GAP go to GCC? Would love to find a way to lower that cost.
How does the data get between GCC and the colo?
This is part of the long story mentioned above, but we have a sweet Metronet dark fiber link between GCC and our router at the colo at whatever speed we can light it. For now we're lighting it a 1Gig since 10Gig lasers aren't cheap. We'll monitor the link for a while to see how congested it gets. We're using Cogent as our ISP as they have a presence at GAP and were priced very well. We pay for 100Mbps up/down burstable to 400Mbps, but we've seen it hit as high as 500Mbps! The kicker? The cost is LESS than what we were paying for our ATT fiber at 10Mbps! Booyah! What's fun is when staff publicly cheer about how fast our internet speeds are like this tweet from Dustin :-)
Honestly, it's a bit strange to think about all our gear being downtown and accessed over a small optical fiber pair the size of a human hair. According to the Metronet folks, our optical path to the colo is 11.4 miles while the driving distance is only 7 miles. So, while it takes 20mins to get to the colo by car, it's a mere 90 microseconds for data traveling down the fiber :-)
And yes, we do have a cheap comcast business connection into GCC as our backup ISP. Should the fiber go down for some reason we can reroute most things over the comcast connection and limp along.
Guess I'll need a part 3 to go over a few more details like what gear is still at GCC, but until then here are some pics from the move.
Tom, Justin, Ed, Aaron ripping gear from the rack
Loading gear into cars. Thankfully the weather wasn't too bad considering the time of year.
All our junk in the Suite hallway. Tom is looking into our micro-suite. Yes, we brought a table and 2 chairs along with the gear since there's no place to sit or sit stuff :)
Justin and Ed hoist one of our Dell R710 ESXi hosts onto it's rails
Justin and Ed starting to wire up gear at the rear of the rack
Tom is working on the rails for our 2nd and oldest PS100E EqualLogic array
Tom and Ed working(?) at the rear of the rack. Yes, it's a bit cramped back there.
Front view of the rack with all gear in place. It's a bummer that the top R510 bezel is black and thus doesn't match the R710's and EqualLogic PS6000E :-(
Oh, we also need to take out the sliding shelf above the KVM at some point.
Rear rack view. Cables mainly nice and tidy and correctly color coded based on function. Also note our rear door can actually close unlike our neighbors. Maybe they'll be inspired to clean up their cabling after seeing ours? #doubtful
If you have question about any of this leave a comment and I'll try to address with a comment or include in the part 3 post.
Very Impressive! God is definitely blessing GCC!
I honestly was a little confused with several of your twits over the last couple months... GCC moving servers to COLO. I thought you were talking about a move to Colorado! Ha!! Still, moving all the gear to a datacenter makes a lot of sense.
Thanks for sharing details!
Posted by: JohnCStark | February 08, 2012 at 12:09 PM