South Louisiana - Putting my server in the attic or in a shed?

zovc

King of Cable Management
Original poster
Jan 5, 2017
852
603
Hey!

I know this doesn't have much to do with SFF, but I don't really know where I "should" ask about this. Google wasn't too helpful.

I'm going to be moving into a house with some friends soon, and I'm going to be repurposing an old server to be our router and NAS and maybe to run a gaming VM for our living room or company to remote into. I've been trying to fit it into desktop chassis and it's just not the right approach, it's time for me to get a proper server chassis and maybe a rack for the switch and everything else.

If I need to, I can put the server/rack inside the house, we'll all have our own bedroom and the office area is small and mostly out of the way. But, if that server ends up in the office and it's at all noisy, it'd be pretty noticeable and probably unpleasant for anyone wanting to do something in there. I was asking one of the guys about it, and I mentioned I thought about trying to put it in the attic, and he suggested I put it in the utilities "shed."

So, the house is laid out with the bedrooms and bathrooms on one side all joined by a hall, the hall turns exits near the office (the front "central" entrance of the house), and spills out into the living area with the kitchen off to the side. The kitchen exits into the garage (with attic access) and the garage has a regular door to a shared patio with the office's entrance. If you exit the garage in the opposite direction of the street/front entrance, you pass by a utilities shed that shares a roof with the house. Then, in the back yard, there's a second, large, detached shed.

So, the three out-of-the-way options are the attic, the "utilities shed," and the "tools shed."

I'm taking for granted that most activities in either shed will be pretty noisy, so 3+ fans running 100% shouldn't be much worse than white noise in one of those rooms. In the attic, I'm pretty sure the server could be positioned to where it never bothers anyone.

The attic will be most isolated from Louisiana's humidity, I think. Unfortunately, the attic will probably absorb and retain a lot of heat, probably staying over 100F in the summer. It would also technically have the shortest Ethernet run to each outlet in the home, and depending on the layout of the attic, could even be strategically positioned to have a roughly equal distance from each jack.

The utilities shed shares a roof with the rest of the house, so running Ethernet to and from it would be fairly easy. With the attic over its head and the somewhat neglected construction/windows, even if it gets up to outside temperatures during the day it should sink back down into the 80-90F at night time. I don't have any measure of the relative humidity, but I don't think there's any air circulation system or modern insulation to help trap outside's humidity out.

The tools shed is pretty much the same as the utilities shed except for the fact that I'd have to run (est. 150ft~) Ethernet to the shed, it'd almost certainly be worth just running one industrial (10Gb+) cable for each NIC in use on the server and using a separate switch inside the house. That could complicate or increase battery backup costs? But this would definitely be the most out-of-the-way solution. Because this shed isn't connected to the house, I'm taking for granted it does the least to fight off the elements in terms of heat and humidity.

I'm fairly sure that any of these options are sheltered from rain, but could always rig something up to be even more sure.
 

jeshikat

Jessica. Wayward SFF.n Founder
Silver Supporter
Feb 22, 2015
4,969
4,781
I've seen what Southern humidity does to computers left sitting in garages, sheds, and storage units so I wouldn't recommend it.

If it's just a file server it shouldn't be generating that much heat though, is making it quieter not an option?
 
  • Like
Reactions: zovc

zovc

King of Cable Management
Original poster
Jan 5, 2017
852
603
I've seen what Southern humidity does to computers left sitting in garages, sheds, and storage units so I wouldn't recommend it.

If it's just a file server it shouldn't be generating that much heat though, is making it quieter not an option?

I don't know what the relative noise level of the system is yet--I don't have a proper server chassis, but I need to get my hands on one. Do you have any recommendations?

I found one on Newegg for ~$400 that can house 20 drives and that seemed like an insanely good deal even if the case itself ended up kind of rickety. I'll edit this post with a link when I get home this evening. It's unlikely I'd put 20 HDDs in there any time soon, but it'd be pretty much more storage than we could ever need.

The case has, I think, two 80mm exhaust fans and there's a discrepancy between whether or not it ships with an older or a newer mid-section plate. The older one fits 4 90mm(?) fans and the new one fits 3 120mm fans. Without having seen the case myself I probably could do some amount of modding (and fan shopping) to reduce the noise.

Edit: Here's the case.
 
Last edited:

jØrd

S̳C̳S̳I̳ ̳f̳o̳r̳ ̳l̳i̳f̳e̳
sudocide.dev
SFFn Staff
Gold Supporter
LOSIAS
Jul 19, 2015
818
1,359
Its probably worth figuring out what you want to put in it. number of drives and drive form factor (maybe work out what kind of stuff you want to store on it, how much redundancy you want, etc), do you want a backplane, eatx, matx, atx, mitx, hotswap drive cage or no, etc, etc. This should give you a fairly good idea of how deep your chassis will need to be, what sort of functionality it will need to have and also how many rack units tall its going to be. That will essentially dictate what form factor PSU your going to end up using & how big you can go on CPU cooling / expansion cards, etc. From there it gets a little easier to shop. New Egg have some good deals as do Ali Express and (YMMV) ebay. Once you know what functionality you need from a chassis then its really just a case of reading the spec sheets until you have a short list then exclude by price i suppose. If your willing to forgo things like SAS backplanes and hotswap cages price comes down pretty quickly. Also if you dont need something thats big enough to fit a couch size motherboard in then prices tend to come down some too. Its also worth factoring in cost for things like a SAS HBA, multi NIC cards, etc if you think your going to need them, they can add up quickly.

Both cases came from Ali Express, their a bit rough around the edges but they didnt cost $400 (or even half that). The bottom one is about 55cm deep (mATX compat), 3u tall & runs as a file / general use server. the one above it is 25cm (mITX compat) 1u tall & serves as my router / gateway box. Their not in a rack because im cheap and its not going to bring me any more benefits in space reduction than i already have. Once im done living in a shoe box that will likely get remedied.
 

neilhart

Cable-Tie Ninja
Apr 18, 2017
149
271
In South Louisiana a basement would literally be water. :cool:



I don't really know how to shop for server cases. They seem to jump from $400 to $1,000+ and that's hard pill to swallow.

Owning an in house server is a PITA from my experience. One time I had a system drive crash while the attached RAID had one drive down with a spare drive mapped in place. Restoring the system drive left the RAID in a un-defined state as the tables for the RAID sparing were lost. And very unhappy users.

My point is that when other people are using your server they expect ISP five nines reliability… This means a system of backups and even a complete backup system that is a mirror of the on line one.

IMHO the best solution is ISP provided WiFi for the house and all of the users to use the cloud for file storage as needed.
 
  • Like
Reactions: Biowarejak and zovc

wiretap

Average Stuffer
Apr 25, 2017
55
142
With my entire rack up and running, I can't even hear it from 6 feet away. It is whisper quiet. You just have to spec your components and fans for quiet operation. Putting everything in an enclosed rackmount cabinet helps cut down on noise as well. Here's mine, which sits in a spare bedroom. Wife approved and extremely low noise:

22U Rackmount Loadout:
Pro Audio Stash 22U, 32in depth Rackmount Enclosure with Glass Door
2U Belkin PureAV PF60
1U HP ProCurve 1810G-24 Gigabit Switch (all CAT6 cabling to clients)
3x 4U HTPC's
1x 4U ESXi Server
Netgear CM1000 DOCSIS 3.1 Modem + Comcast 125Mbps/25Mbps Service
pfSense Firewall/Router
Access Points: 4x Asus RP-AC68U 802.11ac
Cyberpower 1500VA Battery Backup
Obihai 200 w/ Google Voice over IP
17" Asus LCD for KVM Management


ESXi Server Orion:
Norco RPC-4220 4U Rackmount Case
1000w Corsair RM1000 Power Supply
Supermicro X9SCM Motherboard
Intel Xeon E3-1240 Processor
Supertalent 16GB (4x4GB) DDR3-1333 ECC RAM
Ceton InfiniTV4 PCI-e CableCARD Tuner
3x IBM m1015 SAS Controllers
2x 250GB Samsung Evo 850 SSD's (Datastore and Backup Datastore)
7x 4TB Western Digital SSHD's
10x 2TB Western Digital Green HDD's
5x SFF-8087 to SFF-8087 cables
3x 120mm Arctic Cooling F12 PWM Fans
2x 80mm Arctic Cooling F8 Rev.2 PWM Fans
1x 92mm Arctic Cooling Alpine 11 Plus PWM CPU Fan
---Running ESXi 6.0 Update 2
---VM1: Windows 8.1 x64 Pro w/ MCE
---VM2: Windows Server 2012 R2 x64 Essentials
---VM3: Windows XP Professional

3x HTPC's:
Rosewill RSV-R4000 4U Rackmount Case
Corsair CX430 V2 Power Supply
MSI P67A-G43 B2 Motherboard
Intel i3 2100 Processor
Rosewill RCX-ZAIO-92 92mm CPU Cooler
2x2GB PNY DDR3-1600 Memory
Galaxy GTS 450 GC Graphics Card
Corsair 60GB Force SATA III SSD
nMEDIAPC Blue Pro LCD
2x 120mm 1000RPM Silent Blue LED Fans (Front)
2x 80mm 1200RPM Silent Blue LED Fans (Rear)
---Running Windows 8.1 x64 Pro w/ MCE and Emby Theater
---USB over CAT6 for IR receivers below the TV's
---HDMI over CAT6 for audio/video to the TV's

pfSense Firewall:
Silverstone PT13 Slim-ITX Case, passive cooling / silent operation
Jetway NF9HG-2930 (Intel Celeron N2930 2.16 GHz Quad Core Processor, 4x Intel i211AT Gigabit LAN, mini-PCIe, mSATA)
8GB DDR3-1600 Kingston HyperX Impact 1.35v
128GB SanDisk x110 mSATA SSD
FSP 60w Power Supply
---Running pfSense with Snort, Squid, OpenVPN, nTop, ClamAV

 
Last edited:

Zeroth Alpha

Cable Smoosher
Jul 24, 2016
12
18
In South Louisiana a basement would literally be water. :cool:



I don't really know how to shop for server cases. They seem to jump from $400 to $1,000+ and that's hard pill to swallow.
The case you linked to above does have an optional 120mm midsection fan mount. There is a review of it here: https://www.servethehome.com/120mm-fan-partition-norco-rpc4220-rpc4020/. Server cases are unfortunately very expensive, as the market isn't particularly price sensitive. I would totally recommend going on eBay an buying an old Supermicro server and just scrapping the electronic components though, as great quality used cases can be had for very cheap. Supermicro is definitely better than Norco.
 
  • Like
Reactions: Biowarejak and zovc

zovc

King of Cable Management
Original poster
Jan 5, 2017
852
603
Its probably worth figuring out what you want to put in it.

@j0rd! I know you're in the know on this sort of stuff!

My motherboard is SSI EEB size, which makes things a little difficult.

I'd like room for at least 12 (3.5") drives total. Hot swap isn't super important to me, but I've seen large server cases that load drives vertically and store ~48(?) drives like that, when you take the top off. That seems like a really good use of space and a ~24 drive unit like that would give me more space than I'd ever feasibly need. If I understand correctly, a lot of server racks (like even the Norco one I linked) have individual cables for several drives. I think my motherboard has 12 SATA/SAS ports, which is enough without needing an extra controller out the gates. I would like to be able to add a controller card in the future if I needed though.

I'd like to be able to fit my Zotac GTX 980 Amp! Edition in there for the time being. Somewhere down the line I'd probably upgrade it to a more robust card, but it's still quite powerful. Being able to move it away from the motherboard (or really, the other expansion slots) would be nice, since it is dual-slot and blocks a slot no matter where I install it. That's a small problem and it isn't holding me back at the moment.

I have 4 Ethernet ports on the motherboard, but installing an additional card to try to get 10Gb between my server and clients would be a fun project. The number of actual ports on the board aren't a huge deal since I can use a switch. I bought a cheap one on amazon that is supposed to be rack mountable, but I have been having issues with it alongside my current consumer-grade router.

Owning an in house server is a PITA from my experience.

I can definitely imagine there being issues and quirks, but I can't really being more than an inconvenience for any of us. I'm going to make sure our data is backed up elsewhere, so any projects that hiccup from failures aren't totally lost.

With my entire rack up and running, I can't even hear it from 6 feet away.

That thing is pretty pretty! I'll have to consider something like that.
 

jØrd

S̳C̳S̳I̳ ̳f̳o̳r̳ ̳l̳i̳f̳e̳
sudocide.dev
SFFn Staff
Gold Supporter
LOSIAS
Jul 19, 2015
818
1,359
My motherboard is SSI EEB size, which makes things a little difficult.
Cases to support that size mobo are (understandably) quite deep & usually more expensive than their short depth bretheren. That being said there is still potential to pick up a cheaper unit from Ali Express or the like but it would probably be missing features like a SAS backplane, build quality, etc.

a lot of server racks (like even the Norco one I linked) have individual cables for several drives.
Usually its a PCB backplane. one side will have some kind of multi lane SAS connector(s) and power connector(s), the other will have the drive connectors on it. As an example my current card has two 4 lane SAS connectors on it. Because im cheap im using a forwards breakout cable that has 4 SATA connectors on it but in a better case i could hook it up to a SAS backplane and have 8+ drives using only two data cables. Some backplanes also include port multipliers that enable you to run more storage. Also worth noting is that some backplanes and cards will include connectors for a sideband cable. These tend to provide for things like staggered spin up, LED drive identification, etc.

I'd like room for at least 12 (3.5") drives total. Hot swap isn't super important to me, but I've seen large server cases that load drives vertically and store ~48(?) drives like that, when you take the top off.
Im going to guess your thinking of the Storinator, iirc LTT has made a fuss about them a few times but also iirc Backblaze use them (they also release their quaterly drive reliability report, its a good read if your looking to buy lots of drives. There are probably other chassis w/ a similar layout but thats the one people seem to know. I will say a front facing hot swap cage may not be a big deal but its super nice to have when you need to replace a drive and dont have to slide the chassis out, open it up and dig around inside to get it done.

I'd like to be able to fit my Zotac GTX 980 Amp! Edition in there for the time being.
Off the top of my head 3u is the minimum height if you want to put full height PCIe cards in, you can get away w/ 2u if your able to use half height cards (im currently parts collecting a few half height cards to put into a 2u pfsense box).

Is the switch managed or unmanaged? Are you using both the rackmount switch and the switch built into your consumer router or just one or the other. In theory an unmanaged switch is transparent & should run fine if you hang it off the router then just hang all your clients off it. If its a managed unit it may be doing things like DHCP / DNS / other weirdness which would need to be disabled to prevent it screwing w/ whatever is providing those services currently. Also if its managed it will have an IP, if its not on the same subnet as the device your using to access it then you will need to fiddle about to talk to it. (say your on 10.0.1.1 and the managed switch is on 192.168.2.1 out of the box or whatever)

EDIT: Backblaze do not use Storinators. Apparently they designed their own storage pods then open sourced that design. It appears that the 45Drives Storinator is based on that open source design. Protocase will also sell you a chassis based on the backblaze design and have been doing so since at least 2011.
 
Last edited:
  • Like
Reactions: Biowarejak

zovc

King of Cable Management
Original poster
Jan 5, 2017
852
603
Cases to support that size mobo are (understandably) quite deep & usually more expensive than their short depth bretheren. That being said there is still potential to pick up a cheaper unit from Ali Express or the like but it would probably be missing features like a SAS backplane, build quality, etc.

Do you have any recommendations on length I should be looking for?


Usually its a PCB backplane.

Most of those features sound pretty neat, but they all sound sort of "gravy." I could probably live without a backplate if that's where the deal was.


Im going to guess your thinking of the Storinator,

Ah! That's where I first saw it. Yeah, I've seen LTT's Storinator but I was actually thinking of something from Supermicro that Level1Tech used in their "budget" ~150TB server setup. It involved three similar enclosures (one for parts) and a repurposed Google server. The enclosures I'm thinking of might not have had any computer inside of them and just been glorified hard drive controllers, though...


Off the top of my head 3u is the minimum height if you want to put full height PCIe cards in, you can get away w/ 2u if your able to use half height cards (im currently parts collecting a few half height cards to put into a 2u pfsense box).

From what I've seen, yeah, 3U is the shortest height that allows full height cards. But I would sitll believe it if there was a 2U case that could mount a full-height card sideways to make it fit. In the future, it's possible I could upgrade my 980 and get a low profile card, but at the moment it'd not really be worth it to do that.

Is the switch managed or unmanaged?

I endeavored to select an unmanaged switch. (I don't recall it having its own IP or subnet--and mine is not192.168.1.X--so I took for granted it wasn't trying to manage anything.) I did continue to (also) use the router itself as a switch, which I didn't consider could be a problem.

I never got around to hassling with the switch because I didn't need it yet. Between my HTPC, my server and my two VMs, I had enough ports for my day-to-day. I'll fool around with the switch some more and contact Rosewill when I'm trying to get our network up and running at the new house, I'll actually need the extra 16 ports with other users on the network. Originally I brought the switch mainly to drag to LAN parties. :cool:
 

jØrd

S̳C̳S̳I̳ ̳f̳o̳r̳ ̳l̳i̳f̳e̳
sudocide.dev
SFFn Staff
Gold Supporter
LOSIAS
Jul 19, 2015
818
1,359
Do you have any recommendations on length I should be looking for?
Your probably going to be looking at around 60-70+ cm. Its only really important in as much as making sure your rack is deep enough, if the spec sheet says the board will fit then chances are it will fit. Its probably best not to get a 2 post rack (sometimes called telcom racks) and make sure you put it on rock solid rails or a rack shelf, they get real heavy real fast w/ a stack of drives in. Those front panel mounts arent great at taking weight. A full depth 4 post would be a solid bet but chances are if you keep your eyes open on the 2nd hand market you can find a good rack for cheap.
Most of those features sound pretty neat, but they all sound sort of "gravy." I could probably live without a backplate if that's where the deal was.
They are, that being said digging around in a multi drive cage full of power and data cables to find that one drive that died (for the love of god label your drives so you can identify them), getting it out w/out pulling all the other drive cables at the same time, getting a replacement in when there is almost no room in the chassis to work, etc. I dont have any photos of the inside of my server but needless to say pulling drives out is a pita, the hot swap cages in it are cheap and nasty but they make what would otherwise be a tedious task (+ time to make sure its all hooked back up properly) into a 10 second task. Its probably worth labelling your data cables as well as your drives if you go down the all internal drives path.
Ah! That's where I first saw it. Yeah, I've seen LTT's Storinator but I was actually thinking of something from Supermicro that Level1Tech used in their "budget" ~150TB server setup. It involved three similar enclosures (one for parts) and a repurposed Google server. The enclosures I'm thinking of might not have had any computer inside of them and just been glorified hard drive controllers, though...
IIRC L1 were using JBOD disk shelves hooked up to their server's HBA(s)
I endeavored to select an unmanaged switch. (I don't recall it having its own IP or subnet--and mine is not192.168.1.X--so I took for granted it wasn't trying to manage anything.) I did continue to (also) use the router itself as a switch, which I didn't consider could be a problem.
An unmanaged switch wouldnt have an IP address or a managment interface, their transparent to the rest of the network, it shouldnt have any issues dealing w/ other switches on the network. That being said i have nothing good to say about consumer routers (especially the bit where their actually a router, gateway, switch and wifi AP all in one w/ an anaemic CPU and almost no RAM and hot garbage software) so maybe it was causing problems, seems unlikely though.

EDIT: there are 2u chassis that will take full height PCIe cards on a riser. IIRC you can get 3 slots into the back of a 2u horizontally. Im not a fan of this solution because it tends to leave cards covering slots but YMMV, etc
 
  • Like
Reactions: zovc

zovc

King of Cable Management
Original poster
Jan 5, 2017
852
603
For whatever it's worth, my router is running custom firmware and has marginally better software. :p

... but it's definitely time to retire it and get a proper router! Which is why we're here!

@j0rd, you make some very compelling points. I'll definitely be looking for some sort of internal interface that helps with cable clutter. Thanks for all your input and insight.
 
  • Like
Reactions: jØrd