NUC shelf for datacenter rack - 3U - 12 servers by Guillaume_F 3d model
Warning. This content is not moderated and could be offensive.
m4
3dmdb logo
Thingiverse
NUC shelf for datacenter rack -  3U - 12 servers by Guillaume_F

NUC shelf for datacenter rack - 3U - 12 servers by Guillaume_F

by Thingiverse
Last crawled date: 3 years, 1 month ago
The NUC shelf ... or how to hold 12 NUCs in 3 rack units.
Real world
Tested and approved during more than 2 years now with 3x96 NUCs in datacenter (cold corridor) in an HPC cluster configuration and grid computing.
No outage.
High density
2 shelfs on a simple 19 rack tray.
You can fill up a 42U bay with 96-120 servers with only 2 to 4kVA depending on the NUCs.
The leakage current is also deeply reduced with the power sharing (one switching power supply for 12 NUCs)
Cost killer
This shelf is lowering the cost of a node to nearly the price of the NUC + SSD + RAM.
PSU, Fans, ABS material, cables: ~ $30 per NUC.
Making this setup the most efficient, dense, upgradeable and low-cost on the market.
The cost of an Ultimaker 2 is amortized after the completion of the first row (2x6) if we compare it to the cost of existing solutions (x64 intel or amd).
Existing solutions are also often proprietary with proprietary board.
Other low-cost solutions
All low-cost solutions based on NUCs for racks are using the original NUC box.
This approach is not relevant in a datacenter for many reasons:
the cooling is not efficient (warm air will stuck between boxes) and fans are not industrial grade
no power sharing with an industrial PSU: many switching power supply = high leakage current = risks with the differential protective relay = going dark
impossible Cabling separation (Network / Power), a huge mess
Upgradeable
The project is parametric. You can change and adapt the linecard to new generation of NUCs with minimal impact. You only have to reprint the linecard (1 hour on a Ultimaker).
Efficiency and reliability
The 12 NUCs can be powered with one single power supply.
You can even make it redundant with isolated PSU. Our tests show us it is not necessary.
Each shelf (6 Units) is cooled by 2 fans.
The cooling is more efficient than the NUC fan and allows to remove it.
The air flow generated by one fan (industrial and durable) is evenly distributed across 3 NUCs. The line-card (with the higher 2.5 support) is also designed to maximise the effect on the NUC heat sink.
Design and printing
It is a classical and efficient design. Front=cold, Back=warm.
Network cabling is exclusively done at the front of the rack.
Power cabling is exclusively done at the back of the rack.
Parts are designed for fast printing.
You can lower the quality (high speed) without affecting the strength of the parts and the overall structure of the shelf. With an Ultimaker2 (olsson block), we reached a production rate of one linecard per hour.
It is still ABS. Don't expect to sit on a shelf without destroying it.
The line-card is designed to be light and to use the rigidity of the hardware.
Each piece is already prepared for a good printing without warping.
You don't need to add support or brim to it.
When brim was necessary, it was included in the model (one single layer).
Once printed, for those parts, you simply have to remove by hand the thin sheet added around.
Files
linecard : board/disk holder, sliding in the shelf. 6 per shelf.
shelfback : back of the shelf. 1 per shelf.
shelffront : front of the shelf. The line-cards are entering here. 1 per shelf.
shelftopbottom : top/bottom sheets to glue on the shelf-back/shelf-front. 2 per shelf.
shelfleftright : left/right sheets to glue on the shelf-back/shelf-front. 2 per shelf.
fancase : back fan holder. Screwed on the shelf-back. 1 per shelf.
All other files are dependencies (dim, pin, misc...) or for model evaluation.
You should get all files and launch all.scad to evaluate the consistency of the sources.
To purchase
NUCs (I.e. NUC5i5MYHE which is vPro equipped, necessary for remote console)
Fans. 92x92x25mm. I.e. SUNON MagLev Motor Fan ME92252V2-0000-A99
PSU. 12-24 Vdc with enough power (20w per line-card + fans). I.e. Tracopower TXH 600-124
Nuc internal power connector 2x2 and pins (Molex Micro-Fit 3.0: 43030-0007, 43025-0400)
Electric cable (I.e. Alphawire 5012C SL005)
Crimping tool
NUC
Open the nuc case
extract the board and the 2.5" disk holder
remove the Nuc fan
screw the board and the disk holder on the line-card - simply reuse the screws from the NUC. It is designed for.
Shelf assembly
Take a bottom plate and glue the front and back parts on the designed large lowered sheet.
Glue the top plate the same way.
Glue the left / right plate
Screw the fans to the fan holder part (desktop fan screws)
Screw the fan holder part to the back part (desktop fan screws)
For the glue, the best simple one is diluted ABS in Acetone (closed jar with ABS and acetone during 2-3 hours until getting a viscous paste).
It provides 80-90% of the strength of printed ABS once cured.
This mixture is highly sticky and fast to dry (10-20 seconds).
It is instantly consolidating and merging the parts, like a weld.
Power distribution
The power cable has a crimped Molex Micro-Fit 3.0 on one side (NUC internal connector). The cable is fixed in the two line-card cable holders and goes throw the fan case. There is one large hole for each line-card.
You can slide one line-card in production without shutting down anything. The power cable can slip in and slip out the back hole.
These power cables (per 6) can be soldered on a simple 0.75mm power cable to be properly attached to the PSU (we used U shape terminal connector here).
PSU
It can be hold on the back of the 19 rack. I provide an example in the psu.scad file which fits the tracopower PSU. The two identical pieces are screwed on the PSU and inserted in a simple 19" 1U front plate (installed at the back of the 19" rack).
The third piece is a holder to tight the power cables going to the shelf.
Ecological footprint
NUCs are small (far less components than server boards) and highlly power efficient.
This shelf is ultra-light manufacturing with minimal transport footprint.
The parts are upgradeable and can be fully recycled to make new filament and new servers.
When a motherboard is changed/upgraded, it can go back to its original Intel box and start a second life as a desktop or a HTPC.

Tags