Server building and U

Subject: Systems | July 20, 2011 - 01:18 PM |
Tagged: rack mount, server, 3U

If you read PC Perspective on a regular basis the chances are very good that you have purchased individual components and assembled yourself a PC from them.  There is a lesser chance that you have built a server, especially one using a rack mountable casing.  The terminology is different, less about ATX and more about rack unit or U.  U is a measurement of size, with a 1U case bearing a remarkable resemblance to a pizza box, with a height of 1.75 inches (44.45 mm) with a width depending on the style of racks you are installing into, with 19" and 23" being the standard.  OCC takes you through the assembly of both an ATX case as well as a 3U case and recommendations of OS and software based on the intended use of your new server in this article.

View Full Size

"Hardware is one of the most important items in building a server, regardless of its overall purpose. Without enough power or memory to run the necessary applications, the server would subsequently be useless. Depending on its intended usage, however, the hardware may vary slightly. It is often a good idea to sit down and create a plan beforehand. Personally, I live by a rule of thumb that purchasing more than needed is better than having the server become congested or even fail when it is needed the most. For this guide, I will be putting together two different types of servers. The first will be a gaming server, designed to host multiple instances of LAN-based games. The second will be a file/web server, intended to back up data and make items accessible from any computer in a network. As always, hardware can vary depending on your needs; this is just a general overview of a configuration that I would use, based on the available hardware that I have on hand. Keep in mind that any computer can function as a server as long as the necessary software is installed. Therefore, you may even have old parts that can be reused for your server."

Here are some more Systems articles from around the web:

Systems

 

July 20, 2011 | 06:56 PM - Posted by Firekingdom (not verified)

The U is more like a desktop than a server, what server has a DVD driver in it? It looks more to me like a HTPC going into a server room. Which by the way is the best think I every did. I push 1080p over cat6. hdmi > cat6 > switch > 6x (cat6 > hdmi > display.) All over 50 feet apart. The one problems is they all display the same think, But it can expand. All of it in a close loop. It can be connected to the internet, but pushing that much data plus others will hurt the performance of the picture. Only way It can be connect to the internet is thought a another machine. Internet > slave machine > hdmi machine > to the cat6 switch. The hdmi machine can have internet but is has to go through the slave machine. Why is was like this, while if anything goes wrong with any others server the hdmi is save. The slave will have to machine will have to get hack, than the the hdmi will have have too. only a one port open on the hdmi machine it is 22. Than I will open other port that I need and will close when I log-off or hit a time limit. I tend to keep have 5 switches in the server room. 1: main one, 2: website/ random data for the it, 3: slave machine/ security feeds, 4: VOIP, 5: the internet/ printers, all other stuff. Every thing past switch 5 is a mess, also 2 or 4 ports in for wifi only. I want a radius server, but they wont let me, plus A white box to compact all this to 2 rack.
The security stuff takes one rack it self with the drives.
I wish the bugdet had 50k in it, I would get a a white box or vm machine with 96 cores all amd or intel if better ones comes out. 8 cpu with 12 cores each and 16gb in 4dims each cpu, plus the raid card and two video cards. one for the hdmi stuff and the other for the vms. the Data base will be itself just link it to the vms. I dont have the room for 300gb every month in raid 5. It gets clean out every 6 months, which leaves me with about 50- 100 gb of docs another stuff in encrypted plus salted with raid. Plus one back card for every raid card. Why because if one fails I can get the data back and move it to better raid cards.
That the setup I have. I would go all linux besides the data base, but they wont things simple. I should do it any ways, It will give me another 20-30k. Microsoft server licenses are not cheap. They wont support the them, which crazy when they have IT personal. For the white box i will get fully support for that. Vmware is not my favorite think, but proxmox ve I can handle.

I think the web server been running for 2 years before I gave it a break for a day. Why it got a break it goes like this. I;m punching out, and some says they are going to turn off the power in the main branch or main hub down town and are branch as well. The reason I didn't ask. I was like what u going to turn off the power for a day, and not tell me. I have a ups that will last maybe 1 hour max if the generators will not turn on. Also i didn't know if the generators will handle the load of the building lights and server ac and the the servers on low power. So I go to the the main server and fill out the box under shout down (which is shutdown -i for non-windows server.) The main one will shutdown all other pc on the network beside a select few, and the cams and VoIP, and but the data base in low power mode. When the power does go out the the data base will send a message to all computer on in the network to save there work, and will shutdown in 20 min. voip will run and main fan for the room will run all night. Cams will run for 10 mins and be ask to turn off or go on. The only real PC that will be on is my itx board, witch has a copy of the website on it, which is striped down and has a cam on it so i can see if the power is back on. When the power is on it lights up a green light on a wall when off it is red. If it is green for 20 mins and the ups is half way charge I will start turning on the servers.

July 20, 2011 | 06:59 PM - Posted by Firekingdom (not verified)

Sorry about all that stuff, just figure people will like to see who some IT setups. I can't give the company name out, but I can give the setup.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Lines and paragraphs break automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd> <blockquote><p><br>
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.