JBC CONSULT DATA CENTER SERVICES
View our extensive list of services below:
Hot Aisle Containment Solutions.
- Open area of room is a cold environment.
- Leakage from raised floor openings in the larger area of the room goes into the cold space
- One redundant cooling unit can serve several aisle containments
- HAC with Inrow cooling improves energy efficiency compared to traditional raised floor cooling with HAC.
- Typically a rack hat style containment with end row doors.
- Minimizes cold and hot air mixing so the return temperature to the CRAC is higher
- HAC with Inrow cooling can cool more than 30kW heat load per rack
- Generally more effective.
- Hot aisle containment will be more forgiving for network racks and stand-alone equipment such as storage cabinets that might have to live outside the containment architecture, i.e., they will live in the lower temperature area of the computer room.
- Hot aisle containment can perform well in a slab environment by merely flooding the data centre with an adequate volume of supply air and containing the exhaust air.
- Hot aisle containment, by virtue of the containment structures typically abutting the ceiling where fire suppression is installed, is not creating separate volumes, but merely creating obstructions which need to meet clearance requirements from sprinkler heads. With a well-designed space, it is conceivable that a standard grid fire suppression system could be installed around a hot aisle containment array of barriers and meet code.
New Data Centre design.
Effective data centre operation requires a balanced investment in both the facility and the housed equipment. The first step is to establish a baseline facility environment suitable for equipment installation. Standardization and modularity can yield savings and efficiencies in the design and construction of telecommunications data centres.
Standardization means integrated building and equipment engineering. Modularity has the benefits of scalability and easier growth, even when planning forecasts are less than optimal. For these reasons, telecommunications data centres should be planned in repetitive building blocks of equipment, and associated power and support (conditioning) equipment when practical.
We can with consultation design Data Centres from concept to delivery of a completely finished Tier 4 centre.
We can and do supply all hardware and equipment needed in the Data centre.
Cold Aisle Containment.
- Easier to implement; does not require additional architecture to contain exhaust air and return it to the cooling units (drop ceiling, air plenum etc.).
- Only requires doors at ends and cap at top.
- Generally less expensive.
- Cold aisle containment is typically going to be easier to retrofit in an existing data centre, particularly when there are overhead obstructions to circumnavigate, such as power and network distribution, ducts, lighting.
- Cold aisle containment doesn’t absolutely need to be on a raised floor, but it typically is because of challenges associated with delivering supply air to the contained space(s).
- Enables more surface area for “cold sinks” (with or without a raised floor) ride through in the event of power failure and engine generators not starting.
Electricity is said to be the single biggest cost factor of running a data centre and efficient usage can significantly reduce IT costs, as some data centres can be likened to consuming as much power as a small city.
However, according to the industry, with rapid data growth rates it seems unlikely that power consumption of data centres will reduce in the next few years. In fact, energy usage is set to increase exponentially.
We supply the following:
- Power Distribution
- Rack PDU’s
- Intelligent Rack PDU’s
- Inline Meters
- Branch Circuit meters
- Transfer Switches (Both STS and ATS)
- DCIM Software (from various companies)
- UPSUPSs and power conditioners play a vital role in ensuring IT reliability. Any time a UPS fails and becomes unavailable, missioncritical electrical loads are put at risk. The surest way to increase UPS availability is to have the correct design, use high quality components, ensure redundancy and eliminate downtime by minimizing MTTR (mean time to repair).
- Single Phase (1-80kVA)
- 3-Phase (10kVA-5mVA)
- Double-conversion online modular UPS
- Double Conversion Standalone UPS
- Single Conversion UPS
Cooling and Humidity Solutions.
Recommended Computer Room Temperature
Operating expensive IT computer equipment for extended periods of time at high temperatures greatly reduces reliability, longevity of components and will likely cause unplanned downtime. Maintaining an ambient temperature range of 68° to 75°F (20° to 24°C) is optimal for system reliability. This temperature range provides a safe buffer for equipment to operate in the event of air conditioning or HVAC equipment failure while making it easier to maintain a safe relative humidity level.
In today’s high-density data centres and computer rooms, measuring the ambient room temperature is often not enough. The temperature of the air where it enters the server can be measurably higher than the ambient room temperature, depending on the layout of the data centre and a higher concentration of heat producing equipment such as blade servers. Measuring the temperature of the aisles in the data centre at multiple height levels can give an early indication of a potential temperature problem. For consistent and reliable temperature monitoring, place a temperature sensor at least every 25 feet in each aisle with sensors placed closer together if high temperature equipment like blade servers are in use.
Recommended Computer Room Temperature
Relative humidity (RH) is defined as the amount of moisture in the air at a given temperature in relation to the maximum amount of moisture the air could hold at the same temperature. In a data centre or computer room, maintaining ambient relative humidity levels between 45% and 55% is recommended for optimal performance and reliability.
When relative humidity levels are too high, water condensation can occur which results in hardware corrosion and early system and component failure. If the relative humidity is too low, computer equipment becomes susceptible to electrostatic discharge (ESD) which can cause damage to sensitive components. When monitoring the relative humidity in the data centre, we recommend early warning alerts at 40% and 60% relative humidity, with critical alerts at 30% and 70% relative humidity. It is important to remember that the relative humidity is directly related to the current temperature, so monitoring temperature and humidity together is critical. As the value of IT equipment increases, the risk and associated costs can increase exponentially.
Inrow Cooling Solutions for Aisle Containment.
Inrow cooling precisely cools and conditions air in close proximity and is targeted cooling at the rows of server cabinets that fill the data centre. They can be installed on the floor or suspended from overhead making them closer to the actual rack. This setup offers both capacity and efficiency gains.
Whether floor mounted or overhead, inrow cooling solutions consist of fans and a cooling coil. Depending on the product, the cooling coil will utilize chilled water or refrigerant as the cooling medium. In the case of refrigerant based units, they require a connection to a remote condenser system; while chilled water based units traditionally connect to chiller systems.
- Warm and cool air does not have far to travel
- Allows the units to dissipate high heat loads quicker
- Scalable approach to cooling your data centre
- Floor MountThe floor mount, inrow air conditioner couples the functionality of a perimeter CRAC (computer room air conditioner) unit with a significantly smaller footprint. These products are embedded in rows of data centre cabinets, supporting a conventional hot aisle/cold aisle layout.
- OverheadOverhead units have a unique value proposition. Suspended from the ceiling or housed on top of a cabinet, these units conserve valuable floor space, leaving more room for servers, storage, and switches. Overhead systems allow cooling-strapped data centres to add capacities without new construction.
- A profiled wall utilises metal cladding with trapezoidal, sinusoidal or half round profiles fixed to the steel structure of a building. Cladding panels can be produced in different profiles which can be laid horizontally or vertically and can be manufactured from prefinished steel or aluminium in a vast array of colours (Painted, epoxy coated or Powder coated) providing a wide choice for an aesthetic finish.
- Cladding is always custom made to suit your requirements.
Humidifiers and Evaporative cooling.
Humidifiers are required in data centres to prevent electrostatic discharge damaging servers and offer high capacity, low cost evaporative cooling.
For existing data centres ASHRAE recommends a humidity level of 5.5°C dew point to 60%RH and an allowable range of between 20-80%RH. In most parts of the world, at some time in the year, humidification will be needed to meet these internal conditions.
Humidifiers are often used alongside free air cooling systems to either boost the cooling capacity with an evaporative cooling effect or provide high load, low cost humidification to the large volume of air flowing through the data centre’s ventilation system.
We supply and install Humidifiers from all the major International and local suppliers.