Quantum DXi4510 review

Price when reviewed

Quantum’s latest DXi4510 aims to deliver an affordable data deduplication solution that brings this space-saving technology within the reach of SMBs. Furthermore, it avoids using virtual tape libraries (VTLs) and presents its storage as CIFS and NFS network shares, making it very easy to deploy and use.

Data reduction is carried out at the appliance and has a big advantage over source-based deduplication, as it will work with any backup software you choose. The main drawback is that all data selected on the source must be processed by the appliance, so network overheads will be higher.

Quantum choice of hardware gets our thumbs up: hiding behind the bezel is a top-quality PowerEdge R510 2U rack server. Storage is handled by eight hot-swap SATA drives managed by a Dell PERC H700 controller as a single RAID6 array.

There are no hidden costs, as the price includes all data deduplication and compression features plus remote replication to another appliance. Along with NAS shares, the appliance supports Symantec’s OpenStorage (OST) API, allowing it to work with NetBackup’s data mover function.

Quantum DXi4510

Quantum also targets virtualised environments: esXpress Backup for VMware ESX software with support for four virtual backup appliances is included. This allows you to schedule automated, daily backups of virtual machines or run them on demand.

Creating NAS shares from the well-designed web interface is easy: choose a name, go for CIFS or NFS protocols, and define access permissions where workgroup and AD modes are supported. At this stage, you also decide whether deduplication should be enabled on the share, but this is a one-way trip and can’t be reversed later on.

The console provides plenty of status information so you can keep track of network, RAID, deduplication and ingest activity on the appliance. You can view the deduplication ratio being achieved, check on storage usage, and set up alerts that can be emailed to multiple recipients.

To test deduplication ratios, we ran our own set of lab tests designed specifically to look at performance for file server operations. We used a 4GB data set consisting of 1,000 files and introduced controlled changes within a percentage of the files during a simulated standard backup strategy consisting of daily incrementals and weekly full backups.


Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.

Todays Highlights
How to See Google Search History
how to download photos from google photos