Disclosure: Some links on this page are monetized by the Skimlinks, Amazon, Rakuten Advertising, and eBay, affiliate programs, and Liliputing may earn a commission if you make a purchase after clicking on those links. All prices are subject to change, and this article only reflects the prices available at time of publication.
The LincStation N1 is a Network Attached Storage (NAS) device from a company called LincPlus. But it is different from most recent consumer-oriented NAS systems, in that it's not just a plain-looking box designed to hold a few hard drives and run a proprietary operating system.
Instead, it's a small and sleek looking NAS with support for up to six storage devices thanks to two 2.5 inch bays for SATA drives and four M.2 2280 slots for PCIe NVMe SSDs. And while the LincStation N1 ships with a licenced copy of Unraid, which is a proprietary operating system based on Linux, users are free to install other operating systems. This makes the $399 LincStation N1 an affordable option for anyone looking to create their own NAS using whatever operating system is compatible.
In this review I'll look at the performance of the LincStation N1 under Unraid, since that's the software that ships with this NAS, but I'm also going to explore the compatibility of other operating systems running on this hardware including Windows, TrueNAS SCALE and Debian.
LincPlus sent me an LincStation N1 to test. This was provided to Liliputing for free, with no requirement that the NAS be returned upon completion of the review. This review is not sponsored by LincPlus, and the company did not modify or approve the content of this article in any way.
DesignThe LincStation N1 consists of a silver metal base and black plastic top. The exterior measures 210 x 152 x 40 mm (8.27 x 5.98 x 1.57 inches) which is very close to the size of a piece of A5 paper (or a piece of A4 folded in half).
The front of the NAS consists essentially of three layers. The upper layer, which is part of the plastic top, has an illuminated on/off button on the far right and seven LEDs in the centre that provide information about the state of the 6 storage bays and the network connection.
The middle layer of the front consists of a metal flap that can be pulled down to reveal two NAS storage bays with support for 2.5 inch SATA III drives. When considering using hybrid storage, which is a mix of SSD and HDD drives, it is important to note that these SATA III drive bays only support a height of 7 mm.
Between those drive bays there's a USB Type-C 3.2 Gen 1×1 port (aka 3.0, 5 Gbit/s data) port. When LincPlus first announced the LincPlus N1, the company had described this port as supporting faster speeds, but after I pointed out that they had transposed the speeds of their USB ports, they corrected the error on their website.
The lower level is an RGB strip. I'll cover the software that can be used to control the colour and patterns later in the review.
All of the system's ports are on the back of the computer, leaving the left and right sides bare.
From left to right, the ports are:
You can access the LincStation N1's four NVMe storage bays by flipping over the computer to access its bottom.
On either side of the metal base are metal plates, each of which covers two M.2 2280 NVMe bays. The underside of the plate has a thermal pad affixed to it protected by a removable film when shipped.
LincPlus says the LincStation N1 supports up to 48TB of total storage if you use four 8TB NVMe SSDs and two 8 TB 2.5 inch SATA III SSDs.
You don't need any tools to remove a cover from either bay or to insert or remove an SSD. All you need to do is press a plastic catch to either remove or install a drive or the cover.
Within the left bay there is also a removable USB 2.0 drive (F4 ARV21X) from Alcor Micro which is preloaded with the Unraid operating system. A peculiar feature of Unraid is that it must be run from a USB drive.
When you buy a LincStation N1, you also get a prepaid 1-year subscription for Unraid. o ensure you get your full year, the activation key is on a separate included card and this allows you to first experiment with Unraid using the free 30 day trial licence which can be activated after booting the device.
While you can use the operating system indefinitely, this licence means that you'll get free OS updates for a year. If you'd like to continue to receive updates after that time, Unraid licence prices range from $49 per year for a Starter Licence to a $249 one-time fee for a Lifetime Licence.
Inside the LincStation N1 is the motherboard, but it's not designed to be user accessible. There's also not much reason why you'd want to get to it, since the processor and memory are both soldered to the motherboard and storage is easily accessible without disassembling the computer.
The NAS uses an Intel Celeron N5105 which Intel classify as a mobile processor. It has a TDP of 10 watts and has four cores with a base frequency of 2.00 GHz which can boost up to 2.90 GHz. It also includes an iGPU with Intel UHD Graphics that has 24 Execution Units and is capable of 4K support at 60 Hz.
Now this is a slightly older processor that's part of Intel's Jasper Lake line of chips that launched in 2021, and only has 8 PCIe lanes which are Gen 3.0, which could pose a bottleneck on the read/write speeds for any NVMe SSDs that you use with this system.
If you do the maths you might think the NVMe drives will operate using two PCIe lanes resulting in a maximum theoretical speed of 1969 MB/s. However you have to allocate at least one lane for networking to provide a fast, reliable wired Ethernet connection. That results in the NVMe drive speeds operating at a maximum theoretical speed of 985 MB/s.
For memory there is 16 GB of LPDDR4 configured to run at its maximum speed of 2933 MT/s. Obviously being soldered there is no easy way to expand the memory.
The motherboard does have a few other interesting features, as there are two further chips that are not used by Unraid but, as I'll show later, are very useful if you choose to use a different operating system.
The first is 128 GB of soldered eMMC storage. Embedded MultiMediaCard (eMMC) storage was first seen in both mini PC sticks and boxes before M.2 storage was adopted. According to Intel's Datasheet of Intel Pentium Silver and Intel Celeron Processors, volume 1, the eMMC conforms to JEDEC's Electrical Standard 5.1 which in turn defines a maximum data transfer of 400 MB/s. As is common with these type of Intel processors, all the storage (NVMe, SATA and eMMC) is connected to the Platform Controller Hub (PCH) which is then connected to the CPU by the On Package Interface (OPI), also known as the Direct Media Interface (DMI) on desktop processors. This can be a source of bottlenecks but given it uses DMI 3.0 which has four lanes of PCIe Gen 3.0 resulting in 3938 MB/s, the bandwidth should be sufficient for this device.
The second chip of interest is a soldered Intel AX201 WiFi 6 card. Not only does this provide WiFi on the 2.4 and 5.0 GHz bands, but it also supports Bluetooth 5.2. The antennas are attached to the plastic frame within the device. As they are covered by the plastic top rather than a metal one, they can get good signal reception.
There's a heat sink with small fan attached to the CPU on the underside of the motherboard. I haven't personally opened the device to measure the fan, but based on promotional pictures shared by LincPlus, it looks like a 40mm fan that exhausts directly out of the rear case through the central vents.
The NVMe drives transfer their heat via the thermal pads to the metal base of the device which acts as a heat sink. This provides sufficient cooling, as the drives can only run as PCIe Gen 3.0 so they will not generate that much heat compared to the latest Gen 5.0 drives.
LincPlus also put a large metal heat sink on the underside of the system's plastic top. It's positioned so that it comes in contact with the upper side of the motherboard directly above where the CPU is located on the other side.
The motherboard is T-shaped to accommodate the two SSD drives on either side. The heat sink in the top also extends into each side to cover each SSD.
LincPlus has a promotional video that uses an animation to visually explain the internal construction.
To power the LincStation N1, a small 60 W (12.0 V 5.0 A) power adapter (A6514-1205000D) complete with a country specific power cord with a C7 (figure-8) connection is included.
Operating System CompatibilityWhilst the Celeron N5105 processor is a few years old, it's still considered a current part that's supported by Intel. Other recent NAS systems that use the same processor include the Asustor AS5404T, Flashstor 6, and LOCKSERSTOR Gen 2.
It's 10nm processor featuring Tremont micro-architecture. More importantly, the Intel Celeron 5105 processor supports Trusted Platform Module (TPM) 2.0 which is a requirement for Windows 11. The processor is also on Microsoft's list of approved CPUs.
Since the LincStation N1 also supports Windows 11's other requirements, including a minimum of 4 GB memory and 64 GB of storage, there's nothing stopping you from installing Windows 11 on the computer. But one hurdle to overcome is finding the required drivers. Fortunately this processor was used by Intel when they released their NUC 11 Essential Atlas Canyon, NUC11ATKC4. All the drivers have been transferred to the ASUS support page for this model.
You don't actually need all of the drivers though, because after installing the Chipset Device Software, GNA Scoring Accelerator and Serial IO drivers everything is updated. It may be optional, but I always install the Intel Management Engine driver for good measure.
Given that Unraid is based on Slackware, which is a Linux distribution, you could conclude that most other Linux distributions should therefore work. With a recent enough kernel that is probably true as the Intel Celeron 5105 processor not only uses a 64-bit version of the x86 instruction set, but also a 64-bit bootloader, which are the two requirements for installing recent versions of Debian, as well as other operating systems like TrueNAS SCALE (which is based on Debian) or TrueNAS CORE (which is based on FreeBSD).
While it can sometimes be difficult to get WiFi and Bluetooth to work with Linux on PCs with newer hardware, in this case it's a good thing that some of the hardware in the LincStation N1 is a few years old. The Intel AX201 wireless card was first launched in 2019, and it's well supported by most Linux distros.
Hardware performanceIt might seem strange to start the review of a NAS that ships with Unraid software by looking at Windows. However installing Windows makes it easy to see how the LincStation N1 compares with other Windows PCs with the same processor. By using Windows I can also run the widely recognised CrystalDiskMark to look at the drive speeds.
Prior to installing Windows, or any other operating system for that matter, I recommend removing the Unraid USB to reduce the risk of accidentally overwriting data on the USB flash drive.
Installation is quite simple using a Windows ISO written to USB. After installing all available updates, I ran my tests on the computer using Windows 11 Pro version 23H2 OS build 22631.4112.
First a quick look at the hardware information shows that the 10 watt processor has the values for Power Level 1 set to 10 W and for Power Level 2 it is set to 12 W with a Tau of 28 seconds.
These power limits are defined in the UEFI (BIOS), where they could be changed. They are lower than a typical Intel Celeron N5105 mini PC, however they are obviously set low for a good reason, as the cooling is not particularly powerful given there's only a single 40 mm fan. Keeping these default values shouldn't have much impact on the LincStation N1's performance as a storage device. However it does mean that you cannot expect to mimic an N5105 mini PC by running Windows as a virtual machine. If you try, you will quickly be impacted by thermal throttling.
For testing, the actual make and model of NVMe drives is not important as long as they are fast enough to saturate the link width, which, as explained before, is one lane of PCIe Gen 3.0.
For my testing, I'm using drives that are completely over the top in terms of performance as I didn't have anything available. I've used four of my own 4 TB Lexar NM790 M.2 2280 PCIe Gen 4.0 NVMe SSD drives which have a slated sequential read up to 7400 MB/s and sequential write up to 6500 MB/s. As a reminder, the theoretical maximum speed of PCIe Gen 3.0 x 1 is 985 MB/s.
The performance obtained is always going to be at least a little lower than the theoretical maximum. The results are still very respectable with a sequential read of 853.28 MB/s and a sequential write of 781.19 MB/s.
It's not actually possible for SATA III drives to saturate the link, since the theoretical maximum speed of SATA III is 600 MB/s. So I'm using two of my own 4 TB Samsung SSD 870 QVO drives in the 2.5 inch drive bays, which are about the fastest you can get. According to Samsung, they should support sequential read speed is up to 560 MB/s and sequential write up to 530 MB/s.
It is interesting that the sequential read speed matched the specification at 560.90 MB/s. However the sequential write was much lower at 419.31 MB/s. I would have expected the write speed to be at least upward of 450 MB/s. However the achieved speed is still higher than the maximum throughput of the networking capabilities of the device, so it shouldn't be a problem.
Testing the eMMC is straightforward as this is where I installed Windows. The results effectively give us the speed of the storage whilst running the operating system.
Again it is not exactly the 400 MB/s theoretical maximum that eMMC 5.1 is documented as having, but the sequential read speed of 297.49 MB/s and sequential write speed of 146.78 are still in line with the speeds typically seen from this form of storage in other tests I've conducted on mini PCs using this type of storage.
I also tested the USB drives. This time I'm able to saturate the link by using an NVMe drive in an enclosure based on the ASM 2464PD controller as this is compatible with USB 3.2 interfaces.
The front USB, whilst being a Type-C port, is only running at USB 3.2 Gen 1×1 speeds or 5 Gbit/s.
The actual speeds obtained of sequential read speed of 460.30 MB/s and sequential write speed of 435.51 is okay for what was known as USB 3.0, it is just a bit disappointing that the port is not running at 10 Gbit/s as it is a USB Type-C.
The rear USBs however are 10 Gbit/s and both reach around 1050 MB/s for sequential reads and 960 MB/s for sequential writes.
Finally I also tested the Unraid USB drive. Being only a USB 2.0 drive it is not particularly fast. I observed a sequential read speed of 28.29 MB/s and a sequential write speed of 11.09 MB/s. Of course this isn't an issue when running Unraid, as it loads its operating system into memory so the speed of the USB does not affect performance once booted.
Having established a baseline for the storage performance, I can now look at the CPU performance. I've only run a couple of benchmarks which were MAXIM's Cinebench R23 and Passmark's Performance Test 11.0.
The CPU multi core score for Cinebench of 1727 pts is nothing special and is a direct consequence of the lowered power limits.
The overall PassMark rating of 816.1 is also similarly affected by the power limits, and the CPU Mark only reached 3981.3. But this device isn't pitched as a mini PC where overall performance matters more.
If anyone remembers the MeLE PCG02 Pro, which looked a bit like an oversized stick PC, it was available in a version equipped with an N5105 processor. To keep it from overheating, it also relied on reduced power limits, albeit with slightly lower values of 8 and 10 W for PL1 and 2 respectively. As a result the PCG02 Pro only managed a CPU multi core score for Cinebench R23 of 1638 pts and a CPU Mark of 3891.6. It achieved a higher overall PassMark Rating of 1442.3 but that was because it used an M.2 2280 SATA SSD for storage which being slightly faster gave it a Disk Mark of 1214.4 compared to the LincStation N1's eMMC Disk Mark of 993.3. Looking through all my old test data for N5105 mini PCs, the LincStation N1's performance is on par with what to expect from this processor.
Turning next to network performance. I've tested the 2.5 Gb Ethernet port and also the WiFi throughput over the 5GHz band simply because the WiFi card works on Windows..
Windows Ethernet WiFi 5.0 GHz iperf3 3.16 iperf3 3.16 Upload 2.35 Gbps 1.65 Gbps Download 2.35 Gbps 1.70 GbpsThe Ethernet performance is as good as you expect to see with a 2.5 Gb Ethernet port and measured 2.35 Gbits/sec both up and down. The WiFi performance is actually quite impressive. It achieved an upload speed of 1.65 Gbits/sec and download speed of 1.70 Gbits/sec.
I also confirmed that Bluetooth worked, should you ever have a need for it when not using Unraid.
A remaining area of concern was whether the cooling is actually effective with such a small fan. As a quick test I ran Cinebench R23 multi core whilst graphically monitoring the CPU Package temperature and power in HWiNFO64.
Prior to the test, the CPU was idling around 62°C which is quite high but not worryingly high. Once the test started, power usage went from around 4 watts to the PL2 limit of 12 watts before dropping to the PL1 limit of 10 watts after the Tau period. The CPU temperature initially climbed to 77°C before dropping slightly and then climbing gradually over the duration of the test back up to 77°C.
Perhaps if you left the test continually running, the CPU temperature would climb slightly more, however there is still a long way to go to the Tjmax of 105°C and thermal throttling would certainly start before it reaches Tjmax and would protect the CPU. Also such a scenario is really unrealistic when running as a NAS, but even under these conditions it is apparent that the cooling is sufficient.
Now I want to pause whilst I make a potentially controversial suggestion. A lot of people are reluctant to include a NAS into their home setup due to either the hardware cost or the perceived complexity of configuring it, given there are often a lot of new terms and potentially confusing jargon involved.
Installing Windows on the LincStation N1 actually addresses these two factors as the hardware is well-priced and there is little new learning involved. If you are simply looking at having some storage attached to your home network that you can access from your wired or wireless devices, then this is achievable through Windows. You can either span your NVMe drives into a single logical volume with no redundancy, or mirror your disks for extra data security and then use Windows' file-sharing functionality (SMB) to make the volume/folders/files available over the network. You can also tailor appropriate user access if required.
Plus, if you install Windows 11 Pro, then you've also got Hyper-V for virtualisation and you've also got WSL 2 for containers after installing Docker. Apps like Jellyfin and Plex have their own installers for Windows so the fundamentals of a NAS are achievable.
But for anyone currently reaching for a pitchfork, let's now continue and look at Unraid on the LincStation N1.
Unraid PerformanceLincplus recommends that new users of Unraid start by understanding how the operating system works using the free 30-day trial licence that comes with the LincStation N1.
Included with the device is a comprehensive user manual in English, German and French that covers how to activate both the trial licence and the included first-year licence.
I have used the same four Lexar NVMe drives together with the same two Samsung SSD drives in all my testing. So with Unraid, having installed your drives, you first allocate them as storage (the array), and optionally allocate for redundancy (as parity) and performance (cache). Next you can define your logical drives to be used as shares, configure the SMB settings and set up your users, perhaps tweaking any settings along the way as necessary.
For my testing I created an array with the first two NVMe drives, allocated the third NVMe drive as cache, and the fourth as parity. This result is about half of the 16 TB of NVMe drive storage being available as NAS storage. I also left the two SSD drives as unassigned and I'll explain why shortly.
With the storage established, if you want to install containers you should first download the community-applications plugin.
This will give you access to numerous templates from which you can create containers.
In testing I have used the template for Jellyfin because I see using the NAS as a music server as a common requirement for NAS users.
Creating a container within Unraid is very straightforward and, compared with some NAS offerings, very simple.
Even accessing the container once created is made easy, and perfect for those new to using a NAS operating system.
Unraid has applied this approach of ease-of-use to installing virtual machines or VMs.
Simply pick your flavour, complete any field you are asked to enter values for, and modify any of the suggested values as required.
You then are up and running in no time.
Unraid comes with a dashboard in a single screen that shows everything you need to know about your NAS system.
In terms of the Unraid software, the version I tested on was 6.12.10.
Unraid is actually built on top of a LInux distro called Slackware, currently using release 15.0 together with a modified kernel at version 6.1.79. So this allows you to use standard Linux commands and you can access a terminal window using the icon from the top right.
Earlier I mentioned that I'd left the two SSD drives unassigned. This is because I wanted to use them for an alternative backup solution to demonstrate how flexible Unraid can be. From a Linux terminal I used some LVM commands to create a logical volume I called backup on the SSDs. I then wrote a backup script that was controlled by a crontab entry that simply used rsync to copy the Unraid storage to the backup storage every night.
The underlying filesystem of the Unraid storage is xfs with docker containers using btrfs.
As explained above, the physical NVMe drives are limited in speed to PCIe Gen 3.0 x 1 or 985 MB/s. This however is going to be fast enough given the network connectivity.
Unraid only supports Ethernet so unfortunately the WiFi and Bluetooth functionality is not available when using this operating system.
To test the Ethernet performance I transferred a large 100 GB file between a PC and the LincStation N1 which was connected through a 10 GB Ethernet port on the PC side.
In an upload test, when I transferred the file from my PC to the NAS, the upload speed averaged 284 MB/s.
Download was slightly slower at 244 MB/s.
Running CrystalDiskMark against a shared drive on the NAS from the PC confirmed this slight difference with the sequential read speed being 261.44 MB/s and the sequential write speed was slightly faster at 296.17 MB/s. It is worth remembering I've configured my NAS drives with a dedicated NVMe drive for cache to improve write performance as the parity calculations do not have to be performed.
The LincStation N1 also comes with an RGB LED light strip that runs across the bottom of the front of the computer, as you can see in the rather poorly created GIF above.
You can control the RGB lighting settings through a webpage that is accessed on port 50000.
When you first access it you may get a shock if your Chinese is not up to scratch. One option is just to translate the webpage using your browser.
Or you can view the source of the webpage which is in English. The variable names and values make it obvious what the Chinese page is actually saying. Below I'll cover how you can simply fix this issue for non-Chinese reading users.
Finally, if the included fan is too noisy for your liking, or you need extra cooling for some reason, you can control the fan speeds.
First you need to install a driver for Linux and this can be done by installing the IT87 Driver plugin. You then have the option of installing "Dynamix System Autofan" and"Dynamix System Temp" as described in this post: to control the fan. Alternatively you can use pwmconfig in Linux to configure the fan curve you want and then use fancontrol to use it.
TrueNAS CompatibilityIf you are not keen on paying an ongoing annual fee for Unraid updates and support, or there is some other reason why you don't want to use Unraid, you could use a different NAS operating system on the LincStation N1. As an example of an alternative and open-source NAS operating system, I tested installing and using TrueNAS, specifically TrueNAS SCALE.
Installation was straight forward. I used a TrueNAS SCALE iso written to a USB and installed it to the internal eMMC storage.
All the drives can be seen by TrueNAS. The eMMC drive shows as mmcblk1, the four NVMe drives are self evident, the two SSD drives are sda and sdb, and the Unraid USB drive is sdc.
For my testing I chose to create two storage pools. The first combined all the NVMe drives in RAIDZ2 (or RAID6) and second by using stripe to combine the two SSDs together so without any redundancy. This combination allowed me to set up a replication task to backup the main storage NVMe pool to the SSD pool. Datasets were set up for shared drives over SMB, and a jellyfin container and Ubuntu VM were created to replicate the testing performed in Unraid. No issues were encountered during testing and TrueNAS appears fully supported on the LincStation N1.
Debian compatibilityRather than use a graphical interface, I chose to install the text-based standard live ISO for Debian version 12.7 known as bookworm. Again I replicated a similar NAS environment to the Unraid testing one using just the CLI. In terms of storage configuration, I used RAID 10 across the NVMe drives and RAID 0 across the SSD drives.
Unsurprisingly, Debian is fully supported on the LincStation N1.
LED controlIf you chose not to use Unraid and still want to control the LEDs on the LincStation N1, you really should download the official "driver" LED software and then configure it.
You can get this LED software from the LincPlus support page under LincStation N1 driver.
Then, you need to unrar the compressed file and copy the directory called static under N1 Driver—LED/device_servers/servers/led to /usr/local/sbin. Next make sure that /usr/local/sbin/led/go_8130_led_linux is executable. You must then load the i2c_dev module so you can access the System Management Bus (SMBus). Then create an alias called led as "(cd /usr/local/sbin/led && ./go_8130_led_linux)". Finally edit the files index.html and rgb.html under /usr/local/sbin/led/static and change the Chinese characters to the corresponding English or to your preferred text.
When you want to change the colours on your LincStation N1 you first need to check using the command "i2cdetect -l" that the SMBus is on i2c-0 as this seems to be the only I2C bus number the LED software works with. If it is, you can then run the led alias and go to the ip address of your device on port 50000 in a browser. Colours are represented by their hexadecimal code e.g. red is #FF0000. When finished go back to the terminal running the aliased command and end it using control-c. If you want to leave the LED listener running in background, don't bother with the alias and just run "(cd /usr/local/sbin/led && ./go_8130_led_linux > /dev/null 2>&1) &".
In reality the software is just using i2c-tools to set registers for the LEDs. The chip address is 0x26 and, if for example, it is on bus 1, to turn off the first NVMe light use the data address of 0xb1 with the value of 0x01. You should also disable interactive mode so include the -y flag making the full command "i2cset -y 1 0x26 0xb1 0x01". The other NVMe drive values are 0x04, 0x10 and 0x40 respectively. To turn them back on, use the data address of 0xa1. You must make sure you don't enter the wrong bus number or address as i2cset can be extremely dangerous.
Fan controlThe fan curve is defined in the UEFI (BIOS). However it can be changed dynamically through the operating system. When running I couldn't actually hear the fan unless I put my ear next to the device. So during my testing I did not find the fan noise to be an issue.
By default the fan is not recognised by the operating system and runs according to the UEFI (BIOS) settings. So the first step, if you want to control it, is to install a modified IT87 driver. I've described the process for Unraid above. In Debian you can find a suitable driver by searching the internet. I download a version from https://github.com/frankcrawford/it87 and then build it using make. I then manually installed it although you could automate this if preferred.
Once installed you can then check the current CPU core temperatures and fan speed using the sensors command.
To control the fan you need to install the fancontrol package. You should then use pwmconfig to configure the fan curve.
Finally you need to start the fan control with "systemctl start fancontrol.service".
Power consumptionI measured power consumption by connecting the LincStation N1 to a power meter, which was connected to a UPS power outlet to provide clean power.
I first measured the power consumption when using Unraid:
Once the LED listener process (go_8130_led_linux) starts running, the power consumption fluctuates in Unraid.
In Debian, without the LED daemon running, power consumption measured:
The LincStation N1 costs only $399. While a typical mini PC with an Intel Celeron N5105 processor would cost much less, this is actually a very competitive price for a 6-bay NAS with this processor, as well as support for up to 48 TB of solid state storage, 2.5 Gbps Ethernet port, and a compact design.
Obviously LincPlus had to make some compromises to keep the price low, but most don't feel like dealbreakers to me:
Even though the LincStation N1 comes with Unraid including a year's subscription, it is not restricted to running just Unraid. This makes the device highly flexible for beginners and experts alike. You could just use Unraid as it works great. Or you could dual boot with another operating system. There is even the option of not using it as a NAS, treating it as just a mini PC with 6 drive bays.
Overall I'm very impressed with the LincStation N1 and see it as being great value with good functionality.
The LincStation N1 is available from Amazon or the LincPlus website. Customers who order directly from LincPlus can save $5 with the coupon code LDS2.
I'd like to thank LincPlus for providing the sample for review.
Liliputing's primary sources of revenue are advertising and affiliate links (if you click the "Shop" button at the top of the page and buy something on Amazon, for example, we'll get a small commission).
But there are several ways you can support the site directly even if you're using an ad blocker* and hate online shopping.
Contribute to our Patreon campaignor...
Contribute via PayPal * If you are using an ad blocker like uBlock Origin and seeing a pop-up message at the bottom of the screen, we have a guide that may help you disable it. Join 9,574 other subscribers
0 Comments
Post a Comment