Central Storage for Home is a must

Yes, you read it right. Central storage, like in work networks, it is a must at home.

You might have bought a laptop with SSD. It is fast, but it has 128 or 256 GB of space.

Maybe your desktop is old, and you don’t trust it.

You have a smart phone, and or a tablet.

Most likely you take a ton of picture of your kids, some videos. Most phones come with settings where the default the video is FHD, or QHD (1080, 2K, and even 4k). Soon you will need space in that phone or tablet.

You need to backup that old desktop, and the laptop does not have enough space.

Even if it did, it is a good idea to have it available for all devices.

Enter the central storage for home.

There are many ways to do it. The easiest is the NAS route. You can also use a spare desktop, but it needs to be always on, and you might need drives.

Small Net Builder also ranks and reviews NAS, so that is a good source of information as well if you were reviewing wireless.

The other option is to convert, or add the function to a desktop that is always on. You can just add the drives, and share a folder

A NAS has an advantage that it consumes usually less electricity, and since it will be on 24/7 this is something to have in mind. Also, generally their interface is simpler as well, and they come with several functions that can be turn on with a switch in the setup page.

A computer has the advantage, that you already might have one, it might also be faster (a lot of NAS are mini computers already), but more complicated to setup. Another advantage is that it is possible you already have a computer that is always on. In the end however it will require more technical involvement and setup. It is not difficult though, and there are lots of guides. I personally have an OLD machine, that cannot be used to even browse the web, but has worked great running Linux to store files as a backup.

One thing that people forget, is that you will need 2 central storage.

Don’t think of your central storage as backup, think of it as storage for your photos, videos, documents. This way you can keep storage available in your device, and also access those files from any other device.

The second storage is to backup the first one.

The good news is that the second storage only needs to do backup, and sometimes the main NAS/Computer can work directly with the backup.

This second storage can then simply be a USB drive.

Most NAS devices (and all computers) have USB ports, so you can connect an external drive to run a backup job, and most NAS have a function for the back up. The important part, is that you want this function to be automatic. WD MyCloud drives for example require that you login to the web interface and run the backup manually. Not ideal, because you will forget, and when you need to recover something from backup, it could be several months old.

I haven’t worked with all NAS, so I don’t have a recommendation there.

With a computer this is a bit simpler, because there is ton of software for backup that can run incremental copies (and this saves the most space).

Lets say you get a Synology (which seems to support backup automatically) and you put 2 drives of 2 TB each. Because you will set the drives as RAID 1 most likely (where the content is copied to each drive), your USB drive should be at least 3 TB. You need more space so you can keep extra copies of files (to recover different versions of the same file).

Ideally you don’t fill the drive either, so 3TB will give 4 or 5 times copies of the same data in a USB drive (it depends on how much the NAS is used, and how the backup software works).

You could however setup the drives in RAID 0 and get 4TB (read on RAID function first before making a decision). If you get the external drive and setup the backup from the get go this would be fine, mainly if you are expecting to fill the drive. You just need to be aware that the backup might be needed with a lot more chance (complicated math of risk of failure in process here, the risk is exponential, and not just double). This is a valid deployment, mainly for people who have a lot to storage and plan to replace drives in 2 years or less (drive prices keep getting lower and lower). Just think as backups as an essential part. They are recommended with a RAID 1 (it can still fail, or the NAS could die), but are a necessity with RAID 0.

Bigger systems can get more complicated with 4 drives, and they all have tons of features, from SMB (Windows/Linux and Apple file sharing), AFP (Apple File Sharing), web server (to host photo albums and share with family members for example), streaming protocol (store videos in the central storage and watch them from any device in the network, like the PS4 or Xbox)and even more advanced features (depending on the NAS) like auto posting to Facebook, connect a camera directly to the NAS to import picture, and other things. Most people only use SMB and AFP.

Even if you only had 1 laptop and a phone, the central storage is a great addition, and for small-scale like that there are cheaper, and even 1 drive solutions.

Once you start using central storage you will be wondering why you haven’t used it before, and although it looks like it is more complex than what you require with a little research or help you can get it running in no time at all.

Advertisements

Updates coming

Ok, so I have been neglecting the blog for a while, such is life.

However I have been playing with some new technology, and different projects, so I have new ideas, and I would like to share them. Worst case scenario this would be my documentation of the ideas

Digital Signage Thoughts

Digital Signage is what you see everywhere now a days. The concept looks rather simple, it is a large format TV displaying some kind of content, usually promotions. The implementation though, it is not as easy.

I have been working in a digital signage project for a few years now. I have put a lot of time, but probably not as much as it needs on the project.

This will be the first in a series of articles, but I wanted to start with some thoughts on it before I go more in deep. This will help anyone also considering digital signage.

  • Cost of Hardware: This is an easy thing to consider. The problem is that we should not put too much emphasis on it. For example, the cost would be TV cost + player (usually a computer or Android based device)
  • Cost of Installation: Another part that needs to be accounted well. Mounting TVs in drywall is a challenge to do it right. If you are not confident enough to do pull ups from the mounting bracket then it is not mounted right. When possible consider professional installation. After installing several I was taking about 1 hour or more per TV (routing power, network cables as well, and finally using the right hardware)
  • Cost of maintaining the project: This is the big part that gets overlooked. And it will vary according to your environment. Don’t go with the cheapest solution, if it will require more time to implement and have running. The reason, is that human resource time needs to be considered as well. I am constantly managing people because they do not know how to maintain the TV for the digital signage. It sounds easy, but the truth is that people will not do basic steps like turning on the TV. This goes with Cost of Hardware point.
  • Distribution of content: this is how you will distribute the content, either using a digital signage distribution system or something homebrew (like HTML)

Now that you have this basic points let me expand a little as a reference.

When considering the hardware you have digital signage base TVs and regular consumer TVs. After a bit of search, we used LG consumer models. They were priced right and have pretty good quality. They were also half the cost of the digital signage model, so this allowed us to use 55″ instead of 42″. Then we used Android base sticks (these are really old now), and Lenovo Tiny M73 computers. The initial test worked great, so we deployed 50 of these.

And that is where the problems starting surfacing

We did not account for people not knowing how to use TVs. People forget to turn them on, did not understand what an input is, and cannot configure the TV.

Our next problem was misinformation. At the beginning we were going to use a smart stick that was powered by the TVs USB port, but we had to change it because the Android device was slow. The users still though just turning the TV turned the PC as well. The problem there was that information got simplified when it was presented the first time (almost a year before full deployment) and the new information was not retained.

We also did not like Windows 7 (Windows 8, and later 8.1) were better for large display, and installation increased deployment time.

(I will expand on the deployment in another article)

What is the underlying problem?

Accountability. If you are using your own equipment, you make sure Wi-Fi is connected and the laptop is on. When it is the company’s, the user simply “does not want to be bothered with it”.

Sounds easy to fix. Simple enough….We covered the issue on meetings, all management agreed, but still something happened when deploying communication downstream. A few weeks later the same problem happened.

What is the solution?

Doing the proper research before the project, and putting all numbers together. We saved a lot of money in hardware, but we spent more on maintaining and making sure it is running. Now the project is running smoother, but not as much as it should.

Quick Recommendations

Use digital signage equipment. Not because it is rated to run 24/7, but because it can be remotely controlled. For example Samsung/LG/NEC/[take your pick] will have a RS232 port, or some connectivity for a controller card. This will allow you to set the TV to turn on automatically, and turn off automatically. It can disable all input ports but what is selected, and will remove end user errors. The cost per installation will double, but the ongoing maintenance will be reduced.

Also, set the right expectations. We went crazy with what we wanted this project to do. 80% of what we were planning was not implemented, and we could have done a simpler job

What is next?

On the next article I will describe what we did, and how we did it, with the challenges we discovered. After that, what I think we should have done with what I learn. Hopefully you can avoid the learning curve.

SteamOS and Steam Hardware thoughts

You might have heard of SteamOS or read an article about how it will impact consoles.

SteamOS is basically to PC games what ChromeOS is to web browsing. It is an Operative System (the OS section) that is designed specifically to run steam games.

Steam is a publishing platform for video games and other general software. What made steam successful was the easy to distribute games, however there are other things that favor steam.

The average gamer will buy 1 game a month or so, and usually at its $60 price tag. There are exceptions and variations, but this is what console makers based on. It is because of this that big franchises games, like Halo and  Call of Duty sell so well. Halo 4 grossed $220 million on the first day. So it is a big industry. Because of this, most console center around the blockbusters.

However there are a ton of other good games around, and not all gamers are willing to spend $60 for a game. When this happens, Steam shines.

PC games also have a great concept. I can still Play Need for Speed 2, which was released on 1997, or play Silent Hill 2, released on 2001. While a remake was released for Silent Hill 2, I would have to pay $30 to get the remake (with similar graphics as the PC version) and Silent Hill 3. While it is a great price for 2 games with better graphics, and I do recommend it, it is an extra cost while I already own both for PC.

Silent Hill 2 has some compatibilities problems with Vista and up, but runs great on XP, in a virtual machine, or in Linux even using Wine.

Steam has available Final Fantasy VII. So now you can play the game released in 1997 without having to tweak the PC. This is another point where Steam shines. The games usually just run.

Gog.com offers similar experience, but it does not have the all in interface.

There is another big win for Steam that console and other distribution platforms have not mastered. The PROMOTIONS. For example, at the moment of writing Steam has a promotion for Flatout complete Pack, which includes 4 flatout games for $9.99. The promotion says it is $75 off out of $39.99. While Xbox and PSN have similar promotions, the prices are usually not adjusted. So Flatout would cost in comparison $39.99 on these networks. I could not find the game to buy online on the networks but…

Lets take a look at Painkiller Hell and Damnation. $29.99 in Xbox Marketplace, but it is $19.99 in Steam. That is $10 difference. Considering Steam runs promotions it would not be rare to find a Painkiller promotion with all the games for the price of 1. The average 1 year old game is about $30 on Xbox360 and PS3, but for PC that game would cost $15 or less.

Considering most distribution platforms require you to be online to play Steam has a clear advantage here.

The problem they face is to provide a consistent experience for players. Not all computer games are enjoyable with a Joystick. Playstation and Xbox lose money on the console, but regain it in the games. PC games have a higher upfront on the computer, but the games are cheaper (there is no license to pay to the console makers).

However considering a new Xbox One cost $500, which is the cost of a decent PC, and you can get a fairly decent video card for $150, PC is not so expensive compared to consoles.

Lets consider, even if you were to spend $1000 on a PC (or Steam hardware), you could run any of the games you already own on the PC, plus you can run your software (Productivity programs, photo editing, etc), watch online media like Hulu, Netfix, Amazon on demand and with Steam you can ply your games in the TV (you can technically play any game on the TV since it is a big monitor). Depending on how many games you buy, and if you can stay away from brand new games, you could be saving a ton of money and recovering the cost of the PC quick.

12 games at $60 each is $720. I make it a point of not spending more than $10 per game (I buy online, from gog.com, steam, amazon and gamestop). Even at top price of my game, I would buy 72 games. If I buy packs, then I could have over 100 games in a year.

I currently have 170+ games on steam, but I have over 700 games in total, and at least over 400 of those are PC games. I have been playing PC games since 1992, and every once in a while it is nice to play an old game. For example X-Wing, which you can buy now for almost $100 thanks to being a collector’s item. If you wanted to play an old console game, you would have to connect your old console, or buy the game again in the console market (which might not work with the next generation console).

In PC gaming, there is always a way to make the game work.

So, I think the steam hardware is a good idea. Because of the flexibility of PC gaming, it will be though for Steam, but more choices cannot hurt the player, but rather give more power.

BTW, I had a PC connected to the main TV before. It is not connected anymore because Xbox does what we need most of the time (Hulu and Netflix with occasional gaming), but I end up doing most of my gaming downstairs in my main PC. However PCs now can be small and have enough power to run most games, so it is a possibility.

Gaming PC is a viable gaming type, and it has a user base. Steam Hardware will have its place as well so I think that although it might have a rough start, it can be rewarding for people that want to try it. Since I have spare PCs at home, I will try Steam OS.

Let me know your thoughts