Central Storage for Home is a must

Yes, you read it right. Central storage, like in work networks, it is a must at home.

You might have bought a laptop with SSD. It is fast, but it has 128 or 256 GB of space.

Maybe your desktop is old, and you don’t trust it.

You have a smart phone, and or a tablet.

Most likely you take a ton of picture of your kids, some videos. Most phones come with settings where the default the video is FHD, or QHD (1080, 2K, and even 4k). Soon you will need space in that phone or tablet.

You need to backup that old desktop, and the laptop does not have enough space.

Even if it did, it is a good idea to have it available for all devices.

Enter the central storage for home.

There are many ways to do it. The easiest is the NAS route. You can also use a spare desktop, but it needs to be always on, and you might need drives.

Small Net Builder also ranks and reviews NAS, so that is a good source of information as well if you were reviewing wireless.

The other option is to convert, or add the function to a desktop that is always on. You can just add the drives, and share a folder

A NAS has an advantage that it consumes usually less electricity, and since it will be on 24/7 this is something to have in mind. Also, generally their interface is simpler as well, and they come with several functions that can be turn on with a switch in the setup page.

A computer has the advantage, that you already might have one, it might also be faster (a lot of NAS are mini computers already), but more complicated to setup. Another advantage is that it is possible you already have a computer that is always on. In the end however it will require more technical involvement and setup. It is not difficult though, and there are lots of guides. I personally have an OLD machine, that cannot be used to even browse the web, but has worked great running Linux to store files as a backup.

One thing that people forget, is that you will need 2 central storage.

Don’t think of your central storage as backup, think of it as storage for your photos, videos, documents. This way you can keep storage available in your device, and also access those files from any other device.

The second storage is to backup the first one.

The good news is that the second storage only needs to do backup, and sometimes the main NAS/Computer can work directly with the backup.

This second storage can then simply be a USB drive.

Most NAS devices (and all computers) have USB ports, so you can connect an external drive to run a backup job, and most NAS have a function for the back up. The important part, is that you want this function to be automatic. WD MyCloud drives for example require that you login to the web interface and run the backup manually. Not ideal, because you will forget, and when you need to recover something from backup, it could be several months old.

I haven’t worked with all NAS, so I don’t have a recommendation there.

With a computer this is a bit simpler, because there is ton of software for backup that can run incremental copies (and this saves the most space).

Lets say you get a Synology (which seems to support backup automatically) and you put 2 drives of 2 TB each. Because you will set the drives as RAID 1 most likely (where the content is copied to each drive), your USB drive should be at least 3 TB. You need more space so you can keep extra copies of files (to recover different versions of the same file).

Ideally you don’t fill the drive either, so 3TB will give 4 or 5 times copies of the same data in a USB drive (it depends on how much the NAS is used, and how the backup software works).

You could however setup the drives in RAID 0 and get 4TB (read on RAID function first before making a decision). If you get the external drive and setup the backup from the get go this would be fine, mainly if you are expecting to fill the drive. You just need to be aware that the backup might be needed with a lot more chance (complicated math of risk of failure in process here, the risk is exponential, and not just double). This is a valid deployment, mainly for people who have a lot to storage and plan to replace drives in 2 years or less (drive prices keep getting lower and lower). Just think as backups as an essential part. They are recommended with a RAID 1 (it can still fail, or the NAS could die), but are a necessity with RAID 0.

Bigger systems can get more complicated with 4 drives, and they all have tons of features, from SMB (Windows/Linux and Apple file sharing), AFP (Apple File Sharing), web server (to host photo albums and share with family members for example), streaming protocol (store videos in the central storage and watch them from any device in the network, like the PS4 or Xbox)and even more advanced features (depending on the NAS) like auto posting to Facebook, connect a camera directly to the NAS to import picture, and other things. Most people only use SMB and AFP.

Even if you only had 1 laptop and a phone, the central storage is a great addition, and for small-scale like that there are cheaper, and even 1 drive solutions.

Once you start using central storage you will be wondering why you haven’t used it before, and although it looks like it is more complex than what you require with a little research or help you can get it running in no time at all.

Advertisements

What is the ideal hardware for home?

This will surprise most, but home and work lines are blurred. Most people do personal stuff at work, and some work at home.

Even if you did not work with computers, if you use computers at home, it will mimic a cheaper and simpler setup of a work network.

The reason is because a work network has to be efficient, reliable and keep costs manageable. Doesn’t that sound like something you want at home too?

I already posted about the laptop you would want. But what else do we need? Lets make a list

  • Reliable wireless router.
  • Good laptop
  • Maybe a desktop
  • Central storage device
  • Tablet
  • Phone

Tablet and Phone?. Well, yes!,  Most likely you already have them, and you use them. Bare minimum you already have a smartphone. So it should be part of your network.

Now, here is what people miss, and it is important. A reliable wireless router. Most of the routers provided by the ISP (Internet Service Provider, like Comcast, and Verizon) are plain simply trash. Even if the hardware itself is not trash, the software in the router (yes, it has a mini Operative System) is usually outdated and crippled. I have a long-standing fight with Comcast about their modems. Excellent hardware, bad software that crashes and you need to reboot the modem.

https://www.smallnetbuilder.com/wireless/

It has tons of reviews and charts. It is more technical, but then again I always recommend to do research, and wireless is very complicated already. Honestly, people undermine wireless function in the network, and should be the heart of it.

If you use a wireless router, make sure it is at the center of the house and not hidden away. Do not put it in the basement, do not hide it behind other electronics. Everything that it needs to pass thru to serve you Wi-Fi will reduce the range and quality.

If you still would like to hide it, then use Access Points to provide Wi-Fi.

I personally use a Buffalo Router with Wireless for my home Wi-Fi, and Ubiquity for the work Wi-Fi (at home). The Ubiquity AP has stronger range and sometimes it drowns the Buffalo thought, but the Buffalo has stronger range than most cheap Linksys and Verizon routers, and I used to live in an apartment where all neighbors had Wi-Fi.

Lots of option, lots of products, but generally think of $150 for the router, and $100 for AP, and be aware of 2×2 or 3×3 (how many antennas per band, so 2×2 means 2 antennas for 2.4 GHz, and 2 for 5 GHz). Depending on how many devices and the location is how many antennas you will need (and that is the reason you want the Wi-Fi antenna in the middle, and not hidden).

Also, don’t just throw money in a 4×4 router if you will put it in a corner of the house. Since the antenna are semi directional the antennas pointing away will not be servicing Wi-Fi to the house. I am not going to expand on that. There are guides in small net builder, and other places on the web. But I want to re-iterate that location has an effect on Wi-Fi, as well as quantity of devices, and material (the common plastic tile in kitchens for example completely blocks Wi-Fi).

In the end your wireless and the router matter.

If you can, also use your own Modem (cable systems, you cannot with FIOS)

For a good laptop, I already posted, an article.

The Desktop should be similar.

So, now we are left with a central storage device. I will save this for a new article, because it is also something that most people think about, but most people need

 

Zorin OS (Linux) source update

Today I finally decided to fix several
errors that were happening while I was updating a Zorin OS box.

I update over the network using ssh, but this will also work if you do it
from the terminal.

First, we will update the source list

sudo apt-get update

Now we will run the updates

Sudo apt-get upgrade

All normal now. I mean, we run these commands weekly right? (or at least you
have your system to auto update)

The difference today is that we will read the output.

Lets fix the easy parts fix. Most likely we have old kernels (when running
upgrade it would have notified you)

sudo apt-get autoremove

That was easy…but wait, it is possible that it will complaint about running the auto
loader because a symbolic is damaged. Silly link. Usually this does not cause
problem, but lets just fix it. The exact error is “The link /initrd.img.old is a
damaged link “ “you may need to re-run your boot loader[grub]”

sudo update-grub

OK, now that is fixed, lets fix the other possible errors.

Running update we could get either or both of these errors

We could google the errors. That is the easiest way, and probably how you got
here, but let’s do the short version instead.

The Google error looks worse so lets take a look at it.

It is complaining that the package could not be found. This is because Google
Chrome is not available in 32-bit anymore. So first lets check If your system is
running a 64-bit OS

uname -a or you can also use arch

If you would like more information check this page


http://www.howtogeek.com/198615/how-to-check-if-your-linux-system-is-32-bit-or-64-bit/

If your system is i386 or 686 then the best option is to uninstall Chrome,
or re-install Linux using 64-bit. If you are running 64 bit (x86_64) then we can
switch the repository.

To switch the repository to 64 bit you will have to modify the google sources

cd /etc/apt/sources.list.d/

Once inside the folder then we will edit using nano

sudo nano google-chrome.list

Once we have the list open, we can
see that most likely there is only 1 entry and it is for Chrome. To this entry
we will add right after deb and before the link,
[arch=amd64]
. This will force apt to download the 64 bit
binaries. The line should look as

deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main

Now we can update the repositories again and that error will be gone. Then
run an upgrade to update Chrome. You can also for an install by entering

sudo apt-get install google-chrome-stable

This is easier and faster than it seems, but I don’t like to post commands
without explaining what they do

That is all. Good night…..Just kidding, we still have an Opera issue.

Opera has been moved to the repository, so the easiest thing to do is just
remove the source. Remember that to fix Chrome we went to the sounces.list.d
folder? We should still be there. Run a directory listing (ls -la) and you will
see 2 files.

  • opera.list
  • opera.list.save

You have 2 options. You can comment out the line or just directly remove the
file.

Before though, I would check whether you have it installed.

sudo dpkg --get-selections | grep opera

If nothing shows, then it is not installed. Simply comment out or delete the files (I suggest to delete to keep a clean system).
That is it.

If it was installed, simply remove it, then comment out or delete the source
file.

If you like Opera download the deb from their web page


http://www.opera.com/computer/linux

It seems like a lot of work, but honestly I have no paid attention to the
errors for quite a while. So it is not as bad. All of these can take less than
10 minutes to fix if you just enter the commands.

I had to search a bit, and still took me about 30 minutes to get everything
fixed, which is not bad.

The Chrome information I got it mainly from a Reddit page. Opera from the
Ubuntu forum. I had to customize a bit the commands because they were for Ubuntu
Vanilla, and I am working from a Zorin OS (based on Ubuntu/Mint), but the
commands should work in all flavors. This is the main reason I explained what
they do

At the end I also like to do a reboot to make sure everything is working
peachy :). It is a good practice, because if something is not working as it
should, the information is still fresh in your mind and it is easier to
troubleshoot. To make it clean I also always restart before I start.

Creating virtual Servers in Apache2 using Webmin

I like Webmin. I can edit configuration files, but I have to confess that I don’t memorize the syntax and options, so I only edit configuration files when I need to copy and change something quick.

If I am going to start from scratch, then Webmin is my tool of choice.

I won’t cover in detail how to install Webmin, but here is the short version:

Using apt-get

sudo sh -c 'echo "deb http://download.webmin.com/download/repository sarge contrib" > /etc/apt/sources.list.d/webmin.list' wget -qO - http://www.webmin.com/jcameron-key.asc | sudo apt-key add - sudo apt-get update sudo apt-get install webmin

If you use Yum:

(echo "[Webmin] name=Webmin Distribution Neutral baseurl=http://download.webmin.com/download/yum enabled=1 gpgcheck=1 gpgkey=http://www.webmin.com/jcameron-key.asc" >/etc/yum.repos.d/webmin.repo; yum -y install webmin)

I recently created a few WordPress sites, and I had to create the Virtual Servers for it, and I run into a tiny problem where it was not working correctly. It drove me crazy, but I took some notes.

Using Webmin is easy to create virtual Server. The process is documented in the Wiki at http://doxfer.webmin.com/Webmin/Apache_Webserver#Creating_a_new_virtual_host

However, we all know that we can’t read that much.

To create a Virtual host (virtual Server) go to Servers>Apache Webserver. Then click the tab that says “Create virtual host” (yes, to create a Virtual Server, you create a virtual host).

The options here are pretty simple. VERY SIMPLE!!

Handle connection to address. Basically this is for multi-homed servers more than anything. If you have an IP address per site, here is where you specify it.

Port: This is the tricky part. You would be tempted to put any right? Here most of the time you actually need to listen the port, which most of the time will be port 80.

Why? Well, because in my case I am using a specific server name, and in most cases you would too, so it seems to work better when you set the port instead of just using default (at least with WordPress).

If you need to use https, then you will have to have a certificate per site, and also a IP address per site. So that will change the previous setting. Adding SSL is similar, but the port would be 443, and you need to have mod_ssl enabled too.

Next enter the root location for the site (for WordPress for example it is /usr/share/wordpress).

After that enter the server name. This would be thisismywebsite.com for example

The Add virtual server to file section should be as it is. The default is to add to sites-available, and then it creates the link in sites enabled.

Finally you can copy directives if needed (this is useful when you have special folders, or if using SSL with a wildcard certificate).

Then click on Create Now, and then apply changes. Your site should be working now.

There are more settings, and it is a good idea to refer to Webmin documentation, or the Apache documentation for them, but my most used settings are

Alias and Redirects. I use it when I have sites located under a master root, but the folders are in other parts of the system. I do this a lot to save files in a separate partition.

In Networking and Addresses you can add alternate virtual server names.

Directory indexing is another section you might change, if you need to edit access to files.

Edit directives is basically a quick access to the configuration file for the site. So also useful.

That is it!!! Were you expecting something more complex? You can do more complex, but it is always good start simple, and then more from there. The 2 things to watch out for is port 80 instead of any, and dedicate an IP address per site when using SSL.

Of course there are a ton of configurations changes and mix and match settings, but this way you can quickly have multiple sites running.

What is RAID?

This is a kind of technical term, that it is used a lot in IT. Sometimes it is difficult to explain what RAID is, and why you should use it.

Spiceworks has a great video, only 4 minutes long that explains how RAID works, and even non IT people can understand it.

What is RAID?

Note: RAID does not refer to the bug spray, but to a storage technology used in business usually. Even SMB (Small Business) should use RAID.

Another thing to note, is that for a ton of technical reasons I use RAID 10 (or RAID 1+0, or RAID 01). Worst case scenario I would use RAID 1 (when 10 is not supported).

The article One Big RAID 10: The new standard in server storage it talks about the approach to RAID 10 and why it should be used

But the main thing that you should know, is that I actually have had to restored broken RAID 5 arrays, and I have seen 2 RAID 5 array die during re-build (when replacing a damaged drive)

 

The value of Backup and Data

I have been bad….Very BAD!!!

When I started the blog, my second article was about home backups. I started writing in and before I knew it was too long. My problem was that I was covering too much ground, and too much information was fresh in my mind.

Well, after my article in NMS I decided to write this article, which will initiate a series of backup articles. Why a series? Because there is too much information that I need to give you, and I don’t want to bore you too much, but backup is so important and people give it so little importance that I MUST convey the information.

What is the Value of Data?

This is more than anything a question you need to ask yourself. The data that is important for me, might not be important for you. And Value is even harder to assign. However there is a some easy points to quantify it.

  • Can you reproduce the data?
  • Is it easy to host someplace else?
  • How vital is to have it available?
  • What is the damage if the data is loss (economical and emotional damage)
  • How often you need it?

None of these questions by themselves can put value on data, but all together can. For example, the perfect picture of the first birthday of your first child. You might not look at the picture all the time, but if lost forever, you will remember that picture and be sad about it (emotional value). However because it is a picture, it also probably is hosted in Picasa, Flicker, Facebook or another social media. Maybe not with the same quality, but at least you could save it from there if lost from your computer. In the other hand, lets say your QuickBooks database for your small/home business. Maybe it won’t slow you down a lot if lost, but could have several repercussions when tax season arrives.

Now that we have an idea how to assign value to data we should…

Assigning Value to Backup

The Value of the backup is proportional to the value of the data. It sounds like a lot of mathematical terms, but simply, if the data is very valuable (or invaluable) to you, then the backup is as well. VERY SIMPLE, No?

Although this is simple, I am still amazed to find out most people don’t backup regularly. I know I don’t backup most things at home almost never, however, most of the data I don’t care for (even though I have TB of data), and the data I care about I have it in my desktop, my laptop, my work PC, my old PC. It is a mess, but if needed I can recover most of it.

This mess of data also brings another value for the backup. The backup will organize your data. It is more is a consequence of backing up, but when you start to plan and put in effect a backup plan, you also end up organizing your data, which later will save you time when you are searching for it.

Extra benefits of backup

When setting a backup you will have plenty of benefits, some are more visible than others, however these are the ones I can think of right now:

  • Data can be recovered
  • Data will be replicated
  • Data will be organized
  • Data will be centralized
  • Multiple versions of the same data (*1)
  • When replacing computer, moving data is easier
  • Backup can be backup again (maybe to cloud, offline, remote)

(*1) It depends on the backup, but at least you have 1 older version of a file. Some backups support more than one version. If a file becomes corrupted, or you save a change that you want to undo later, an older version is the solution.

Thinking about value

Unless you do all your work online, and you don’t care about data, there is always something you won’t want to lose in your computer. Backing up is important, and have extra benefits beyond being able to recover data, and the value of the backup is the same as the data, or even greater in most cases.

So, do you value your data? And are you backing up your data?

Network Monitoring Service. Is it necessary?

If you are looking for the absolute answer it is NO. Why? Because if your network is working without it, it will continue to work without it.

So, is that it? Absolutely not. Think about it, in life as well in IT, we do a lot of things not because it is necessary, but because it improves a condition, or it is better in a sense.

For example, when you buy a computer, I can almost guarantee that regardless of the reason you always get a better computer. In a concrete example, your laptop gets dropped in the floor and it is completely destroyed. The laptop itself was about a year old. Do you need a new laptop? Most likely not, but it will help you so much with a ton of work, browsing the web, etc. We actually depend on our electronic devices, but they are not a necessity. To add to it, you most likely will get a better laptop, since in a year, for the same price you can get something better. Here comes another point, you didn’t need the faster laptop, but you reason that you NEED it. Confusing, right? You could  buy a used laptop way cheaper.

Network monitoring is a bit more abstract, but has the same reasoning. You don’t need it. Your network works without it, but if you have it it will help you do so many things better and faster, just like a new laptop Smile

However, there is a common misconception that network monitoring can be expensive. And it can be, but it can also be really cheap, almost free.

The most expensive part of network monitoring is the Human Resource behind it. If you want a 3rd party company to set it up and manage it, that can be expensive. However if you limit the Human Resource from a 3rd party then the cost can be manageable even for a small company with no budget. Even a non profit organization should have network monitoring.

So maybe you are wondering why I say it is not necessary but that it should still be deployed. Well, lets not get confused. Lets take this scenario as an example. A Non profit organization with 15 people that don’t have a Network Monitoring System (NMS), but they are having problems with the computers. This non-profit organization does not have an IT, but they have an unofficial “IT” person, who when needed calls a real IT, that either donates time or needs to be paid. The in house “IT” is at a loss over what could be the problem, so he calls for support, but right away is asked a ton of technical questions that he cannot answer. In the end, the “paid” IT has to visit the site and troubleshoot from scratch. This troubleshooting can be time consuming, and it is essentially information gathering. Regardless of whether the time was “donated” or it is paid, for the organization it is an expense. Donated time wasted in troubleshooting could have been used in something else for example. With NMS setup right however, a quick look could have shown if the problem is in a specific time, if it is network, CPU or hard drive related, it can be historical data on the status of the network.

An added advantage of NMS is that it will show problems before they happen. For example a hard drive failing will be recorded. A server that has high CPU usage, or low memory will be recorded as well, and then it can be remediated before it becomes a real problem.

Cheap NMS setups

There are actually a lot of NMS that can be setup cheap, but I will list the ones I know and I use.

  • Xymon/Hobbit: This is my favorite quick NMS. I can setup this from scratch at my work in about 1 hour. It does require a little more understanding about the network, and what you want to monitor, but it excels in easy to setup once the configuration file is understood (and it is quite easy). The main advantage, is that it is fast. Check the demo site. My Xymon setup is running in a 1 GB RAM virtual machine that is hosting a ton of intranet services. It monitors 88 host, in 6 pages with 399 checks.
  • SpiceWorks: it tries to be an all in one IT service, and it does it pretty well. The NMS is pretty inclusive, with reports and dashboards. Very easy to setup, however it is a bit more demanding and requires a Windows PC joined to the domain. I use it when I want to see more information. It supports Linux, VmWare, OSX, and other types of clients. You can be up and running in 10 minutes, but it takes a little longer to configure the application to the network later (fix labels, complete information, etc). The community itself is the tool I use the most though.
  • Nagios: Very powerful and popular NMS. It can also be very complicated. There is a commercial and Free edition. The commercial is free also up to 7 devices.
  • Foglight NMS: Free up to 100 devices. I haven’t actually test this software yet, but it looks interesting. Footlight does have a lot of different services, the NMS will monitor only the network in general. There are Foglight applications for more in depth monitoring of Exchange, SQL, Oracle, AD, etc.
  • BigBrother: Just like Foglight, Big Brother is owned by Quest Software, however, there is no more development for the Free edition (last update in Dec 2005). This is still a great product however, and the Professional version has attractive dashboards. I have a feeling thought that Quest is moving towards the Foglight product instead. Xymon is a fork of Big Brother, and it is still being developed, so I recommend Xymon instead.

So here are 5 products that are Free. Only requirement is to have a PC to run them from and a little time.

Observations about NMS

You can run the NMS system in different PC types, however have in consideration that they are network applications. To this extend have in mind these recommendations:

  • Spiceworks only runs in Windows and if scanning a domain, the PC needs to be a domain member (this requirement might have been removed in version 5). This means that you will need a Windows License for the machine running it. Also, it is not recommended to use the PC for something else. Spiceworks will make the PC slower, or the dashboard will be slow when using the PC.
  • In my case, we use Spiceworks in a PC that is used occasionally, by someone that is usually out of the office. This person however only does web browsing.
  • Windows and Linux have different ways to work in the network. Linux being much more efficient, mainly in the ping part.
  • BigBrother and Xymon run better in Linux.
  • Xymon has packages that are easy to install in rpm and deb format, making an Ubuntu or Mint server an easy choice (and my usual choice). Install in Ubuntu takes 10 minutes (including dependencies) and it is as easy as running “sudo apt-get install xymon hobbit-plugins”
  • fping is a must is building the NMS, mainly in Windows. The Windows ping utility can easily freeze a NetJet network card from HP for example. Fping will ping faster and reduce the latency
  • BigBrother and Xymon require a small client to check local status of remote devices, like CPU, RAM and event log, but can check other services without a client, for example SMPT, AD, DNS, etc
  • Spiceworks does not require a client, but requires access to the  machines using credentials. For a Bring Your Own Device (BYOD) network is not recommended. BYOD is when the computer (usually laptop) is owned by the user, so there is no centralized login.
  • a run of Xymon in a virtual Linux server runs faster than in a Windows XP machine (and it is tons easier to setup)
  • This is not a NMS specific, but Linux has advanced a lot in the last couple of years in installation and maintenance, making it very easy, so it should not be avoided.

What about the big names and expensive solutions?

They are certainly not bad, and it is not a bad investment either, but if you are considering paying for those solutions then there is no reason to be wondering if NMS is for you after all.

When NMS becomes a necessity

As the network increases NMS advantages become easier to see. A network bigger than 50 devices should have NMS, regardless of budget. The reason is that you have at least 50 different point of impact in the network and troubleshooting time alone will justify the spending time, money or both to get a good NSM in place, but as I already listed, NMS does not necessarily need to be expensive.

A local IT provider usually has hosted (remote) and monitored NMS. These can be expensive, but if there is no time, or willingness to check the NMS then it is certainly a option to consider.

Monitoring a Network Monitoring System

Although it sounds redundant, the NMS needs to be monitored as well. After all, at heart the NMS is a data gathering system. Sure, it can have rules, alerts and actions, but no system, no matter how expensive it is, is completely automated. Here is where the cost I described initially comes into play. The best scenario would be to have someone in house looking at it (over coffee for example) every morning. An offsite NMS would usually include something like that. If looking to have it externally managed I would recommend to have it monitored at least once a week, with an option to check onsite on demand. The value of the monitoring is in the value of losing service in the network and how valuable is the data (another article on Backup and Data value in the horizon).

Closing thoughts

  • Network Monitoring Systems (NMS) are not a necessity but you should still use one.
  • NMS can be free if you dedicate a little of time and love to it.
  • NMS is good even for small companies
  • NMS can even be used at home.
  • There are bigger NMS packages available for free. For example Nagios XI up to seven devices, and Foglight for up to 100 devices.
  • An old Linux box with Xymon can be a great NMS monitoring a lot of devices.
  • If more services and support are needed later on, there are packages available, but at minimum even a not fully configured NMS will save valuable time troubleshooting

So, with no excuse. In the time that it took to read this article (even if you only read the intro and summary) you could have installed a NMS.

Draw your expectations of the NMS (don’t go too crazy if no budget is allocated, but don’t short it either), and list what is important to monitor.

Now, I am throwing a shameless marketing plug. If help is required, or you need advice to put a NMS in place, please contact us.

By now you probably already know that I am a fan of no-nonsense deployments, so rest assured that your NMS deployment will be in the expected range.. This reminds me of a personal story…..

The first time I looked into NMS, I contacted a 3rd Party vendor. At the time I was serving a mid size network, with requirements of big company. However this was more of a personal project (a project that I though was good for the company, but the company did not shared my thoughts). The proposal was about $30K for a year, with an $50K optional. Sure, all the bells and whistles were good, but I needed something lower at that time. I did not let scary proposals scare me, and I kept looking. A week later I had Big Brother running (it took me 4 hours to setup at the time), but 2 months later it saved me over 16 hours in helpdesk tickets, because a lot of problems were noticed before they happened, or I was able to quickly diagnose. Would the 30K saved more time? Yes. Would I have been able to get it budgeted? No!. In this case, even a little NMS is better than no NMS.