My colleague and I are having a debate at work. We are currently in the process of upgrading several of our remote branch sites from from old Cat5 spaghetti mess to Cat6/6A and also upgrading their switches to our enterprise standard in the process. We typically do two drops per office for a computer and possibly another device such as a printer, IP Phone, etc... These are complete cabling overhauls where none of the old cabling will be used. We are remote network support so we are not onsite at these locations unless absolutely necessary.
The question - Would you buy the necessary amount of switches to support patching all jacks regardless if there is an endpoint plugged in or not, or would you just plug in what's necessary to save port capacity?
I'm on the side of plugging everything in:
We usually don't have an onsite IT person so patching everything in would keep them from adding patch cables and create a future mess in the MDF/IDF.
It's easier to go through and cutover from the old cabling to the new cabling since all jacks would be active - instead of tracking which ports have endpoints on them versus which ones don't.
We're already paying ~$300 per drop, if another switch investment is needed to support those unused drops, it's only an additional $46 per port (assuming we're paying $2200 for a switch, hence $2200/48 = $46/per port, excluding switch support)
His reasoning for only patching in what's needed:
Saves switch port capacity
Reduces security risk by not having open ports where an outsider can just plug into our network (this is a moot point in my opinion because we are simultaneously moving forward with a Clearpass NAC deployment)
What would you typically do for new cabling installs?