Transmission Media

There are several types of cable which are commonly used with LANs – Local Area Networks and today’s topic will introduce the two most common cable transmission media – copper and fiber-optic cables. But first we will discuss why we need transmission media in the first place and how we got to where we are today.

For wired networks the cable is the medium through which information usually moves from one device, computer or network to another. There are several types of cable which are commonly used with LANs – Local Area Networks and today’s topic will introduce the two most common cable transmission media – copper and fiber-optic cables. But first we will discuss why we need transmission media in the first place and how we got to where we are today.

The Need for a Transmission Medium

The first recognised computer network medium was known as sneaker-net. This was because way back when in the early days of computing those users who wanted to transfer or share data with other users would make a copy onto floppy disk and get up and go to the intended recipient and hand it to them.

The sneaker-net came into being as observers; usually other university types noted that the majority of these young computer types wore sneakers and since they used their sneakers as the primary means of transporting the data the term sneaker-net came into use. It was probably not meant in a complimentary way. The term “geek” hadn’t even been thought of at the time or else the term may have been “geek-net” but we will never know.

Oh! And by the way the floppy disks that I am referring to here are the original floppy disks which were large, flat, round, non-rigid, discs of magnetic medium and not the small 3 1/2 “ types that were and are still used by some people today.

The capacity of these disks; which was considered to be quite large at the time, was around 320 Kilobytes. The more modern varieties have a larger capacity with the high density varieties that can still be brought today having a standard capacity of 1.44 Megabytes.

Today most new machines do not even have a legacy floppy drive included which is probably just as well. They were not the most reliable of devices and by today’s standards their storage capacity was miniscule. Today we might wonder just what use; if any, could a floppy disk be? Well back then computers were primarily code-oriented and the output of human “friendly” text was a considerable and notable achievement. Then we started to see a new and wonderful device being attached to the computer. It was called a keyboard.

Up until then input had been in the form of punch cards and tape. Except for the multi-mega resource endowed who were able to afford magnetic tape reels. But for the average enthusiast they keyboard and floppy disk drives were the rage. Then we saw another new device come into being; the monitor. Well; now things were really getting up to speed, we could read text (in monochrome of course) on a television type device. The world was really becoming a truly wondrous place.

Things only got better as hard disk drives that were “affordable” and compact enough for the smaller computer were mass produced. With capacities ranging up to 60 Megabytes these storage giants and they were big and heavy and most definitely not portable as we know the term today. Now we could load a program into the computer and store it on the internal hard drive and then whenever we wanted to start the machine would had the option to start the program stored on the hard drive and run with that.

As more programs were developed and the hard drive became larger we saw the introduction into mainstream; although still enthusiast computers the need to manage these programs and this need resulted in the creation of what I will loosely call the early disk operating systems. They were refined and added to as time progressed and the term for them became the abbreviation DOS which many of you may have heard of. It stands for Disk Operating System.

Initially this was either pre-installed or factory upgraded but later it became possible to install this disk operating system from floppy disk which had also evolved into larger than 360 Kilobyte capacity disks by then. In fact 720 Kilobyte disks were very popular by now. Well is it has a habit of doing time did not stand still and software progressed in its capabilities and size so that multiple disks were required to install the disk operating system and most other programs as well.

Fortunately manufacturers were introducing larger capacity hard drive and floppies were evolving into the 1.44 MB capacity disks we know today and all was well in the world. Software evolved further and became ever larger with ever increasing storage requirements.

Major corporations were using mainframes with large capacity magnetic reel storage which was wonderful from a storage capacity point of view but the manner in which it operated was linear which meant that much turning of tape was required to access various data when required. The answer of course was to schedule access to increase the efficiency of use but that had its limitations.

In the mean time some bright spark in the computer department was talking to the mother via long distance telephone when the thought occurred to them that if they could converse over such a long distance surely machines could do the same. Okay maybe not over long distance but across the room would be handy. So it’s off to the communications department that our hero goes. Upon arrival he asked the guys down there the big question. The reply he got nearly made him wet his pants in excitement.

Machines have in fact been communicating quite well over long and short distances for quite some time now. “Go down to Wall Street and check it out for yourself” was the com guy’s reply. No need; our hero knows that if he goes to executive accounts and finance there are machines there that are always in touch with the money men down-town. So it’s off to finance that our hero goes.

Once there; merely observing the ticker tape machines was inspiration enough. Using the telephone lines was the inspiration that he had. So off to the lab and sometime later our hero has rigged-up a contraption using readily available parts; cheap was a prime motive here. And so the first basic modem came into being. At least we don’t need to take disks out of one machine and transfer them to the machine next to it by hand any more.

With the passage of time this novelty become a storm and the military decided to get in on the act an established a special research project known as ARPA. The network they created was called ARPANET (not much in the way of imagination here). The important thing however; was that they created the protocols that permitted machines to communicate over very long distances with reasonable reliability.

The core of this set of protocols was the Transmission Control Protocol (TCP) which was responsible for getting the message through. With more devices joining the conversation it became obvious that some way of identifying each machine was needed. The identification of each machine on a machine to machine level had as with all electronic communications devices been achieved using a hardware address which we know as the Media Access Control address or MAC address and worked well for awhile.

As the factors of scaling-up the size of the networked computers became of more important it was soon realised that purely using MAC addresses for machines that were permanently connected in a smallish local network was not a problem. The troubles began when trying to connect each of these small networks on a sporadic periodic basis (only when needed).

So the idea of letting the machines on the smaller local network sort out their MAC addresses and using a different addressing structure to identify remote networks gained impetus. This was the beginnings of a logical addressing structure. Well as the years went by and more and more machines were being networked and more and more networks were being interconnected on a global scale this logical addressing structure evolved considerably.

The best part of it all was that the smart guys who created TCP did such a fine job that it has been used ever since and when the time came that the number of interconnected networks was getting to be humanly unmanageable manually some way to overcome this had to be found. We were also seeing a number of different architectures being developed by different groups more or less independently and in competition with one another. The result was that not all networks could connect with all of the other networks.

At around this point in time the number of home computers being sold was beginning to skyrocket and many who used networked computers at work desired the same for their home computers. Having already developed the Hypertext Transfer Protocol (HTTP) and the Hypertext Markup Language (HTML) academics and researchers could now confer with each other and here a greater degree of collaboration between these groups of individuals developed. Admittedly this may have at times been a little constrained and secretive but non-the-less it happened.

The military were also in the processes of extending and evolving their networks. Cutting the story short it became obvious that something had to be done to make everything capable of talking to everything else. It was at around this point that the Internet Architecture Board (IAB) and the Institute of Electrical and Electronics Engineers (IEEE) really came to the fore.

The IAB took responsibility for overseeing and formulating standards relating to the development of the Internet. The IEEE met in February of 1980 to discuss what needed to be done to produce uniformity if the Internet was going to truly develop.

They came to the realization that there were a number of different competing and incompatible network technologies and decided that due to the various different possible media available it was all too much for just one group to deal with and so they created a number of subcommittees each charged with the responsibility for the production of a set of standards covering a smaller area of technology that could be used by one and all. I guess they must have heard what the little general said “Divide and conquer”.

Thus we saw the formation of the 802 DOT committees and the standards that were produced by them have all evolved over time. Some are still widely in use today, some aren’t, some have gone to better pastures and others are only just beginning to become mainstream while a few are yet to “arrive”. For example the network system originally developed by the Digital Corporation®, Intel® and the Xerox® Corporation was called Ethernet and the committee formed to deal with the standards for this type of network architecture was the 802.3 subcommittee.

Another was formed to handle wireless networking communications. This was the 802.11 subcommittee and just to illustrate the manner in which the number of changes that have occurred as these architectures have evolved over the years the 802.11 subcommittee has appended the 802.11 with a letter to distinguish one set of wireless networking technology standards from the others. The first was called 802.11a and we are currently seeing the finalisation and market implementations of the 802.11n standard today. That makes a total of 14 major standards of wireless networking technologies in less than 28 years.

Well now we get to the point where we have all sorts of networks and a variety of transmission media what is what and which do I use? These are the questions that we will begin to answer now.

Transmission Media

The cables used in cabled networks are usually either copper-based or fiber-optic transmission medium, the architecture, topology, protocol(s), and size of the network will determine which is chosen and what variety of each is preferential.

I would be amiss not to mention that there could be a mix and match of cabling and other transmission media such as wireless. Different segments of a network may consist of different transmission media. Generally it is easier to separate out segments with different transmission so as that each type of media and each category within that type of medium are contiguous for that local section of the network. There are many reasons as to why this is a good idea and I will be discussing them as we progress.

An obvious example would be when mixing copper-based cable networks with wireless or even fiber-optic cable network sections. It is not too difficult to understand how each of these different transmission media will propagate a signal in different ways and hence will have different specifications and physical and electronic requirements that need to be satisfied for an effective transmission to take place.

With respect to the various different varieties (or flavours if you will) of each type of media for now let us just accept that the same holds true and consider it to be fact. I will explain the reasons in greater detail later.

In Part three of the Network Cabling Guide we will dive into copper cabling technologies so see you soon.

comments powered by Disqus
Loading