Tuesday 5 April, 2011, 13:15 - Spectrum ManagementApparently, mobile phone operators are beginning to run out of capacity on their networks due to all the data traffic that is being generated by smart-phones and people using broadband dongles in their laptops. Of course, whether or not this is true or not today or whether it is just an excuse for poor service quality, there will almost certainly come a time in the relatively near future when it the squeeze on spectrum becomes reality.
Posted by Administrator
Posted by Administrator
A typical mobile operator in an average European country will currently have access to something like 100 MHz of radio spectrum - 50 MHz for the uplink from phones to base stations and another 50 MHz in the opposite direction - more commonly written 2 x 50 MHz. This will be in usually two (or more) of the commonly available mobile bands, such as:
* 900 MHz (actually 880 - 915 and 925 - 960 MHz)
* 1800 MHz (1710 - 1785 and 1805 - 1880 MHz)
* 2100 MHz (1920 - 1980 and 2110 - 2170 MHz)
So when they do run out of spectrum, what can they do? Well help is at hand in the short term through two new bands which are being released. The first, known as the 'digital dividend', has become available due to the more efficient planning of television broadcast networks that has arisen as a result of the switch-over from analogue to digital broadcasting. The second, at 2.6 GHz, was (and in some countries still is) used for a multitude of purposes including wireless video cameras, wireless cable networks and fixed wireless broadband. Together, these two bands make just over another 200 MHz of spectrum available:
* 800 MHz (791 - 821 and 832 - 862 MHz) - the 'digital dividend'
* 2600 MHz (2500 - 2570 and 2620 - 2690 MHz) - note that the gap between 2570 and 2620 MHz is also available
If you assume that an average country has 3 or 4 mobile operators, this equates to something like another 60 MHz (2 x 30 MHz) each, resulting in a 60% increase in their capacity.
So what's the problem? Some observers (eg Cisco) claim that mobile data traffic is doubling roughly every year, so this 60% increase in capacity will amount to about 8 months of traffic growth, then the problem starts all over again. New technology will deal with some growth. Newer mobile technologies from HSPA+ to LTE and LTE-Advanced may offer a doubling in capacity over current 3G (UMTS) networks for each unit of spectrum. Another year dealt with, and only at the cost of changing over all of the network equipment and handsets!
On this front, it is perhaps no surprise that UK mobile operator O2 recently announced plans to offer free WiFi for all. Why is this no surprise? If the traffic from smartphones and laptops can be offloaded from the mobile network to WiFi hotspots, this will ease the burden on the mobile network. But this is a relatively short-term fix too. In the long term, the only way that mobile operators will be able to deal with the growth in data traffic is to get access to more spectrum. But where will this spectrum come from?
It has long been recognised that to offer a sensible (in terms of cost, coverage and capacity) mobile network, frequencies in the range 300 to 3000 MHz are best. Go any higher and things such as Doppler shift and cell handover become real problems. Go any lower and antennas become too large and unwieldy. The problem is that the remaining frequencies in this range are already being used. In general terms:
* 300 to 430 MHz is military territory
* 430 to 440 MHz is radio amateur land
* 440 to 470 MHz is full with PMR systems
* 470 to 790 MHz has UHF television broadcasters in it
* 790 to 862 MHz is already mobile
* 862 to 880 MHz houses all manner of low power devices
* 880 to 960 MHz is already mobile
* 960 to 1350 MHz is where aircraft radars and some radio amateurs live
* 1350 to 1710 MHz is for satellites (including GPS), broadcasters and more tanks, planes and guns
* 1710 to 1980 MHz is already mobile
* 1980 to 2110 MHz is partly mobile and partly full of military folk
* 2110 to 2170 MHz is already mobile
* 2170 to 2400 MHz is mostly military
* 2400 to 2500 MHz is WiFi and bluetooth land
* 2500 to 2690 MHz is already mobile
* 2690 to 2700 MHz is where radio astronomers hang out (mostly in cardigans)
* 2700 to 3100 MHz is aircraft and maritime radars
If any more space is going to be made for mobile services, someone else is therefore going to have to give up their claim to their territory. In some countries (eg Sweden) 2300 to 2400 MHz is being made available for mobile services but in the majority of European countries it is used by the defence services who, having already vacated other spectrum, are beginning to fight back.
Clearly, anyone who moves out for the benefit of mobile services will either have to stop doing what they do (unlikely) or go and do it somewhere else (costly). For any international service (eg boats and planes) this cannot be done unilaterally and getting international agreement is probably too slow. What's more, radars and things such as that need a lot of spectrum due to the way they work and furthermore, removing them might cause planes to fall out of the sky, which would seriously disrupt tourism in many parts of Europe that aren't very close to where you live (though it might have a potential commercial upside for undertakers).
Wireless Waffle is therefore going to stick it's neck out and make a proposal as to who should lose the battle for this important part of the spectrum and that is ... the broadcasters!
On a completely different, but not unrelated tack, the amount of energy consumed by a terrestrial broadcasting networks is, well, large. Not 'a whole power station' large, but still pretty big. The amount of energy consumed by a satellite is tiny. In fact, once it's up in the sky, it's zero (they are solar powered). A terrestrial broadcast network also delivers much less capacity than a satellite. So broadcasting by satellite consumes much less power (and therefore has a much lower carbon footprint) and offers much greater capacity (for services such as 'The Cartoon Network' in HD). Let's therefore turn off UHF broadcasting and give the spectrum to mobile networks - the broadcasters can go to cable and satellite and can continue to use the VHF band if they really want to.
What would a world without UHF broadcasting look like. In somewhere such as the Netherlands where 90% or more of homes are on cable, not a lot different. It would mean that people's holiday homes might need a satellite dish but these are so cheap and plentiful it should be no big deal. In the UK where most homes still have a terrestrial UHF receiver you might think this would be a bigger deal, but over 50% of homes have either satellite or cable already and again, having to buy a dish is no biggie, so other than the temporary inconvenience of swapping set top boxes and putting a dish up (or getting connected to cable) nothing much would change.
If they so desired, public service broadcasters could continue terrestrial television broadcasting using the VHF band - by switching off those ancient and largely unlistened-to DAB transmitters. DAB could be replaced by DRM and Bob's your uncle - no loss of anything important, just a bit of shuffling around.
If all this sounds far fetched, watch this space. Or, perhaps more accurately, watch outer space!
Tuesday 8 March, 2011, 19:22 - Radio RandomnessThe front page of London's Metro newspaper today screams about 'Terror Fears over Dangerous Sat Nav Flaw'. The story highlights the extent to which the British economy relies on satellite navigation systems and claims that over GBP94 billion of domestic output would be affected by a loss of GPS caused, for example, through deliberate jamming. Not wishing to blow the Wireless Waffle trumpet (actually we have a flumpet not a trumpet), but we raised the issue of GPS jamming in February last year!
Posted by Administrator
Posted by Administrator
What the article does not explain is why the loss of GPS would cause so much damage. Clearly if aircraft or boats lose GPS, they rely on their other safety devices such as radars and navigation beacons.
The crux to the problem is not in cases where GPS is used for its location abilities, but for its timing. Few people realise that, as well as providing accurate information to help you figure out where you are, GPS (and the other systems which do the same such as the European Galileo and Russian COMPASS satellites) also provides very accurate time. Each satellite has its own very accurate atomic clock on-board - in fact because of the way in which these satellites (collectively known as the Global Navigation Satellite Service or GNSS) work, accurate time is a prerequisite for determining accurate location. To work out where you are, you determine the apparent time for each of the satellites you can see. This is the 'real' time plus a delay due to the radio waves travelling from the satellite to you at the speed of light. For a GPS satellite, this delay is around 66 milliSeconds, but will vary depending on the position of the satellite in the sky relevant to you. Your GPS receiver knows where each satellite is supposed to be (this is stored in it) and thus can 'triangulate' your position using the time difference to work out how far you are from each one. Complicated eh?
This timing information is accurate to a few nanoSeconds (it has to be for you to be able to calculate your position to within a few metres) and thus can be used for other purposes, in particular for synchronising networks. No longer is the time in London different from that in Manchester as it was in the days of the railway, but the time in the two cities can be synchronised to within nanoSeconds of each other. For telecommunications and broadcast networks, this synchronisation is essential when transmitting data at speeds of Gigabits per second, where each bit is only a few nanoSeconds in duration itself. Similarly, the London Stock Exchange uses GNSS timing information to record the times of trades. And many other big businesses rely on GNSS timing of this kind. And this is the BIG problem. If the GPS signal is lost, the various network clocks will slowly drift apart and all of these networks will begin to fail.
This is, of course, a dream for terrorists. A GPS jammer can be made for a few pounds (the Metro article gets that right) and if correctly situated would cause untold havoc, though most new receivers are capable of dealing with basic jammers. In response, there are now designs for increasingly sophisticated jammers kicking around the internet and you can bet that as receivers get better at detecting jammers, the jammers will get increasingly complex. This is why the authorities are spending time and money putting in place solutions to identify jamming and locate the jammers as quickly as possible, as well as mechanisms to get around the problems of a local jamming source.
To date there have been few real examples of malicious GPS jamming. The military did manage to 'accidentally' jam all GPS receivers in San Diego, and James Bond did have to deal with a British boat being thrown off course by a GPS encoder (in 'Tomorrow Never Dies'), but these are isolated incidents. It is certain that GPS jammers are being used today, but so far, no serious damage has been done. So if you're driving along a country lane and your Sat Nav suddenly starts indicating that you are in a car park in Coventry, you know why!
Saturday 26 February, 2011, 02:18 - Amateur RadioOn a number of previous occasions, Wireless Waffle has commented on actions being taken by various regulatory authorities which seem to be attacking the use of the 70 centimetre band by radio amateurs. But in the USA, things have just gotten a whole lot worse, with the tabling of a bill which suggests that two thirds of the (admittedly large) US 70cm allocation be given over to broadband services for first responders (or the emergency or blue light services as we call them over here).
Posted by Administrator
Posted by Administrator
The bill, snappily entitled 'A bill to enhance public safety by making more spectrum available to public safety agencies, to facilitate the development of a wireless public safety broadband network, to provide standards for the spectrum needs of public safety agencies, and for other purposes.' or 'Broadband for First Responders Act of 2011' for short (herinafter referred to as 'the six million dollar bill') states, in section 207, subsection (d):
Not later than 10 years after the date of enactment of this Act, the paired electromagnetic spectrum bands of 420–440 megahertz and 450–470 megahertz recovered as a result of the report and order required under subsection (c) shall be auctioned off by the Federal Communications Commission through a system of competitive bidding meeting the requirements of section 309 of the Communications Act of 1934.
The spectrum referred to in the aforementioned subsection (c) and in section 207, subsection (a) appears to refer to that which is freed by the cessation of use by public sector users who should migrate to the 700 and 800 MHz bands in the interim. However, there is little, if any, use of the frequency range 420 to 440 MHz by these users in the first place, as it is part of the amateur band. Indeed, according to the United Stated Frequency Allocations the band is only to be used by either radio amateurs (on a secondary basis), or by Government radiolocation services (eg the PAVE PAWS radar installations at the Clear, Beale and Cape Cod air-force bases). So, if taken literally, the frequencies in the range 420 to 440 MHz which would become available are - none, because none of them are used and thus none have been cleared! However, things are never that straightforward and it could be argued conversely that the fact that the first responders are not using that range of frequencies is a clear indication that they have cleared out of the band!
Either way, this assault on the 70cms band might be time for US amateurs to think carefully and propose something in their mutual interest. North of the border in Canada, the 70cm band stretches only from 430 to 450 MHz and in most of the rest of the world it is only 430 to 440 MHz (and as indicated before, there were moves afoot in Europe to further constrict this to 432 to 438 MHz). It may be, therefore, that agreeing a reduction in the US 70cm band to be in line with Canada at 430 to 450 MHz whilst in return guaranteeing some certainty of tenure would be a good way ahead. In Europe, for example, a reduction to 432 to 438 MHz in return for clearing out all those annoying low power devices around 433 and 434 MHz and also guaranteeing primary status for amateurs in the band would be a fair compromise. Or what about making the 70cm band 430-433 and 435-440 MHz, thereby leaving the low power devices to wallow in their own crapulence, sure this would require the re-tuning of many repeaters in the UK and elsewhere, but this is already having to be done in many cases due to the incoming and outgoing interference problems caused by the self-same low power devices.
In many countries around the world, the frequency range 410 to 430 MHz is used for digital mobile radio systems which support blue-light activities. In the UK some spectrum in this range is available for use by the Ambulance service, for example. This means there is equipment available. Equally the band 450 to 470 MHz is similarly used. Therefore, giving US first responders the ranges 410 to 430 and 450 to 470 MHz makes a lot of sense and for once, means that the US frequency usage would be in harmony with that of most of the rest of the world and harmonisation leads to economies of scale and cost reductions and so forth, as its protagonisist are always keen to argue.
Of course, any change in usage or allocation in a band leads to the need to re-plan, re-tune and re-think, but if the rest of the world's radio amateurs can cope with just 10 MHz of 70cm spectrum, and Canada can cope with 20 MHz, perhaps it is time for American amateurs to relinquish part of the band in the common good and figure out what they would like in return?
Thursday 13 January, 2011, 04:22 - Spectrum ManagementPreviously on Wireless Waffle we have discussed ways of checking and even gaining some knowledge of the state of propagation of the short-wave bands. But for truly advanced users, there is a way to find out the actual state of propagation for a particular location in real time. Scattered around the world are a series of ionosondes. These ionosondes are rather like radars in that they transmit a signal to the ionosphere and measure the time taken to get a response. They do this across a range of short-wave frequencies.
Posted by Administrator
Posted by Administrator
The result is a chart called an ionogram. An ionogram is effectively a radar picture of the height of the ionosphere at the location immediately above the ionosonde as well as providing an indication of its refractivity, over a range of frequencies. An example ionogram taken from the ionosonde in Dourbes, Belgium, is shown below.
The ionogram is the ultimate way of assessing short-wave propagation. It tells us exactly what is going on. To help interpret the ionogram, there are also a useful set of figures provided in the diagram which give us some very useful information. So... how do we interpret the ionogram to help understand HF propagation?
In the ionogram above, the strong red/pink line extending from just below 3 MHz to just above 6 MHz shows that the ionosphere above Belgium was refracting radio signals in that frequency range straight back down again (ie at an angle of 180 degrees) - it was acting like a mirror for radio frequencies in this range. As the frequency goes above 6 MHz, the line bends upwards until eventually it goes off the top of the chart. This is the point at which the ionosphere stops refracting signals back down (at 180 degrees), however it will continue to refract signals at higher frequencies which hit it at lower angles (less than 180 degrees).
From this simple data, together with the height of the ionosphere (the scale up the left hand side of the chart) it is possible to calculate a number of very useful figures, and this is done for us.
Firstly, we have the maximum usable frequency (MUF). This is shown amongst the figures to the top right of the chart (in this case 27.62 MHz) and is also repeated at the bottom of the chart (under the label 3000 km). The MUF is the highest frequency which the ionosphere will reliably reflect radio signals. It is also the one which has the lowest refraction angle. What this means is that signals at this frequency will be refracted by the ionosphere (above Belgium in this case) but only where the path between the ends of the link hits it at a low angle, which equates to a path length of around 3000 km. Two stations, each 1500 km away from Belgium, the centre of whose path is above Belgium, will therefore be able to communicate at a frequency of 27.6 MHz. So a station in Western Ireland and one in Romania are likely to be able to communicate on this frequency. Equally one in Spain and one in Sweden might too.
The second useful frequency shown is the one shown as 'foF2' in the diagram (top right). In this example foF2 is 7.15 MHz. foF2 is the highest frequency at which the ionosphere above Belgium will refract signals at an angle of 180 degrees, ie straight back down. If you therefore want to communicate from somewhere in Belgium, to the same place in Belgium, using the ionosphere, this is the highest frequency I can use. How useful! But the best bit is the interpolations between foF2 and the MUF. These are the figures shown at the bottom of the chart under the various distances (from 100 km to 3000 km). These are the maximum frequencies I can use to communicate over the distance shown.
In this example, if my path length is 100 km, the highest frequency I can use is 7.9 MHz. If my path length is 1000 km, the highest frequency I can use is 11.7 MHz. Now this is really useful. If I want to communicate from London to Stuttgart, a distance of approximately 800 km, of which Belgium is roughly half way (in the centre of the path) the highest frequency I could use, in this instance, is 10.2 MHz.
What is the lowest frequency I could use? That is more difficult. What the diagram does tell us, however, is that for short paths, (ie from Belgium to Belgium) the ionosphere was successfully refracting signals at frequencies as low as 3 MHz. How do we know this? There is a nice red/pink reflection on the chart at this frequency. Below it, the picture becomes rather scattered indicating that the refracted signal was not reliable.
So, what can we ascertain:
- The highest frequency being refracted by the ionosphere above Belgium is around 27.6 MHz. This is the highest frequency at which two stations separated by 3000 km for whom Belgium is in the centre of their path, will be able to communicate - the MUF.
- The highest frequency which can be used to communicate from one location to the same location (in Belgium) using the ionosphere is 7.15 MHz - foF2.
- For a range of distances, we can work out the maximum frequency which can be used.
- For short paths, we can find out the lowest possible frequency being refracted by the ionosphere (around 3 MHz in this case) and thus the lowest frequency which can be used.
We can also take a stab at assessing how strongly the ionosphere is refracting. The phantom reflections shown at around 450 km height are signals which were refracted from the ionosphere, then reflected by the earth and then refracted again by the ionosphere. These phantom reflections would tend to suggest that the strength of refracted signals is particularly good, as it has been strong enough to rebound from the earth and refract again! Sometimes, three or even four phantoms can be seen, indicating very strong refractions which would suggest that short wave signals would be very strong.
The ionosondes in Europe include:
- Dourbes, Belgium
- Juliusruh, Germany
- Chilton, United Kingdom (registration is required but is free)
- Warsaw, Poland
- Rome, Italy