I have been asked about the relationship between the "signal strength", "signal quality" and the pixilation and stuttering he got when trying to record from various Freeview channels onto a personal video recorder (PVR), computer (via a USB Freeview adaptor) and a DVD-R recorder.
In summary, the power of the transmissions has nothing to do with it at all. This simply affects the distance from the transmitter that the services can be received.Each multiplex provides either 18Mb/s (16QAM, mux 1, B, C and D) or 24Mb/s (64QAM mux 2 and A), which is used to provide up to 8 TV channels by using "statistical multiplexing". This means each TV channel is provided using around 3Mb/s, which is the value you found. (Mb/s is megabits per second, so 1Mb/s is a million bits per second and so on. Computers usually measure file sizes in bytes, which are 8 bits. So to record one second of a 8Mb/s transmission requires 1MB or megabyte.)
When the signals are increased in strength at switchover they will all be changed to 64QAM providing 24Mb/s.
Trial records were made 8km from the Angus transmitter.
Signal strength (max 6)
Average bit rate Mb/sec
5 minutes file size MB
Multiplex and mode
720 x 576
720 x 576
704 x 576
704 x 576
720 x 576
720 x 576
544 x 576
720 x 576
544 x 576
Let me explain why these results happen.
There is a huge amount of difference between the way analogue and digital transmission systems work.
On analogue, the picture is transmitted by a system that scan from left to right, row by row down then up the TV screen. At the start of each scan line, a "very high" signal is transmitted. This means that even if the signal gets some interference, the TV can recover what it was doing at the start of the next line.
The digital TV pictures are first broken into three components, broken into 8x8 blocks, the blocks are then encoded using fast Fourier transformations, scaled by a compression value, and run-length encoded to create a stream of 1s and 0s. Then up to eight of these streams of 1s and 0s are multiplexed together to produce no more than (but as close as you can get by dynamically adjusting the compression value) 18Mb/s or 24M/s.
These are then transmitted using a system called COFDM, which uses a system called "forward error protection" to hopefully deliver the bitstream without a single error to the receiver. However, if a SINGLE error occurs, it can be quite some time before it can be corrected for. This is because a single bit can represent many different values: the TV channel, a value of 1, 2, 4, 8, 16, 32, 64 or 128 in a single byte, the number of repeats of another value and so on.
This means that if you CAN get an error-free reception of the MPEG-2 (as it is known) bitstream, to record or playback this bitstream is a simple matter of copying it to storage and then reading it back again.
However, if you do not get PERFECT reception, then if you record this bitstream onto a DVD or the hard disk drive in a PVR, then it will pixelate and stutter at points where the bitstream was corrupted.
The "standard" picture size for MPEG-2 (called Main Profile at Main Level, MP@ML) is 720x576. This can represent a picture in the old-fashioned 4:3 TV screen, or the modern 16:9 widescreen layout. On a 625-line analogue TV system, only 576 lines are used for picture, the rest is for Ceefax/Teletext and other signalling.
Of the 720x576 pixels only 702 horizontal pixels are designated for viewing, the other 18 being a "bleed" so the recording can be scaled for broadcaster purposes. ITV (and Channel 4 sometimes) use an "off-standard" system of using only
544 horizontal pixels (ie three-quarters width) that provides additional data compression benefits, but with a loss in the picture quality.
The bitrate is determined by several factors.
Firstly, there are some channels that cannot be statistically multiplexed, these being BBC ONE, BBC TWO, itv-1 and Channel 4. This is because they are different on each transmitter, so these channels have a "reserved" (ie, maximum) bandwidth.
Secondly, there is the nature of the material. Static pictures and cartoons compress much better than unpredictable material. Strobe effects are the worst, followed by panning shots of crowds watching football.
Thirdly, material that is live and has to be compressed in real-time produces more data than archive material that can be compressed with more computing time. This is why Films on DVD look better for the same bandwidth than something you might record.
Fourth, the amount of bandwidth the broadcaster wishes to use, because higher compression results in more choice of channels, but with reduced picture quality.
Fifth, the ratio between three types of frame: the "full picture" frames that are broadcast periodically (every second frame to several seconds apart) and the "forward predication" and "backward prediction" frames that consist of the difference from the "full picture" frames.
So, this means:
If you conclude that the "claim that switching to digital TV is an improvement in viewing quality is not substantiated" is quite correct. If you had prefect or near-perfect analogue reception, digital TV will decrease the picture quality. This is because digital TV provides more services using the same transmission frequency.
However, many people are unable to get PERFECT analogue reception, but will get PERFECT (uninterrupted bitstream) Freeview reception. For these people, the picture quality will improve.
But, Freeview reception where the "bit error rate" is high will result in pixilation or picture freezes. In this situation, you should:
So, in conclusion, Freeview reception for recording will only work if you can receive the bitstream with no errors. This depends more on the level of interference, rather than the signal strength.
I hope this explains everything.
|link to this|