T.R | Title | User | Personal Name | Date | Lines |
---|
4471.1 | Compression impressions? | STAR::ROBINSON | | Wed Jan 30 1991 10:07 | 9 |
| >you tell me!
No John. You tell us. ;-) What is or what is the significance
or a "compression chip"?
I think I know the significance of an A4000...
Thanks,
Dave
|
4471.2 | "Wicked quick" image processing | TLE::TLET8::ASHFORTH | The Lord is my light | Wed Jan 30 1991 10:19 | 10 |
| Re compression chips:
Woodja believe wicked-fast-coprocessor-driven-support-fer-industry-standard-
image-compression-formats?
I think John is referring to the approach of "CCITT-on-a-chip," which would
make a lot of new telecomm (and other) applications possible and/or easier to
develop.
Bob
|
4471.3 | | WJG::GUINEAU | the number 42 comes to mind | Wed Jan 30 1991 10:47 | 5 |
| No to mention the possibilities of on-the-fly data compression for storing
data on floppies and hard disks - potentially doubling capacities on
standard devices.
john
|
4471.4 | Of course! | TLE::TLET8::ASHFORTH | The Lord is my light | Wed Jan 30 1991 11:29 | 3 |
| Right- that's why I didn't mention it!
(nyuk)
|
4471.5 | | STAR::ROBINSON | | Wed Jan 30 1991 13:00 | 5 |
| >on-the-fly data compression
Oh big deal! You mean another Power Packer ;-);-);-);-)
Dave
|
4471.6 | | LEDS::ACCIARDI | | Wed Jan 30 1991 15:39 | 9 |
|
EE Times has, for several months, been reporting on emerging real-time
compression hardware that achieves 200 to 1 packing! This means that
even AT class computers with slow buss speed could play back real time
video from even a floppy disk.
I believe the Intel version is called DVI.
Ed.
|
4471.7 | | ADOV02::MCGHIE | Thank Heaven for small Murphys ! | Wed Jan 30 1991 19:33 | 13 |
| In the latest issue of Byte we got here Downunder (actually the
December issue, so old issue for most of you), there was
an article dealing with the new compression chips.
It suggested uses of them could be to effectively increase the
capacity of secondary storage devices etc. The important
part they stressed was the real-time nature of the compression and
decompression.
Sounds good especially if standards are introduced for
the compression techniques.
Mike
|
4471.8 | doubting thomas | SAUTER::SAUTER | John Sauter | Thu Jan 31 1991 08:19 | 38 |
| re: .6, .7
This sounds too good to be true. The problem with clever compression
techniques is that they are easy to demo to upper management, but hard
to make reliable in practice.
The demo is easy because you just pick a picture or data file that
compresses well. Upper management doesn't know enough to ask the
hard questions, so they cheerfully announce that they will soon have
a product.
In practice, compression is chancy. Depending on the data pattern,
compression techniques may actually _increase_ the number of bits that
must be stored or transmitted. If you have a hard limit on your
transmission bandwidth or storage capacity you have to decide what
you are going to do when this happens. One solution is to decrease the
quality of the picture. In some applications this might be quite
acceptable, but the more you have to do it the more customers will
reject your product. I predict that having a hard limit 100 times less
than the uncompressed data rate will cause you to do it a lot.
Data compression onto media is a harder problem, because you can't
afford to drop a single bit. Some clever user will try to write an
archive that has already been LZ compressed onto your fancy disk, and
you will be embarassed to discover that nothing your compressor can
do will decrease the number of bits significantly.
The only solution I can think of involves making the number of bits per
sector dependent on the pattern of bits. This, of course, makes the
number of sectors per track data-dependent. All software work on file
systems, since the early 60s, has assumed that the capacity of a track
does not change when you modify the data on it---invalidating that
assumption means that 30 years of software has to be done over.
It would be an interesting task, but I wouldn't want to wait until it
is done for my next paycheck.
In summary: I am skeptical.
John Sauter
|
4471.9 | Oh, yeah? | TLE::TLET8::ASHFORTH | The Lord is my light | Thu Jan 31 1991 08:28 | 19 |
| I'd agree with the doubting thomas of .8 *iff* the issue were totally lossless
compression of *all* data. However, I believe the major data type in mind for
compression is specifically *image* data. This form of information is
notorious for storing relatively small amounts of information in relatively huge
amounts of media/silicon. There are currently on the market commercial products
which have excellent compression ratios, dependent on the user's stated
tolerance for data loss. Even with relatively little loss, the ratios are good-
(yes, I feel stupid, but no, to be honest, I don't recall the ratios!).
I believe that, like graphics and math coprocessors, data compression
coprocessors will end up being deliberately invoked by specific code, rather
than becoming a transparent part of any storage/retrieval device drivers.
If I missed the point of .8, feel free to clarify. Since I think we're a teensy-
weensy while away from mass market availability of these little suckers, I'd
say we have time to chat at a rather leisurely pace...
Cheers,
Bob
|
4471.10 | data | VICE::JANZEN | Tom MLO21-4/E10 223-5140 | Thu Jan 31 1991 09:03 | 3 |
| One proprietary technqiue uses fractal math. YOu could compress a
Mandelbrojt image with Mandelbrot math!
Tom
|
4471.11 | JPEG! | KALI::PLOUFF | Ahhh... cider! | Fri Feb 01 1991 10:41 | 17 |
| FWIW, there's a move on Usenet to use JPEG as the standard for
exchanging images, replacing GIF. JPEG uses lossy coding of 24-bit
images, with tons of tweakable compression parameters, and heavy
interest among chip makers. There is likely to be portable JPEG
compression software on the net by spring, and Amiga/IFF is one of the
target platforms/formats under development.
The jury is still out on how much loss is noticeable in a picture that
has been through the compression-decompression cycle, except for two
points. One is that more highly compressed versions of an image have
noticeably less detail than less compressed versions. The other is
that dithered images come out looking lousy.
If I were a betting man I would put my money on Commodore adopting this
technology, not DVI or data compression such as LZW.
wes
|
4471.12 | | WJG::GUINEAU | the number 42 comes to mind | Fri Feb 01 1991 10:43 | 5 |
| What happens when the same image is compressed/decompressed multiple times?
I mean are the losses cumulativly destructive?
john
|
4471.13 | Yes, it get's worse | FROCKY::BALZER | Christian Balzer DTN:785-1029 | Fri Feb 01 1991 12:57 | 11 |
| Re: .12
Yes, due to the nature of the process (trying to preserve the "look" and not
the exact contents of the picture).
However for the intented market (multimedia), this process starting with
"good" source pictures is very well suited.
Cheers,
<CB>
|
4471.14 | | BAGELS::BRANNON | Dave Brannon | Mon Feb 04 1991 18:49 | 17 |
| For a look at a real world application of compression...
I stopped by the Memory Location, they have a DCTV hooked up for
demos. It uses a .DCTV format which appears to be picture info compressed
to reduce the time needed to tranfer data to/from computer and
DCTV box. Didn't see any description of the algorithm used.
Seemed to run at a reasonable speed on a 2000.
re: compression methods
GIF format seems to be taking care of the question of how to compress
picture data by doing it at the picture file format level - that way
you don't needed a unpacking utility before running the picture viewer.
When is IFF going to add lzw encoding as an option?
Dave
|