[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference hydra::amiga_v1

Title:AMIGA NOTES
Notice:Join us in the *NEW* conference - HYDRA::AMIGA_V2
Moderator:HYDRA::MOORE
Created:Sat Apr 26 1986
Last Modified:Wed Feb 05 1992
Last Successful Update:Fri Jun 06 1997
Number of topics:5378
Total number of notes:38326

4471.0. "Hmm - is Dave Haynie hinting, teasing or both?" by WJG::GUINEAU () Wed Jan 30 1991 08:15


Compression chip on an 040 card? A4000?  you tell me!

john




Newsgroups: comp.sys.amiga.hardware
Subject: Re: 040 boards
Reply-To: [email protected] (Dave Haynie)
In article <[email protected]> [email protected] (Richard Blewitt) writes:

>  Since it is possible to have both the 040 and the 030 running at
>the same time, would it be possible to make an 040 board that has
>an expansion slot for the addition of more 040 boards all running
>together?  

Sounds like a good idea, but you would be extremely hard pressed to fit more
than a 68040, interface logic, and your choice of RAM, Cache, or possibly 
a Yet Undefined Goody (like the PP&S compression chip) in the relatively tight
space allocated for a coprocessor board.

If you want to think in more general multiprocessing terms, you could 
theoretically build Zorro III boards with multiple processors on them.  The
Zorro III bus doesn't currently have any hardware cache consistency protocol,
but it does provide for bus locking and other things you would need in a
multiprocessor environment.  The real trick is software.

The main reasons for the Coprocessor Slot:

	[a] A Coprocessor Slot device can completely take over the Amiga 
	    system itself, without any special software.  A Zorro III device
	    would at the very least need the main CPU to handle interrupts.
	[b] A Coprocessor Slot device can be very tightly coupled to the A3000
	    motherboard.  So it's generally more efficient than a Zorro III
	    device.  But don't expect it to plug into your A4000 or whatever 
	    new thing comes along someday, whereas the Zorro III board would
	    travel just fine to a more advanced 32 bit Amiga system.

>Rick


-- 
Dave Haynie Commodore-Amiga (Amiga 3000) "The Crew That Never Rests"
   {uunet|pyramid|rutgers}!cbmvax!daveh      PLINK: hazy     BIX: hazy
	"What works for me might work for you"	-Jimmy Buffett
T.RTitleUserPersonal
Name
DateLines
4471.1Compression impressions?STAR::ROBINSONWed Jan 30 1991 10:079
     >you tell me!
     
     No John. You tell us. ;-) What is or what is the significance
     or a "compression chip"?
     
     I think I know the significance of an A4000...
     
     Thanks,
     Dave
4471.2"Wicked quick" image processingTLE::TLET8::ASHFORTHThe Lord is my lightWed Jan 30 1991 10:1910
Re compression chips:

Woodja believe wicked-fast-coprocessor-driven-support-fer-industry-standard-
image-compression-formats?

I think John is referring to the approach of "CCITT-on-a-chip," which would
make a lot of new telecomm (and other) applications possible and/or easier to
develop.

Bob
4471.3WJG::GUINEAUthe number 42 comes to mindWed Jan 30 1991 10:475
No to mention the possibilities of on-the-fly data compression for storing
data on floppies and hard disks - potentially doubling capacities on 
standard devices.

john
4471.4Of course!TLE::TLET8::ASHFORTHThe Lord is my lightWed Jan 30 1991 11:293
Right- that's why I didn't mention it!

(nyuk)
4471.5STAR::ROBINSONWed Jan 30 1991 13:005
>on-the-fly data compression 

Oh big deal! You mean another Power Packer  ;-);-);-);-)

Dave
4471.6LEDS::ACCIARDIWed Jan 30 1991 15:399
    
    EE Times has, for several months, been reporting on emerging real-time
    compression hardware that achieves 200 to 1 packing!  This means that
    even AT class computers with slow buss speed could play back real time
    video from even a floppy disk.
    
    I believe the Intel version is called DVI.
    
    	Ed.
4471.7ADOV02::MCGHIEThank Heaven for small Murphys !Wed Jan 30 1991 19:3313
In the latest issue of Byte we got here Downunder (actually the
December issue, so old issue for most of you), there was
an article dealing with the new compression chips.

It suggested uses of them could be to effectively increase the
capacity of secondary storage devices etc. The important
part they stressed was the real-time nature of the compression and
decompression.

Sounds good especially if standards are introduced for
the compression techniques.

Mike
4471.8doubting thomasSAUTER::SAUTERJohn SauterThu Jan 31 1991 08:1938
    re: .6, .7
    
    This sounds too good to be true.  The problem with clever compression
    techniques is that they are easy to demo to upper management, but hard
    to make reliable in practice.
    
    The demo is easy because you just pick a picture or data file that
    compresses well.  Upper management doesn't know enough to ask the
    hard questions, so they cheerfully announce that they will soon have
    a product.
    
    In practice, compression is chancy.  Depending on the data pattern, 
    compression techniques may actually _increase_ the number of bits that
    must be stored or transmitted.  If you have a hard limit on your
    transmission bandwidth or storage capacity you have to decide what
    you are going to do when this happens.  One solution is to decrease the
    quality of the picture.  In some applications this might be quite
    acceptable, but the more you have to do it the more customers will
    reject your product.  I predict that having a hard limit 100 times less
    than the uncompressed data rate will cause you to do it a lot.
    
    Data compression onto media is a harder problem, because you can't
    afford to drop a single bit.  Some clever user will try to write an
    archive that has already been LZ compressed onto your fancy disk, and
    you will be embarassed to discover that nothing your compressor can
    do will decrease the number of bits significantly.
    
    The only solution I can think of involves making the number of bits per
    sector dependent on the pattern of bits.  This, of course, makes the
    number of sectors per track data-dependent.  All software work on file
    systems, since the early 60s, has assumed that the capacity of a track
    does not change when you modify the data on it---invalidating that
    assumption means that 30 years of software has to be done over.
    It would be an interesting task, but I wouldn't want to wait until it
    is done for my next paycheck.
    
    In summary: I am skeptical.
        John Sauter
4471.9Oh, yeah?TLE::TLET8::ASHFORTHThe Lord is my lightThu Jan 31 1991 08:2819
I'd agree with the doubting thomas of .8 *iff* the issue were totally lossless
compression of *all* data. However, I believe the major data type in mind for
compression is specifically *image* data. This form of information is
notorious for storing relatively small amounts of information in relatively huge
amounts of media/silicon. There are currently on the market commercial products
which have excellent compression ratios, dependent on the user's stated
tolerance for data loss. Even with relatively little loss, the ratios are good-
(yes, I feel stupid, but no, to be honest, I don't recall the ratios!).

I believe that, like graphics and math coprocessors, data compression
coprocessors will end up being deliberately invoked by specific code, rather
than becoming a transparent part of any storage/retrieval device drivers.

If I missed the point of .8, feel free to clarify. Since I think we're a teensy-
weensy while away from mass market availability of these little suckers, I'd
say we have time to chat at a rather leisurely pace...

Cheers,
	Bob
4471.10dataVICE::JANZENTom MLO21-4/E10 223-5140Thu Jan 31 1991 09:033
	One proprietary technqiue uses fractal math.  YOu could compress a
	Mandelbrojt image with Mandelbrot math!
Tom
4471.11JPEG!KALI::PLOUFFAhhh... cider!Fri Feb 01 1991 10:4117
    FWIW, there's a move on Usenet to use JPEG as the standard for
    exchanging images, replacing GIF.  JPEG uses lossy coding of 24-bit
    images, with tons of tweakable compression parameters, and heavy
    interest among chip makers.  There is likely to be portable JPEG
    compression software on the net by spring, and Amiga/IFF is one of the
    target platforms/formats under development.
    
    The jury is still out on how much loss is noticeable in a picture that
    has been through the compression-decompression cycle, except for two
    points.  One is that more highly compressed versions of an image have
    noticeably less detail than less compressed versions.  The other is
    that dithered images come out looking lousy.
    
    If I were a betting man I would put my money on Commodore adopting this
    technology, not DVI or data compression such as LZW.
    
    wes
4471.12WJG::GUINEAUthe number 42 comes to mindFri Feb 01 1991 10:435
What happens when the same image is compressed/decompressed multiple times?

I mean are the losses cumulativly destructive?

john
4471.13Yes, it get's worseFROCKY::BALZERChristian Balzer DTN:785-1029Fri Feb 01 1991 12:5711
Re: .12

Yes, due to the nature of the process (trying to preserve the "look" and not
the exact contents of the picture).

However for the intented market (multimedia), this process starting with
"good" source pictures is very well suited.

Cheers,

<CB>
4471.14BAGELS::BRANNONDave BrannonMon Feb 04 1991 18:4917
    For a look at a real world application of compression...
    
    I stopped by the Memory Location, they have a DCTV hooked up for
    demos.  It uses a .DCTV format which appears to be picture info compressed
    to reduce the time needed to tranfer data to/from computer and
    DCTV box.  Didn't see any description of the algorithm used.
    
    Seemed to run at a reasonable speed on a 2000.
    
    re: compression methods
    GIF format seems to be taking care of the question of how to compress
    picture data by doing it at the picture file format level - that way
    you don't needed a unpacking utility before running the picture viewer.
    
    When is IFF going to add lzw encoding as an option?
    
    Dave