| I can't speak for settings in unix nor in consoles. But:
> If not, can someone give a brief explaination as to why auto sensing
> does not or cannot work?
Some facts are in note 1177.4 and .5. It is quite possible that
the auto sensing fails because of actual programming error, if you can
believe that! The innards of a DE500's ethernet chip has these
functional units:
+-----------+
#######| PCI |#############
# | Interface | # Ethernet chip
# +-----------+ #
# ^ v #
# +----------------+ #
# | DMA | #
# | Engine | #
# +----------------+ #
# ^ v #
# +------+ +------+ #
# | Rx | | Tx | #
# | Fifo | | Fifo | #
# +------+ +------+ #
# ^ ^ | | #
# | | / | #
# | | / | #
# | +------+ | #
# | / | | #
# | +-/ | | #
# | | | V #
# | | +-------------+ #
# | | | Coders and | #
# | | | Scramblers | #
# | | +-------------+ #
# | V ^ V #
# +-----------+ +-----------+ #
###| Serial |##| MII/SYM |##
+-----------+ +-----------+
10Mb/s 10Mb/s or 100Mb/s
The host "driver" must set up the adapter to use the right protocol
over the right 10 or 10/100 port. "Autosensing" involves trying,
say, NWAY over the 100Mb/s port and either working or timing out.
If it times out then it tries 10Mb/s. If that works fine but if it
times out then it tries the 10Mb/s serial port. That will either
work or time out. Repeat as necessary.
Autosense must work at initial power-on but it also runs all the time
in the background to handle cable-pull events. When you pull a
cable then the adapters at *both ends* of the link start running their
autosense algorithms. If my adapter is testing the 10Mb/s serial port
and timing out while my peer is testing the 100Mb/s NWAY and timing out
then it is easy to get into trouble.
You can see why forcing an adapter to a single speed and mode might
be more reliable?
Regards,
Chuck
I once did a driver for a 10/100 DE500 card. I was given the autosense
algorithm and I never had to understand it or debug it. ;-). Hats off
to the code writers who attempt to code autosense for these adapters!
|