[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference ulysse::rdb_vms_competition

Title:DEC Rdb against the World
Moderator:HERON::GODFRIND
Created:Fri Jun 12 1987
Last Modified:Thu Feb 23 1995
Last Successful Update:Fri Jun 06 1997
Number of topics:1348
Total number of notes:5438

170.0. "Oracle V6/ VMS V5 benchmark 62xx,88xx results" by DEDO03::MURRAY () Wed Aug 10 1988 16:57

                                                     
Below is an internal memo which may be of general interest....    
                                         
    
Oracle on VAX

We have spent some time speaking with senior members of Oracle's Technical 
Support and Marketing staff. The results of these conversations are detailed 
below. In summary, Oracle are happy to substantiate a statement of 10 
Oracle users/VAX map.

SQL Forms would be expected to support 8-15 users/VAX mip. (8 users would 
represent extreme conditions)

Normal commercial mix would be expected to support 10-12 users/VAX mip.

8700 Benchmark

As an example, Oracle undertook some benchmarks which have returned the 
following results ;

8700 (6 VAX Mips) -- 70-72 concurrent users.

Database 6200 

Database tests were conducted using a 500 MB database using 5 million rows on 
VAX/VMS v5.0 (pre-release B5.0) and Oracle v6.0 (to be release in UK Aug/Sept 
88), audited by Codd & Date.

The hardware used was the 6200 series using 128 MB RAM and the benchmark was the 
TP1 benchmark.

The results were as follows ;

6240	      42.6 tps	     99.9% less than 1 second response time

6230	      32 tps	     99.9% less than 1 second response time

6220	      21.7 tps	     98.9% less than 1 second response time		

6210	      11 tps	     87.9% less than 1 second response time

Database 8820

During the same test, the VAX 8820 (12 VAX mips), 8830 and 8840 systems were 
benchmarked using the TP1 benchmark (very similar the Digitals Debit/Credit 
benchmark). The test was 3 updates and 1 insert to a 360Mb database. The results 
were as follows :

Note: There are two 'flavours' of Oracle; the basic version can be enhanced with 
      the Transaction Processing Sub-module. These figures assume this module is 
      invoked. The figures are based on simulated users and therefore do not 
      include an overhead for the communications associated with each user. Codd 
      & Date believe that the figures are still realistic since the 
      communications CPU overhead is replaced by the activity of the software 
      simulating the users.

8810	      -		     -

8820	      32.8 tps	     450 users	 99.9% less than 1 second response time
8820	      31.4 tps	     600 users	 95.9% less than 1 second response time
8820	      28.4 tps	     2700 users	 No meaningful response time

8830	      -		     Not much better than 8820.
8840	      -		     Not much better than 8820.

Note: The 8830, 8840 performance is no better than the 8820 because it is 
believed that the memory bus with these machines saturates. The 8820 fully 
exercises the memory bus to obtain the above results. Digital, in the USA are 
aware of the problem: no resolution is anticipated. The 66xx has a very high bus 
speed, and the 66xx figures above support this statement.

Comparative figures

The same testing sessions also covered IBM and Amdhal:

IBM

IBM 4381 Model 14  supported 21.7 tps running the TP1 benchmark.

AMDHAL

An Amdhal 5890 model 600E (I think 6 tightly coupled CPU's) supported 246 tps. A 
'stripped down' version of the TP1 benchmark supported 265 TPS ( which is 1-2 
TPS better than Tandem's tps record).
  
    
    Colin Murrray
    SWAS, Livingston
        
    
T.RTitleUserPersonal
Name
DateLines
170.1Look CloselyCREDIT::BOOTHBang the Conundrum SlowlyWed Aug 10 1988 17:4234
    The Oracle benchmark methodology is suspect at best. Many US-based
    consultants are having a field day kicking Oracle's new test results.
    Among them is Omri Serlin, one of the most respected consultants
    on benchmark methodology.
    
    What you have entered is essentially the Oracle "party line". It
    sounds great, but it won't wash. If they get their new product out
    the door in August, I would be amazed. They promised 90 days in
    their 18 July announcment. I think the release is closer to October.
    
    For their benchmarks, the new PL/SQL language wasn't used. It wasn't
    ready. The databases were memory-resident. TPS was used, but its
    one hell of an expensive method for obtaining row-level locking.
    Gee, isn't there a database that comes with row-level locking?
    
    Oracle has a track record of over-estimating the number of users
    they can support on a system. They used to say 12-15 on a MicroVAX.
    That could only be obtained with weeks of fine-tuning. It was virtually
    impossible if SQL*Forms was used, and almost every Oracle customer uses
    it.    
    
    The Oracle numbers are interesting. It will be more interesting
    to see if they in any way resemble reality.
    
    It is always pleasant to see internal memos transmitting the party
    lines of competitors. Since all their numbers are being made public,
    there is nothing "sensitive" in this memo. However, the line about
    Oracle's TP1 being very similar to Debit/Credit is a complete joke.
    
    Oracle's "audit" appears flimsy. Its test methodology is suspect
    at best. The entire benchmark is being subjected to intense criticism.
    I hope no one takes their numbers to heart without looking closely.
    
    ---- Michael Booth
170.2Opposing ViewpointCREDIT::BOOTHBang the Conundrum SlowlyWed Aug 10 1988 17:4791
    An external article for general interest:
    
        From this week's Computerworld (pages 23,28):

                       ON YOUR MARK, GET SET, FOUL!
                           By Charles Babcock

      Oracle Corps.'s claim of shattering records for transaction processing 
    includes a curious contradiction:  Right after Oracle's stated "world 
    record for performance" of 265 transactions/sec., it noted that the 
    results "have been audited and verified by the highly reputable and expert 
    firm of Codd and Date Consulting Group."     
  
      But when you turn to the auditor's report, no such result appears.  As a 
    matter of fact, the report isn't available yet, although the brief, two-
    page summary signed by Tom Sawyer of Codd and Date was labeled "Auditor's
    Report" by Oracle.  Sawyer signed off on the DEC VAX and Sequent benchmarks;
    possibly for that reason, their results remain somewhere below the 
    stratosphere.  But whoever audited the mainframe results that make Oracle
    look better than Tandem and IBM has yet to step forward.

      But wait---there's another contradiction.  Oracle states "the industry-
    standard TP1 benchmark" was used in achieving its results.  As Omri    
    Serlin, Rich Finkelstein, Frederic Withington and other independent    
    experts have been saying, TP1 is still loosely defined and there is no 
    TP1 standard. Serlin is trying to establish such a standard through a  
    Debit/Credit Council, but Oracle refuses to join.

      As if we weren't confused enough, Oracle President Lawrence J.
    Ellison has started talking about Oracle's claimed TP1 benchmarks as a 
    "Debit/Credit benchmark, absolutely the same as Tandem's."

      We thought Debit/Credit was our one clear transaction processing     
    benchmark, defined in an April 1985 DATAMATION article and documented  
    by Tandem in March 1987.  There is nothing in the Oracle performance   
    report that indicates it included a communications front-end or did    
    mirrored journaling, as would be required in a Debit/Credit test. Is   
    Lawrence J. Ellison confused?

      But hold on a minute---Oracle's claimed TP1 also contains a major    
    contradiction. It has been agreed by all previous implementors of TP1  
    that the transaction begins with a simulated terminal network
    generating transactions.

      In Cullinet's test of IDMS/SQL, the transactions were generated on
    the same machine that was running the benchmark. Sybase avoided this   
    overhead by running the transaction-generating application on a
    separate VAX and feeding the simulated transactions into the benchmark 
    machine. Oracle criticized Sybase for that step at the time, but in its 
    mainframe tests, Oracle has carried the maneuver to its ultimate
    illogical step: It measured transactions fired off in main memory
    without any attempt to mimic a network.

      "Oracle has gone to such an extreme in what they have done.  It had  
    no on-line flavor whatsoever," Serlin says.

      Let's try to get this straight. In the Oracle version of a benchmark 
    that simulates banking transactions, all of the bank's customers,
    tellers and branches are located inside the bank's central computer.

      Not surprisingly, analysts on Wall Street found this to be an        
    acceptable test. "I was impressed with the quality of the benchmarks,"  
    Scott Smith of DLJ Securities told COMPUTERWORLD before the flight on
    the British Airways Concorde arranged for the announcement.

      "The benchmarks were plentiful thorough and well documented," said   
    Roxanne Googin, and analyst at Needham & Co. in New York.
 

    HOW NOT TO JUDGE A BENCHMARK
      Yes, ladies and gentlemen, it was a thick document, but we should
    read it as well as weigh it.

      We should also read the full auditor's report if, indeed, it ever    
    becomes available. An auditor reports his exceptions as well as the    
    aspects of the test he can verify.  Sawyer's summary letter states
    simply, "These attributes of the benchmark were verified," listing
    seven items. 

      This summary is silent on exceptions.  When Sawyer audited the widely 
    hailed Tandem benchmark in 1987, he was able to find exceptions, and I 
    suspect his final report on Oracle will read differently from this summary.

      Until that report becomes available, I would urge Oracle customers to 
    ignore the alleged TP1 benchmarks and measure Oracle with their own 
    applications.  Any real-world results will be better than getting taken 
    on a flight of fancy.