T.R | Title | User | Personal Name | Date | Lines |
---|
3500.1 | | VAXCPU::michaud | Jeff Michaud - ObjectBroker | Tue Mar 18 1997 12:22 | 9 |
| This has recently been discussed in the Digital_UNIX notesfile.
Summary: the min. binary size is 16k due to the file format.
Also the size of the file is not the amount of code&data.
Use the "size" command to find the size of code & data.
Also use "strip" if you want a smaller executable, this strips
unneeded symbol table info (assuming you don't want to debug
using that binary, you can keep the original unstripped version
and if the stripped version core dumps, you can use the original
to look at the core).
|
3500.2 | ... | KERNEL::PULLEY | Come! while living waters flow | Thu Mar 20 1997 12:33 | 15 |
| Well, he reckons there's a problem with the STL files.
Apparently the include files we provide don't have the declarations and
definitions of template in separate files.
Also when they're instantiated, the files aren't split up either.
He believes this is why he is getting such large binaries.
He's already using strip on his binary files.
I tryed the same silly example as in .0 on VMS Alpha, and got a .exe
of only 8 blocks, (4K), and an object of 5K.
He also mentioned that we reckommend spliting up files as above, but
I've not read up enough yet to know about what advantages that would
have.
Thanks,
Steve.
|
3500.3 | Unused code added to every module | DECCXX::RMEYERS | Randy Meyers | Fri Mar 28 1997 20:34 | 21 |
| Re: .0
I've been able to reproduce the larger than expected files from a simple
file containing nothing but:
#include <vector>
The STL contains many classes with inline member functions. Until a recent
change made by ANSI C++ (which has not been implemented by the compiler),
inline member functions had internal linkage (were local functions to whatever
module needed them). This means that if the functions are used in a module,
copy of the function will have its code generated for use by that module, and
that copy will never be used by a different module.
It appears that the compiler for no good reason decides that one of the
member functions is used, and this implies that 23 other functions from
the 15,500 lines of source included by #include <vector> are also needed.
The net effect is any module that includes <vector> gets an overhead of
about 4K code and 1K data even if the module doesn't use any features
of <vector>.
|
3500.4 | ? | KERNEL::PULLEY | Come! while living waters flow | Tue Apr 01 1997 12:34 | 14 |
| I've just tryed another example.
#include <vector.hxx>
int main(void){}
That gave a object file of 4k, (I think) and binary of 23k.
This might sound a foolish question but, is he missing much by doing it
that way?
I guess he'd not be at the standard library, but he'd be able to
template away, and instantiate, won't he?
This could be my noviceness coming out here.
Thanks,
Steve.
|
3500.5 | you would only need to include vector | DECC::J_WARD | | Wed Apr 02 1997 09:52 | 5 |
|
If you are actually using it, i.e. you are creating a
vector container in your own code somewhere.
You can use templates in general without including it...
|
3500.6 | | KERNEL::PULLEY | Come! while living waters flow | Mon Apr 14 1997 10:26 | 18 |
| I don't know whether this points me back to UNIX, but:-
csh> size jv.o
text data bss dec hex
3936 4832 48 8816 2270
csh> ls -l jv.o
-rw-r--r-- 1 sjp system 23528 Apr 11 15:48 jv.o
So does that mean C++ reckons it needs to use about 9k of the
object file, and UNIX gives it 23k?
His supposision, is that if he's getting sizeable object files, that
might be why he's getting large binary files.
He tryed compiling the #include <vector> int main(){} example, and on
Digital UNIX gets something like the figures above.
Running GNU on the Alpha, and another compiler on Sun only produces 1k
object files.
.3, what's the extra function?
|
3500.7 | Probably we're not looking at the right thing | DECCXX::MITCHELL | | Mon Apr 28 1997 16:00 | 38 |
| I think we might be chasing down a rabbit hole here.
DIGITAL UNIX 9633 says:
<<< TURRIS::DISK$NOTES_PACK2:[NOTES$LIBRARY]DIGITAL_UNIX.NOTE;1 >>>
-< DIGITAL UNIX >-
===========================================================================
Note 9633.0 Large object/binary executeables, dynamic?? No re
KERNEL::PULLEY "Come! while living waters flow" 59 lines 28-APR-1997
---------------------------------------------------------------------------
Hi,
I've a customer using Digital UNIX v4.0b, C++ v5.5-004.
This is based on notes c_plus_plus 3500.*.
He gets large object and binary files E.g., >30 meg for a binary file for
one of his small programs.
Rather than sending us lots of code, he's cut it down to a small example,
but in doing so has made an assumption.
I.e., real but not very big programs give him large files, so a really small
example giving sizeable object & executeable, could be showing the same
problem.
This probably isn't a good assumption.
I think it would make more sense to try to find out what's going on with
the large application.
I noticed in DIGITAL UNIX 9633 that the compilation is done with -g0. What sizes
does the customer see when compiling with and without -g?
DIGITAL UNIX 9633 also says:
Also they've come out not stripped, does that simply mean I didn't use the
compress on the compilation or is there another way of stripping files?
-compress affects the size of object files, not executables. Executables can be
stripped using strip. What this does is remove the debug symbol table information.
|
3500.8 | Logged part of the problem | DECCXL::RMEYERS | Randy Meyers | Mon Apr 28 1997 17:55 | 8 |
| Re: .0
Since the C++ compiler does seem to generate (4K+1K) code for templates that
are not needed, I've logged this problem as note CXXC_BUGS 4319.0 in the
compiler group's private tracking conference.
It is not clear that this problem is causing the extreme problem reported
in the base note.
|