My view of the world - Page 4


DevX Home    Today's Headlines   Articles Archive   Tip Bank   Forums   

Page 4 of 4 FirstFirst ... 234
Results 46 to 56 of 56

Thread: My view of the world

  1. #46
    Mike Mitchell Guest

    Re: My view of the world

    On 19 Feb 2002 14:17:38 -0800, "Rob Teixeira" <RobTeixeira@@msn.com>
    wrote:

    >It's a good thing people like you don't design operating systems and development
    >environments.


    Yeah, yeah, whatever. The fact is, as a counter argument you can only
    come up with 25 MB here, or 10 * 2.5 MB there -- basically, it's all
    small beer, memory wise. We'll have gigabytes of RAM soon, so there's
    no problem fitting in just a few MB here and there.

    However, I see you brought up my criticism of allowing multiple 20 MB
    runtimes to just hang around unused on the hard disk, implying that I
    am now going back on my words. But I am not doing any such thing. I
    was complaining before of *wastage*. My multiple Exe's scenario, while
    admittedly taking up some more memory, would not be *wasting* memory,
    since if you had an app loaded, once would assume you wanted to do
    useful work with it. Moreover, with an Exe containing *only* the
    functionality it actually uses, many Exe's would be more compact that
    you might assume. Contrast that with the various side-by-side runtimes
    that could be redundant and never to be loaded again, and hopefully
    you'll see my point.

    As for updating existing apps, you'd download a tiny delta file to
    modify the current Exe in situ. This technology has been around for
    years. You see, you are affecting only *one* file -- the Exe on *my*
    hard disk. Where you upgrade DLLs, you have to keep in mind that those
    DLLs could be used with a myriad selection of other vendors' Exe's.
    That is not the case with having everything in a single Exe. You
    update my Exe and it involves only my PC. This is utterly no different
    from how apps used to be updated, except that they were considerably
    smaller in the days of MS-DOS, so it was often less hassle to just
    overwrite the existing Exe (or Com).

    MM

  2. #47
    Robert Lantry Guest

    Re: My view of the world



    >As for updating existing apps, you'd download a tiny delta file to
    >modify the current Exe in situ. This technology has been around for
    >years. You see, you are affecting only *one* file -- the Exe on *my*
    >hard disk. Where you upgrade DLLs, you have to keep in mind that those
    >DLLs could be used with a myriad selection of other vendors' Exe's.
    >That is not the case with having everything in a single Exe. You
    >update my Exe and it involves only my PC. This is utterly no different
    >from how apps used to be updated, except that they were considerably
    >smaller in the days of MS-DOS, so it was often less hassle to just
    >overwrite the existing Exe (or Com).
    >
    >MM

    Modify it against what? Against which dependencies? You would rather have
    monolithic applications with no run-time vs tiny applications with a single
    large run time? If you haven't tried it, go ahead and whip up a simple C++
    MFC app and staticly link in the libraries. Wow, it's 5 megs! And that's
    if I use NO external libraries past the MFCs. Wow, just 4 apps installed
    and I'm at 20 Megs. Let's put in 20 apps...Holy crap I'm at a 100 megabytes!
    And, using that model count up the exes on your machine, or just the ones
    that are currently in memory...Okay, hate MFC's? Think that's not valid,
    use Borland C++ builder and statically link. Oops! It's huge too! Heck,
    throw in some libraries like say serial communications or Winsock or Inet
    or anything you please...before you know it, the exe compiles to 10 megs!
    Let's use Delphi! Ohmigawd, look at the size of the executables it creates!
    And that's just for hello world! Heaven forbit you should need data access...

    Sign me up!

  3. #48
    Rob Teixeira Guest

    Re: My view of the world



    Forget it Robert, logic is wasted on Mike.
    He's stuck in 1984, so let him go play with his DOS and be done with it.
    Better yet, I am hoping he moves to Delphi/Kylix. I've used Borland products
    all throught my DOS programming days, and even had two Windows projects done
    in Delphi, so I know what it's like. I'd like to see the look on his face
    as he realizes what the grass on the other side really looks like. In the
    meantime, he's just a mass of hot air.

    -Rob

    "Robert Lantry" <mirth@mirthy.com> wrote:
    >
    >Modify it against what? Against which dependencies? You would rather have
    >monolithic applications with no run-time vs tiny applications with a single
    >large run time? If you haven't tried it, go ahead and whip up a simple

    C++
    >MFC app and staticly link in the libraries. Wow, it's 5 megs! And that's
    >if I use NO external libraries past the MFCs. Wow, just 4 apps installed
    >and I'm at 20 Megs. Let's put in 20 apps...Holy crap I'm at a 100 megabytes!
    > And, using that model count up the exes on your machine, or just the ones
    >that are currently in memory...Okay, hate MFC's? Think that's not valid,
    >use Borland C++ builder and statically link. Oops! It's huge too! Heck,
    >throw in some libraries like say serial communications or Winsock or Inet
    >or anything you please...before you know it, the exe compiles to 10 megs!
    > Let's use Delphi! Ohmigawd, look at the size of the executables it creates!
    > And that's just for hello world! Heaven forbit you should need data access...
    >
    >Sign me up!



  4. #49
    Josh Guest

    Re: My view of the world


    As far as static vs. dynamic goes, I've seen apps that will take a .NET assembly
    and decompile it into C# or VB.net. It really seems to work.

    If the reason that static linking appeals to you is versioning (as opposed
    to app size) then can't you just decompile the .net framework classes that
    your apps use and include them in your .exe?

    I know this isn't going to solve your problem (because the CLR isn't a net
    assembely and you will still have all that wasted space), but it would work
    for the grid control that you talkabout. Of course so would .net's private/public
    and side-by-side assembely scheme.

    Josh

  5. #50
    Mike Mitchell Guest

    Re: My view of the world

    On 21 Feb 2002 04:32:31 -0800, "Robert Lantry" <mirth@mirthy.com>
    wrote:

    >...Holy crap I'm at a 100 megabytes!


    How big your Exe is depends entirely on the granularity of the libs.
    You pull in *only* the functionality you require. Most every
    third-party control, most every Windows intrinsic control has a bunch
    of functionality some may not want to use ever. So don't link in the
    bits you don't use! Sure, if you're using masses of features, then you
    have to support them. But many apps could be made a lot more compact
    by omitting functionality they're not using.

    By the way, this is all old hat. We did this in the days of DOS all
    the time. An application was built by linking standard libs with
    object files produced by, say, the Basic 7 compiler. You had stub
    files to omit parts you didn't want.

    MM

  6. #51
    George Guest

    Re: My view of the world


    kylix_is@yahoo.co.uk (Mike Mitchell) wrote:
    >On 19 Feb 2002 14:17:38 -0800, "Rob Teixeira" <RobTeixeira@@msn.com>
    >wrote:
    >
    >>It's a good thing people like you don't design operating systems and development
    >>environments.

    >
    >Yeah, yeah, whatever. The fact is, as a counter argument you can only
    >come up with 25 MB here, or 10 * 2.5 MB there -- basically, it's all
    >small beer, memory wise. We'll have gigabytes of RAM soon, so there's
    >no problem fitting in just a few MB here and there.
    >
    >However, I see you brought up my criticism of allowing multiple 20 MB
    >runtimes to just hang around unused on the hard disk, implying that I
    >am now going back on my words. But I am not doing any such thing. I
    >was complaining before of *wastage*. My multiple Exe's scenario, while
    >admittedly taking up some more memory, would not be *wasting* memory,
    >since if you had an app loaded, once would assume you wanted to do
    >useful work with it. Moreover, with an Exe containing *only* the
    >functionality it actually uses, many Exe's would be more compact that
    >you might assume. Contrast that with the various side-by-side runtimes
    >that could be redundant and never to be loaded again, and hopefully
    >you'll see my point.
    >
    >As for updating existing apps, you'd download a tiny delta file to
    >modify the current Exe in situ. This technology has been around for
    >years. You see, you are affecting only *one* file -- the Exe on *my*
    >hard disk. Where you upgrade DLLs, you have to keep in mind that those
    >DLLs could be used with a myriad selection of other vendors' Exe's.
    >That is not the case with having everything in a single Exe. You
    >update my Exe and it involves only my PC. This is utterly no different
    >from how apps used to be updated, except that they were considerably
    >smaller in the days of MS-DOS, so it was often less hassle to just
    >overwrite the existing Exe (or Com).
    >
    >MM


    Too much time is spent in this discussion on the disk size of the CLR. It
    is a moot point since it will be ubiquitous in Windows machines soon. It
    is my guess that the next version of Microsoft anything (Office, Windows,
    Visio, etc) will be in part built off of the CLR so users will have it installed
    whether they know it or not. Just as Apple's OS X comes with the Java VM,
    Windows developers will probably not have to worry in the near future about
    including the CLR on a distribution.

    There is a lot of processor waste in the multiple EXE idea, which makes it
    an intriguingly goofy idea. With the separate EXE's each program that is
    executed would take up a much larger RAM footprint than the smaller programs
    that share loaded libraries (CLR or DLL). Even in the VB6 world would you
    want multiple copies of the exact same DLL loaded in RAM, one for each program
    in memory, or would you want all program threads to share a single library?
    Think about what is more expensive in computing RAM & processor time or
    disk space?

    The only valid argument Iíve seen on this thread so far is the question
    of CLR versioning pertaining to compatibility. Iím hoping Microsoft uses
    the SUN's example of bending over backward to make sure new versions of the
    Java VM are backward compatible. And why not, they pretty much borrowed
    the entire .NET idea from Java anyway.




  7. #52
    Mike Mitchell Guest

    Re: My view of the world

    On 21 Feb 2002 16:09:05 -0800, "George" <george@coller.com> wrote:

    >There is a lot of processor waste in the multiple EXE idea, which makes it
    >an intriguingly goofy idea.


    How can it be "processor" waste? The CPU couldn't care less where the
    code sits that it's executing. In memory, all code is equal.

    > With the separate EXE's each program that is
    >executed would take up a much larger RAM footprint than the smaller programs
    >that share loaded libraries (CLR or DLL).


    I don't know about "much" larger. Larger, certainly, but by containing
    *only* the needed functionality, your baggage goes down, perhaps quite
    considerably. The only reason DLLs were invented was because Windows
    provided multiple programs running simultaneously, but expensive
    memory and disk space were very limited, therefore sharing had to be
    invented. See http://www.winsupersite.com/showcase...ler_fusion.asp
    if you don't believe me. The article includes the following paragraph:

    "DLLs were a good idea for the time. But as Windows grew in complexity
    and size, DLLs started to wear out their welcome and have increasingly
    become a key source of system instability."

    Therefore, I say, supposing we soon have a gigabyte of RAM at a
    reasonable cost? (It's pretty much the case already.) The whole raison
    d'Ítre for sharing functionality with its accompanying DLL **** falls
    away.

    > Even in the VB6 world would you
    >want multiple copies of the exact same DLL loaded in RAM,


    You wouldn't have the exact same DLL loaded multiple times, but only
    the bits your programs, in toto, require. Not every program needs file
    access. Therefore, the code to open files is not present. Not every
    program needs graphic access (painting, etc), so only the bog-standard
    graphics routines, and not the advanced ones, would be included.

    MM

  8. #53
    george Guest

    Re: My view of the world


    You know, I kind of see where you are coming from but I feel that there is
    something fundemental you are missing. Maybe I'm not eloquent enough to
    state it correctly.

    Correct me if I'm wrong but even if you did program your way youíd still
    get into trouble in the following scenario:

    Most if not all Windows programs at some point must communicate to the Windows
    OS through the Windows API (even if this is hidden from the developer; a
    la Visual Basic). The Windows API is itself a ton of C++ functions. So
    all the user has to do to mess up your exe program is upgrade to a newer
    version of Windows; assuming Microsoft changed an API callís functionality
    between versions. Your answer may be that a user would need to simply install
    another exe Ė but that to me isnít any different than upgrading to a new
    version-compatible library.

    To me, and I think I heard this in a recent devX article with a .NET architect,
    the .NET library is intended to eventually replace the Windows API as the
    standard way for programmers to interact with underlying Windows functionality.
    A big benefit of this would be that this new ĎAPIí is object based instead
    of function based, allowing for better maintenance opportunities by Microsoft
    (isnít that why everyone shifted to OO programming?)

    My defense of Microsoft, which I donít do often, is that even though they
    have screwed developers in the past with changes to their DLLs, they have
    been pretty consistant through upgrades as far as the API is concerned.
    So if Microsoft treats the CLR as carefully as their API the problems of
    DLL **** should not occur. In fact it is more likely that the .NET code
    you write today will work on a CLR written five years from now than it is
    for a compiled exe to do the same. Code written against the CLR could (should?)
    even survive a major Windows OS upgrade (think 64 bit OS) . Theory or reality?
    I point to Java as proof that it could work this way.

    My point is that no matter what programming language you use or how you compile
    and distribute your programs you still need to trust Microsoft not to screw
    around with the underlying architecture too much. This is true of any OS.





    kylix_is@yahoo.co.uk (Mike Mitchell) wrote:
    >On 21 Feb 2002 16:09:05 -0800, "George" <george@coller.com> wrote:
    >
    >>There is a lot of processor waste in the multiple EXE idea, which makes

    it
    >>an intriguingly goofy idea.

    >
    >How can it be "processor" waste? The CPU couldn't care less where the
    >code sits that it's executing. In memory, all code is equal.
    >
    >> With the separate EXE's each program that is
    >>executed would take up a much larger RAM footprint than the smaller programs
    >>that share loaded libraries (CLR or DLL).

    >
    >I don't know about "much" larger. Larger, certainly, but by containing
    >*only* the needed functionality, your baggage goes down, perhaps quite
    >considerably. The only reason DLLs were invented was because Windows
    >provided multiple programs running simultaneously, but expensive
    >memory and disk space were very limited, therefore sharing had to be
    >invented. See http://www.winsupersite.com/showcase...ler_fusion.asp
    >if you don't believe me. The article includes the following paragraph:
    >
    >"DLLs were a good idea for the time. But as Windows grew in complexity
    >and size, DLLs started to wear out their welcome and have increasingly
    >become a key source of system instability."
    >
    >Therefore, I say, supposing we soon have a gigabyte of RAM at a
    >reasonable cost? (It's pretty much the case already.) The whole raison
    >d'Ítre for sharing functionality with its accompanying DLL **** falls
    >away.
    >
    >> Even in the VB6 world would you
    >>want multiple copies of the exact same DLL loaded in RAM,

    >
    >You wouldn't have the exact same DLL loaded multiple times, but only
    >the bits your programs, in toto, require. Not every program needs file
    >access. Therefore, the code to open files is not present. Not every
    >program needs graphic access (painting, etc), so only the bog-standard
    >graphics routines, and not the advanced ones, would be included.
    >
    >MM



  9. #54
    Rob Teixeira Guest

    Re: My view of the world


    "george" <george@coller.com> wrote:
    >
    >Most if not all Windows programs at some point must communicate to the Windows
    >OS through the Windows API (even if this is hidden from the developer; a
    >la Visual Basic). The Windows API is itself a ton of C++ functions. So
    >all the user has to do to mess up your exe program is upgrade to a newer
    >version of Windows; assuming Microsoft changed an API callís functionality
    >between versions. Your answer may be that a user would need to simply install
    >another exe Ė but that to me isnít any different than upgrading to a new
    >version-compatible library.


    A lot of people here have been equating Windows with a giant runtime - which
    from a certain point of view, it is.

    >To me, and I think I heard this in a recent devX article with a .NET architect,
    >the .NET library is intended to eventually replace the Windows API as the
    >standard way for programmers to interact with underlying Windows functionality.


    I've heard bits and pieces of MS replacing "win32" programming. That to me,
    doesn't really equate to replacing the OS infrastructure, rather it means
    that MS has a better way for programmers to interface with Windows (as you
    say), and that (old style) API programming is becoming a thing of the past.
    Another thing i *have* seen quite clearly, is that MS is saying that enhancements
    to the OS going forward will be accessable from .NET, if not entirely written
    in managed code. This is very significant in a number of ways. For example,
    in the past, a lot of enhancements to COM, peripheral services (media for
    example), shell extensions, security, enterprise services, etc. were made
    with the assumption of C++ code only. If VB programmers were luck, at some
    point (hopefully) MS or some third party would write a VB-COM-friendly entry
    point to the new functionality.
    Now, if what they say is true, you can pretty much count on having enhancements
    available from day one.
    Another incentive is that MS doesn't have to do split-development for different
    languages. What works in one will work in the other. That means that you'll
    have fewer guys working on tools and libraries specifically for one environment,
    while ignoring another entirely.

    > A big benefit of this would be that this new ĎAPIí is object based instead
    >of function based, allowing for better maintenance opportunities by Microsoft
    >(isnít that why everyone shifted to OO programming?)


    Exactly.

    >My defense of Microsoft, which I donít do often, is that even though they
    >have screwed developers in the past with changes to their DLLs, they have
    >been pretty consistant through upgrades as far as the API is concerned.


    >So if Microsoft treats the CLR as carefully as their API the problems of
    >DLL **** should not occur. In fact it is more likely that the .NET code
    >you write today will work on a CLR written five years from now than it is
    >for a compiled exe to do the same.


    Well, that remains to be seen. I am however, optimistic that Fusion linking
    (side-by-side model) and the CLR will allow programs to remain stable for
    the most part.

    >Code written against the CLR could (should?)
    >even survive a major Windows OS upgrade (think 64 bit OS) . Theory or reality?
    > I point to Java as proof that it could work this way.


    Yes, it is feasable. In fact, barring the programmer doing something dumb
    or taking short-cuts from his own code, there's no reason this can't work.
    All the numerical bits that can change (handles and pointers particularly)
    are well isolated and encapsulated (the IntPtr type for example). The Jit-engine/Runtime
    would then take care of things like the different addressing schemes, leaving
    your code intact.

    -Rob

  10. #55
    Mike Mitchell Guest

    Re: My view of the world

    On 27 Feb 2002 10:27:14 -0800, "Rob Teixeira" <RobTeixeira@@msn.com>
    wrote:

    >I've heard bits and pieces of MS replacing "win32" programming. That to me,
    >doesn't really equate to replacing the OS infrastructure, rather it means
    >that MS has a better way for programmers to interface with Windows (as you
    >say), and that (old style) API programming is becoming a thing of the past.


    This sounds ominous to me. It sounds as if they could be planning on
    replacing the Win32 API completely with .Net long term, thus making it
    mandatory for *any* Windows application to be done via, and only,
    through .Net. Any "old" Win32 app, including all VB6 apps, would then
    require some kind of "thunking" layer for their calls to be translated
    to the .Net equivalents.

    MM

  11. #56
    Rob Teixeira Guest

    Re: My view of the world



    Doubt it.
    Not in this decade anyway.

    -Rob

    kylix_is@yahoo.co.uk (Mike Mitchell) wrote:
    >On 27 Feb 2002 10:27:14 -0800, "Rob Teixeira" <RobTeixeira@@msn.com>
    >wrote:
    >
    >This sounds ominous to me. It sounds as if they could be planning on
    >replacing the Win32 API completely with .Net long term, thus making it
    >mandatory for *any* Windows application to be done via, and only,
    >through .Net. Any "old" Win32 app, including all VB6 apps, would then
    >require some kind of "thunking" layer for their calls to be translated
    >to the .Net equivalents.
    >
    >MM



Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
HTML5 Development Center
 
 
FAQ
Latest Articles
Java
.NET
XML
Database
Enterprise
Questions? Contact us.
C++
Web Development
Wireless
Latest Tips
Open Source


   Development Centers

   -- Android Development Center
   -- Cloud Development Project Center
   -- HTML5 Development Center
   -- Windows Mobile Development Center