David,

> Because in a language strictly tied to a platform,


In a world tending toward platform independent languages that is a null
argument. So when Windows 64 comes out we change all the definitions again?
Not a good idea.

> Integer should be the optimal integer size on the platform


Why?

> (and no, Jonathan, implementing
> Integers as 32-bit integers behind the scene and making them act like
> 16-bit integers would not be a good idea; you'd have a base integer type
> with very strange behavior, so experienced programmers would use Long
> for everything).


Excuse me, but that is abusrd. No one is talking about making them work
behind the scene as something else. What we are talking about is avoiding
confusion. For example using Int16, Int32, Int64 is easy to understand and
creates no confusion. But if I have a routine in VB6 that does binary file
access and reads integers to and from the file and then convert that program
to Vb.Net, I've got to make sure that every time I access the binary file in
the Vb.Net program that I am no longer reading and writing integers but
shorts. That can be very dangerous, because if just one gets by it'll blow
my database.

> Lots of people claim that they never had any problems with GOTOs,
> GOSUBs, C-style pointers, or self-modifying code, but it doesn't keep
> them from being errors waiting to happen either.


As if there are no errors waiting to happen in VB.Net.

And, by the way, I am not a Notter. I have VB.Net installed and it has a lot
of things I would have liked to have seen in VB6. In fact I've created
several classes in VB6 that emulate things in VB.Net, which, of course will
make the transition easier when I get there. But it is sure a shame that MS
didn't make the transition easier. Integer is just another one of those
unnecessary changes.

Gary