What's wrong with Microsoft?

I recently had someone I don't know too well at my job say something like "You just don't like anything made by Microsoft!" after I criticized MS Source-Safe. Now it's true I don't have any great love for Microsoft, but I call it how I see it. Technically speaking, MS Source-Safe is way behind the times when compared to subversion of mercurial. In fact, it's main advantage is that it is supported out-of-the-box by MS Visual Studio. That's it.

I feel it unusual having to defend myself to someone like that, because it seems to me that anyone who has been to school for computer science would know what makes computer systems good, and realize that Microsoft fails on most counts. They have won in the marketplace as a result of savvy marketing, dirty business tricks, the occasional good product, and sheer momentum.

The problem is that a lot of people working in the IT industry know IT. IT and Computer Science are very difference. IT Majors learn things like how to install Windows and configure TCP/IP addresses. Computer Science majors learn things like "Big O Notation", and how to design operating systems and compilers. Many IT people spend their lives learning .Net or Java just well enough to do their job, instead of thinking about how things should be implemented.

Some people also think I am a Mac fanatic just because I own one. I don't especially love Apple. They make great products, but nothing is perfect. People don't realize that there are dozens of operating systems, Mac OS and Windows are just the two most popular for consumer applications.

Examples:
File Locking
In almost every operating system I know of, you can read, write, rename, or delete a file while it is in use (unless it has been specifically locked). This is not true in Windows. This is considered the "right" way to do things as far as system architecture is concerned by most people who know about such things. Why can't you do this in Windows? Because you couldn't do it in DOS. Why couldn't you do it in DOS? Because DOS was designed to be a simple single-user, single-tasking OS, and tracking file access adds overhead. It was easier just to draw a line and say "You can't do anything to the file when someone has it open." Now you might think this is something that only a computer geek would pick up. That's true in a sense, but what computer specialists think about computer stuff is important, it's their field after all. Building an operating system is like building a foundation for a building. If the foundation is poorly done, the building on top will have issues that are very hard to fix. In the case of the operating system, the "building" is the upper layers of the OS itself, and the applications running on top.

But why does this file locking limitation matter? Well it matters partially because if files can't be deleted while they are opened, then most OS components can't be changed when the OS is running. This is especially true in Windows due to it's relatively monolithic design. In a typical Unix system, most all of the files can be swapped out for different ones while in use if you have the proper security access. One practical use of this is for system updates. Since Windows can't modify EXE files that are running, and especially DLL/OCX files that are in use, you usually get the dreaded "you must reboot to continue" message. This is because the computer can only replace those files when you reboot, because that's the only time it can guarantee they won't be in use. Poor design.

Ok, so that's just one thing... What else?

Downloads in Internet Explorer
Did you ever notice that large downloads take a very long time in Internet Explorer? Not just to download, but to copy to where they are supposed to be once the download is finished. This is because IE downloads them to some temporary folder in the meantime. When the download is one, it copies it to where you actually told it to save the file. For large files, this copy can take quite some time (more about that later). Worse yet, since it copies the file instead of moving it, it takes twice as much space on the disk until the copy is complete. (At which point, I assume IE deletes the file). So we take extra time to copy, require twice as much space, and put wear-and-tear on the hard-disk - what is the reason? I mean why not just download the file to where you asked for it to be? Could be several reasons. Perhaps Microsoft wanted to make sure incomplete files didn't end up where the users could access them, or didn't want users clicking on them before the download finished. Perhaps there is some technical reason why the virus scan must be done in the temporary folder. That's all well and good, but why not just move the files to the destination when completed. That would be almost instant, and not require all the extra space. The best reason I can think of is because cross-volume moves don't work in Windows. That is, moving a file from C: to D: doesn't work. If you had saved the file to a different partition or another device entirely, the move would fail. Still, they could have tested this first and moved when possible. Also, it could be that you are saving to another drive because your primary drive (where the temp folder almost certainly is located) is already full. In the more than 10 years that IE has been in development, apparently nobody at Microsoft thought about that. (By the way, cross-device moves typically do work at the OS level on other modern operating systems).

Copying Files
Copying files is something that pretty much everyone does, and it's something they have done since before Windows, even before DOS. I would expect that this would be a task that Windows would handle exceptionally well. I would be wrong. In fact, it seems to have gotten worse with time. Windows 7, for example, seems to be extremely slow at copying large amounts of files. As an example, we copied about 18GB of data from a computer's internal hard drive to a USB2 external hard disk a few weeks ago at work on a Windows 7 machine. The process took all day long. It's true that USB2 isn't the best way to connect a hard disk, nor is it the fastest method - but that got me was that for the first 5 or 6 hours, Windows just gave a message something like "preparing to copy". We didn't know what it was doing, we didn't know how long it would take, there was no useful status. It makes me long for the good'ol days of Norton Commander. That was only DOS based, but you knew exactly what was going on, and you had the power to do a lot of things you still can't approach in the Windows desktop shell (Explorer).

At any rate, finally the file copying started, and it was done after another few hours. We brought the disk to another office, and started to copy the data onto the destination server. My co-worker asked me if I had any suggestions to speed up the copy. It was obvious that Windows Explorer had the main bottleneck before, so I suggested he use XCOPY, which is Microsoft's command-line utility to ... well, copy large amounts of data. I can even forgive Microsoft for the Explorer copy function being anemic. Most users probably copy only a few dozen files, comprising a few hundred megabytes at once, especially to external drives. The XCOPY went well for about 20 minutes, and then starting throwing errors, and stopped with "out of memory". The co-worker mentioned that the destination disk wasn't "out of memory". I don't know why people who supposedly know about computers think "memory" means "disk space", when it always refers to RAM, but whatever. The computer had gigabytes of RAM available, so I figured it must be some DOS limitation or something. Even so, it shouldn't require much memory to copy files, you make a buffer of a few MB and use it over and over. I looked up the problem. Apparently it's a "known issue". It's been a "known issue" for years. Microsoft still ships XCOPY with it's OS, but it can't be bothered to fix such a basic utility. The error actually occurs when the path names become too long (more than 255 bytes). Since the paths on the USB disk were already long, and my co-worker was copying them into a location something like "d:\the_place_to_copy_the_stuff\", the new names were past the limit, and XCOPY crashes with the cryptic limit. It's lucky I found the cause so quickly on MSDN, but if they don't want to fix it, they could at least have it spit out a warning when you run it "File paths longer than 255 chars won't work!", or something. We changed the destination path to something like "d:\p\", and sure enough, it worked fine - and it ended up completing the copy in much less time than Explorer took.

The entertaining thing was that Microsoft's recommended solution was to use the "Robocopy" utility from the Windows 2003 Resource kit. Interesting. Still, it I was going to go to the trouble of downloading and installing a new utility, I would just use the excellent RSYNC and be done with it. Should I really have to go through all this trouble just to copy a directory tree?

Let's go for just one more example.

Instant Messenger Wars
First there were multi-user chat rooms like IRC. Later AOL and ICQ came alone. AOL released AIM for non-subscribers, and later bought ICQ. Eventually Yahoo came up with their own chat, and Microsoft released MSN Chat. Now a lot of people don't realize this, but a chat *program* and a chat *protocol* are two different things. A *protocol* is a method of connecting, while a *program* is the application you use to do it. If you use AOL's AIM program to access AIM, and you use MSN Messenger to access MSN, etc., then that's all well and good, but understanding that there a clear separation is good for a few reasons.
1. Third party applications can be developed to talk to existing chat services.
2. These third party tools may be available for operating systems that the chat services don't want to support officially.
3. These third party applications might let you do cool things the official programs don't.
4. One of these cool things is to collect all your "buddies" into one list in one program, instead of having 10 different chat programs open all the time.

ICQ was open from the start. AIM was unofficially open from the start, they even quietly produced a Unix client in addition to their Windows and Mac clients. Microsoft was very closed, not allowing anyone else to access their network, and thwarting 3rd party tools at every turn. Nobody much cared, since most people smarter than the average consumer didn't use MSN chat to begin with. But AIM combined with ICQ had a huge market-share compared with MSN at the time. Microsoft wanted to be able to connect MSN to AIM/ICQ, since they were the underdog. AOL blocked them at every turn, and then Microsoft, of all companies, started whining to the FCC and everyone who would listen about how AOL was a, wait for it... Monopoly! They said that the internet should be free and open, and services should be interoperable, etc. All of this just after they finished shutting down Netscape and trying to get every web developer hooked on proprietary IE HTML and ActiveX so that you would need Windows to look at web pages. I agree that the internet should be open and interoperable, but hearing this cry from Microsoft while MSN remained one of the most difficult services to connect to outside of their official applications was laughable at best.

And.. one more for history's sake
DOS vs DOS
Long Long ago, in a place far, far, away, Microsoft was paid by IBM to develop PC-DOS, the operating system for the IBM PC. Microsoft bought a fledgling OS called "Q-DOS", which was a rip-off of CPM (which was in turn a simplified look-alike of Unix for computers with little capability for things like multitasking or multiuser security). Anyway Microsoft bought Q-DOS and branded it PC-DOS, and licensed it to IBM. Then they turned around and started selling their own version, called MS-DOS. Turns out they had the legal right to do so, and someone at IBM probably got fired for not reviewing their contract very carefully. Not too long afterwards, they released MS Windows as an optional add-on that ran on-top of DOS. Later, a company called Digital Research released their own version of DOS, called DR-DOS. DR-DOS was better than MS-DOS and PC-DOS in almost every way technically, and the price point was quite low. How did Microsoft respond? They put special undocumented calls into their MS-DOS, and had Windows check those calls before it would load. This ensured that Windows wouldn't work on DR-DOS, and thus, anyone who wanted to run Windows had no choice but to buy MS-DOS instead of a competitor. Digital Research sued, but the case dragged on in court, and by the time the case was settled, it was far too late.

I could continue on, but I think you get the main points by now:
1. Microsoft Windows is not an amazing technical achievement by any means.
2. Some of their business practices to date have been less than unscrupulous.

Now, I don't hate the company. It's hard to love or hate anything as large as Microsoft. They are like the government. Any government is very large, has multiple departments, and has done lots of good and bad things in the past.

What I really want to say is that if they come out with some truly great piece of software, I will be the first to recognize it as such. But I'm sorry, as it is, Source Safe is a piece of junk.

Comments

Popular Posts