Take away developers' PCs and send them to Walmart

Why is it so hard to get developers to realize the software they design is slow, bloated and does not have the "fast and fluid" experience we all would like? End users may not appreciate this about many programmers and their mindset, but many developers don't like to use old computers? They love their technology and demand leading-edge PCs.

Maybe it's time to take away their computers, and send  them to Walmart to buy new ones.


Developers and Their Computers

When a programmer starts working on his software, he or she has to deal with two problems.

The first is speed of development. If it takes 20 minutes to compile an application, before he or she can even run it, then it takes forever to write software. Programming is often a write some code, compile, test, then write some more code, compile and test process, which just goes on and on. I can understand the need to be productive, so the speed of the development cycle is critical to getting software finished in a reasonable amount of time.

The second problem is that today's software often tends to be a bit sluggish, because it is so CPU intensive. The 32-bit color displays we use, high-resolution monitors, heavy emphasis on graphics and large databases we access, put heavy loads on applications. Software has to do a lot of work, just to do the basic stuff, so everything else we add as programmers, slows it down even more.

The solution for most programmers to these problems is to buy bigger and better computers.

Simply put (please correct me if I am wrong): developers rarely are satisfied with a cheap PC. Most likely they have the leading-edge in computers. Just give them that multicore CPU, with enough RAM for three computers and a mean, gamer's graphic card and a programmer is happy and productive. But this is the solution from a programmer's point of view and there is one big problem with it.

Everybody Else buys PCs from Walmart

The people who use their software often don't buy the bleeding-edge computers (unless they do CAD and 3D games or create videos). The average end user, even in businesses today, buys from Walmart or from manufacturers like Dell -- the more mass-market lower-cost models available.

These computers have lower-end CPUs and even if the microprocessors are dual-core they are lower-end dual core. Integrated graphic GPUs is the name of the game today for mass-market computers and simply put, they just don't perform like a decent discreet graphics card does.

The Programming Paradox

Programmers develop their software on the leading-edge computers, while end users consume their software on computers with likely one-quarter the performance of the developer's computer. This is the programming paradox. Hopefully most programmers have the sense to at least test their software on a low-end computer to make sure it runs well, but this still creates a problem. When the software does not run well, it is easy for a developer to say to himself: "That's what you get for having a cheap computer", rather than take a serious look at ways to improve performance in the software.

The Solution, but You won't like It

While I say this partly in jest, part of me wants to say this in all seriousness: Why not put away your computers, go to Walmart, buy the cheapest PCs you can find and make use them for the next few months. Let's see how well they do now.

OK, be Realistic

Most programmers will likely have a good laugh at the suggestion, chuckle and then move on. But guess what? I am partly serious. Not only can I suggest this, but I live by it. I have been developing a long time now (started in the 1980s), and I design and develop all my software on mass-market PCs. I do all my work on a low-end computer, with previous versions of Windows and then do my testing of the software on newer, faster, PCs with the more current operating system.

When I tell other programmers this, they usually laugh, since most would not work this way. But I have done this for years, and it changes how you look at the software you develop. Performance becomes a key issue. Currently, I work on a Windows XP computer, with a 2.5 GHz Celeron CPU and 768MB RAM.

I then test my software on my Windows 7 (and Windows 8 beta) computers. But even the more current computers I have are not leading edge. They are in the mass-market catagory. Now I did add some 3D graphic cards to two computers recently (Windows 7, Vista Home Basic/Windows 8) so I can test some OpenGL 3D stuff I have been working on, but even there I only purchased as inexpensive a 3D card I could find. Both were in the $50 range (normal retail price), so they are very low-end graphic cards.

Why? Really! Why?

Developing software, especially if you lean towards a more agile style of development, is something where you do a lot of coding, compiling, running and testing -- and continue this over and over again. I need to know how my software will run on the average computer immediately, not after some development time and later testing on a low end computer.

I feel that if my software runs well on a low-end computer , then guess what happens when it is run on better computers by end users. Simply put, "it flies" -- "fast and fluid" or what ever you want to call it. If a programmer spends a day working on their software, with a high end computer and then only tests it at the end of the day on the low end, mass market computer, it is much more difficult to go back and find what slows things down.

The Lesson

While I honestly don't expect most programmers to work the way I do (I am realistic), there is a lesson in this, which may benefit them.

Performance in software has a great deal to do with the mindset of the programmer developing it. Can developers put themselves into the shoes of those who will use their software? Even the choices we make in how (and on what computers) we develop the software can make a big difference. Performance should not be an afterthought, but instead needs to be a mindset.

Photo Credit: trekandshoot/Shutterstock

Chris Boss is an advanced Windows API programmer and developer of 10 year-old EZGUI, which is now version 5. He owns The Computer Workshop, which opened for businesses in the late 1980s. He originally developed custom software for local businesses. Now he develops programming tools for use with the PowerBasic compiler.

58 Responses to Take away developers' PCs and send them to Walmart

  1. MMurcek says:

    Like anything that makes total, crystal clear sense, this column is going to draw a lot of flak, but, Chris, you hit a home run here...

    • chrisboss says:

      Thank you. It appears only 1 in hundred appreciate my thoughts on programming, but thats life I guess.

      • olivierc says:

        Then maybe you have to ask yourself if your thoughts are good or if you are exposing them the right way.
        I can of agree with you on the most part, is just the way you are saying it that sound completely from the past. You are mixing everything together, and not in a good way, that make it sounds like you actually do not have any idea about what you are talking about.
        Not all software are bloatware. You can find poorly programmed native software that run a lot slower than managed one. 

  2. smist08 says:

    Looking at Walmart's website and I see many quad core based systems, some with 16Gig of RAM. These new computers look a lot better than my couple of year old developer computer (which wasn't even that good back then).

  3. jfcarr says:

    I'm really surprised by the tone of this article, since Mr. Boss is a software developer himself and should know better.  I absolutely agree that software should be tested on low-end hardware, but the notion of developing on same is ridiculous.  Software developers are running resource-intensive IDE's along with debugging tools, local database instances, etc.  They have high-end machines because they need them, not because they're sloppy or lazy.

    • StockportJambo says:

      I agree, but Mr Boss has shown himself to be a bedroom hacker with little or no current experience of software development in the real world. He doesn't work in teams, he doesn't understand the benefits of a modern IDE, he has no concept that an application composes many parts which may exist outside of the sandbox that the developer has no control over. Mr Boss is out of touch (as this article once again proves) so in all honesty his opinions mean little.

      • olivierc says:

        Well, to be fair he has a point about testing on lower-end machine ( or minimum specs computer that you decided before doing your software ). And I do not thinks "bedroom hacker" are using BASIC anyway :)

  4. psycros says:

    Actually, if you're talking about high-end games or graphic design apps, those people do NOT buy the $500 Wally World special.  Most serious gamers either build their own machines or go to Cyberpower or a similar outlet.  Photoshoppers also aim high when it comes to PC purchases, typically getting a top-end rig from Dell or another major label.  Still, for the majority of users you're absolutely correct - many developers have loooong been writing their newest versions of everything around the next version of Windows rather than the most current one..and they often do little or no testing on previous versions.  As you also point out, they usually develop on hardware that is well beyond what the typical user will have.  In their defense I will offer this: the price-to-performance ratio of PCs is getting a little better every year.  This is mostly because newer machines from major manufacturers are being equipped with more serious innards than they once were.  Ten years ago a laptop was unsuitable for virtually any sort of gaming beyond Solitaire: now they come with mobile versions of mid-range AMD and Nvidia cards, SSDs and 4gb of RAM or more.  64-bit OS's are the norm now on most new machines.  Still, there's no excuse for inefficient code that's inevitably developed and benchmarked on bleeding edge computers.

    • woe says:

      "Photoshoppers also aim high when it comes to PC purchases, typically getting a top-end rig from Dell or another major label."
      Photoshop users dont need anything special, basically more RAM on a PC with a quad core.  Quadcore PC's are practically the norm these days.  Bestbuy has cheap desktop PC's that come with i5's and 6gigs of RAM to start off with.

  5. Adas Weber says:

    Wow, it's so easy to put all developers into the same pot!

    I will comment from my own experience in terms of how I work...

    Firstly, I don't use a cutting edge computer for development work. It has standard onboard graphics and nowhere near the fastest processor. It has 8GB RAM because I often need to manipulate large images when processing photos.

    I also have several older computers which I use for testing my applications. This is because most of my applications are for businesses that run old hardware and operating systems, so I need to make sure my applications perform well.

  6. Mark Archer says:

    What a totally backwards way to identify and address bottle necks in an application.  It's like trying to build better roads by using less powerful tractors.  Incorporate performance analysis into the development cycle and the the same kind of benefits will be achieved (probably to a much greater extent) without the hit to productivity that the developers would take.

    I'm a software developer and I've been working on a Dell laptop for the past two years because the local government I work for is too broke to buy anything else.  I make it work but I also spend at least 5% of my time waiting on my laptop.  That's a whole lot of wasted money. 

  7. It blows my mind how little developers generally know about proper testing, myself included.

    Don't Uni's teach this stuff?

    • StockportJambo says:

      Developers are never the best people to test their own code. Beyond basic functionality tests, that task is best left to a QA department & some pilot users. The reason is that a developer will know how to make their software work and go through a workflow on that basis, subconsciously or otherwise. A 3rd party will do things out of the "norm", which is what ultimately uncovers bugs.

    • woe says:

      Users are beta testers.....I use Microsoft Enterprise software everyday.  

      Just this last week I got to download a "un-documented" hotfix for DPM 2012 from Microsoft after being on the phone with them for an hour.  We could not backup SQL databases with mixed case names....even though it worked just fine with DPM 2010.  Beta testing!

  8. chrisboss says:

    While what I suggest in this article is "partly in jest" and I really don't expect programmers to use less powerful computers (if you wait too long while compiling, obviously valuable time is lost, plus Visual Studio surely won't run too well on a low end PC). Yet there is a lesson is this story. I should note that the reason I can work on a lower end PC, is that the compiler I use compiles so fast, that I rarely have to wait more than 30 seconds to compile. The compiler itself was written in assembler and it doesn't waste any time. I know some compile source code files in the hundreds of thousands lines of code, so don't feel bad if you need a fast PC. The largest source code (DLL) I have to compile is about 63,000 lines of code and it compiles on about 10 seconds in my PC (my 9 year old Emachine running XP).

  9. malin says:

    There is probably less than 1 in 50 apps where performance matters at all.

    There is a some truth here - nearly half of users still have XP, and many have ancient machines.  More and more users are using slow tablets and smartphones.  Some software performs terribly on typical machines - it is more fun to bang your head on a wall than to run Openoffice or Libreoffice on a typical user's computer, and the majority of antivirus programs seem to be written by people who have no idea that a user might care if their machine runs like molasses.

    But those programs are the exception not the rule.  Most software, even written with a bloated compiler, runs much faster than anyone would need it to, even on a 10 year old XP machine.  An efficient language could help smartphones and tablets, but you don't have anything to do code for Android anyway, right?  In most cases any language and IDE is fine, and a developer may need a super-powered machine to get it to compile fast.  Obviously they should test the app on slow machines.

  10. StockportJambo says:

    Once again you are totally out of touch & once again your arguments have more holes than a paper doily.

    The computer I use at work is a very basic Dell desktop, at least 4 years old, with on-board graphics. The only thing that is unusual is that I have two monitors - that's simply so I can have the IDE open in one and the website I'm writing / debugging open on the other. That's it.

    Guess what?

    I've never had a client complain that "it's too slow". And I don't really put much thought into it either.

    Know why?

    Because it's "fast enough" on it's own, and there are far more important things to worry about when developing software.

    Stability, functionality, ease-of-use, scalability (the sites I'm involved in typically have around 3,000 users hammering it constantly for 8 hours a day, sometimes 24 hours a day if it's a global client)... these are the things that actually matter, both to us and the end user.

    I'd love a new computer from "Walmart" (not that we have such a beast here), because it would undoubtedly be faster & better than what I'm currently using, but it doesn't matter if I don't. My 4+ year old bottom of the range Dell machine works fine.

  11. olivierc says:

    It is good that you understand (finally) that a modern computer as way more to do than your beloved CPC / DOS / WIN95 or whatever you were (are) using, but you still miss the data transfer rate ( PICe, USB, Bluetooth, whatever connect to your computer ) and the security ( sandboxing, antivirus, firewall, etc.). These stuff, whatever people are saying, are actually putting a lot of stress on the CPU, thus making everything slower.

    The rest of articles kind of make sense. It is very common practice to define the minimum requirement for your software, and start testing on it. 

    For the rest, well it is your usual "look at me I'm so good" and "BASIC is the langage of the futur of the past". 

  12. chrisboss says:

    It should be noted that there are many different scenarios in the programming world and no description of how it is done would fit all of them. Some are in a manufacturing environment, or a large organization, or work on teams, while others work alone. I am a sole developer, who writes tools used by a variety of other programmers, so my experience will obviously be different than many others. That said, all one has to do is look at the minimum requirements of much of the software today and it is obvious that most applications are resource hungry. Anti-Virus software is actually a good example, but a good bit of mass market software demonstrates this problem. Now even if a software application runs reasonably well on a mass market PC, when run by itself, Windows is for multitasking and when running multiple apps (or even multiple instances of the same app) one can  quickly see how much of the PC's power is often lost because of resource hungry apps. Maybe it's because I have been programming since the DOS days (can you say 25 mhz 386 CPU ?), but I don't see as significant an improvement as there should be in some of todays software (in performance) now that we have computers which are 80 to 200 times more powerful than those I used then. Even Herb Stutter in his talk "Why C++?" used the term "pathetic" when refering to the industries lack of concern for performance. This is why many at Microsoft are getting back to C++, rather than managed languages. Personally, I feel that one can still find productive way to write software (RAD) and still get performance.

  13. Douglas says:

    Everyone buys their PCs from Walmart huh? Then who is keeping Tigerdirect afloat, buying Alienware, or all of those graphics cards and extra sticks of RAM?

    Sure there are people out there with 6 year-old PCs, that were simple, baseline models even at the time, and they've never been modded since. But that's not your average user, that's not a user whom is likely buying much new software either.

    Even without these people who commented here, who are actually developers, poking wholes in you presumptions about their business--why would you want software to only be developed for the lowest common denominator?

  14. MMurcek says:

    Well, if this column is so off base, why do so many of you pee yourselves over raspberry (got that part right) pi?  How many of you brag about running Linux on a old 386 sx with 256 mb RAM?  Everyone can't wait for quad and more core ARM cpus for their phones, I guess that single core ain't getting it done, eh?  Tablets (much less powerful than an old Core 2 Duo CPU) are going to kill off PCs.  Which is it?  Probably all of the above.  I noticed that the supercomputer guys have not packed up and gone home, so makbe the iron has to be up to snuff for the job?  Just a guess...

    • olivierc says:

      What kind of software are we talking about ? You are mixing super-computer, which are use for highly demanding computational software ( solving complex equation that require power ), desktop software, and mobile software. Granted the later two will merge at some point, but I really do not see your point here.

  15. chrisboss says:

    First, Walmart is this article is simply a name everyone is familiar with which exudes, low cost mass market stuff. No matter 'where" one buys their computers, price does matter today, so end users are more likely to be purchasing lower cost, less powerful computers or for some simply being stuck using the same computer for many years and not being able to buy the latest PC. Wonder why WIndows XP is so popular today, despite it being a 10 year old OS ? Most businesses likely have new computers, but also a good number of PC's with some age on them. Also in a difficult economy, businesses are more likely to look for cheaper costing hardware. Also many who are reading this article are more likely a "techy", being in IT, a programmer or something like that. There are millions of end users of software today who are just common people, whether simply home users or small business users. In many small towns today (I live in the rurals) the local computer store has disappeared (can't make it), so believe it or not, many people, including businesses are buying their computers at the local Walmart. If you check out your local Walmart (and not online which has more choices), computers range in the $300 to $700 price bracket and usually come with 2 to 4 gig memory. Also the tablet PC's of today, because of the restrictions forced by the form factor, have comparibly the same power as a low to middle end Walmart PC or laptop.

    Lastly, even in business environments, with slightly better PC's, is it really true that the majority of software today performs extremely well ? Or is it possible that "acceptable" performance is what most are getting. Just because software may be "acceptable" in performance, does not mean it is what end users deserve or desire. There are many computerized tasks in business, of their nature, just take time. But if an app does a complex task in 10 minutes, but better software could do it in 1 minute, then does not that improve business ? Call me old fashioned, but I do believe performance is vital.

    • olivierc says:

      Ok, can you give examples of software that does not perform as they should ? 

      • chrisboss says:

        Windows XP service packs !

        Windows XP was pretty fast. Service packs 1 and 2, improved it and only slowed it down a bit. Now service pack 3, that is a different story. XP users upgraded to service pack 3, so they can continue to get updates. Yet, it significantly slows down a PC and adds significant size to the operating system.

        On another note: Does anyone remember what a CD is ? There was a time when software came on floppies. The CD was appreciated, so now you could have just one disk, rather than a dozen floppies. But when did software have to start coming on DVD's ? And it isn't all just a bunch of graphics or videos and stuff. The Visual Studio download is huge compared to anything I usually get. I can see the operating system being so big it may require a DVD, but our programming languages ? Does that not say anything about the state of software development ?

        Now when I download VS 11 beta , I installed it on the computer I installed Windows 8 on. This computer was a Vista Computer w/512 meg ram and a Celeron CPU, so I figured it would be good to upgrade it before putting Windows 8 on it. I put a decent Pentium Dual Core CPU in it and 2 gig RAM (and later a 3D video card). I figured that I have the minimal specs for Windows 7/8 (2 gig ram) and a decent dual core CPU. Visual Studio runs terribly slow on this computer.

        Good old Visual Basic 5.0 Pro (which I have) would fly on this PC.

      • olivierc says:

        x Visual Studio 2011 : it is a beta software, probably build in debug.
        x Windows XP SP3 : I have no idea
        x Size of Visual Studio : it actually contains debug symbols and source code for the CRT and other stuff, documentation for everything, and depending on the install you have it also contains SQL server, Team Foundation server, etc. You can choose, when installing, what you want to have.

  16. MMurcek says:

    Uh, here's an example of Chris's point, I think...


    • olivierc says:

      Well this article is mainly about old-hardware being rock solid, not really about software ( sounds pretty basic ). So yes today, we have something call "forced obsolescence" where company set the lifetime of a component to the lowest possible value so that they can sell more. This is very basic capitalism mechanism and has nothing to do with software.

      • chrisboss says:

        Actually is does have a lot to do with software. Computers and software go hand in hand. One does not live without the other. Computers (and software) are an investment. The longer that a computer can be used, the more one gets from that investment. Now if buying a new computer, increases productivity, then it may be worth the new investment (if it does the job twice as fast, then the investment is offset by the increase in productivity), but if it does not, then why throw away the previous investment.

        A business can succeed or fail and how they use their investments can make drastically effect the outcome. Likely many companies who still use WIndows XP, don't do it because they are "old fashioned", "not up to the times" or "they just don't get it", but it is likely because they have an investment in these computers and it is still paying off (computers do the necessary work and well). Why invest in the "latest" computer, when the old one still works great and does a great job ?

        There is value in "a penny saved is a penny earned". If programmers can extend the life of the current PC base in a company, then the company benefits. New PC's then can be added when it really makes sense and truly adds value to a company. Save money now, by designing software which performs well even on legacy computers, frees up money so it can be spent where it really counts (maybe the CAD dept. could really benefit from the latest leading edge PC). Embrace the "old" and "valuable" and know when is the best time to use the latest "leading edge" technology.

      • olivierc says:

        Yes, company are not switching OS for no reason, it sounds logical, but what does it have to do with what you were saying previously ? 
        Generally, company are not upgrading there software as well, so what runs on a given PC will mostly remains the same for it lifetime. 
        I am not even talking about company IT policy that forbid user from upgrading their software ( even for security reason ).

        It has really nothing to do with optimization.

        And you still did not give example of software you find clunky.

      • MMurcek says:

        OK, let me turn your request around.  Not a piece of software I find clunky, but a piece of software I found so useful I kept on using it until newer versions of Windows stopped being compatible with it: A Wang scanner package called ImageIn.  It worked with lots of scanners, output several formats.  It was still useful after Wang was out of business, then Windows moved on and it no longer ran. 

      • MMurcek says:

        Some day that hardware will fail.  I'm sure they'd rather keep running the software that is doing exactly what they need on a DEC emulator than pay thru the no$e to have new software written

      • olivierc says:

        Ok for you two reply, but what does it has to do this article ? 

        I understand that company do not want to pay to replace software, or that you did not find any software that has the same capabilities as the one you mention, but did you try a lot of them ?

        I used to work with  embedded software, and I can tell you that some of the software I had to fix which where made back in the 90s, running on DOS where slow as hell and very buggy.  

  17. so dont use thier software untill they make it run right on your machines if no one uses it they will abandon it or when they do it will be to late you wont need it because there are alternatives to all software, faster cheaper free or whatever we do not have to settle for software or bloatware anymore, i donot download anymore from places that have thier downloader in software like cnet - brothersoft- they can kiss my ----- well thats enough if you dont want it the way it is look for alternative dont use google either i dont use google for nothing.

  18. GoustiFruit says:

    Or use VirtualBox and test your software on different specs.

  19. LRN says:

    This won't work on me, at least as far as the time spent on running development tools is considered. If anyone forces me to use a less-powerful machine, i will either:
    A) Switch to Python (no compilation)
    B) Switch to Go (ungodly fast compilation)

    As for detecting performance issues early...Well, it's a good point. OTOH, if you wrote some code that is not optimal (performance-wise), and you wrote it in a way that prevents you from optimizing it later; or if you've developed a suboptimal architecture - well, you suck, man. Starting on slow machine will just make you aware of your suckiness somewhat earlier (if you're working on powerful PC, running your software on low-end PC is the first thing you should do in alpha/beta stage, so this moment WILL come somewhere before the release).
    Also, remember that premature optimization is evil.
    So i still think that:
    1) Make smart architectural decisions (keep performance in mind).
    2) Make right design decisions (abstract correctly to allow you to change underlying implementation later).
    3) Implement decently (don't do obviously stupid and suboptimal things; or do them, but have a clear idea about fixing them, put it into a comment near the suboptimal code).
    4) Release early, release often, fix performance issues, use profiling tools (hell, even QueryPerformanceCounter will do the trick).

    And finally - there's a difference between targeting 15-year-old OS and targeting 15-year-old hardware. Although, i guess it depends on which side of the barricades you're on.

  20. DatabaseBen says:

    you do have a point.  developers to use state of the art computers only because they are more proficient during the engineering of software.

    but for the most part, most consumer-ware is ultimately designed and compiled to be used by the average computer user and the average computer. 

    it is the consumers responsibility to read the spec's on the software packages to ensure their computer meets the minimal requirements or not.   most of the time, an average computer can become more powerful for running higher end consumer-ware by simply adding more ram or a state of the art video board.

  21. 1DaveN says:

    IMO a big part of the problem is that there is no one to give consumers accurate, unbiased information about PC purchases, so they base their decisions on price.

    A better PC will cost you more initially, but you will enjoy it more and be more productive every minute of its life.  Better quality components will mean fewer hardware issues, and a higher end PC is likely to last two years longer than a cheapo (maybe even three).

    I just talked to a guy who told me that after using a Mac at work, he really knows how bad his home Windows PC is, and that he'll never again buy anything but a Mac.  Digging in a little, it turns out the comparison is between a $350 PC from Staples and an $1800 Mac.  If this guy had had a better understanding of how to buy a PC, he would have spent $700 or $800 to be happy, instead of $350 to be miserable.

  22. Hall9000 says:

    Maybe I'll be off track but here goes. Remember the first PCs we had? Most programs didn't even fill a 5 inch one sided floppy! Yet, they were complete stand alone programs, from games to utility like spreadsheets and such. Memory was basically a computer brain fart. :P So, how did those people even start writing software that actually worked and was fast? It's that kind of talent that is missing today. Today's developers see computers with huge ram and feel they have to use it all just because they can. Same for storage. 10TB of storage available? Then why not write something that will fill almost all of the damn hard drive. Don't send them to Walmart, send them an old COCO 3 or an old Amiga and make them learn how to write compact and fast programs on them.

    • olivierc says:

      Programs nowadays are using a lot of graphics, for different resolution. 

      Let's take a basic sample, on a mobile device using android. Each ICONS or Graphics has to be in 4 different size for all the existing devices. Android does not support natively SVG ( or any vector format) so you need to use PNG.

      It is the same on iOS, take the iPAD for example, the latest one have a resolution of 
      2048x1536 so you need to actually re-make your icons and graphs to support this resolution has well.You can go for an all text application, but then user complains about how poor your application is looking.Old software were not using icons, or very simple one and at a DPI that was kind of low. This is a tradeoff, user want cool looking application, so has long as vector format are not supported, application will get bigger and bigger.

  23. Kushan says:

    There is a world of difference between a machine capable of running software and a machine capable of developing it. Plus, it's not the Programmer's job to decide how efficient the development process should be, it's their manager or the company as a whole. If you give them slower PC's, development will take longer and thus be more expensive for the software house. That's money that the company might not have, or money that can be put towards testing. Any good software house worth their salt has an entirely independent testing team that will test the application on a variety of hardware. If some machines run it too slowly, it's up to the software lead to decide where the "cutoff" point is - after all, if the software doesn't run on someone's computer it's a lost sale.
    Or to put it another way, for someone that has that much experience with software development, they don't seem to know a great deal about it.

    • chrisboss says:

      The point is not what PC one develops on. It is the mindset of remembering what kind of PC's the software will be run on. The lesson here is the proper "mindset", not the PC.

  24. gerryf says:

    I absolutely agree with the point of this article--and have this conversation at least once a week when it comes to anti-virus/security software. Each year, Norton, McAfee, Kaspersky,Trendnet, etc release a new version of their latest greatest malware defeating software and each year they get bigger and more computer resource intensive. There is still a huge install base of Windows XP computers that cannot hope to run this software well with 512 mb of ram and old single core pentium 4 and early AMD processors. Even the first generation Vista machines were often running with 1gb of ram and that is not even enough for the OS, let alone a bloated, cpu & Ram pig that kills the performance of everyday tasks.

    • olivierc says:

      Antivirus are meant to burn CPU cycle and bandwich, they are basically hooking up the kernel for every single file operation you can do. Some of them are also scanning memory, registry etc. in realtime. 
      Their is no way you can do that without impacting the perfomances, even a little. This is a tradeoff for security. 
      Granted  UI on these software can be a little bit clunky ( event if it is getting better and better ).

    • Bob Grant says:

       Part of why I don't use those, and go for Avast instead.

  25. DigitalSin says:

    This is a really good point Chris, and I hope this mindset gains a bit more traction across the development community. 

    My development machine is a beast, and while it's partly like that because I do a lot of gaming on it, the larger reason is because Visual Studio is a resource hog in the first degree. Every iteration of VS gets more memory intensive, when 95% of the time the only thing I'm really using it for is the code editor. 

    Your development tools should be the fastest apps on the planet since we constantly go through the write->F5->test & fix process about a thousand times a day during heavy development cycles. Instead of Microsoft making the tools faster, they get more bloated and expect you to just get a faster machine. Such a wasteful mindset, and it's sad we put up with it.

    • olivierc says:

      Which language are you using ? I find the C++ compiler from VS2010 quite fast. 
      The environment can be tweaked as well ( for example, you can remove intelisence that take quite a lot of resources ( it is better in the latest service pack as it seems they had a pretty nasty bug ).

  26. ToeKneeC67 says:

    Depending on the programmer - they are using cutting edge hardware.  We use to joke around with our programmers at one company, 60 of them) that we were just going to replace their hardware with a more common end users speced PC.  However, when you are paying 60-80k a year (plus benefits) - time is money and if never happened.

    On that - it's the one advantage of controlled hardware, like an Xbox, iPad - maybe this will change if Tables take off the way I predict.

  27. bibleverse1 says:

    Good point of overall article. Would this mindset reduce development cost?

    • Robin Watt says:

      No, the short nearsightedness of this article does not understand the business model, the future path, nor costs associated with it.

      • chrisboss says:

        Maybe this principle won't work in every situation, so I can give you that. But common sense suggests that making software which can run on less powerful hardware can produce a significant savings for a business.

      • chrisboss says:

        (1) If it takes 20 minutes for a development tool like Visual Studio to start up, when run on a reasonable PC, then something is definitely wrong.
        (2) Do not think I do not care about productivity. Programmers should be able to "code fast", "compile fast" and have excellent high level libraries at their disposal (never have to reinvent the wheel). "Time is money" as they say.
        (3) I strongly feel that RAD tools are important. But when a RAD tool becomes a distraction, rather than a help, then again something is wrong.
        (4) I don't use Microsoft programming languages currently. The last Microsoft programming language I used to any degree was Visual Basic 5.0 Pro. When I recently downloaded the Visual Studio beta for use on Windows 8, I was not very pleased. So much distraction and slow that I could not see how a programmer can be productive (of course there are those who know Visual Studio like the back of their own hand, so they are likely to be quite productivtive).
        (5) I write software so it can be used on multiple versions of Windows, rather than just the current version. What I write can work on WIndows 95 to Windows 8, a larger range of OS than most programmers write for.
        (6) In my own work, I will not accept anything but "small (small footprint), fast and reliable". This has proven to be a real benefit when writing software which runs well on the current generation of Windows tablet PC's. Don't need a powerful CPU (Atom will do), don't need a lot of RAM, don't need a lot of valuable disk space (SSD's tend to be small compared to harddrives).
        (7) Code reusability is a high priority to me as well. I write library code, which other programmers use.
        (8) I believe in short (fast) development cycles. Rather than months or years to develop an app, my goal is to make it possible to do so in days or weeks. My own customers depend upon the tools I write so they can develop apps fast.

    • chrisboss says:

      Development costs have more to do with the overall development system as a whole. For example, a programmer who is an excellent debugger is well worth their pay. Debugging is an Art, a skill and no development tool can eliminate the human component. Now can development tools which require less PC power to run and which produce apps which require less PC power can save large amounts of money.Large companies would have to spend less on computers, both toe develop apps and to run them. Computers will have a longer life span, because of not having to deal with development tools which outgrow them. I have been using my trusty WIndows XP computer for 9 years now and I can still develop all the software I work on using it. By saving me money, by not having to keep buying a new PC every few years, when it comes time I really need one (ie. I needed a tablet PC for current testing with Windows 8), it is more affordable. A good example of big savings for me, was the ExoPC tablet I purchased. I got a decent Atom based tablet for $399, while developers who want to work with Windows 8 may tend to opt for the more powerhouse tablets in the $1000 to $1200 range. All my software runs great on Windows 8 on this tablet.

      Leading computers always go for a premium price. If companies can so with slightly above mass market computer specs, they can buy more hardware than if they are forced to buy leading edge computers, just to run (or develop) their software.

  28. Robin Watt says:

    Really?  You do have some valid perspectives; however, let me explain something you've over looked...  Most developer tools are not low end products and have heavy ram and CPU requirements.  They can not run these development tools in a low end system and if they did, they'd eat up more development time than it would take to write a decent application.

    No, I'm not a developer. I'm a system administrator who provides for them, and when it takes a developer over 20 minutes to start up Visual Studio 2010 because some manager thought like you.  I then have to spend my time to go back to the manager, explain to them to idea of Total Cost of Ownership, and show them how much time and money they are wasting of the company because they choose to save a grand here or there by providing substandard systems to develop on.

    Its a nice notion; however, you can't seem to see the forest for the trees.  You've lost sight of the core reason why developers work; to make money, not waste it by wasting company time and resources.

    Now I can see testing products through out the live span of the development on target end user equipment; however, there is no need to cripple your staff to work with substandard equipment unless you (the end user and upper management) are willing to wait and pay the cost of either not delivering on schedule or losing customers due to the excessively long wait to bring products to market.

    • chrisboss says:

      The real question is 'why are the development tools such resource hogs to begin with" ? I recognize that programmers need more powerful PC's because of the development tools they use and I would never suggest someone who uses Visual Studio for example to do so. My real question is why do they get stuck with development tools like this in the first place ? I use a programming language which is only a few megabytes in size, compiles amazingly fast (compiler was written in assembler) and produces apps which can fit on a floppy disk. One does not have to give up on RAD tools and power libraries, to be able to have a powerful development system. The point of this article is the "mindset", not the tools or even the computers we use. Why do we as programmers think "bloat" is acceptable, in our development tools and in the software we develop ?

  29. chrisboss says:

    I should point out something which was not touched on in this article. The quality of software has a lot to do with testing by the developer. It is easy to pass on testing to beta testers, but I am a firm believer that the programmer should do the primary testing. The way I write code, is to write small sections of code and immediately compile, run and then test. I don't want to build new code on previously buggy code. I aim for bug free code if possible. The speed of editing code, compiling and then running to test is critical for fast and reliable development. So when a developer can use a development environment which doesn't get in the way of coding, but enhances it, encourages writing code quickly and the compiles lightning fast, so you can code, compile, run and test , over and over again to produce bug free code, then you have a winning tool.

© 1998-2020 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.