• Explore the magic and the mystery!



  • A Troubling Look at Web Standards and Desktop Publishing

    November 28th, 2007

    First-time Web developers are probably surprised to learn all the silliness that you have to endure in order to get anything more more than the simplest site to look good in the various browsers. Sometimes it can become a total nightmare, where one browser makes everything look perfect, while another messes up tables and text and pictures overlap or aren’t even there.

    Worse, fixing one problem, creates yet another in the original browser. It’s a vicious circle, and one that I have confronted often. In fact, when our original Webmaster, Brent Lee, updated all our sites last year, he often had to write separate code strictly to accommodate the eccentricities of Internet Explorer. You see, Microsoft has its own bright ideas on how Web standards ought to be implemented, and they don’t always follow the rest of the industry. In a sense, they want you to accept to their point of view, rather than the other way around.

    Even if you overcome the Internet Explorer equation, it’s a huge juggling match to get everything else to work properly among Firefox (and its derivatives), Opera, Safari and lesser applications. At the end of the day, you have to compromise big-time to get things to look as good as possible, and I have to say I’m not always happy with the end results.

    To make matters all the more confusing, some sites are designed to work strictly in Internet Explorer, because they support some Microsoft-only feature, such as the ever-insecure ActiveX. In one case close to me, a client who worked in the real estate industry was forced to use Windows in order to access a multiple listing site that the firm who employed her subscribed to. Fortunately, that site eventually added support for Opera (but still not the other browsers), and she was able to abandon Windows for good.

    Despite all these headaches, each browser developer touts its fidelity to Web standards and how well it does at various canned performance and rendering tests. Now maybe that’s true, but things don’t always play out that way in the real world. You see, as soon as you add special applications, such as the ever-popular WordPress blogging software and various forum systems, all bets are off. Install a few modifications and special themes, and the situation becomes even more complicated.

    Let’s compare that to the desktop publishing world.

    Back in the 1980s, Adobe introduced the PostScript description language, which basically defined the characteristics of the printed page in mathematical terms. The end result is that, save for an output device’s limitations in terms of print resolution and color quality, documents would almost always reproduce with near-perfect fidelity. Everything was predictable; well, so long as everyone used the same fonts, meaning the same versions from the same vendor.

    The PDF format that also lies at the core of Mac OS X even allows you to embed fonts and illustrations in a document. It’s also an industry standard, so that near-perfect precision can be consistent even across computing platforms.

    Indeed, Apple and Adobe, together, made the desktop publishing revolution possible over two decades ago. While there had to be a few compromises along the way, the day that traditional typographers and graphic artists gave up their old-fashioned tools and bought Macs was the day the publishing world changed for good.

    Contrast that with the Web, where everything is approximate, and absolute precision largely remains an unfulfilled dream. Sometimes I wonder if the various browser developers and Webmasters understand that every user, from a Web-based business to the consumer, suffers big time because of this absolute standards disaster.

    Sure it’s possible to ensure fairly straightforward compatibility if you lower your standards and keep your sites simple, without the flourishes that separate greatness from mediocrity. That, however, would simply reduce your presentation to the lowest common denominator, and not allow you to take advantage of the best so-called “Web 2.0” features that everyone’s touting. If you do choose to embrace them anyway, you work ten times as hard to make everything work together among all the browsers without breaking too many things.

    So is there a real solution to this mess, or just more excuses?

    Personally, I think the Web industry needs to support a true PostScript for the Web. That means a consistent mathematical language that allows sites to render consistently among all browsers and computing platforms that support the standard.

    You wouldn’t even need to master text-based coding anymore to ensure absolute precision, which has to be a relic of the 1970s; just your favorite desktop publishing application. And it would mean that text, graphics, tables, Flash-banners and all the other goodies we put on our sites would always look good and absolutely the same regardless of which browser we prefer.

    Even better, you could use just one document, unaltered, for both online and print content. Instead of having to reinvent the wheel to provide the online analogue of a printed page, you’d prepare your document once and you could deploy it anywhere without the need to make changes, except to, perhaps, insert links and special online banners. Even then, the links could be a native part of the document that would simply not appear in the printed version.

    In fact, you can get a great idea how it works in the online version of a PDF file.

    Do I make myself clear? Or am I whistling in the dark here? Is a variant of PDF the practical solution to total print and Web integration? Or will Microsoft fight Adobe and the rest of the industry tooth and nail if their proprietary standards aren’t adopted? I wonder.



    Share
    | Print This Post Print This Post

    33 Responses to “A Troubling Look at Web Standards and Desktop Publishing”

    1. Tim says:

      I wouldn’t mind the explorer only pages so much if Microsoft would resume support for OSX explorer. The “I. E. only” page I need no longer finds I. E. 5.2.3 acceptable. Opera seems to have dropped the “Self-identify as I.E.” option, sigh, I can still go to the library.

    2. Dana Sutton says:

      The idea of a “a true PostScript for the Web” is a great one — at least if it can be implemented in such a way that browsers would be backwards-compatible and still able to display our current pages. Unicode serves as a good illustration of what happens when a single standard is universally adopted: thanks to Unicode it is now possible to display such data as non-Roman alphabets, mathematical, logical and musical notation on the Web in such a way that it can be read by any browser on any platform, no matter what platform it was created on, and be exchanged between different platforms (so that, say, a Word document with some Classical Greek embedded in it created on a Mac can be opened and read by Word on a peecee). Before Unicode came along, the situation was chaotic, it was impossible to display such data on the Web accurately, and documents containing such stuff could only be read in the platform on which they were created. This improvement is hugely beneficial for all hands — and it only came about because a lot of corporations (not least Microsoft) managed to suppress their egos and private ambitions and pull together to make it happen. Unicode serves as a kind of model for the universal page-description language Gene is suggesting.

    3. kenh says:

      No question Microsoft will try to enforce a monopoly on the web, but because of the increased popularity of other platforms, will they be able to pull it off?

      Whenever I see a site that is really bad, I refer them to http://validator.w3.org/, The W#C Markup Validation service. Of course, whenever I send the results to these sites, I never get a reply.

      Based upon my experience as a non-traditional college student a couple of years ago, most students are not even aware of any web creation software other than Front Page, and my guess is that many web developers are the same.

      So I guess the answer is still a mixed bag.

    4. Tom B says:

      PDF for the web might be a decent solution in this age of broadband if PDF support on Windows was anywhere near as good as it is on the Mac.

    5. Henry Rzepa says:

      The comments about “postscript for the Web” I think miss the point entirely. The Web was designed (and I was there, listening to Berners-Lee, Raggett, et al in 1994 setting out their vision) to endeavour to separate content from presentation. Something, I might add, that neither postcript or PDF do at all well (indeed, at all). The Web, some of us hope, is heading towards a symbiosis between machines and humans, where the former achieve the boring bits of devouring millions of pages, and the humans the creative bit of thinking about only a few of them. The way the web developed was to also avoid “postscript for the web” by allowing Web pages to be converted to eg Braille for the visually impaired, and so forth. I wonder whether “postscript for the web” would allow that? Finally, as a masterstroke, they mandated that the web be non proprietary. Postscript? That belongs to Adobe does it not. “Postscript for Web” would probably mean Flash, or some such, also proprietary.

      The bottom line is that the Web is not just a pretty visual medium for human enjoyment, but heading towards a “semantic” medium when both machines and humans co-exist. Postscript for the Web addresses only the (sight enabled) human.

    6. Richard says:

      Actually, I think that you’ll find your wish is already provided: there’s nothing stopping you from constructing your ‘perfect’ design, and saving it as a JPEG/GIF/PNG image. That will get perfectly reproduced on all machines. Well, except those using Lynx. Or who have to increase the font size, to make things readable. Or who use a screen reader. Or who are accessing your site from a phone, with limited screen size. Or those people who want to be able to quote sections of text easily.

    7. The comments about “postscript for the Web” I think miss the point entirely. The Web was designed (and I was there, listening to Berners-Lee, Raggett, et al in 1994 setting out their vision) to endeavour to separate content from presentation. Something, I might add, that neither postcript or PDF do at all well (indeed, at all). The Web, some of us hope, is heading towards a symbiosis between machines and humans, where the former achieve the boring bits of devouring millions of pages, and the humans the creative bit of thinking about only a few of them. The way the web developed was to also avoid “postscript for the web” by allowing Web pages to be converted to eg Braille for the visually impaired, and so forth. I wonder whether “postscript for the web” would allow that? Finally, as a masterstroke, they mandated that the web be non proprietary. Postscript? That belongs to Adobe does it not. “Postscript for Web” would probably mean Flash, or some such, also proprietary.

      The bottom line is that the Web is not just a pretty visual medium for human enjoyment, but heading towards a “semantic” medium when both machines and humans co-exist. Postscript for the Web addresses only the (sight enabled) human.

      My entire argument here relates to rendering fidelity. All browsers should be able to reproduce the same content as it was intended, without alteration, by adhering to a basic display technology.

      It doesn’t necessarily have to be PostScript or PDF. Whatever it is must be an open standard and must allow the same level of user flexibility we have now. PostScript is presented as an example because it allowed for device independence. The next step is browser independence. So if you want to use Internet Explorer, fine, but you shouldn’t have to suffer if a Webmaster doesn’t get around to providing a few tricks so that you can see the content properly. Nor should you suffer because what you do for Internet Explorer makes your site incompatible with Safari, etc.

      Peace,
      Gene

    8. gopher says:

      The real problem I see is websites which don’t conform to w3.org standards, and web browsers of old which don’t support all the old standards. The compromise is http://www.anybrowser.org/ which suggests graceful degradation of code. When web publishing software allows this to be done easily, we’ll have the holy grail. Right now, it can only realistically be done if you learn HTML. My FAQ

      http://www.macmaps.com/browser.html

      has additional resources to help you out. I compose my entire website with the Mac only software BBEdit, and I swear by it. It is available from: http://www.barebones.com/

      Suggest anyone who has a Mac interested in publishing to the web to learn how to use it. Not only does it let you use true HTML, but also makes it a lot easier by giving you the precise tools to build HTML code very easily without having to memorize the commands.

      Another thing I have learned is that PHP provides a kind of buffer with which to do server based rendering of HTML. By doing your rendering on the server side, you can remain free to explore rendering options that are more compatible with the client without having to ask the client to use one browser over another.

    9. Dan Knight says:

      Gene, I know your pain. Solutions that work with one generation of browsers may break with the next one, especially Internet Explorer. And W3 standards are sometimes as clear as mud – the Safari team interprets them one way, Mozilla another, and so forth. The problem with HTML and CSS “standards” is their lack of precision, of defining exactly where an offset should be measured from, for instance.

      My solution used to be a table-based layout, and our switch to a CSS-based layout was a nightmare. It took weeks, if not months, and the cost of a Windows laptop to get Low End Mac displaying properly on the abomination that is/was Internet Explorer 6.

      If you want pixel perfect rendering, your options are to create an image, create a PDF, or use Flash. You simply can’t do it with HTML and Cascading Style Sheets, as different platforms (Macs, Windows, Linux) render type differently and different browsers have their own idiosyncracies.

      The real question: Is HTML good enough? I believe it is, as it lets the reader do things like make text bigger and smaller to fit their screen and their visual needs. Flexibility isn’t a problem, it’s a benefit designed into HTML from the start.

    10. PXLated says:

      You’re whistling in the wind. The web is N O T print nor was it intended to be. And, if you code properly, you can get a far more complicated site than MacNightOwl to display correctly (pixel perfect), and the same in every (5.0 ) browser without a single hack for IE and even validate xhtml-strict. Code to web standards, validate your code and as a developer/coder truly understand CSS. Generally, the only problem is…amateur developers.

      Run MacNightOwl through the W3C validator and you’ll find 105 reported errors and that’s for the less strict “transitional” doctype. Hard to get pixel-perfect with that.

    11. You’re whistling in the wind. The web is N O T print nor was it intended to be. And, if you code properly, you can get a far more complicated site than MacNightOwl to display correctly (pixel perfect), and the same in every (5.0 ) browser without a single hack for IE and even validate xhtml-strict. Code to web standards, validate your code and as a developer/coder truly understand CSS. Generally, the only problem is…amateur developers.

      Run MacNightOwl through the W3C validator and you’ll find 105 reported errors and that’s for the less strict “transitional” doctype. Hard to get pixel-perfect with that.

      Actually, we do pretty well with most browsers I’ve tested. 🙂

      The other point is that, obviously, we don’t generate our own HTML and PHP. That’s done via WordPress and its various plugins on this site, and I doubt we could change code errors unless those components were modified. So I presume, then, that the people who develop WordPress and its add-ons are “amateur developers.”

      No, in the real world it doesn’t work that way. I’m not saying that the Web must be the same as print. I’m saying that, in terms of precision, there’s a crying need for it here too and making every site perfect otherwise is one horrendous chore, and usually unsuccessful.

      Peace,
      Gene

    12. PXLated says:

      — we do pretty well with most browsers I’ve tested —
      Yes, using a transitional doctype puts a browser in quirks mode. The rendering engine tries to fix things like unclosed tags, etc.

      — done via WordPress and its various plugins on this site…“amateur developers.” —
      That’s a problem with a lot of blogging platforms. I personally prefer and use ExpressionEngine. Clean, clean code and content is totally separated from presentation. Ones site can be as standards compliant as they are capable of making it.

      — there’s a crying need for it here too —
      I’d agree that it needs to get better. The web was started by geeks for geeks (or scientists) and then we had the browser wars and now finally some web standards. The print world/environment has the advantage of being really, really old with well established rules for what works/what doesn’t and how one does things. And you’re working with a fixed size. Postcript came along and implemented those well refined processes. The web is only a few years old and is still developing but with web standards and the various browsers (even IE7) implementing them, things have gotten better and will continue. We’re in the hot-metal type or quill pen days in comparison.

      — precision —
      I’m not sure one will ever get the same precision as related to print, nor should we. Two totally different animals with totally different variables. If one embraces the difference (and I know print people have a hard time with that), one can take advantage of them. And with the mobile web finally starting to get hot, the variables are even going to be more abundant.

    13. Kaleberg says:

      The web isn’t print. If you want paper, use PDF. Most people want to be able to adjust their window size, width and height. They want to control the base font and font size. Sure, graphics designers still relish the opening to the old Outer Limits show. You know, the weird video and the “Don’t adjust your television set. We control the horizontal, and the vertical” and so on. There have always been evil space aliens, and there probably always will be, but I’ll adjust my set when I want to.

      Besides, most of the problems are not precision rendering, but weirdness in Javascript features. At least all of the JS code I wind up reading and debugging is constantly checking for the presence or absence of various mechanisms for accessing blocks of html by name, the contents of HTML blocks and so on. That’s the stuff that bites back.

    14. gopher says:

      The problem is not just Javascript, Cascading Stylesheets, Java, or any other client side language. Accessible websites that work only on text only web browsers, those who prefer not to have formatting forced on them, and those who want formatting. In short HTML is a markup language, not a layout language. You have to conceive of text being marked up. Anything beyond that is bells and whistles some of which work for some and others work for others. Unless you develop for the common denominator you will always have dissatisfied clients using the web. There is a lot you can do with that common denominator, but there is a lot you can’t. Unless you can suddenly give everyone 20/20 vision, there will always be someone trying to read with a text only web browser.

    15. Sam Elowitch says:

      I agree that the limitations of the web as currently implemented are sometimes ridiculous, particularly with regard to typography and layout — it’s so limited, even with the advent of Cascading Style Sheets (CSS) and all the wonderful work so many have done in that area. However, any radical restructing of the web away from HTML would have to figure out some way of making the new code search-engine compliant. The other factor is speed. Rendering a PDF for every web page would put a tremendous bandwidth burden on servers and clients everywhere — it would really gum things up. Hence there would have to be something much more lightweight.

    16. I agree that the limitations of the web as currently implemented are sometimes ridiculous, particularly with regard to typography and layout — it’s so limited, even with the advent of Cascading Style Sheets (CSS) and all the wonderful work so many have done in that area. However, any radical restructing of the web away from HTML would have to figure out some way of making the new code search-engine compliant. The other factor is speed. Rendering a PDF for every web page would put a tremendous bandwidth burden on servers and clients everywhere — it would really gum things up. Hence there would have to be something much more lightweight.

      PDF is presented not so much as a final solution, but as a direction in terms of results. Yes, I agree that any solution has to be lightweight and not cause excessive rendering times and heavy loads on Web servers. We have to respect the people who buy $5 Web hosting and expect it to just work.

      Peace,
      Gene

    17. Sam Elowitch says:

      One thing’s for sure — I think everybody in the web design community is sick to death of the endless parade of the same handful of cross-platform fonts (Times New Roman, Arial, Helvetica, Tahoma) that are predictably available for use on the computer of visitors to a website.

      W3C should come up with a standard for uploading/embedding fonts, and that would mean close coordination with the makers of webserver software like Apache, Tomcat, IIS. Fonts should be rendered server-side instead of client-side, but how exactly this would be done I honestly don’t know.

      Also, for crying out loud, would the browser makers (ahem, Microsoft) please fix the broken box model for layout, in pixel-true fashion?

    18. gopher says:

      One thing’s for sure — I think everybody in the web design community is sick to death of the endless parade of the same handful of cross-platform fonts (Times New Roman, Arial, Helvetica, Tahoma) that are predictably available for use on the computer of visitors to a website.

      W3C should come up with a standard for uploading/embedding fonts, and that would mean close coordination with the makers of webserver software like Apache, Tomcat, IIS. Fonts should be rendered server-side instead of client-side, but how exactly this would be done I honestly don’t know.

      Also, for crying out loud, would the browser makers (ahem, Microsoft) please fix the broken box model for layout, in pixel-true fashion?

      Predictably? That’s a laugh. Some people are still run Netscape 2 or 3. Some are running Lynx because they are legally blind. The result is no font other than the font which shipped with their system displays. We can’t depend on ANY font.

      My current browser statistics still show:

      Firefox 2: 1%
      I.E. 6: 11.1%
      Opera 9.X 2%
      Firefox 1.x 35.4%
      I.E. 5.0: 1%
      Safari 2: 2%
      I.E. 7.X 6.1%
      Mozilla 1.x 2%
      Safari 1.X 39.4%

      But I’ve seen Cyberdog, and Lynx on there on occasion.

    19. Indeed, there are lots of reasons why people update, and the common one is they don’t know there’s a new version available. It’s not like Apple providing a Software Update mechanism and providing a new version of Safari or at least a new WebKit in a system update. Take 10.4.11 as an example, where Safari 3 was included for Tiger users.

      It doesn’t always work that way for other browsers.

      Peace,
      Gene

    20. Sprocket999 says:

      Yes, Gene, the concept of a ‘PDF-like’ (not actually using PDFs though) cross platform standard would be welcomed by everyone except one — unless they were to create it, which then would mean only their client software would work correctly. (Have we seen this before??!!)

      In a perfect world, though, I believe this concept would have to come from a non-commercial venture who has nothing to gain except goodwill for whatever altruistic reason. It would also have to be so good that developers and users would be drooling to be involved. The minute a ‘for-profit’ gets involved — all bets are off.

      Personally, as a Web Designer/Ad Agency Owner, I like it. Will it happen? I don’t think so, but I WISH it would. Cheers!

    21. Sam Elowitch says:

      I definitely agree that one of the cardinal rules of the Web ought to remain that there should always be a separation between content (HTML/XHTML), behavior (JavaScrpt/ECMAScript), and presentation (CSS). That way, even folks using Lynx or who have disabled JavaScript or are running Internet Explorer 😉 can still find the material accessible. For now, I think the W3C has got to move forward with CSS version 3 and the browser community has got to follow suit.

      My personal pet peeve is how despite there being plenty of defined Unicode entities for things like fractions (one-third, three-eighths and the like) they still don’t display properly in most browsers. Hence there’s no way to present fractional data in a semantically correct way that will also present properly in the visual sense. That is so maddening!

      In, short I say we stick with HTML/XHTML because of its simplicity and its near-universal acceptance, but really push further development of CSS 3 in a forward direction.

      Thank you, Gene, for raising this issue. You’ve tackled it in a very interesting way. Well done!

    22. Part of the problem is how long these things take effect. Look at the dates at the World Wide Web Consortium http://www.w3.org/Style/CSS/current-work#background. It takes half a decade or more for the committee to approve of anything and then we have to give Microsoft another 10 years to partially comply with any of the standards. Internet Explorer is much harder to get things to work with it than any other browser. Internet Explorer 7 finally works with PNGs properly which was approved in 1998 and we are still waiting for SVG support of any kind for Internet Explorer and that standard has been out since 2001 and every other major browser has already implemented it with many fewer resources. It is possible to get perfect pixel layout and still allow people to resize things. People who want no formatting can turn off stylesheets and images, so all they will see is text and that should make them happy while the other 99% of the world is happy with the absolute positioning of CSS. As for people who don’t like PDFs, get the browsers – any of them – to support page breaks in CSS and most of the PDFs will disappear. I like many others prefer PDFs over web pages for more than 2 screenfuls of text as they are much easier to download, browse, and print. And as far as hand coding is concerned, using a good Wysiwyg Editor will result in fewer mistakes in coding rather than the other way around. Remember, there are many more options than Frontpage and vi. As for those who have no better thing to do than to validate HTML, that has nothing to do with whether or not the page displays correctly. And as far as using PHP, not all designers are also programmers. Most programmers web pages as well as their CMSes all are terribly designed, are very hard to use, allow virtually no formatting options and are insecure and difficult to update.

    23. Dana Sutton says:

      Thinking over this thread, it seems to me that we may be talking at cross purposes. Some contributors are primarily concerned about the question of whether html rendering is sufficiently precise and accurate. Others (including Gene in his original article) are really more concerned about the fact that different browsers sometimes display the the same html, CSS and Javascript differently, with the result that a Web designer can have difficulty in predicting how the page he designs will display on different browsers, or getting his page to display equally well on all of them. Is the fault really in the quality of the code designers use, or in the diversity and defects of the various browser engines out there? I think that for the immediate practical purposes of Web designers (and users) the second question is more interesting and worth discussing.

    24. Andy says:

      Surely we already have standards, in HTML, XHTML and CSS? The problem is browser vendors who ignore or don’t implement properly those standards.

    25. gopher says:

      Surely we already have standards, in HTML, XHTML and CSS? The problem is browser vendors who ignore or don’t implement properly those standards.

      Not just that, it is a problem that most users choose not to update their web browsers.

      As I say, I’ve seen people today still use web browsers that are 11 years old. How can you expect to broadcast to users who use old browsers? You have to work with the lowest common denominator, otherwise give them many clock cursors, hour glasses, and spinning beachballs. That’s no way to treat someone you hope to be your customer.

      Keep the high end new content off the main page of a website and direct people who have higher end browsers and connections to one page, or make it possible to use the website with the lowest common denominator. http://www.anybrowser.org/ is a good place to learn what that denominator is. Demand that publishing companies which make website creation programs for the non-HTML literate offer a base HTML that follows those guidelines.

    26. PDF would have probably been the choice in the Web’s early days if there had been more bandwidth.

    27. Sam Elowitch says:

      PDF would have probably been the choice in the Web’s early days if there had been more bandwidth.

      Perhaps. But I see a lot of potential problems with that. Adobe has already shown itself to be willing to use very bloated code for the web, such as that generate by GoLive. They’ve also bought out Macromedia basically in order to kill it; they are hardly advancing that product line at all. Where we would end up going vis-à-vis the behavior layer (JavaScript/ECMAScript or whatever) under that scenario is anybody’s guess.

    28. PXLated says:

      Benjamine wrote…”as far as hand coding is concerned, using a good Wysiwyg Editor will result in fewer mistakes in coding rather than the other way around”… That’s totally dependent on how good the coder is.
      “As for those who have no better thing to do than to validate HTML, that has nothing to do with whether or not the page displays correctly.”…Yes & no but it may in the future and it isn’t just about display. It can result in lighter weight page and if semantically correct, better Google juice. It also speaks to professionalism. If one bills themselves as a web designer/developer shouldn’t they advance their skills and develop to standards? Do you accept a dentist using an old drill and filling technique? A lawyer that doesn’t keep up on the law?

      Donna wrote…”Is the fault really in the quality of the code designers use, or in the diversity and defects of the various browser engines out there?”… It’s both but one can write clean, well formed code that displays cross-browser with no hacks.

      Gopher wrote… “still use web browsers that are 11 years old…That’s no way to treat someone you hope to be your customer”… On the other hand, someone living with that old a technology may not be a potential customer or one worth catering to. One needs to know, or determine, their target market just as Mercedes isn’t targeting people with an 11 year old Chevy.

    29. gopher says:

      Gopher wrote… “still use web browsers that are 11 years old…That’s no way to treat someone you hope to be your customer”… On the other hand, someone living with that old a technology may not be a potential customer or one worth catering to. One needs to know, or determine, their target market just as Mercedes isn’t targeting people with an 11 year old Chevy.

      That is elitist and biased. You have to realize the vast majority of the world does not have the financial resources to buy the most current software and hardware. The people who do are less than 1% of the population. Some people who are computer illiterate have no time to learn a new operating system. I know one person who is a very smart social worker who didn’t even know how to copy/paste, nor how to connect to links provided to them in AOL e-mail and it still isn’t easy for them. What’s more, they are stuck on dialup because that’s all that is available to them on a social worker’s income. Are you going to close off social workers just because you can’t or won’t develop websites that are commonly coded? They shop at JJill and Coldwater Creek. It is time for the web publishing community to recognize there is a common low denominator, sometimes lower than one would desire, but it is there, and it is not going away anytime soon, unless everyone is given free broadband, and a machine simpler than a Mac to use.

      Do all people have the financial resources to buy new software every two years, or new hardware every 5? Not everyone does, nor should we expect them to.

    30. PXLated says:

      Gopher wrote…”That is elitist and biased”… Bullcrap.! I said, you need to know your target audience. If it’s absolutely everyone you are pretty much screwed but I doubt most sites cater to “everyone”. If you’re doing a govt. site maybe.

    Leave Your Comment