• Explore the magic and the mystery!



  • Newsletter Issue #912

    May 22nd, 2017

    THIS WEEK’S TECH NIGHT OWL RADIO UPDATE

    One thing is sure: When a rival company brings out something new, Apple is made to look second best by the critics unless it duplicates the feature. It doesn’t matter how often responsible tech pundits correct the fake news, it is repeated.

    A typical example is the claim that Apple lost because it did not originate something. Whether it was support for LTE on the iPhone to digital music players and smartphones, Apple took its sweet time, allegedly. But when its solution arrived, it usually worked without the downsides.

    So before the iPod appeared in 2001, music players were little more than clumsy successors to the original Sony Walkman. It was about grabbing digital files from your computer, but it didn’t matter how quickly the files were retrieved, and the user interface was an afterthought. In those days, I reviewed such gear for ZDNet, sister site to CNET. Every single product I received was virtually unusable, so after the review was done, I sent it back and never thought about it again.

    Not that the iPod was perfect. Those teeny tiny hard drives were fragile, and I bet there are millions of people out there with dead iPods, or iPods in which the drive was replaced. Apple waited until flash memory capacity was sufficient and inexpensive enough to switch, although an iPod Classic, with a regular drive, stayed in production for several years.

    The iPhone success story? Well, before Apple introduced its game-changer in 2007, many smartphones were derived more or less, from the BlackBerry, with clumsy physical keyboards. I realize some of you became quite flexible on them, and it’s also true that there are still BlackBerrys, with the traditional keyboards, on sale. Only they run Android these days. Even though Android handsets are more plentiful than iPhones, no single model beats Apple.

    LTE? Well, it’s simple. The first chips supporting the faster Internet speeds weren’t power efficient. For Apple to add them would severely reduce battery life. Apple waited for new generations of cellular modems that wouldn’t make the batteries die more quickly, and then joined the LTE revolution. But for most people, the higher broadband speeds only offer a modest advantage.

    Now on this week’s episode of The Tech Night Owl LIVE, we featured commentator/podcaster Jeff Gamet, Managing Editor for The Mac Observer. Gene and Jeff did a pop culture segment, anticipating the movie version of “Wonder Woman,” the introduction of General Zod on the “Supergirl” TV show, and other movie and TV-related topics. The tech segment covered expectations for Mac notebook upgrades at the 2017 WWDC in June, whether actor Jeff Goldblum might have become the voice of Siri, the Microsoft Surface Laptop, and whether you can trust the cloud.

    You also heard from ethical hacker Dr. Timothy Summers, President of Summers & Company, a cyber strategy and organizational design consulting firm. Tim delivered a comprehensive look at the recent WannaCry ransomware attack that targeted hundreds of thousands of institutions and businesses around the world using Windows XP. This attack targeted a Windows flaw that has been patched by Microsoft. You also learned more about the ongoing prospects of bitcoin, the controversial digital currency that is still regarded as a viable alternative payment method by some. The ransomware attack required bitcoin payments to unlock compromised PCs.

    On this week’s episode of our other radio show, The Paracast: Gene and guest co-host Randall Murphy present a return visit from Walter Bosley. He’s an  author, blogger, former AFOSI agent and a former FBI counterintelligence specialist. He has researched mass shootings, breakaway civilizations, lost civilizations and more. On this episode, Walter will discuss one of his books, “Shimmering Light: Lost In An MKULTRA House of Anu.” You’ll learn about his father’s bizarre story involving Roswell and a 1958 UFO retrieval operation in Arizona — and the curious role Operation Paperclip and the subsequent CIA MKULTRA mind control program may have played behind the scenes. Walter will also cover some of the mysteries of Antarctica.

    THE 4K TV REVOLUTION: FULL STOP!

    You may not recall this, but HDTV was actually demonstrated in the U.S. in the late 1980s. After the standard became official, it took a while for broadcast stations to begin to adopt the technology. The first was WRAL-TV, a CBS affiliate in Raleigh, North Carolina, which began transmitting digital HD on July 23, 1996. But it took until November, 1998 for HDTV sets to go on sale.

    It must have seemed strange for a TV station to be offering a technology that benefited nobody, except manufacturers and professionals, for 28 months. Over the next decade, TV sets offering 720p and, later, 1080i and 1080p resolution, blanketed the country. They got cheaper and cheaper until you could buy a decent set with a huge flat screen for only a few hundred dollars. But the original HD sets were CRT and they were very expensive.

    Once HDTV was ever-present in people’s homes, and many people had more than one set with high-definition capability, manufacturers had to find ways to persuade you to buy new sets. But a well-designed TV can easily survive for eight or 10 years before requiring major repairs, meaning a long replacement cycle. A standard definition CRT set that I bought around 1994 lasted 20 years before it was put out to pasture.

    In addition to racing to the bottom with cheaper and cheaper sets, emulating what happened in the world of Windows PCs, manufacturers would add extra features, such as full-array backlighting and digital enhancements, to improve picture quality. By and large, the differences were really minor in the scheme of things, unless you looked carefully or real close.

    That takes us to 4K, which offers a resolution of 3840 × 2160. 4K — also known as Ultra HD or UHD — was invented in 2003. The first 4K camera was introduced that year, but it took a number of years before the technology filtered down to commercially available TV sets.

    As with flat screens and HDTV, early 4K gear was expensive, with prices above $5,000 commonplace, but you can bet that the TV manufacturers worked hard to make the technology affordable. These days, 4K sets aren’t much more expensive than decent quality HDTV gear of just a few years ago. Indeed, when you buy a new set, you’re very likely to choose 4K unless you want something real cheap.

    But 4K has some problems.

    So with HDTV, the difference between high-definition and standard-definition is crystal clear. Of course, that assumes you’re watching HD fare. If you’re watching a regular DVD, or a cable channel that isn’t HD, the content will be scaled up, but it’ll still look inferior to the real thing.

    However, you may have spent a bundle for 4K, but when you bring your spanking new TV home, you might want to spank it. Or perhaps you’ll feel cheated, because you may not see a difference. The reason is similar to how Apple’s Retina displays work. Resolution is increased to the point where you won’t see the pixels at a normal viewing distance. This is why some Android smartphones, despite offering more pixels, don’t really look so different from iPhones. You can’t see the extra pixels, but the handsets need more powerful graphics to push all those useless extra pixels.

    So if you watch a relatively small screen 4K set, below 55 inches say, at a normal viewing distance, it won’t look any better than HDTV. To see the advantage, you have to sit closer or get a larger screen. This shortcoming may help the industry move bigger displays, and thus earn higher profits.

    In order to drive home the 4K advantage, more and more sets support wider color gamuts, or HDR. There are two main standards, HDR10 and Dolby Vision. To make matters more confusing, some sets offer one but not the other. Samsung has a supposedly enhanced variation, HDR10+, An Ultra HD Premium label on a set is supposed to ensure that it properly supports the various HDR standards.

    As with the latest iPhone, the 9.7-inch iPad Pro, and recent iMacs and MacBook Pros, a wider color gamut means richer, more lifelike color reproduction. It means that a 4K set can offer a visually improved picture even if the higher resolution advantage isn’t readily visible in your home.

    Then there’s the software. It took a while before most TV channels were offered in HD. The arrival of Blu-ray and streaming video resulted in making high-definition fare commonplace. There’s not a whole lot of standard-definition left, although TV stations and cable and satellite systems still offer lower resolution content. Curiously, you may have to pay the cable and satellite companies a little extra for HD even though it took over years ago. Yes, it’s profiteering pure and simple.

    Now when it comes to 4K fare, there’s not much. TV stations are experimenting with Ultra HD broadcast transmissions, and sets will require a digital tuner that supports the forthcoming ATSC 3.0 standard. Meantime, such streaming services as Amazon Instant Video and Netflix offer some programming in 4K, but you’ll need a pretty fast broadband connection to reliably receive such content. A consistent connection speed of over 25 megabits is a given, and probably a lot more if you have an active family viewing content at the same time. Or you’ll have to shut down everybody else to reduce buffering.

    The first Ultra HD Blu-ray players are on sale, but you have to check the specs carefully. Many players merely upscale HD content to 4K resolution; they don’t offer the real thing. Even when you buy a new player from Samsung and other companies, and prices start at around $200, there aren’t a whole lot of 4K Ultra HD discs to be had. When they are, they are normally priced slightly higher than regular Blu-ray and require a separate disc. The packages I’ve seen also include Blu-ray discs, so you can future proof if you like.

    Does 4K have any downsides, other than not being visible on smaller sets unless you look real close? Well, it may just be upscaling. So when you watch HDTV content, it is scaled up to 4K, in other words, the extra pixels are added, and the picture looks a little bit better. But the conversion may also add picture noise on some sets.

    But what about standard-definition, which also has to be upscaled to 4K? Well, according to a USA Today article from my friend and colleague Rob Pegoraro — a regular guest on The Tech Night Owl LIVE — those old DVDs may “look awful.” He offers ways configure the set to reduce the pixelation, at the cost of turning off key image processing features. Many sets provide a way to customize the picture and save it as a preset, so if it’s done with an alternate setting, you may be able to switch back and forth with a relatively fast visit to the settings menu.

    The movie companies would rather you buy a Blu-ray or Ultra HD Blu-ray version of your favorite movies instead. The entertainment industry is happy to resell content as newer standards appear, just as they did when DVDs took over from videotape.

    What this all means is that, assuming my aging VIZIO TV holds up, I don’t see any compelling reason to consider buying a 4K set, even if I had the extra cash for one. Besides, the industry is already working on 8K; digital cameras, used for some of your favorite movies, are already shooting blockbuster movies in 8K.

    THE FINAL WORD

    The Tech Night Owl Newsletter is a weekly information service of Making The Impossible, Inc.

    Publisher/Editor: Gene Steinberg
    Managing Editor: Grayson Steinberg
    Marketing and Public Relations: Barbara Kaplan
    Sales and Marketing: Andy Schopick
    Worldwide Licensing: Sharon Jarvis



    Share
    | Print This Issue Print This Issue

    2 Responses to “Newsletter Issue #912”

    1. Dfs says:

      I have to disagree about upscaling. On my set (a Sony) it is entirely problem-free, and although the difference between upscaled HD and genuine 4k is barely detectable, the difference between standard HD and upscaled HD is enormous. As far as I’m concerned, that alone justified the purchase of the set.

      That being said, in my part of the world, where my previous carrier has been gobbled up by Spectrum, we are constantly being bombarded with ads touting the virtues of Spectrum and how great it is going to make our lives, although the difference between it and its predecessor is minimal (well, it has a klutzier interface and you have to go through more screens to get where you want to go, so there is a visible difference in human engineering, although it is not a good one). Upgrading to deliver 4k content is one way they could deliver on their promises, if no reason than that 4k sets are flying off the shelves and their price has come way down to the point they are becoming the new standard and I bet eighteen months from now it will be impossible to buy anything else). Meanwhile a few outlets, notably Netflix, do deliver it, and for those who want to squeeze the last ounce of value out of their new sets this may be perceived as yet another reason for cord-cutting. If cable services get a bad rep for not keeping up with technology, this won’t exactly help the industry and so they have a good economic reason for giving us 4k content.

      • gene says:

        The issue mentioned in Rob’s article is upscaled standard definition fare, such as regular DVDs.

        In any case, I agree that 4K is taking over. The next set I buy will be 4K, no doubt, but that won’t happen for quite a while.

        Peace,
        Gene

    Leave Your Comment