• Explore the magic and the mystery!


  • Listen to The Tech Night Owl LIVE

    Last Episode — August 24: Gene presents a regular, tech podcaster and commentator Kirk McElhearn , who comes aboard to talk about the impact of the outbreak of data hacks and ways to protect your stuff with strong passwords. He’ll also provide a common sense if unsuspected tip in setting one up. Also on the agenda, rumors about the next Mac mini from Apple. Will it, as rumored, be a visual clone of the Apple TV, and what are he limitations of such a form factor? As a sci-fi and fantasy fan, Kirk will also talk about some of his favorite stories and more. In is regular life, Kirk is a lapsed New Yorker living in Shakespeare’s home town, Stratford-upon-Avon, in the United Kingdom. He writes about things, records podcasts, makes photos, practices zen, and cohabits with cats. He’s an amateur photographer, and shoots with Leica cameras and iPhones. His writings include regular contributions to The Mac Security Blog , The Literature & Latte Blog, and TidBITS, and he has written for Popular Photography, MusicWeb International, as well as several other web sites and magazines. Kirk has also written more than two dozen books and documentation for dozens of popular Mac apps, as well as press releases, web content, reports, white papers, and more.

    For more episodes, click here to visit the show’s home page.

    Apple, In-house GPUs, and the Consequences

    June 23rd, 2017

    Apple has made huge progress in using its own silicon to power its mobile gear. The A-series processors, based on ARM silicon, are capable of performance that rivals traditional notebook personal computers with Intel Inside. Recent benchmarks of the 2017 iPad Pro reveal numbers that are faster than the latest MacBook and at least competitive with a MacBook Pro.

    Indeed, some suggest that Apple is missing the boat by not ditching Intel and switching to its own chips. But this is a really complicated issue, and it’s also true that Intel is making progress in making their chips run a decent amount faster. The Kaby Lake CPUs used in the latest Mac notebooks exceed the performance of their predecessors by decent margins. So it may not be time now — or ever — for Apple to consider another processor migration.

    That said, up till now, Apple has licensed designs from Imagination Technologies to provide the GPUs for such gear as iPhones and iPads. The UK firm receives royalties from the sale of these products. That these devices score well in benchmarks does indicate that the association has been productive. Until now.

    In recent years, it has been rumored that Apple will go in-house to develop GPUs, and they have hired people, including some from Imagination, to move in that direction.

    In April, the bottom fell out for Imagination, when Apple informed them that it will begin to move to its own GPUs over the next 15 months to two years. Since Apple provides half the company’s business, it would have a huge impact.

    In fact, it’s been reported that Imagination has lost at least 70% of its market value since Apple’s decision was reported. Now it appears that Imagination is trying to find a buyer.  Indeed, there are even reports that Apple itself may attempt to pick up the pieces, although that would seem a curious move. It would mean that Apple reached its divorce decision as part of a scheme to reduce Imagination’s value and thus get a good price. But that’s not Apple’s way.

    But don’t forget that Apple will be spending many millions of dollars to switch to its own silicon, so any savings would take a while to be achieved. It would be more about having greater control over of its hardware. That need for control is, of course, why some suggest the A-series CPUs are destined for Macs.

    But they are actually there already, at least for the MacBook Pro with Touch Bar. One of Apple’s own systems-on-a-chip is employed to power that feature, and it’s rumored other low-functions, such as Power Nap, will make the switch.

    But Apple has already denied that there will be a Mac on ARM in our future, even though some continue to suggest it’s inevitable.

    I briefly thought Apple might even want to use its own technology to power Mac GPUs. But that would have its own complications, since games and other apps are optimized to do their best on existing graphics hardware from Intel, AMD and NVIDIA. So, in effect, an Apple-built GPU might create extra complications.

    Or maybe not. I’m not a graphics chip designer, nor do I play one on TV.

    Now I suppose the folks at Imagination ought to feel betrayed by Apple, although the fear that they’d be ditched certainly loomed after those A-series CPUs were launched.

    But if Apple can continue to expand the functionality of its chips with additional functions, other chip makers might be losing lots of business too before long. Apple is embroiled in a patent dispute with Qualcomm, which provides some of their baseband chips. These are the parts that allow your iPhone and certain iPads to connect to a cellular network. Apple also uses baseband chips from Intel, but may actually be slowing down the capabilities of both because of the limitations in the latter.

    All right, this all may be getting more involved than I had hoped. Few who buy these gadgets really care about the manufacturers of the parts Apple uses, or the consequences of using those parts unless they create problems. So even if an iPhone’s broadband speeds could be a little better, they are surely fast enough as it is for most people.

    So what if Apple took baseband components in-house too?

    What all this in-house development does mean is that it allows Apple to further differentiate its hardware from the pack. Putting its own CPUs into iPhones and iPads results in products that are more difficult for other companies to compete with. The rest of the industry largely uses commodity hardware, although Samsung sometimes uses its own Exynos SoCs, which appear to deliver mediocre performance.

    And even though Macs run Intel processors, and discrete GPUs from AMD and NVIDIA, the decision to pass off functions to A-series silicon can give Apple further control over innovative technology. It’s the sort of thing that’s bound to give other companies conniptions.

    While Apple is often egged on by the media to buy this, that, or another company, such moves are done to expand the company’s own technology portfolio. Remember that Touch ID came through Apple’s purchase of AuthenTec, which means nobody else can use their fingerprint sensors. We all know about Siri’s origins, although some of that company’s former executives are working with Samsung on its nascent Bixby digital assistant. And, yes, I know that Bixby is just out of the starting gate, so it has a long ways to go.

    Obviously, Samsung did that to follow Apple, but when Siri arrived — even though it was in beta and fairly buggy — you didn’t have to wait for a software update to use it. Regardless, Apple’s efforts to move more hardware components in-house will only continue.


    The World of the Hackintosh Revisited

    June 22nd, 2017

    So eight years ago, Macworld columnist Rob Griffiths decided to take a stab at building an unofficial Mac clone. With Apple’s switch to Intel processors in 2006, it seemed logical that you could do such a thing, take generic PC parts and somehow induce them to run Apple’s OS, and many have tried. Rob called his completed computer a “FrankenMac.” Overall it worked, well mostly, but the setup process required lots of babysitting and false starts.

    But that was early in the game. Over the years, building a Hackintosh has become easier to manage, largely because there are online communities that specialize in testing PC hardware of compatibility and devising the best ways to install macOS. So if you choose the recommended hardware, you stand a decent chance of building a mostly usable computer.

    I say mostly, and you’ll see why in a moment. You see, the big problem is that macOS is tightly integrated with a specific number of Macs with certain hardware configurations. Where you have the option — and it’s one not often available anymore — you can install third-party RAM and maybe even a third-party drive. None of that should present a compatibility problem.

    But when it comes to using third-party motherboards, processors, graphics cards and other parts, all bets are off. That the process works at all might be a near-miracle. But when you consider the limitations — and I’ll get to them shortly — why should you bother?

    Now Rob hit a wall with his current Mac, a Late 2014 27-inch iMac with 5K Retina display when he tried to run his favorite game, X-Plane flight simulator. You might not believe it, because, when well equipped, it’s actually a powerhouse for the most part. But Rob encountered subpar frame rates, from “decent to slow.” Worse, the iMac’s cooling fan accelerated to full speed — and it’s audible — whenever the app was running.

    Now I sort of feel his pain. I just added a YouTube channel for my other radio show, The Paracast, and I prepare the content in iMovie, combining an illustration with the audio file. The process creates a multi-gigabyte file that, at the start of the Share process, sends my iMac’s fan roaring at full speed. After the initial encoding process, it quiets down.

    The real problem I confront is relatively slow upload speeds, and that won’t change until I live in a place where I can actually set up a regular ISP. This housing complex offers “amenity Internet,” meaning adequate, at 15 megabits down, and 3 megabits up. It’s all right most of the time, and Netflix streams are pretty solid, but there’s no hope for 4K. But since I don’t have a 4K set, it doesn’t matter.

    Bob recounts his Hackintosh installation in a Macworld article, and I recommend you pay close attention to the problems he encountered if you hope to use one of these boxes to replace a work Mac.

    You’ll see what I’m getting at in a moment.

    Now the biggest problem for Rob was the fact that Apple’s choice of graphics cards in that Late 2014 iMac was more about pushing millions of pixels, but not pushing them fast enough. The top-of-the-line for this model is an AMD Radeon R9 M295X with 4GB video RAM. The most powerful and expensive component for Rob’s Hackintosh was an NVIDIA GeForce GTX N-1080 graphics card, which sold for $550 at Amazon. It helps that NVIDIA had the good sense to release Mac drivers, which serve two types of customers. One is using an older Mac Pro, before 2013, when you could easily swap out the graphics card, and, of course, a Hackintosh builder.

    The good part is that, after his new computer was set up, Rob reported an improvement of 70% in frame rates over his iMac, and the graphic card’s three cooling fans didn’t intrude. All told, Rob spent $1,567 on parts, but it required countless hours trying to make this faux Mac run almost acceptably.

    He got it working to the point where he could play his favorite game reliably. But that’s where it ended.

    Routine macOS functions, however, were hit or mess.

    Rob concludes:

    “After way too many hours of effort, I have given up on making my Hackintosh my every day Mac. I never got Messages working. My audio works until the machine sleeps, then it won’t work until I reboot. Sometimes the machine won’t even wake from sleep. Handoff and Continuity do work…sometimes (but hey, that’s no different than a “real” Mac!). I never got iTunes protected video to play. And on and on goes the list of “little things.”

    But even before you decide to give it a try, prepare to have lots of patience. All of the debugging required even to get a basic subset of functions to operate requires plenty of work, and you’ll confront roadblocks along the way that may never be solved. Obviously, Rob is still going to hold onto his iMac.

    That said, maybe Rob’s best move now is to consider buying the 2017 version of the iMac. According to some of the benchmarks I’ve read online, it’s graphics card, an AMD Radeon Pro 580, using the Polaris architecture, soars way past its2014 and 2015 counterparts. It may well be fast enough to satisfy Rob’s needs, assuming he is willing to sell off the computers he has.

    In fact, I wouldn’t be surprised if he got enough money for his 2014 iMac, and his Hackintosh, to cover that purchase of a new computer. Rob: What do you think?


    Looking Down at an iPhone: It Changed My Life

    June 21st, 2017

    When Steve Jobs demonstrated the first iPhone at a Macworld Expo on January 9, 2007, I was only half listening. To me, a cell phone was all about making phone calls. Their web browsers and email tools were clunky, clumsy, and how can you type quickly on a telephone keypad?

    At the time, I would cast a curious look at my son, Grayson, while his thumbs busily typed text messages to his friends. Clearly he knew something, but I wasn’t sure it was worth taking seriously. He’d grow out of it, I thought. But that’s what parents always say about their children.

    When the iPhone 3G arrived in 2008, I had the chance to get one from Apple to review. In addition to supporting 3G cellular networks, the Apple Store had debuted. When the first iPhone arrived, Jobs talked in terms of web apps, which went precisely nowhere. Having a real app store, with software that ran natively on the iPhone, created a revolution for developers.

    Since iOS was derived from macOS, you could use the same developer tools to create both. On a Mac.

    While other mobile handset makers offered their own app stores over the years, they were usually overpriced, performed poorly, and cost a lot more than they were worth. In Apple’s App Store, many apps were free, while others were no more than a dollar or two. So it was easy to try them out, see if you liked them, and if not, delete them without any great loss.

    Where Apple blew past other smartphones was in making the iPhone a real mobile computer, a genuine counterpart to a Mac, and its touchscreen and simple interface were easy for people to grasp. Up until then, the standard bearer, a BlackBerry, and similar devices, had physical keyboards. But the keys were tiny, and you almost had to learn to type all over again to master one. They were mostly toys for executives or politicians, but regular people weren’t apt to pay attention.

    It almost harkens back to the days of DOS, and text-based PCs. The Mac arrived with a graphical interface, and it was regarded as little more than a toy, not a serious work computer. Well, until Microsoft delivered a version of Windows that was mostly usable. Suddenly such complaints no longer mattered, although Apple was forever regarded as beleaguered.

    Now about that iPhone from Apple: It changed my evening computing habits almost overnight. Until then, I had the habit of bringing my MacBook Pro into the bedroom to keep tabs of email and to do online research. I’d stick it on a chest and reach for it every so often to stay up to date.

    It didn’t take long until I was able to set the notebook aside. I could just place the iPhone in the night table,  or hold onto it while watching the family TV. Either way, it made easy work of staying touch via email or messaging real easy. I wasn’t a great consumer of apps — I’m still not — but I found a few that were indispensable.

    I asked Apple if I could keep it a little longer. They allowed me two more weeks. Before I returned it, I ended up buying one.

    Since I signed one of those two-year contracts, I ended up upgrading on a fairly typical schedule. Sometimes I’d get a loaner from Apple in an alternate year, and a couple of times they allowed me to keep it longer if I transferred my own phone number to the device. That way, they didn’t have to pay for the online account.

    As you recall, when you signed a cell phone contract, you were locked in for two years, unless you paid an hefty early termination penalty. In a sense, you were buying the handset and paying it off in two years, but the price never went down after that period.

    Things changed with T-Mobile’s “Un-carrier” pricing scheme, where the purchase of the handset was separated from the cellular plan. You may still be taking two years to pay off that loan, but when it’s paid off, the price goes down. Maybe you end up keeping that phone a little longer. But you can also pay a few dollars more and upgrade every year. Just return the old phone, take home the new phone, and you’ll pay forever. AT&T Next is an example of such a sales scheme.

    At least that’s what the carriers hope.

    Now back to the iPhone: In some ways the history of the macOS is being replayed, but it’s more in Apple’s favor.

    Apple releases a revolutionary operating system and computer, and another company comes out with something that’s almost as good. But cheaper, and built by loads of manufacturers. Apple nearly got buried by Microsoft in the move to Windows.

    But that hasn’t quite happened this time. Although Apple’s iOS retains a small portion of the mobile handset market, its represents the sale of hundreds of millions of costly devices around the world. Only one manufacturer, Samsung, sells more. But Samsung’s largest proportion of sales are in the lower price ranges. The cheap stuff doesn’t make a lot of profits, but it builds market share.

    Google’s Android OS, in turn, dominates the mobile OS race. Its market share isn’t quite the level of Windows, but it’s high enough. In turn, Android handsets heavily resemble iPhones. The influences are clear. Apple set the pace and made smartphones warm and fuzzy, and now even people in third-world countries have one. If the iPhone hadn’t arrived in 2007, would there have been a smartphone revolution? Or would smartphones remain expensive executive toys with tiny physical keyboards?

    A smartphone is an important work tool in ways never anticipated even by Apple. Many jobs depend on them. So imagine riding and driving for such ride-sharing services as Lyft and Uber, which requires their navigation apps, and consider whether that would have been possible without what Apple brought to the industry.


    Consumer Reports and its Samsung Disconnect

    June 20th, 2017

    So Consumer Reports has finally issued its verdict on the Samsung Galaxy S8 and the Galaxy S8+. Both score 82 on the current ratings scale, placing them at the top of the heap. Plenty of praise was heaped on the phone, except for its lack of a removable battery, a difficult-to-access memory card, and then there’s that fingerprint sensor!

    Instead of being on the front, in the spirit of Apple’s Touch ID, it’s at the rear. The reason appears to be due to the difficulty of embedding such a sensor beneath the AMOLED display. This design decision fueled unconfirmed rumors that Apple had encountered the same problem, and would the forced to make the same placement decision for Touch ID on the rumored iPhone 8, which is rumored to also sport an edge-to-edge display.

    But recent iPhone 8 rumors indicates Apple won’t have that problem, and there will be an embedded Touch ID in its usual spot.

    CR gave the iPhone 7 a 76, and the iPhone 7 Plus a 77. Both products were also denigrated for the lack of a removable battery. But the smaller iPhone gets a mediocre rating for battery life, whereas the larger battery in the iPhone 7 Plus gets a rating similar to that of the Galaxy S8.

    Now as to that CR review, it does appear that the magazine’s reviewers were asleep at the wheel. I’m not saying the Galaxy S8 and its bigger brother/sister aren’t deserving of a high rating. But there are some real question marks in the review, serious omissions, which do not make any sense unless the tech staff is living in a bubble.

    So there are published reports of a red tint on some of the Galaxy S8 screens. Supposedly you can fix it with a manual calibration on your unit. That this curious step was required is curious, although a supposed carrier update appears to fix it, at least for the carriers who are pushing that update. Still, CR isn’t doing its research not to be aware of ongoing problems with this device. At the very least, customers should be warned about the problem and the possible solution.

    Then there are the biometrics. So it doesn’t appear that CR worried so much about the awkward positioning of the rear-mounted fingerprint sensor, or the fact that you might accidentally smudge the camera lens when trying to unlock your phone. This is a poor design decision, one that might discourage people from using it.

    But the real lapse in the CR review is the failure of the magazine’s editors to understand the flaws in the facial and iris recognition sensors. The problems aren’t even mentioned, or weren’t in the version of the review I read at CR’s site.

    Consider that, on the day the Galaxy S8 was rolled out to the media, someone already found a way to defeat the facial recognition scanner with a photo. Do I need to go on? That makes the feature fatally flawed, useless as a security feature.

    But what about the iris sensor? Surely it’s going to be real hard to fool that, even if you remove someone’s eye, as you see on some the TV shows and movies, right?

    Well, it won’t take any gruesome surgery to defeat the iris sensor. According to a published report, all that’s required is a photo and a contact lens. That’s how hackers managed to defeat that system.

    Now all of this information is online, so anyone using these devices would understand the limits. But CR is blissfully unaware of such problems. Well, at least the units survived the dunk tests.

    Compare that to one of the Galaxy S8’s predecessors. A Galaxy S7 Active failed CR’s dunk tests. Supposedly Samsung agreed to replace any defective handsets, but only if they have actually sustained water damage. So do you even want to take the chance?

    Now I’m not suggesting that CR needs to write favorable reviews of iPhones, or that the ratings should be changed. But I am concerned that significant facts about lapses in the security features of the new Samsung smartphones were ignored. So there’s an awkward-to-use fingerprint sensor, which, of course, will discourage users from trying it. The iris and facial recognition sensors are seriously flawed and easy to crack.

    Another example of CR’s lack of attention to detail is the way it appears to accept the Samsung’s features without considering the usability, or even if they’re necessary.

    So you know how Apple’s Retina displays work, that you won’t be able to see the pixels that make up the image at a normal viewing distance. If there are more pixels, you won’t see them. Other than bragging rights, what’s the point?

    So CR’s reviewers write that the Samsung’s display, which is either 5.8 inches or 6.2 inches, depending on the model, promises “more than 520 pixels per inch of detail, though you may not notice the benefits of those extra pixels in everyday use.”

    You think?

    Clearly CR doesn’t understand what a Retina display is. But since the magazine wasn’t aware of the shortcomings in the Samsung’s biometrics, I’m not surprised.

    Again, this doesn’t mean these handsets don’t deserve high ratings, but why the flaws and unnecessary frills aren’t mentioned is yet more evidence of CR’s serious shortcomings in reviewing tech gear.