Our Impression of How Apple is Doing is a Lagging Indicator of How Apple is Doing

Apple software honcho Craig Federighi cheerfully waves to the WWDC audience as he takes the stage in San Francisco on June 2, 2014

Apple software honcho Craig Federighi jauntily waves to the WWDC audience as he takes the stage in San Francisco on June 2, 2014

Lots of people–journalists, bloggers, analysts, random bystanders–love to to make grand pronouncements on where Apple is going. Very few are good at it. And part of the problem is that it isn’t even all that easy to understand the state of Apple right this very minute.

I’ve been thinking about that as I ‘ve read coverage of the news which the company has made over the last couple of weeks–which has included its acquisition of Beats and a WWDC keynote which, while devoid of new hardware, was bursting at the seams with wildly ambitious plans for software and services. Apple as of mid-June of 2014 is interesting in ways which I don’t think anyone was predicting even in late April, before the Beats scuttlebutt emerged and it became clear that WWDC wasn’t going to involve major hardware announcements.

Which means that even the best commentary on recent Apple developments–such as Joshua Topolsky’s “Meet the New Apple“–is playing catch-up with developments which Apple has been secretly working on for months or, in some cases, years. (The new Swift programming language began as a personal project in 2010.)

Topolsky’s piece is full of words which very few observers would have applied to Apple even the week before WWDC: fun confidence, buoyant, giddy and even open. If it’s reasonable to apply them to the company now–and I believe that it is–it’s not because  a switch flipped at the WDDC keynote. It’s because Apple was already changing in ways we didn’t yet understand.

Of course, the same basic dynamic is an issue with nearly all analysis of almost every company: When Google holds its IO conference later this month, it’s entirely possible that it will reveal something which will render some of our current impressions of the company obsolete.

But perception lagging reality is a bigger factor with Apple than with most companies, for several reasons:

  • Apple really is going through a big, unpredictable shift, not just because Tim Cook isn’t Steve Jobs but because he has a new team. (As Ben Thompson of Stratechery points out, nearly 60 percent of the company’s current top managers weren’t in their jobs in 2010.)
  • Apple spends less time talking about its future–even in broad strokes–than most companies, which sometimes leaves those of us on the outside blissfully ignorant of where it’s headed until it’s well on its way to getting there.
  • People tend to have deeply-held attitudes toward Apple–be they positive or negative–which they have trouble putting aside even when the facts suggest such attitudes may need reassessing.

I don’t mean any of this as a knock on Apple commentators. (At least the smartest ones, a group which certainly includes Topolsky and Thompson.) There’s no shame in only being able to articulate things about a company once the evidence is in.

Actually, that’s a far better way to shed light on Apple than the blustery predictions which so often pass for analysis–and which, to the extent they’re taken seriously, mostly serve to damage the world’s understanding of Apple rather than increase it.

No comments

The Turing Test Has Finally Been Passed, and I’m Not Impressed

goostmanI swear I’m not trying to be a wet blanket. But I’m reading lots of giddy reaction to the news that a chatbot is the first software to pass the Turing Test–the notion, first proposed by legendary mathematician Alan Turing in 1950, which posits that computers will have achieved intelligence when one of them is capable of convincing humans that it’s a real person. And I’m not convinced that what just happened is a defining moment for either computers or humans.

Traditionally, a computer will have passed the Turing Test when it fools 30 percent of the judges who converse with it into thinking it’s human. The chatbot in question was devised by Vladimir Veselov, a Russian living in the U.S., and Eugene Demchenko, a Ukrainian living in Russia. It duped 33 percent of the judges of a contest held by the University of Reading in the UK into thinking that it was man rather than machine.

Or, to be more precise, that it was boy rather than machine–a 13-year-old Ukrainian kid named Eugene Goostman. The creators of the chatbot, which has been around since 2001, dramatically simplified the challenge they were tackling by building software which posed as a youngster who isn’t entirely proficient in English.

I haven’t seen any transcripts of the conversations which convinced the judges that Eugene was flesh and blood. But RT Today published an “interview” with Eugene. If this is as good as the chatbot’s artificial intelligence gets, I’m not astounded. And if I’d been a judge, I can’t imagine that I would have been snookered.

A sample:

RT: Do you work?

EG: I am only 13, so I’m attending school so far.

RT: What do you study?

EG: Is it “the third degree”? I’m not going to answer such a provocative question!

RT: Do you like science?

EG: I wouldn’t want to be a scientist – in science the credit goes to the man who convinces the world, not to the man to whom the idea first occurs. So that, science is unfair thing a-priori.

RT: Do you have any heroes?

EG: No, I haven’t any heroes! How could you imagine such a trash! Wonna ask me something more?

Even if you don’t expect a Ukrainian teenager speaking in English to be the most dazzling of conversationalists, this comes off as laughably crude and synthetic. It seems to deflect questions it can’t parse by refusing to answer them or shiftily parroting back a snippet–just like the Eliza programs I wrote in BASIC on a TRS-80 computer when I was in high-school.

So I wonder: If this is the first software to pass the Turing Test, is it possible that the victory had less to do with Eugene being a brilliant work of computer science–and more to do with some of the human judges in this particular competition being a bit thick-witted?


At Last, Apple’s Wearable Enters the “Yes, It’s Coming Soon” Phase


Exclusive Technologizer visualization of likely appearance of Apple wearable device

At first, people idly wonder whether Apple might enter an emerging category of gadgets. Then there are rumors that the company is entering the category–but they come from sources without a great track record for accuracy, or involve alleged facts which don’t ring true.

And then at some point someone trustworthy reports that the new Apple product really is on its way within the forseeable future. That’s when it’s reasonable to assume that it’s not a mass hallucination or a hoax.

The cycle happened with the iPhone and iPad. And now it’s happened with the wearable gizmo that everybody calls the iWatch.

John Paczkowski of Re/code, who’s on my exceedingly short list of reporters whose Apple scuttlebutt I reflexively believe to be true, says that Apple is planning to ship a wearable device in October. He doesn’t have a lot of detail beyond that, but does say–although it scarcely needs saying–that said device will hook into the HealthKit fitness platform which Apple announced during its WWDC keynote.

Yuichiro Kanematsu of Nikkei Asian Review is also reporting that the wearable will arrive in October, and provides some more color–although as is often the case with stories about unannounced Apple products, it isn’t always clear where the reporting stops and the speculation starts:

Though the details of services have yet to be released, specs for the new product are being finalized, according to industry sources. It will likely use a curved organic light-emitting diode (OLED) touchscreen and collect health-related data, such as calorie consumption, sleep activity, blood glucose and blood oxygen levels. It will also allow users to read messages sent by smartphones.

Apple appears confident of the new product. According to a parts manufacturer, it plans monthly commercial output of about 3-5 million units, which exceeds the total global sales of watch-like devices last year.

I’m not convinced that everything in Kanematsu’s piece is automatically accurate. For instance, that “likely” gives me pause, in part because it’s not clear whether it’s Kanematsu’s sources who are deeming the tidbits that follow to be probable, or whether it’s Kanematsu’s own guess. And it isn’t even clear which of the tidbits the “likely” applies to.

I am, however, officially assuming henceforth that Apple plans to unveil a wearable device this fall. Almost everything about the product remains mysterious, but even just feeling confident that it’s fact rather than fantasy is a big deal.

One comment

The Ghostbusters Polaroids

Four instant pieces of movie-making history

Ghostbusters logoAny interesting photograph is more interesting still–or at least far more evocative of a particular era–if it happens to be a Polaroid.

That’s my deeply-held belief, anyhow–which is why I’ve blogged in the past about Polaroid images of John F. Kennedy and the Apple-1 computer. (I also told the story of Polaroid’s SX-70 camera in the longest article I’ve ever written about anything.)

And now my Facebook friend Michael Gross–who made the world a much better place as the art director of National Lampoon in its golden age and then as a movie producer–has used his feed to share some vintage Polaroids taken on the set of Ghostbusters, on which he worked as an associate producer. (Among other things, he contributed the movie’s unforgettable logo.)

Michael was nice enough to let me borrow his Facebook photo of the Polaroid photos, which show stars Bill Murray, Dan Aykroyd, Rick Moranis, and Ernie Hudson. Here they are:

Ghostbusters Polaroids

All four snapshots are great, but the one of Murray–well, it may be one of the best photos of Bill Murray I’ve ever seen. Which is saying a lot.

Ghostbusters was released thirty years ago last Sunday; back when it was made, a Polaroid camera was an utterly indispensable tool for moviemakers who needed to document their work as it was going on. (Michael says these particular pictures were continuity shots for the wardrobe department.) Even if many of them didn’t get saved, there must be an awful lot of Hollywood Polaroids which survive. Wouldn’t they make an incredible coffee-table book?

No comments

Where Have You Gone, Peter Norton?

The man who made PC utilities famous--and vice versa

Recently on Facebook, my friend, nerd extraordinaire Esther Schindler, shared a photograph of herself wearing an old T-shirt and challenged her followers to identify it:

Esther Schindler

Either you have no idea what that image means, or you know exactly what it is.

It’s the torso, rolled-up sleeves and folded arms of Peter Norton, the man who was once synonymous with PC utility software, on a vintage shirt produced to promote one of his products. I found seeing him again–even without his own head–to be a surprisingly Proustian experience.

Norton was a mainframe and minicomputer programmer who bought an IBM PC soon after its 1981 release and published an enormously successful suite of software tools, the Norton Utilities, in 1982. Its killer app: UnErase, which could recover lost files back before trashcan-style deletion let you change your mind after getting rid of a file.

Norton’s empire grew to include multiple software products, articles (including a long-running PC Magazine column), and books. He was everywhere that PCs were. And then, in 1990, he sold Peter Norton Computing to Symantec, which made the Norton line of software even more successful.

After the sale, Peter Norton himself retained a high profile as a living symbol of PC maintenance; his personal brand was so powerful that it transcended his actual involvement in products which bore his name. (In the 1990s, a friend of mine wrote a book with Norton: By then, I gathered, writing a book with Peter Norton involved…well, pretty much writing a book.)

And all along, that image of a thoughtful-looking computer nerd with crossed arms was instantly recognizable. Here it is in an early incarnation, on a best-selling 1985 tome which Wikipedia informs me was known as the “pink shirt book.”

Peter Norton

Both the formality of the necktie and the rolled-up sleeves of the classic Norton pose are meaningful. He was a pro, but he was also ready to get to work on whatever ailed your computer.

Here, on a Norton manual cover, is the folded-arm Norton as he’s remained burned into my brain all these years. (I’d forgotten that he didn’t wear glasses all along, at least when posing for photographs: Once he did, it added to his authoritative air.)


And here, in an image which I find vaguely unsettling, is a folded-arm, pink-shirt Norton in a very early (1991) ad for Norton AntiVirus, which eventually became the best-known Norton-branded product:

Peter Norton

Esther, who seems to have done a better job of holding onto interesting computer-industry tchotchkes than I have, still has her Peter Norton Mug:

Peter Norton Mug

As the image below, which I borrowed from this blog post, shows, cross-armed Norton was not only iconic, but also a computer icon. (Don’t hold me to this, but I could swear that at least one version of the Norton Utilities sported an interface featuring an animated version of Peter.)

Peter Norton

I started using MS-DOS PCs on a regular basis in 1991, and that’s when I became a user of Norton software. I was particularly fond of Disk Doctor, which repaired corrupted hard drives; and Disk Editor,  which let you view and edit the data on your disk byte-by-byte. I used both of them more than once to recover from disaster, back when hard disks crashed a lot more often than they do today. I also swore by NCACHE and Speed Disk, two utilities which were superior to their Microsoft equivalents.

No disrespect meant to later, Windows-based Norton products–they’ve rescued my computer on more than one occasion–but for me, the golden age of Peter Norton’s software was when it was mostly DOS-based, lightning fast, and let you dig deeply into your computer. When Norton software switched to Windows, along with the entire utility industry, it got more bloated and tended to go after a larger, more consumery, less nerdy audience. It had less to do with the programs which Peter Norton himself had begun writing in the early 1980s. It felt less real.

Still, even in the Windows age, the crossed-arm Norton was so famous that I remembered it as appearing on all his software products and books.

Not so. Poking around the Web, I found images of him just sort of standing there (in a pink shirt), leaning on computers, hanging out with co-authors, brandishing toolboxes and hourglasses, wearing a stethoscope, and performing magic tricks with gears. And, on one particularly entertaining package, garbed in a garage-style uniform with a “Peter” label, working on a humongous floppy disk jacked up in the air.

Norton Boxes

That bottom row shows boxes dating from the turn of the century, when Symantec doubled down on Peter Norton imagery–right before it permanently removed the man altogether from product packaging in 2001. It was the end of an era, even though I’m not sure if anyone noticed at the time.

Why did Norton products no longer carry pictures of their founder? I don’t know. Maybe Symantec did extensive market testing before it made the move; maybe not.

But these later-era packages, with photos of happy, confident computer users rather than a problem-solving computer geek, hint at the company’s thinking.


I don’t care what the rationale was: Depicting someone other than Peter Norton on a Norton box was like Planters decorating a can of nuts with an anthropomorphic legume who wasn’t Mr. Peanut.

Today, Norton is probably still the best-known name in utility software; it’s even used on products for iOS and Android. But Symantec has completely disassociated the brand from Peter. Just as nobody remembers anything about Duncan Hines other than that he licensed his name to a cake-mix company, it’s possible–maybe even dead certain–that most people who use Norton products don’t have a clue who Peter Norton is.

Here’s the current Norton Utilities package, which seems particularly interested in reminding us that Symantec is the company behind Norton:

Norton Utiltiies

I’m glad that the Norton Utilities still exist–they even include modern versions of some of the programs I once loved, such as Disk Doctor. But so much has changed about PCs that it’s tough to remember how essential early versions of the package were. Creating a really good data-recovery program is not the road to fame and riches that it was in the 1980s.

Of course, an awful lot of people who buy Norton products these days never see a box at all. Shrinkwrapped software of the sort which once brought Peter Norton glory is largely a thing of the past; Symantec has migrated much of its line to a downloadable, subscription-based model.

Bottom line: If there’s a young Peter Norton out there today, he’s not going to become famous by appearing on utility-software boxes sold in retail stores.

As for Peter Norton himself, he may have been synonymous with PC utilities, but they weren’t his sole obsession. After selling Peter Norton Computing to Symantec, he went on to spend a sizable chunk of his loot on philanthropy and collecting modern art, two entirely admirable pursuits. I met him a few years ago when we were on a panel together, had a pleasant chat, and got the sense that he was perfectly happy not to be the guy on the software box anymore.

And yes, meeting Peter Norton did feel a little like encountering Betty Crocker or Mr. Clean in person. I hope I had the presence of mind to thank him for all the times he saved my bacon…


Kickstarter is Loosening Up. Let’s Hope It Still Feels Like Kickstarter

KickstarterBack in 2012, I wrote a feature about Kickstarter–the site which kickstarted the crowdfunding phenomenon–for TIME. (Here the article is, lurking behind a paywall.)

One of the things I found fascinating as I reported the story was that Kickstarter’s founders–Perry Chen, Yancey Strickler, and Charles Adler–weren’t in love with the site’s public image, which at the time had a lot to do with the giant sums of money raised by gadget-y projects such as the Pebble smartwatch and Ouya game console.

For its founders, Kickstarter was about finding funding for creative endeavors–the more creative the better, and regardless of whether the endeavor in question required a lot of money or just a little. When they told me about its success stories they brought up small-time, deeply personal stuff like games and artisanal jam, not Pebble or Ouya. And they didn’t like people thinking of their creation as an online marketplace or a new-age form of venture capital for consumer electronics. (In fact, later that year, they tightened their rules to discourage some campaigns and announced the restrictions with a blog post titled “Kickstarter is Not a Store.”)

Fast forward to today. The site is announcing that it’s loosening up its guidelines for project creators, eliminating many of the restrictions on what a crowdfunding campaign can involve–though there are still bans on such items as weapons, medical products, pornography, hate speech, charity requests, and–go figure!–anything that’s illegal. Creators will also be able to launch campaigns without having them pre-approved by the site, a measure which will presumably help it ramp up the volume of projects. (For all of Kickstarter’s influence, it still operates on a relatively small scale by web standards–there have been a total of around 63,000 successfully funded campaigns in its first five years.)

Over at The Verge, Adrianne Jeffries has a good story on the changes. They still leave Kickstarter with a more narrowly-defined mission than its older archrival, Indiegogo; as far as I can tell, that site doesn’t explicitly ban medical products, porn, or charity campaigns, for instance. That has led to some odd undertakings over there, including a campaign for a wristband which can allegedly monitor your calorie intake, the subject of an excellent investigation by Pando Daily’s James Robinson.

Still, the new Kickstarter will embrace people beyond the “artists, designers, filmmakers, musicians, journalists, inventors [and] explorers” who it originally said it was designed to serve. And I confess to being a little nervous about the transition.

I came away from my time with the company’s creators impressed by the clarity of their vision, and their willingness to err on the side of preserving it. It was a very different story than you usually hear from the founders of venture-backed startups, who are usually under intense pressure to get as big as possible as fast as possible.

Then again, the logic of why some Kickstarter campaigns were kosher and others weren’t was always fuzzy. In the past, underwear was welcome but bath products were forbidden; social-networking software was fine but an actual social network was not. My guess is that the revised rules will simply allow a bunch of efforts which everybody (mistakenly) assumed were legit all along, and therefore won’t radically change the game.

Or so I hope. The best Kickstarter projects have always been quirky and creative–a fact which surely has something to do with the quirky, creative nature of the site itself. I’m keeping my fingers crossed that it emerges from these changes still recognizable as its idiosyncratic, lovable self.

UPDATE: Kickstarter CEO Yancey Strickler responded to my fretting with an encouraging tweet:

One comment

Flappy Bird Arrives, and Swift is Officially a Serious Programming Language

Flappy BirdIn the world of computer languages, maybe the most famous program of them all is “Hello world“–the code which displays that two-word greeting. You can write it on the first day you learn a new language, and it’s often used to verify that the system is working properly.

Maybe we should formally declare Flappy Bird to be a sort of more ambitious version of “Hello world.”

Yesterday, Apple startled attendees at its WWDC keynote by announcing that a new language, Swift, would replace Objective-C as the way to write apps for iOS devices. And within hours, someone had implemented Flappy Bird in Swift. (This version is officially called FlappySwift, and I learned about it from the Twitter feed of Apple’s Chris Espinosa.)

When Flappy Bird first became wildly popular a few months ago, I assumed that it was the very definition of a flash in the pan–something which would obsess the world for a brief period, and then disappear. But even though its creator, Nguyen Ha Dong, yanked it off of Apple and Google’s app stores, it’s still burbling around in the world’s subconscious. (Of course, the yanking only served to raise the game’s profile.)

Could Flappy Bird turn out to be like Pac Man: a fad which was huge, then subsided, but never really vanished from the culture? If the children of 2024 can identify Flappy at a glance, we’ll know that the little guy isn’t going anywhere.


The Best Parts of Apple’s WWDC 2014 Keynote Sailed Right Over the Heads of Consumers. Good!

Tim Cook

Tim Cook wraps up an unusually dense WWDC keynote

If you were Apple, and you were trying to construct a WWDC keynote with the principal aim of whipping consumers into a frenzy, you’d make sure it was about sexy new devices, first and foremost–and ideally devices in the new categories which so many people are assuming the company will soon enter. You’d want to show off products so obviously lustworthy that folks couldn’t wait to get their hands on them, and then have them in stock at the Apple Store as soon as possible.

Here’s what you probably wouldn’t do: Skip hardware altogether in your keynote in favor of software, with a presentation that got more and more technical and abstract as it went along, until a person quite literally needed to be an engineer to understand it.

But the thing is, anyone who’s paying attention already knows that the primary aim of a WWDC keynote is to get software developers excited about the future of Apple’s platforms–maybe now more than ever, given the widespread (albeit silly) received wisdom that Android has defeated iOS.

iOS 8's new Photos on an iPad and an iPhone

iOS 8’s new Photos on an iPad and an iPhone

So nobody should have been startled by the fact that the WWDC 2014 keynote did not involve an iWatch, an Apple HDTV, a Retina MacBook Air, or, as it turned out, any new hardware whatsoever. Tim Cook, software honcho Craig Federighi and company talked only about operating systems and related matters–and after the preview of OS X Yosemite and the first part of the discussion of iOS 8, they spoke only to developers, developers, developers.

Even without the developer-focused crescendo, the keynote included plenty of news about features which will make Macs, iPhones and iPads more useful. Much of it involved Apple’s versions of features which have existed in Android and/or in other companies’ apps and services for years, such as:

  • iCloud Drive (file-storage features similar to Dropbox and OneDrive);
  • Features which make iCloud a repository for all your photos rather than just temporary storage for recent pictures;
  • Both predictive typing in iOS’s standard keyboard and the ability to install third-party keyboards such as SwiftKey and Swype;
  • New iMessage features reminiscent of WhatsApp and SnapChat;
  • The ability to wake Siri by saying “Hey Siri,” much like Android’s use of “OK Google.”

Apple won’t win any awards for pure creativity for any of these updates, but that doesn’t make them less useful; some of the most important features an operating system can add are the ones which are overdue. And in some instances, Apple’s take on existing concepts does look innovative–such as Continuity, a suite of features designed to let you use a Mac and iOS device together, including the ability to use a Mac as a speakerphone for an iPhone.

Then there were two much-rumored iOS additions: Health (the app we thought was going to be named Healthbook) and new home-automation features known as HomeKit. Both are potentially huge deals, but got explained only in elevator-pitch form. We simply need more information before we can come to any firm conclusions about them.

If Health and HomeKit got short shrift, it was to free up more room in a very dense keynote for discussion of other new developer tools, most of which are closer to iOS’s core than health and home automation are:

  • Extensibility is a framework which allows apps to talk to each other, opening up opportunities for integration at least vaguely akin to those offered by a technology called AppLinks which Facebook is spearheading. It will also let third-party developers insert full-blown widgets into the Notification Center, giving Apple devices a long-awaited answer to the widgets which have long been a defining feature of Android;
  • CloudKit provides developers with tools for hosting apps and data on Apple’s iCloud platform, taking the company into territory currently ruled by Amazon Web Services and Microsoft’s Azure.
  • Metal lets games leverage the power of Apple’s A7 processor with as little overhead as possible.

The last announcement at the keynote–unquestionably the “one more thing,” though nobody called it that explicitly–was a new programming language called Swift. No consumer will ever see Swift, and few will even know what it is. But if it lives up to Apple’s sales pitch for it, it’s a sea change–a modern way to create iOS apps that’s designed to be faster, more powerful and more approachable than Objective-C, iOS’s current language.

When I talk to app developers, they have no problem finding things about the iOS ecosystem to praise, but I don’t think I’ve ever heard any of them list Objective-C as a plus for the platform. Instead, its learning curve is often mentioned as an obstacle. And as the Apple executives onstage showed off Swift –including some features which I cheerfully acknowledge I didn’t comprehend–the engineers in the audience went wild.

I do know enough about software development to understand that the news which Apple announced today is going to make iOS a much richer development platform. Apple isn’t dismantling the sandboxing which limits what apps can do without permission, thereby making it tougher for buggy ones to screw up your phone or malware to do intentional damage to your data or privacy. But it’s giving developers the ability to build experiences which don’t feel constrained in the way that iOS apps sometimes do.

Once that happens, people who own Apple hardware will reap the rewards of WWDC 2014. Including the ones who caught the livestream of the keynote and considered it to be a snoozefest.

No comments

Time For Some TWiT

I had fun spending my Sunday afternoon guesting on This Week in Tech–with guest host Mike Elgan, Cnet’s Lindsey Turrentine and Katie Benner of The Information. We talked about a dizzying array of stuff: WWDC, Google self-driving cars, NSA facial recognition, Glenn Greenwald’s book, Android TV, Amazon vs. Hachette, the right to erase bad stuff from the Web, and–oh yeah–this new independent version of Technologizer.

Here’s the show:

One comment

WWDC 2014 Live Coverage: Come Join Me on Twitter

In many cases, the only way to learn what Apple is announcing at a media event in real time is to read liveblogs or Twitter. This time around, for the WWDC keynote on Monday morning, the company is doing a live webcast–so in theory, you don’t need someone like me.

Still, I’ll be in the audience at Moscone West at 10am PDT on Monday to share news and insta-reactions. I’ve decided to do my live coverage over on Twitter, and will report back here with future thoughts once I’ve had at least a few fleeting moments to reflect on whatever gets announced.

See you on Twitter in the morning–and then again back here, I hope.

No comments