Like 5Words? Subscribe via RSS.
More on Trim’s (possible) future.
Labels’ new format for albums.
Gizmodo: Don’t preorder Windows 7.
Will Sony Ericsson make netbooks?
Nintendo patent: ridable, inflatable horses!
Like 5Words? Subscribe via RSS.
More on Trim’s (possible) future.
Labels’ new format for albums.
Gizmodo: Don’t preorder Windows 7.
Will Sony Ericsson make netbooks?
Nintendo patent: ridable, inflatable horses!
Roku, the Internet video box that’s simple and fun to use, with a near-impulse price ($99.95), has a new source of content: Major League Baseball. The Roku folks have signed a deal with the MLB to put live broadcasts of all games on the player, starting with the rest of the reason. The games are available to folks who subscribe to the MLB.TV Premium service, which runs $19.95 a month or $109.95 for the year (or $34.95 for the remainder of the 2009 reason). The gamecasts join Netflix Watch Instantly and Amazon on Demand video among Roku’s offerings.
The service includes DVR-like rewind, fast-forward, and pause features and HD when available, and you can watch a week’s worth of archived shows. It’s the latest evidence that the MLB is the most progressive organization in sports when it comes to tech savvy; early instances included its release of the wonderful At Bat app for the iPhone and deal to put MLB.TV in Boxee’s media-center software.
Why start showing ballgames well past the All-Star break? Roku’s Brian Jaquet told me that the new service is in beta, and that it “opens avenues” to put other sports on the Roku player. Roku says that an update to the player’s software that enables MLB.TV should be ready for Roku owners starting tonight; it hasn’t shown up when I check for updates on the Roku box I’m using, but I’m looking forward to giving it a try. I don’t need access to every game, but when your favorite team is 3,000 miles away, getting to watch any game you choose sounds mighty appealing.
Facebook, the planet’s largest social-networking site, is buying one that’s relatively small but extremely influential, FriendFeed. The deal involves FriendFeed living on as a standalone site for now, but it sounds like the long-term idea is to build new FriendFeed-like features and technologies into Facebook itself. It’s clearly a major move in Facebook’s chess game with Twitter, and presumably reduces (but doesn’t eliminate) the possibility of Facebook buying Twitter itself at some point.
Reaction among serious FriendFeed fans to the news seems to be largely guarded-to-negative (although FriendDeed überenthusiast Robert Scoble is guardedly optimistic, and here’s Louis Gray’s thoughtful take). Me, I’m basically a FriendFeed dabbler/lurker at best, and I’m keeping my mind open. FriendFeed is impressive in many ways, but it’s complex enough that it’s remained kind of a secret weapon of serious geeks. If there’s one thing Facebook has done well, it’s figured out how to make a complicated (confusing, even) service appealing to millions of people of all sorts. Maybe it’ll be able to work some magic with the FriendFeed team that’ll make all of this makes sense.
I’m not personally a chick, but many of the smartest people I know in technology are. And I’m happy to report that the folks at Chicks Who Click, a social networking organization, have asked me to moderate a panel on women in high-tech startups at their conference in San Jose on August 22nd. The panelists are Tara Anderson of Lijit, Emily Olson of Foodzie, and Suzanne Xie of Weardrobe. The whole conference should be both productive and a good time; more details, including registration info, are here.
I’ve contributed another guest post over at WePC.com–this one’s on the question of backup and storage, and whether you want to keep your data in your home or on a remote server. (Actually, I think the answer is clearly “both, for at least your most important stuff,” but I’d be skirting the truth if I told you I’m doing a very good job of backing up files to the cloud, where they’ll be safe and sound even earthquake, mudslide, wildfire, or attacks by rabid OS/2 holdouts put my local backups at risk.)
Anyhow, I called my post “PC Storage: Your Desk vs. the Cloud.” Check it out, and lemme know what your personal desk/cloud storage strategy is…especially if you’ve found a remote backup service you love.
The New York Times has noticed a trend that’s been going on for…well, for decades, probably, but it’s now entirely mainstream: Folks sometimes leap online first thing in the morning, before they’ve so much as brushed their teeth. (Back in the 1990s at the height of AOL addiction, I’m sure plenty of people started their day by checking to see if They Had Mail; as a high school student in the late 1970s and early 1980s, I sometimes took the idea to its logical conclusion by staying up all night on the computer, so there was no waking up involved the next morning.)
Here’s today’s T-Poll–I’m refusing to take this one myself, but you can probably guess what my answer would be…
Sometimes it takes tech companies an amazingly long time to confront the inevitable. The whole war betwen Blu-Ray and HD-DVD was a rotten idea from the start (both formats were announced in 2002). But all parties involved in both camps insisted on wasting billions developing two competing HD formats. Then it took ages before HD-DVD prime mover Toshiba accepted that it had lost the conflict and discontinued the format. That was in February of last year.
And then it took another eighteen months for Toshiba to announce the inevitable conclusion to the whole saga: It’s joining the Blu-Ray Association and will be selling Blu-Ray players and laptops with Blu-Ray drives. I feel for the the company, I don’t think its stance that HD-DVD was the superior format was utterly irrational, and if and when the day comes that I buy a Blu-Ray player, it’s as likely to come from Toshiba as any other company.
But a panel of relatively well-informed consumers could have figured out the likely outcome years before Toshiba ditched HD-DVD and embraced Blu-Ray. Wouldn’t it have made sense for everyone involved to get here more quickly?
(Note: The Toshiba Blu-Ray Disc logo shown above is my quick mockup–I wonder if Toshiba still winces when it sees its name and “Blu-Ray” in close proximity, or if it’s over it?)
In a small way, this is a significant post: It’s the first one in which I’m going to refer to Windows Vista in the past tense. Which might be premature and/or unreasonable–Windows 7 won’t reach consumers until October 22nd, and millions of copies of Vista will be in use for years to come. But last week, I was writing a piece on Windows 7 for PC World, and started to refer to “the Windows Vista era”–and then I realized that it’s hard to make the case that the Vista age ever started. (Even today, two and a half years after Vista’s release, 63 percent of the people who visit Technologizer on a Windows PC do so on Windows XP, versus 27 percent who use Vista–and if anything, you guys should be more likely than the world at large to have adopted Vista.) Already, I’m thinking of Vista as part of the past–in part because I’m looking forward to Windows 7.
More than most technology products, Vista seems to be entirely different things to different perfectly intelligent people. Some say its bad rep is unfair. Others continue to trash it. But you’ll have trouble finding many people outside of Redmond city limits who’ll contend that Vista has been a hit.
What happened? It wasn’t one issue that hobbled Vista, it was all kinds of mishaps, none of which would have have been a disaster if it had been the only thing wrong. (In fact, most of them mirrored problems that had happened with earlier, far more successful versions of the OS, such as deadline problems and driver glitches.) Taken as a group, however, they confronted Windows Vista with both karmic and all-too-real difficulties that it never came close to resolving.
I love doing my thing on the Web, but I’d be a liar if I told you there weren’t things I missed about being a magazine journalist. And one of the top three things I pine for is the fun (and challenge) of creating a cover every month. So I love, love, love this video which shows how my pals at Macworld put together the cover of their new issue. And as much hard work as it shows, it doesn’t capture everything involved in doing a cover, since it focuses on the photography and layout aspect of a process that also involves story ideas and wordsmithing, and sometimes multiple, significantly different mockups of different ideas.
Here’s Macworld’s story about the video.
(Side note: I may have implied that I don’t do magazine stuff anymore, but I lied–in fact, this very Macworld has a piece I wrote, on the war between Macs and Windows and why it doesn’t make anybody happier or healthier. It’s the first article I’ve written for the magazine after more than twenty years of reading it…and eventually sitting next to the people who make it.)
(Additional side note: If we’d been clever enough to do time-lapse photography of the complete PC World cover process when I was there, we’d have been required to include footage of me sweating for several weeks as I waited for newsstand sales reports to come in and tell us whether we had a hit or a flop on our hands.)
Tr.im, one of umpteen URL-shortening services used by Twitterfans and other people who needed to compress long URLs into as little space as possible, is now the first major player among those umpteen servers to call it quits–it’s being shuttered by parent company Nambu. The company says it couldn’t figure out how to make money with Tr.im, and couldn’t find anyone interested in taking it over–and that Bit.ly‘s stance as the default URL-shortener used by Twitter itself means that Tr.im would fail in the long run no matter what.
Tr.im was a worthy contender, but there are plenty of other perfectly good competitors out there, so its closure wouldn’t be a huge issue for new URLs that need to be shortened by Tr.im users. What’s worrisome is the status of existing Tr.immed URLs–of which there are scads all over the Web, and which people are continuing to create right now even though the service is closing. If Nambu shuts down the servers that forward the short URLs to the original long ones, the Tr.immed versions won’t work. The company doesn’t say what its long-term plans are for existing URLs, but it does A) guarantee that they’ll still work through the end of 2009; and B) say that running the servers is prohibitively expensive. I assume that’s a hint, at least, that Tr.immed URLs will likely stop working sometime next year. (Unless someone else steps in to save the service–which doesn’t seem unthinkable given the attention the shutdown is getting.)
If Tr.im does go away completely, it’s a wake-up call we all knew would come eventually, if we gave the matter any thought. Non-shortened URLs will work forever–as long as the page they’re in and the page they link to exists, they’re good. Shortened ones live and die at the discretion of the company that shortened them for you, assuming it doesn’t go out of business. And nearly everybody in the URL-shortening game is a very small company without a proven plan for economic sustainability.
All the information contained in millions of tweets with shortened URLs is tremendously valuable–but many of them simply don’t make sense if you can’t click through to the URL that’s been shortened. Sooner or later, Tr.im’s vanishing act is going to remove all the context from vast numbers of tweets, and the folks who suffer won’t be the people who shortened the URLs, but the ones who want to read those tweets.
I don’t have an inkling what Twitter’s long-term URL-shortening strategy is–hey, are there any clues in those stolen documents?–but I hope it intends to start squeezing down its own URLs. For one thing, I have more faith in Twitter being around for the long haul than I do in the viability of existing URL-shortening services. Also, if Twitter goes out of business, than all those tweets containing shortened URLs may disappear anyhow…