Updating to iOS 5 can disable notifications in RE.minder, GAME.minder, Better Clock

Did you update to iOS 5 in the last 2 weeks? Chances are you did. We've received many reports of RE.minders not going off after the update and after some research, we've determined that Apple took it upon themselves to (randomly?) change the notification options for various apps after the update, RE.minder , GAME.minder and Better Clock included. Read on for the fix -OS 5 introduces the new notification center which handles all notifications, system-wide, a little differently. Notifications now default to a strip across the top of the screen that goes away after a short time. Without any other changes, this is how RE.minder will display notifications under iOS5. You can see whatever notifications you have waiting by swiping down from the top of the screen.

It's possible that Apple has set RE.minder's notifications to "off" on your device in the upgrade to iOS5. To fix the issue, go into the settings app and select notifications, then scroll down until you see RE.minder (Or RE.minder PRO+, GAME.minder or Better Clock). If any of these apps have been disabled, they may be all the way at the bottom of the list under "not in notification center".

Once you select the app, you will see the options for notifications: none, Banners (the new way) and Alerts (the old way). Just select the notification type you want to use and all your RE.minders should show up again. Also make sure to switch badges and sounds back to "on" if they have been switched to "off".

We're still trying to figure out why Apple would change these settings without letting us the developer or you the user know but for the moment, we just have to fix it manually on a case by case basis.

If that doesn't fix the issue, please do let us know by contacting us.

Thanks!

You are your quality control team

It's hard having pride in your work. It's hard taking the extra time needed to make something you are proud of instead of just something close enough. In a perfect world, we would always have time to make a product completely perfect before sending it out the door. But as we all know, we live quite a ways down the road from a perfect world.Over the weekend, I had an experience via the app store that surprised me but probably shouldn't have - I downloaded a truly bad app.

The realities of the App Store and the current economy mean that we as developers constantly try to live in an uneasy lagrangian point between speed, quality and cost. Being first with an idea can mean the difference between success and failure but being first with a bad implementation can be worse than losing the race.

Here at Handelabra, we do our best to make sure our products are worth using. When we make mistakes, and we do, we work our butts off to fix them. But not everyone does. In the new world of indie development and self-publishing via the App Store, there's something interesting happening - the breakdown of quality control.

Yes, Apple must approve any app before it is released and yes, they have a list of rules that ostensibly guard against the bad eggs but my experience this weekend reminded me that the role of publisher is not completely vestigial.

Software development is an interesting thing. It's an incredibly technically precise endeavor that requires a wonderfully artful touch to be done well. And as with any complex undertaking, its hard to find people that have all the right skills in a single body, or even to assemble a team with all those skills in only a few bodies. But the absence of certain aspects doesn't make them any less necessary. And unfortunately, some small developers, when pulled too far by the gravity of costs, lean on Apple to fill the roles they don't have the time or the money to do themselves. But the problem is that Apple, via the app store, is not in a great position to fill that role for us.

We all love Apple, that's why we develop for Apple platforms. But Apple's goals with the App Store approval process are very different than ours with our products. Their goals are to maintain a consistent experience, to protect iPhone and iPad customers from "objectionable content and provide customers with incentives to stay with the platform (and to make sure we're not using any of those sweet, sweet private APIs). Our goal (and I hope I'm not speaking out of turn) is to create compelling, useful and bug-free experiences for our customers. Letting Apple be your quality control department is a mistake. Sure, they may catch some truly egregious bugs (or not) but the factors they are controlling for are probably different from those that will most effect the customer experience of our software.

The App I used this weekend clearly passed Apple's gatekeeper (although it's questionable whether it should have). But before even getting there, it should have faced a much more meticulous gatekeeper who was concerned with the app and it's experience and not simply whether Apple would approve it.

And cards on the table - I'll admit that I'm as guilty of this as anyone. When we've been polishing an app or an update for a month, I want it on the store so bad I can taste it. I'm constantly being reminded by my team that it's more important to get it right, and make it solid before letting it out, short term revenue be damned.

But then, I'm lucky enough to have a team that takes pride in their work. My job in this case is to get out of the way and let those inner quality assurance beasts come out to play.

Expo Bingo, the game to play during the Keynote

Some of you Apple fans, you know who you are, will remember the keynote bingo cards of years passed. They would travel around the internet the week before a big reveal and were a fun way to participate in the speculation. Well we were shocked to discover that there was nary a keynote bingo game to be had on the app store! A situation we felt needed to be remedied.And late last week, Expo Bingo was approved. Free for all and universal for all iOS devices, Expo Bingo is a game we whipped up for the community that is just like the keynote bingo cards of yore, only even better.

When you launch Expo Bingo, it will check in and download the latest bingo squares so we can make sure the rumors are as up to date as possible. It also gives you a totally random bingo card so you really are playing against the world. And since apple isn't the only company with big presentations and huge announcements, we'll go ahead and set up a game for just about anything if you just ask. Aside from WWDC tomorrow, we also have games set up for the 3 big E3 presentations from Microsoft, Sony and Nintendo this week.

And because a game like this is only fun when you're competing against others, you can share your winning board for all to see via Twitter or email. Go ahead and gloat!

Get it for free right now!

20110605-013248.jpg

Every App is Multi-touch (even if it's not)

Back then…

In the mid to late 90's, there was something spreading across the internet like herpes. It promised freedom from the tyranny of table-based layouts, rich animations, vector graphics that could scale to any size and pixel perfect reproduction on any machine, regardless of browser, OS or platform.

That infection was (and is) Flash (Now Adobe, then Macromedia, previously FutureSplash).

What made this technology so appealing to designers was the promise that they could have complete and utter control over the presentation of their designs. No more worrying about how IE4 would render that table vs Netscape 3.2. No more sticking with Arial, Times New Roman and Comic Sans. Build your flash file at 400x600 and everything will always be exactly where you want it. But more than that, you are free to completely re-imagine the entire concept of web navigation. Forget about that back button, forget about users deep-linking to a specific page, your website is now a black box within which you, the designer, are god - usability be damned. In the immortal words of Jeff Goldblum in Jurassic Park, "We were so busy figuring out if we could, we didn't stop to think about whether we SHOULD".

As with most new technologies, it took some time for people to learn what flash was good at and what it wasn't, when to use it and when it was over kill, and probably most importantly, WHY to use flash (some are still fighting to learn this lesson). Flash brought a bunch of new functionality to interface design. For instance, JavaScript offered rollovers but now flash could give you animated rollovers with dynamic hit areas. What this meant to the overall goal of usable interfaces is still up for debate but one thing that DIDN'T change through this r/evolution was the method of interaction - onscreen cursor, driven by a mouse.

Today

With the growing ubiquity of touch-based interfaces, we're seeing the first real paradigm shift in user-interface since Steve Jobs visited Xerox PARC back in 1979. While Flash helped us to learn that interfaces could be fluid, living and changing things, touch is teaching us new lessons.

What makes touch such an interesting development is where it's being used primarily - mobile devices. In the mouse and cursor world, the interface can do anything, as long as it can be manipulated with a single point traveling across the screen. Those who maintain this thinking moving into the touch world do so at their peril. Sure, there will always be software that just needs a series of clicks (now taps) to function, but in the mobile world, those too are multi-touch apps.

Why? Because possibly more important than simply incorporating more than one finger on the screen is remembering a touch point that many seem to forget - the hand holding the device. On smartphone handsets, where it's possible to effectively hold the device in one hand and operate it with the thumb of that same hand, this is less of an issue than it is with the new, larger devices like the iPad and Galaxy Tab. On these devices, it's non-trivial to plan for how users will hold it in physical space.

The quintessential multi-touch experience for the iPad is Uzu, a particle/acid trip generator that can track all 10 fingers simultaneously. Obviously, if you are using this app by yourself, the only way to do so is to lay it on your lap or a table. Once you do so, its two-handed nature is a wonder to behold. Yet as fun as it is to play with, it can be awkward if there's no convenient place to lay it down. This becomes even more apparent if you try to thumb-type while holding an iPad in landscape orientation.

Then look at a game like Snood, a game that has historically used interfaces from controllers to mice and keyboards. The touch and drag mechanic works for aiming but the firing mechanic requires you to tap directly on the cannon. During development, it was probably assumed that most people would hold the device with one hand and manipulate the game with the other hand. But in practice, I have found that firing with an index finger is far less accurate than a thumb. Why? Because when held as you see in the second photo below, the thumb is anchored to the device. An index finger is essentially floating over the device. As you then move in to tap, your aim can shift and you tap (or even tap and drag) in a way you didn't intend. Most attribute this to some sort of "fat finger syndrome". Another way to say this is that touch interfaces have no state. When you stop moving a mouse, the cursor stays where you left it. When you finish a tap, the cursor disappears (if it ever existed in the first place).

I often play simple games like snood while I "watch" TV and I can tell you, holding the device like this for an hour leads to quite the cramp in my "firing hand". The designers of snood probably don't think of that game as "multi-touch" and that is why it's a game I can only play in short bursts. They've forgotten (or failed to learn) than in the world of mobile devices, EVERY app is a multi-touch app.

Congratulations - you are now a hardware designer

What this all means for the future of software interface design is that the lines between software and hardware are going to become VERY blurry. The world of flash began to teach us that just because you CAN put the navigation in a spiral around the center of the screen, that doesn't mean you should. Similarly, the touch world is beginning to teach us that EVERY piece of software is multi-touch, even if it's just a series of single taps because the hand holding the device is just another touch point.

This is why it's so awkward to do full typing on the iPad. Apple (paragons of usability though they may sometimes be) completely failed to plan for MOBILE typing on their MOBILE device. When it came time to tackle typing, maybe in an effort to avoid the "big iPod touch" moniker, maybe because it just didn't occur to them, they completely threw out everything they learned about thumb typing from the iPhone and instead, tried to build a touch-based laptop keyboard. If you are in portrait and need to type something on your iPad, your options are simple: double the length of your thumbs, find a table, or contort you body in to the "iPad crunch" as I call it (Knees together, back hunched. See below)

In a world where the software designer has planned for the hardware, you instead get something like this (click for a larger version) :

As we move into 2011, there will undoubtedly be a number of cool innovations in the multi-touch space. But the most important innovation has already happened, and it's simply time for everyone involved in interface design to remember -

Every App is Multi-Touch.

When iterative development goes wrong - the new Digg.com

Iteration.It is, in many ways, the lifeblood of software development.  It tends to be understood these days that, once you release a product, your next step is to start working on version 2 immediately.  The goal is usually to take what you've done, see what's working and what's not, learn from it, and iterate.  Take out the stuff people don't like, add in new stuff that they will.  This doesn't happen only in software either.  The iPhone is a great example of iterative development.  Each new iPhone has added things to the equation driving people to a new version.  Even if they were happy with the one they have, the new iteration often has enough value to convince people to upgrade.

It is rare but possible that the "next iteration" could abandon a current product for something better, but completely different.  Apple has done this many times with varying degrees of success.  The example of success came when, at the height of it's popularity, they end-of-lifed the iPod mini only to replace it with the iPod nano.  The new version was, at it's heart, still a small iPod but the form factor changed (it got thinner and lighter) and the internal tech also changed (it went from a mini hard disc drive to flash memory).  It was a resounding success, but a step many in the tech world would have been afraid to try.

The less-than-successful example was iMovie '08.  After bringing dead simple home movie editing to the masses with the timeline based iMovie, and iterating it for several popular versions, Apple chose to move consumer video editing to a completely different paradigm.  I'm not going to discuss the merits of the new approach (I hated it with a passion at first) but the fallout was a perfect example of what happens when you rock the boat just a bit too much.  The people revolted and apple was forced to keep iMovie 6 available as a download to appease those who just didn't dig the new paradigm.

So into which of these examples does the new Digg.com fall?  If you ask me (and a lot of other people) it falls squarely into the latter.  Digg is trying to remake itself into a "social news" site, and while that may be a nobel goal for Digg as a company, the problem is that they are doing it at the expense of what Digg truly is - a crowdsourced news aggregation service.  While there are, I'm sure, a lot of users who will gravitate to the new features like following the Diggs of friends (I have none on Digg) or following certain news sources (I don't), it shocks me that Digg has so completely misread it's core functionality.  But more important than misreading it, they've apparently abandoned it.

Here's how I used Digg-

  • first thing in the morning, load Digg.com
  • scan down the page, looking for articles that interest me
  • middle click to open links in background tabs
  • arrive at headlines I've already seen several pages in
  • start reading the articles in the tabs
  • once done, head back to the front page to see if anything new has come along
  • repeat throughout the day.

This use case is, unfortunately, no longer a viable way to use the service.  For one, the new web 2.0, "load the next page below the current page" makes the entire site essentially one big page.  How do I get to page 4 quickly without loading pages 1-3 first?  you can't.  For two, there seem to be far fewer, and far less interesting stories making to the front page now, which is, BTW, no longer the front page. If you are logged in, the front page is actually your new social news page. Confused yet?  I am.  What was a useful tool for quickly finding the latest "popular" news is now apparently a tool about popularity that might contain news.  Maybe.

So what does a company do when the iterative process goes so wrong?  From the Apple example, maybe digg will come around in the next week or so and offer a way to switch the view you get when logged in, allowing users to "just read the news" which is all I really want.  However, since the news that appears on Digg is user driven, the ultimate danger is that this disruptive moment has driven enough users away that the new Digg can never become what the old Digg was.

I won't begin to speculate as to what research was done before making such a big transition.  I also don't fault Digg for trying something new.  Innovation is a great thing.  But if ebay closed up shop tomorrow and reopened as myspace, users would be generally confused.  The problem comes from abandoning the core function to attempt to create a new core.  If the risk pays off, you can be selling iPod nanos.  If it doesn't, you're stuck with iMovie '08.

So how's reddit doing?

Legal and in the wild, on the current iDevice Jailbreak

So the web is all abuzz today as a a new jailbreak has been officially released that allows ALL (according to what I've read) iDevices to be jailbroken with a simple trip to a website.  And following the DMCA rule changes handed down last week, the legality of iPhone jail breaking is no longer grey but pretty black and white.When I first jumped on the iDevice bandwagon a few years back with a first gen iPod touch, I did the jailbreak just to see what was out there.  At the time, there wasn't much that interested me and I soon after took it back to "blessed".  When I got my first iPhone (a 3G), there was one, and only one feature that tempted me to jailbreak - custom SMS ringtones.  I once again waded into the jailbreak waters and found that, while I did manage to get the custom ringtone I wanted, the battery life of the device suffered overall so I once again went back to "blessed".  This was more than a year and a half ago.

Now that jailbreaking is declared legal, and now easier than even for end users, I find that the one feature I still want has yet to make it into the official version of iOS - custom SMS ringtones.  So do I jailbreak or not?  While the reasons behind the decision to keep this oft-requested feature out are undoubtedly typical apple "we just don't want to" fare, I find that I have no real desire to follow the masses down the rabbit hole anymore.  Part of it comes from a desire to keep things on the up and up on the developer side of things but honestly, the multi-tasking brought by iOS4 answers the only other real complaint I have so for the time being, I'll sit tight.

What about you?  How many people are going to jump on this most recent and simplest jailbreak?  If so, what features are drawing you to it?

Weighing in on "Antennagate"

I've been avoiding weighing in on the iPhone antenna problems for one simple reason - I have no antenna problems. I said it the day I got my iPhone 4, and I'll say it again today - the iPhone 4 is the best iPhone Apple's ever shipped. I can now make calls in my basement which is something I could never reliably do with the 3G or 3Gs.In my humble opinion, the Antennagate phenomenon is simply the media taking the first legitimate problem with the iPhone and running with it. Headlines like "New iPhone still great!", and "iPhone has ANOTHER record breaking launch!" only draw so many clicks after you've run them 3 or 4 times. I chalk it up to the "teach the controversy" philosophy. Everyone knows that there are Apple haters and Apple lovers and if you can just find a way to stir the pot a bit, the twitterverse, blogosphere and facebookalaxy (to coin a neologism) will grab ahold of it and run.

Most of the (completely and totally non-scientific) polls I have seen indicate that a very slim minority of users are experiencing serious problems using the new iPhone. What is driving the controversy is the fact that it is easily reproducible by gripping the phone in a way most people never will.

Some people will inevitably jump ship to android or others and that's fine. The truth is, what will keep Apple on their toes is good healthy competition. And given the current production constraints, the traditional idea of "one more person buying the competition is one less person buying the iPhone" is frankly just not accurate. Apple will continue to sell every iPhone that comes off the line and the iphone market will continue to grow by millions a month.

If you are having legitimate antenna problems and decide that now's the time to jump ship to android, I wish you well. There's lots of cool things happening in mobile and the iPhone is just one of them. But the media circus is just that, a circus. And at the end of the day, a circus is just an entertaining event that passes through town, takes your money, and moves on to the next town.