Launching The Bat Player on the Roku Store and its aftermath

After a year of development The Bat Player went live on Roku's Channel Store.

The Bat Player first went public in September of last year.  This means if you knew the right Roku Channel Code you could install it on your device.  I put up a web site to make this easier, and shared it with a couple specific communities like the Roku forums and Reddit's Roku subreddit.

My goal all along was to get it available in their mainstream directory.  But between their approval delays (It can take months for them to get you in their queue) and many requested change iterations, it took some time.  Add a period early this year where I took time off trying to get it approved, and you'll see why it took so long.

But that's not what this post is about.  It's about what happened after it went live.

But first let's quickly go over how The Bat Player works.

As far as the audio aspects go it's a pretty dumb application.  It doesn't really know much.  Due to Roku's limitations with streaming protocols it doesn't know even when a song changes.  To overcome this I built a service, known as The Bat Server that can be ping'ed with a URL to find the title of what's playing on a station via a variety of different methods.  This part I actually broke out in to my first public Node.js module.

Easy enough, but that's not what I wanted The Bat Server for.  I wanted contextual, useful information.   So knowing the artist name allows me to grab some relevant information from Last.FM's API.  That too, is easy enough.

Song information is more difficult.  I query a few different services (Last.FM, Discogs, Gracenote, Musicbrainz) to get an array of possible albums a track could be from.  Then I try to make an educated guess based on available information.  This whole song and dance is pretty expensive.

On top of this the client makes separate calls to The Bat Server to do some image manipulation for a few things (dynamic header images, station icon resizing, building backgrounds and artist images) via ImageMagick on the fly. 

All this happens every time a new song is seen by The Bat Server.  It's not efficient, it doesn't work at scale, and it was living on one tiny server.  But I only had about 800 installs.  So I figured eventually when I got approved for the Channel Store I'd get a couple hundred installs more one day, half of those people would try it, and things would level off and I could reassess down the road if needed.  I underestimated the Roku Channel Store.

Once it was available thousands of new users brought The Bat Server to its knees the first day.  My single t2.small EC2 instance was not up to the task, though in retrospect I did have a few things early on that I'm really thankful for now.  

I cached everything.  Data gets saved to Memcache, so an Artist, Album, Song only gets processed once.  Also, images get pulled through Amazon Cloudfront.  So those crazy dynamic images eventually make it to the edge servers and won't touch my server again.  And lastly I put Varnish in front of the Node.js service as a way to have a handle on how long a previous request can stay as a cached result before we go to the internet radio station to determine what's playing again.  If all fails I can dial up the cache duration.

I also added centralized logging and Analytics not just from the service, but also from the clients themselves. I did this with an Amplitude library I released and a syslog client I built into the client. Both written in Brightscript.  I wanted insight on what was going out in the field.

But no amount of caching or logging was going to save this.  Luckily I was already using EC2 for my existing instance, but now it was time to put myself through a self-run AWS Bootcamp.

First off, I needed to handle the new load.  So I created a new t2.large and an Elastic Load Balancer.  I pointed DNS for the existing server to the load balancer and put the two servers behind it.  Things became much happier quickly, but I knew this was only a stop-gap measure.  I needed some auto-scaling action.

Since I added the t2.large I bought myself some time to figure out what the best AWS CloudWatch metrics would be in my situation to scale capacity.  At first my alarms were a bit too aggressive, causing servers to go in and out pretty frequently, but I eventually leveled them out, finally feeling confident enough to pull the t2.large out of the load balancer and let the system manage itself.

With this new infrastructure I needed some insight into it, so between Runscope for API monitoring (I so love this service), New Relic for server monitoring, Papertrail for the above logging, Rollbar for Node.js application error reporting, and Datadog I was able to put it all into one nice little interface.  Add Zappier and Pushover with some webhooks and I now get custom alerts right to my Apple Watch.  But man am I paying for a lot of services now.  I hope once some time passes without any incident I can get rid of some of these services.

So now what?  Well obviously things aren't perfect yet.  Each server has its own instance of Varnish, so that's pretty stupid.  But that's a side effect of my reactive-scaling.  So maybe I'll [Varnish] -> ELB -> [Servers], but that would require an additional, full time EC2 instance just for caching, plus I read that putting an ELB behind Varnish isn't terribly awesome.  I never thought I'd miss managing F5 BIG-IPs!

So what did I learn?  The obvious, really.  If you're going to build a single-homed, power hungry service, be prepared to scale horizontally.  And what I did to fix it wasn't novel, it's what these AWS resources were created for.  I also learned that the extra effort I put in up front both for analytics and logging made it easier to know what was going on.  And lastly, utilizing the Right Tools For The Job™, in this case memcache, varnish, and the array of services from AWS, really allowed me to take something that would have only handled a few hundred users initially to something that I feel confident I can scale out to any size.

Enjoy the tunes! 

Developing for Roku

Ask yourself, "Do I know anybody who's developed anything for the Roku Media Player?"  More than likely the answer is no.  If you ask others you might get get some response, but more than likely nobody you know has done it either.

And that's why I wanted to throw up a little piece about building for the Roku.

Chapter 1: I want to listen to Streaming Internet Radio.

 Marantz audio reciever

Marantz audio reciever

For me it started innocently enough.  I wanted to listen to streaming internet radio.  I had been listening using my Marantz home theater receiver.  This was ok, I guess.  The interface looked like this:

Not awesome.

My next thought was the AppleTV.  I use my AppleTV for everything else.  The interface is pretty good, but not great.

 AppleTV

AppleTV

But then I learned of a major deal breaker: You can't add your own stations.

So I looked at other options.  Run an iOS app and AirPlay mirror it to the AppleTV?  While that works that's not elegant in any stretch of the imagination.  Build an HTPC?  Get a Raspberry Pi?  All valid options.  But then I remembered that Roku had a ton of available content.  Surely it must have what I'm looking for.  Next thing you know I have a Roku 3 on order.

It turns out I was right.  The Roku had a ton of applications (they call channels) that focused on music and streaming audio.  Here are some of them:

With the SHOUTcast Radio application being the only one I could find that I could add my own stations, I was a step further than I was before.  But, as you can see, a lot of Roku apps have a very templated feel to them.  I later learned why this was.

I started thinking of other features I'd really like.  Last.FM Scrobbling, Adding to Rdio playlists, updating my room lighting with my Philips Hue system.

Chapter 2: Why Don't I Just Build My Own?

Before I knew it I was browsing around the Roku Developer portal.  Things I learned right away:

  1. They have an open SDK.  Cool!
  2. You write in a language called BrightScript.  WTF is BrightScript!?
  3. Callbacks and async stuff happens through a "message port".
  4. You have to support SD and non-widscreen TVs.
  5. They have an approval process like Apple to get on their channel store.

Sounds fair enough.  Let's go through this.

But here's my disclaimer:  I've developed one Roku channel, and there are people who know much more about it than I do.  I did about as well as anybody who's building something for the first time with a technology would, and I'm only giving my initial thoughts.

BrightScript is fundamentally a bastardized version of BASIC.  It's a language so obscure often a Twitter search for the word will bring up zero results.  Roku's SDK provides you a limited set of native objects like Arrays, Associative Arrays, an HTTP client, Bitmaps, etc.  You create one of these objects with a simple syntax of

myArray = CreateObject("roArray")

Everything starts with ro and it's weird to create an object by it's string name.  But whatever.

But those objects are it.  You can't subclass, you can't create your own first class citizens, and you are at the mercy of the SDK.  

So without being able to specify custom objects it seems that any Object Oriented approaches are out the window.  Oh no my friend.

This is where your new best friend, the roAssociativeArray comes into play.  You'll use it for everything.  And you'll use it for this too.

Let's say we wanted to build a simple cache class.  We'd do something like this:

So let's go over this.

  • Your "class" is really just an associative array.  ( {} is the AA shortcut, like in other languages).
  • m is a weird magical variable that points to your associative array as long as the function was fired via the declaration in your "class".  In this case you obviously just can't call "cache_add" directly, you'd have to call YourCacheInstance.Add in order to get the magical m.
  • But cache_add is still completely pubic and accessible and totally confusing to people who may be reading your code.  You shouldn't access it directly, but that doesn't mean you can't.

Ok, so what's this port?  It's roMessagePort.  And it's stupid.

Have you ever written a video game using a game engine?  You have the game loop, where every cycle of the loop you can do things like move a sprite, check for collisions, make the blue box turn red... that kind of thing.  Where in modern application development that kind of thing is abstracted from you this is mostly how Roku applications work.  With one big event loop.

So if you have a screen that you expect user interaction with first you create a roMessagePort object, assign it to the port of your screen, and then start the loop waiting for something to happen.

Here's a simple example:

And yes, you don't use double equals for comparison.

"But what if you make multiple roUrlTransfer requests and they start coming through a port.  How do you know what each is connected to!?" you're currently saying in disgust.  Well our friends at Roku has come with a way to work around that obvious oversight.  They added a GetIdentity to its interface so you can compare request data coming in to the original request to see who owns it.  Otherwise you'll be saving FancyPhoto.jpg's data to a file called importantdata.json.  So you'll have to keep the original request data around somewhere while you wait for responses to come in.  (If it's an async request you have to keep an active reference somewhere or it'll die in transfer anyway, but that's besides the point).

So with the basics of how you work with data out of the way let's talk about the UI.

Remember seeing a bunch of screenshots above that all look similar?  Say hello to roSpringboardScreen.  Just another piece of the SDK.  You pass it some data and it lays it out to look like that.  It's easy, it's fast, and it gets the job done.  Roku supplies a ton of screens that do this type of thing.  roParagraphScreen, roPosterScreen, etc.  The obvious downside is almost every Roku application looks the same.

So I took the approach early on that I wanted to do something different, especially for the "Now Playing" screen.  So I jumped at roImageCanvas.  It does what it sounds like.  You can put images on the screen wherever you want.

roImageCanvas has a lot of niceties.  A built in HTTP client so you can point to URLs instead of having to download things and handle local files yourself and being able to pass a simple array of things to draw are two of them.  For most things this really should be good enough.  But once I implemented my Now Playing screen with it I found I wanted even more control and flexibility.  This is when things got dark.

Working with Roku's roScreen is taking that game loop comparison to the next level because it's literally for building games on the Roku.  But there's no middle ground between roImageCanvas and roScreen.  If you want to do any kind of dynamic visuals on your screen you're giving up doing things Roku's easy way.

So let's talk about how to do some simple things with roScreen.

  1. Want to move a box across the screen?  There's no way to ask an object where it's location is, so when you draw something to the screen keep note of its coordinates.  Next time the loop hits you increment that number.  Then redraw the object with the new coordinates.
  2. Want to fade something in?  Draw it to the screen as completely transparent.  Then much like the previous example, keep incrementing the alpha value at the rate you want until it gets to the alpha you want.

But you better watch out.  Doing things like drawing text and drawing alpha'ed regions are expensive for the Roku to handle.  Luckily roScreen does double buffering so you don't see a flicker while it's cranking through all of that heavy work like making words appear.

For example, you'd want to do this on every loop if you want to do any movement.

But if it's static, and you have no reason to keep redrawing the screen, you don't have to.

This is when that requirement of supporting SD and non-widescreen displays comes into play.  You have to make sure your UI within this screen lays out properly.  Where if you were using any of the SDK canned screens like roSpringboardScreen all this would happen magically for you.  Even roImageCanvas has support for this kind of thing.

Chapter 3: Testing Your Channel

And how do you test on old screens?  Well if you have a Roku 3 it doesn't support analog/non-widescreen/SD, so you'd be like me and purchasing a second Roku, a Roku 2.  After discovering there are major performance differences between the Roku 2 and Roku 3 you'll probably also want to purchase a Roku Streaming Stick in order to verify your application on all the hardware.

But after all that hard work you'll want others to try it out.  And much like the Apple App Store, having your work publicly available is a source of personal pride.

Unlike Apple, anybody who builds a Roku channel can make it publicly available.  You upload it to the Roku developer portal and they give you a link.  Via that link anybody can install it on their device.  This is great for getting a small group to test all your hard work.

The packaging process is simple.  You zip it up, you upload it to your personal Roku, you type in a password that was generated previously, and you download it from your Roku to upload to their portal.

Chapter 4: Distribute On The Roku Channel Store

But just having it available isn't enough.  You want it on their Roku Channel Store.  Just like the Apple App Store that's where people discover new applications and where you'll get the users.  But it'll have to get approved first.

With Apple a worst case scenario is about two weeks unless there's some special reason they aren't approving you.  With Roku, let me put it this way: I uploaded my application on September 11th.  That's right, it's been 88 days since I submitted for approval.

I did hear from them once about a month in asking me if I could upload a new version that listened on a different HTTP port number in my code.  I did, and then I never heard from them again.  I email them about once a week asking for an update, but never a response.  This kind of treatment is enough for me to say "Screw Roku!" and why I'd probably never build a second application on their platform.

All this on top of the small things you might forget about when building for an obscure platform.  Anything you've built before probably has a drop in analytics library so you can get your Google Analytics or whatever going without any effort.  For me I ended up building a BrightScript analytics package for Segment.IO.  Want other 3rd party features?  Those SDKs they provide aren't going to work here.  For me I integrated Rdio, the above Segment analytics, Philips Hue and Last.FM on top of my own custom APIs.

So in closing, I'm certainly not saying that you shouldn't develop for the Roku if you're thinking about it.  On the contrary, it's a really fun challenge to build using an environment that's probably completely different than you're used to.  And if it's a pure numbers game, there are more Rokus attached to TVs than Nexus Android Players or Fire TV boxes combined by an order of magnitude.  It's not even close.  But keep your expectations in check, keep your feature set low, and expect to get creative with the solution you come up with.

Or just wait until you can write for the AppleTV.

 

Building TastemakerX for iOS

It's still in beta, and I'm still happily iterating over all of the features of the application, but I wanted to share the tools I used, the services I take advantage of, and overall tidbits that might be useful to fellow developers about how I've been building TastemakerX.  Maybe you can in turn share some things with me.

Tools that made my life easier.

 Spark Inspector

Spark Inspector

Spark Inspector

Both it and Reveal popped up around the same time, but since then I've been using Spark Inspector as a tool to debug view-related things.  We've all done it, make a view's background bright green or whatever so it'll stand out.  Now I skip that as this allows you to visually troubleshoot, move the frame of views around, change colors... basic stuff, but very helpful.

We've all had the "That view is supposed to be right there!".  So you jump in to 3D mode, see where it actually is, and go from there.

 

Cocoa Pods

I was pretty hesitant to jump into Cocoa Pod land.  If you're not yet familiar, it's a manager for dependancies.  Somewhat akin to a "package manager" for Objective-C libraries.  Like gem, or npm, pear in other languages.  I saw it as exchanging Objective-C dependency hell for Ruby dependency hell.  And frankly, I prefer the Objective-C version.  But I experimented with it on some test projects, and saw that I had no issues.  So I made the decision to use it with TastemakerX.  I'm glad I did.   I do a "pod update" after each release, and I know I'm always up to date.  It also makes experimenting with third party libraries simple.  Add it to your Podfile, "pod install", play with it.  If you don't like it, remove it from the Podfile and move on.

 Cocoa JSON Editor

Cocoa JSON Editor

If you're building the client, but not the API there's often a stage of experimentation.  This is where my experiments took place.  I was able to quickly make requests against the APIs and figure out exactly what it was I was trying to get back.  Not just for our internal API, but third party services as well.

It also really helps the scenario when you need a mocked up API endpoint.  It allows you to create data models, and the built-in web server will map it to a path.  Point your client to it and off you go.  Within seconds you went from a non-existent API to an endpoint with mocked up data.

 Cocoa JSON Editor

Cocoa JSON Editor

 Colorbot

Colorbot

Colorbot

I kind of wish this was a built in developer tool with Xcode.  While I'm sure you're like me, and have your UIColor+CustomColors category, it's really nice to be able to visually organize the color scheme you have for your app.  You use the colors everywhere, so they should be available, definable, and exportable.

So your UIColor category works great in your app, but where do you go when you need that color in your image editor?  Or a CSS file?  Colorbot makes exporting a color easy as a NSColor, UIColor, hex, and more.

 

Third party tools for working with Core Data

Magical Record

Both this and mogenerator was suggested to me by a friend.  While I was hesitant to throw a layer on top of Core Data I experimented a bit and saw the advantages.  The big win with Magical Record is simplifying handling Core Data across multiple threads.  You really shouldn't be doing any Core Data stuffs on the main thread, so any way to make your life easier here is nice.  It likes using blocks, and so do I, so we became fast friends.  It'll give you the managed object context for the current thread you're in, and a block to do work in.  The downside?  Documentation.  It's kind of all over the place.  Certainly read through the header files.

mogenerator

A nice little utility that will take your Core Data model and create two class files from it.  One with all the stuff a managed object should have, and one subclass where you get to put all your custom code.  It really does make things more manageable. 

 

External Services

Runscope

Run, don't walk, to get yourself an account with Runscope.  At its core, it works as a proxy between you and your API.  Don't let that scare you.  Because it's the middle man it's able to capture, in detail, the activity of your application.  Each request is shareable.  So if in testing you see something weird you can simply log into Runscope, grab the url of that request and share it with whoever may be of interest.  Even if they're not a Runscope user.

Other really useful things include bringing attention to all requests that have resulted in an error or requests that took too long to execute. Now with Runscope Radar you can turn a request into an API test template to make sure that thing that was broke stays fixed.  Oh, and it pretty prints the JSON of each request.  Could you do all of this with in-house unit tests, a copy of Charles, and your clipboard?  Sure, but not this good.  It just works.  I wouldn't be overestimating if I said it has saved me days of work.  And the same can be probably said for my teammates who I send Runscope URLs to saying "Could you look at this?"

 Runscope

Runscope

Crashlytics

If you're like me you've tried a few crash reporting services.  Also if you're like me you've kind of been annoyed by all of them in some way or another.  Crashlytics wasn't really on my radar until I went to the Twitter mobile dev event a while back.  They gave a demo of Crashlytics and it caught my attention.  It works just like any other crash reporting service, but the difference is it runs a service in the background on your build machine.  This may bother some people, but it's nice that it's taking care of grabbing the dSYM and keeping track of archived builds without me taking extra steps.  Because of this I've also not had to spend a single second manually symbolicating crash logs that Crashlytics hasn't been able to handle or because I forgot to upload the dSYM.

Another nice benefit, It allows you to set arbitrary keys as your user progresses through the application, so I know of things like the last artist id a user viewed, or a chart they were checking out.  Oh, and it's free.

Product Development

When I started at TastemakerX one of the first things we decided was that we'd be making a rather large change from the idea of the v1 product they had already built.  So we spent some time really knocking out what we wanted this product to be.  During this time I still wanted to be working on some useful code.  So I built some standalone sandbox apps to work out things like networking, image handling, Facebook Open Graph, playlist generation, etc as reusable classes I could seamlessly move over to a production app once we knew what the app would entail.

I know most projects don't have the luxury to start working on the application before they start working on the application, but it gave me a nice head start.  Not to mention it enforces solid design patterns when you're writing code not knowing how you'll be using it later.  There's no UI to contaminate your thinking.

 

Some things I've learned

I would hope every project I ever work on I'd learn something new.  This is no exception.  Maybe some of these might jog something you might want to look at.

  • Storyboards are fine.  Use them, or not.  I don't care.  This was my first non-sideproject storyboard from scratch application I've done and I made the mistake early on to try to *only* use the storyboard.  Know when it doesn't make sense, and go create your nib files.
  • Don't mix up objectWithID, objectRegisteredForID and existingObjectWithID.  Just saying.
  • You're probably not using NSSet/NSOrderedSet as much as you could be.
  • You're also probably not using NSCache as much as you could be.
  • Take advantage of iOS 7's performFetchWithCompletionHandler.  You have a background window of time to do some work.  So instead of just fetching new data, maybe also do some of the post-processing you would normally have to do when the UI is up.  Maybe pre-calculate the height of UITableView cells for this data and cache that for later, or if you downloaded some images that need to be resized take care of some of that.  But there's only so much you can do in this window, so be smart about it.
  • Speaking of UITableView, check out iOS 7's estmatedHeightForRowAtIndexPath if it makes sense for you.  It'll postpone the height calculation for a row until it's rendered instead of upfront.  This makes more sense on older devices, but on the flipside you might see some frames drop the first time the actual heightForRowAtIndexPath is called to calculate the real height when the row is being built.
  • Aside from the audio playback manager, almost every instance of KVO I originally implemented eventually got ripped out.  What seems like an elegant solution using KVO is probably just making things over complex.  Think about if that's really the solution for the problem.
  • I have all network calls go through a singleton and fire off completion blocks.  That's worked really well and in retrospect I wish I did the same thing with Core Data.  On the same topic, if I were to go back (and might!) I would never pass back managed objects.  Use Core Data for storage, and build standalone, separate objects out of them.  Each object can have a reference to its Core Data counterpart for CRUD operations.

So that's my little rundown on things I found useful or interesting so far with TastemakerX.  I'd love to hear things that you're using to make your life easier or things you wish you would have known when you started as well.  Have fun, mobile friends!

Iterate!

Boulder Startup Week 2011

So on the recommendation of my friend @sethhwilson in the hour before the deadline I half-assed an email to @ryanwanger of Boulder Startup Week for their "We'll fly you to Boulder free" offer.  They were going to hand pick a handful of people and bring them in for the event.  I honestly kind of forgot about it, and I really only did it as one of those "take advantage of opportunity because you never know" kind of things that I'm trying to be proactive about.  Logic says nothing will come of it due to too many other people being involved, but lo and behold I got a call a few days later asking if I'd like a free flight to Boulder, Colorado for Boulder Startup Week. I had already planned a trip to San Francisco in a few weeks, and work was piling up, so logically I should had said that too many things coming up in order for me to go.  But much like the reason I sent the email in the first place I went ahead and took advantage of the opportunity and said "Really?  me?  Well, ok.  I'm in."  The trip was booked and they found someone for me to stay with while I was there.

I didn't know what to expect, as I didn't know anyone who lived there, and I had no idea what was going on there.  I knew it as the home of TechStars, and that's really about it.  So with no expectations I figured I had very little chance of disappointment.  Little did I know there was no way for me to be disappointed by Boulder and its people.

The events for the week were loosely organized.  There was a central schedule, but anyone could organize an event and have it be added.  Some were very tech focused like "The Mobile Web and Why it Sucks".  Others had nothing to do with tech like the "Pizzeria tour and Tasting".  And then there was the immensely fun, 1300 people packed house edition of "Ignite Boulder".  I've heard of Ignite events in cities, but it was my first chance attending one.  It was so much fun.  The whole city was excited for it, and I came out understanding what the fuss was about.

But to be honest the events had little to do with my Boulder adventure.  It was the people and the culture that made it special.

Everyone was happy to meet me.  I think there was a certain novelty when they found out I was one of the few chosen for the "free flight" deal.  But the people were extremely welcoming to me from the second I set foot on Boulder soil.

The first thing of notice, when I met new people they all asked the same thing: "Are you moving here?"  I found that very odd.  Why would they think because I'm visiting they expect that I'm moving there?  I'd simply respond to the question, with a "no... I'm just visiting, I have no intention to move anywhere."

People were genuinely excited when I told them about Hollrback and the other work that I do.  They wanted to know more.  They wanted to be involved.  They made me feel special.

Speaking of Hollrback: Boulder likes Hollrback.  They've actually heard of it before.  Can you imagine the smile on my face when I walk into the TechStars bunker and someone from the TechStars class said "oh yeah, I know you guys."  Something about Boulder made me feel like a founder of a "real" startup, not someone pretending to be someone he once read about in TechCrunch.

At this point I'm not going to go into detail about everyone I met, or everything I did.  That would make this post kind of lame.  However, some things:

  • I was in the Denver airport when Britney Riley texted me with "Brad McCarty just told the pitch session attendees that Hollrback is one of his new favorite products."  Holy shit.  He told that group... a group of people who are doing awesome things, that he respects Hollrback?  Wow.  He also mentioned Hollrback in a couple tweets over the startup week.  Seriously, the US editor of The Next Web telling people that he influences that Hollrback is rad.  Mind.Blown.  Also, this tweet.
  • Andrew Hyde seeing my tweets online and replying to me because he knew I was in Boulder.  I also got to have lunch with him and hang out quite a few times chatting.
  • People like Chris Vieville, Marissa Berlin, Cali Harris, Ryan Angilly and so many others would see me at events and make sure to say hi to me.
  • Having Dave Taylor sit next to me at Atlas Purveyors and then realize after "Hey, that was the askdavetaylor.com guy!"
  • I got quoted in Huffington Post after I spoke to a journalist at the Startup Week opening party.

Anyway, that's enough of that.  I could spend all day talking about the culture, the environment and opportunities for startups such as mine and the atmosphere that the people create.  I could compare and contrast to Omaha, but instead I'll thank those who made the week possible:  Elaine Ellis, Andrew Hyde, Ef Rodriguez, and Ryan Wanger who coordinated us out-of-towners.

Oh, and after a while when asked "Are you moving here?" I started to say "Maybe".  And I meant it.

Here's a video I threw together of some things I captured while at the event.  I put no time into it, and it's not very exciting.  I didn't grab as much content digitally as I should have.  It features some clips/photos  from Ignite Boulder, Boulder Open Coffee Club, and some of the other events.  Here you go!

[iframe src="http://player.vimeo.com/video/24066710?title=0&byline=0" width="80%" frameborder="0"]

Lost and Found: A video experiment started in 2006

In 2006 I was diligently working on my (still unreleased) new album "The Longest Post". With that I decided to create a couple music videos both as an experiment on if I could make something fun, and to have a visual way to distribute some songs. A couple apartment moves later and the hard drives with the music videos were lost.

Until now. In October, 2010 I found the hard drives while cleaning out my music studio and decided it was time to show them to the world.

They weren't completely finished, however. But a couple nights of some polish and fixes made it so the first, "Got Skillz?" could be released. And here it is.

Some things of note: You can tell it's from 2007. Old school web pages and apps. You see an old version of YouTube complete with a greasemonkey script changing the background color of it running within an old version of Firefox.  I was running a lot more extensions then than I remember.

Also given the lack of resources as far as what video I had to work with I experimented with using the game/app/world of "Second Life" for some of the character stuff to emulate the "party in the computer" vibe.

The video "Got Skillz?" tells the story of the party inside the computer. The song says "Get off the floor, and get on the net. The dance you see there is unbelievable I bet." and the video shows it to be true.

It also mocks the world of rap music by itself being a rap song, but showing how the character of Real-ity doesn't belong in the ghetto with the homies and would probably just find the nearest computer and get online.

See below for a "behind the scenes" video from when I was working on this originally (February 2007).

Check out facebook.com/pages/Real-ity/192311904989 for updates and new tunes. And as always gabekangas.com to see what Gabe is up to.

The video

Real-ity - Got Skillz?

"Behind the scenes, February 2007"

#TheCabin Mobile for iPhone

I've always said if I took all the time and effort I spent into side projects and spent them on something else I'd have a lot of cool other things.  But I've spent some nights and weekends working on this new little app, my first submission to the iTunes App Store.  (Approved in 8 days, FYI) As some of you may know, for the past 15 years I've been helping manage an internet relay chat (IRC) channel I started called #TheCabin. Recently I finished up an app called #TheCabin Mobile for iOS.

The project, as most of my projects are, was to create something to do the things I couldn't do easily before.

The first feature I really wanted: Easy sharing of photos directly to the channel. Previously in order to share a photo I'd do the following: * Open up the camera app * Take a photo * Open up dropbox or CloudApp * Upload the photo to a public place, copy the URL of the image * Open the IRC client * Paste the URL into the channe.

Now within the app there are two options: Take new photo, use old photo. You select one, type something about it, and off it goes for all your (my) friends to enjoy. I already find myself using and loving it.

Of course I had to integrate a quick way to jump directly into the channel. Most of us hardcore cabiners (those who hang out there) already have our multitudes of mobile IRC options, but I wanted something for a first timer, just downloading the app from the app store, to jump into the channel as easy as possible. To do this I use a few different web based IRC clients. None of them are perfect, unfortunately, so that one that's best for you at the time can be selected from the options in the app.

It lists on the main screen who's currently in the channel, and you can view Tweets/Facebook updates from cabiners who've been posting.  You're also one tap away from reading #TheCabin Wiki (the r33tipedia).

So is this app for everyone?  Or anyone but me?  Probably not.  But it's up on the iTunes App Store if you want to come hang out with my friends and I.  I don't recommend it.