Airplay enabled car

I solved a #firstworldproblem today – wireless audio in my car.

TL;DR – use an AirPort Express & replace its power supply with a 5V -> 3.3V step down module so it can be powered off USB/cigarette jack in the car.

Basically I was fed up with having so many cables floating around and getting tangled & I wasn’t happy with the compressed audio quality with Bluetooth, so I decided to come up with a wireless audio solution.

I did a bit of reading, and most people recommended an AirPort Express (APE) with an inverter running off the cigarette lighter jack. The idea is that you connect to the APE using wireless, then share music from your phone to the APE via AirPlay. The APE is then connected from its audio LineOut to your car via a headphone into the audio LineIn. I felt this solution was a bit wasteful, not to mention the the fact that it would be converting 12V DC to 110 AC, then down again when going into the APE. After waiting a few months, I did a bit more research and came across this post:

http://forums.nasioc.com/forums/showthread.php?p=39379974

In summary, the user enjoiful describes a way of opening up the APE, pulling out the PSU and replacing it with an alternative power supply – USB. The great thing about this is that USB provides 5V DC, and the APE requires 3.3V DC. The end result is that you just need to convert the 5V to 3.3V somehow so that you don’t overload the APE.

This was the perfect solution as I had a USB port in my car, as well as an audio lineIn.

Continue reading

Google Chrome : array evals return out of order

I just want to make others aware of a little quirk with Google Chrome’s Javascript engine (V8), eval’d associative JSON arrays (aka objects) are returned in the incorrect order . The problem is that the engine doesn’t do what a programmer would expect it to do, but the programmer should be aware of why its happening and that it does happen.

Basically, take the following psuedo associative array/object and its corresponding JSON version:

Psuedo Associative array
Array (
 3503 => '',
 3847 => '',
 6852 => ''
);

JSON Array
var data = {3503:'',3847:'',6852:''};

Pretty basic huh? But that happens when we loop over this array/object? In Firefox, Safari and IE we get the same result, which is the array elements in the order listed above. Chrome on the other hand returns the items out of order. Now I know you are probably thinking, “its an array/object, order doesn’t matter”. This is technically true, but not if you are relying on the order for some reason, then you might find bugs cropping up. Check out the code below:

var data = {3503:'',3847:'',6852:''};
var s = '';
for(var i in data) {
	s += i + ',';
}
alert("Expected order: 3503,3847,6852nOrder observed: " + s)

Firefox, Safari and IE all return the following alert:

Expected order: 3503,3847,6852
Order observed: 3503,3847,6852

Chrome on the other hand returns this:

Expected order: 3503,3847,6852
Order observed: 6852,3503,3847


Weird! Give it a try in your current browser by clicking here

Javascript guru, John Resig, has posted a note about this:
http://ejohn.org/blog/javascript-in-chrome/

Or for the official bug reports:
http://code.google.com/p/v8/issues/detail?id=6
http://code.google.com/p/chromium/issues/detail?id=883

As always with javascript programming, expect the unexpected… and…
Be Warned!

Trac ticket: reset to new

We have recently moved from Mantis to Trac at work for our bug/task tracking, but we have encountered a slight issue with Trac’s workflow management. The issue is that we wanted to be able to move a ticket from the ‘assigned’ state to the ‘new’ state. This is a common scenario if a team member leaves or goes on holiday while they have tasks assigned to them. Quite often you wouldn’t just want to blindly ‘reassign’ these issues to another member, but reset them to their new status so another member can pick the task up when they have the time.

We decided to introduce a new state called ‘reset’ in the trac.ini [ticket-workflow] block which allows us to easily reset a ticket to the ‘new’ status as if it had just been created in the system. The new block looks like this:

reset = * -> new
reset.operations = del_resolution
reset.permissions = TICKET_MODIFY

Now at the bottom of each ticket, we have an option which says:

[ ] reset  Next status will be 'new'

Hope this works for you too :)

jQuery fadeIn/fadeOut IE cleartype glitch

While using the jQuery  javascript library today at work, I noticed a glitch under IE7. When fading a html node with the .fadeIn() and .fadeOut() functions in jQuery, IE drops the windows Cleartype rendering; which results in very ugly text. This problem appears to be very common, but no one has a nice solution for the problem.

The most common way to solve this problem is by removing the filter CSS attribute. In normal javascript, it would look like this:

document.getElementById('node').style.removeAttribute('filter');

and in jQuery, it would look like this:

$('#node').fadeOut('slow', function() {
   this.style.removeAttribute('filter');
});

This means that every single time we want to fade an element, we need to remove the filter attribute, which makes our code look messy.

A simple, more elegant solution would be to wrap the .fadeIn() and .fadeOut() functions with a custom function via the plugin interface of jQuery. The code would be exactly the same, but instead of directly calling the fade functions, we call the wrapper. Like so:

$('#node').customFadeOut('slow', function() {
   //no more fiddling with attributes here
});

So, how do you get this working? Just include the following code after you include the jQuery library for the added functionality.

(function($) {
	$.fn.customFadeIn = function(speed, callback) {
		$(this).fadeIn(speed, function() {
			if(jQuery.browser.msie)
				$(this).get(0).style.removeAttribute('filter');
			if(callback != undefined)
				callback();
		});
	};
	$.fn.customFadeOut = function(speed, callback) {
		$(this).fadeOut(speed, function() {
			if(jQuery.browser.msie)
				$(this).get(0).style.removeAttribute('filter');
			if(callback != undefined)
				callback();
		});
	};
})(jQuery);

I have been informed by Steve Reynolds that the US Whitehouse Website is using some of the JS documented on this blog post. I would just like to say thanks to everyone who contributed in the comments. :)

Rails gem install mysql throws error: *** extconf.rb failed ***

Whilst reading guides on how to setup a rails dev server under OS X (both Tiger and Leopard), I kept running into issues installing the mysql gem. The guides would instruct you to install the mysql client libraries from the mysql website, which is fine. But to get improved performance under rails, you must install the mysql gem, but in doing so, I would constantly run into issues getting the gem to install correctly. The issues normally arises after running the following command:

sudo gem install mysql -- --with-mysql-dir=/usr/local/mysql

And the resulting error is:

checking for mysql_query() in -lmysqlclient... no

*** extconf.rb failed ***

Could not create Makefile due to some reason, probably lack of

necessary libraries and/or headers.  Check the mkmf.log file for more

details.  You may need configuration options.

Basically, the gem compile fails as a result of not knowing where the correct mysql files are, despite the fact that the command line switch points to the mysql folder.

To fix this problem, use a line like the following:

sudo gem install mysql -- --with-mysql-config=/usr/local/mysql/bin/mysql_config

Note that instead of pointing at the mysql folder, we are pointing at a mysql config which is bundled with the mysql client install files.

APML2JSON Script/Service

This script takes an APML feed, and parses it into valid APML-JSON based on the APML-JSON spec on the APML wiki. Instead of manually parsing the APML into JSON, I have used the XSLT file attached to the aforementioned spec page along with xsltproc to generate the JSON data. The idea behind this script is based on John Resig’s RSS2JSON script.At the moment the script is pretty hacky for release, so I have provided a REST interface that can be accessed via a GET request.

A request to the interface would take the following form:
http://bmn.name/examples/apml2json/?url=URL&callback=CALLBACK

The callback parameter is optional. If specified, the resulting JSON is wrapped in the callback for easy parsing at the client end, otherwise the resulting JS Object is assigned to a variable which can be accessed via JS. The results from the call are cached hourly to reduce the load on the server. :)

Example Interface call: http://bmn.name/examples/apml2json/?
url=http://blog.bmn.name/index.php?apml=apml&callback=parseFeed

For more information on how to use the resulting JS, head over to John’s RSS2JSON page as he provides some sample JS.

Blu-ray PS3 stuttering playback woes… Solved!

This blog post is for anyone who owns a PS3 and has discovered that *some* Blu-ray discs have stuttering/dropped frames playback.

I purchased a Playstation 3 back in April 2007, and received a free copy of Casino Royale on Blu-ray as part of a PS3 promotion. Needless to say, I opened the PS3 and stuck the blu-ray disc in to the drive to see what all the blu-ray fuss was about. The movie looked fantastic, even on my standard def TV. It looked even better on my brothers HDTV. Now you may be wondering why I’m saying this. Well, as I later found out, this initial experience was not indicative of the PS3′s blu-ray performance on every blu-ray disc.

Fast-forward ahead 8 months.

I decided to go halves in a brand new Samsung Full HDTV (1080p) for christmas. The picture quality of this baby is phenomenal on both HD FTA and PS3. As a present to myself, I went to JB-HIFI and purchased 3 blu-ray movies to try on the brand new tv. I bought Kingdom of Heaven, Gone in 60 Seconds and Behind Enemy Lines; all of which are fantastic movies. I tried out Behind Enemy Lines, and it looked spectacular. I then tried Kingdom of Heaven and had a similar reaction. Then, as you would have guessed, I tried Gone in 60 Seconds…

Arrrgghhhh! What the hell did they do to make this movie so painful to watch? The movie started out fine, but after about 40 seconds, the video started to stutter. It appeared as though video frames were being dropped as the audio wasn’t affected.

I continued to watch once the stuttering had stopped to see if it was a once off. But I was disappointed to find out that it continued to do it every 1 minute or so for about 10 seconds. I became very frustrated as blu-ray discs aren’t cheap. So I went back to JB-HIFI and swapped the disc for another copy of Gone in 60 Seconds so I could work out whether it was a disc problem or a PS3 problem.

I slid the disc into the console and held my breath…

Right on queue, it started to stutter. Argh! At this point I was very angry. I decided to have a hunt around in the BD/DVD settings section of the PS3. One option I found seemed like it fit the bill.

“BD 1080p 24Hz Output (HDMI) – Sets the playback method for BDs recorded at 24Hz (frames/second)”

By default this option is set to ‘Automatic’, so I decided to fiddle with this setting and set it to ‘Off’. I started the blu-ray disc up again and held my breath…

40 seconds passed…. 50 seconds passed… 5 minutes passed…

Nothing! Not a single stutter or anything! The problem was fixed. There is nothing on the blu-ray disc cover to suggest that the movie is in 24fps, so I guess its trial and error for each disc. But at least you now know how to fix the problem!

I’d like to hear from other people who have had the same problem.

EDIT: I have found a few resources on the net of other people experiencing the same problem, so if you wish to know the exact reason for this stuttering – read this thread.