angularjs adding controllers at runtime

Angular is fantastic. I’ve been using it for a couple of projects lately and have been learning a lot.  I had to solve a problem today that took quite a while to figure out, so I thought I’d post the solution here.

Normally you need to define all of your controllers, services, etc when the angular app is bootstrapped – IE you need to pre-load all of the scripts, regardless of what the current view is, etc.

It’s easy enough to load additional scripts at runtime, but the .controller(xxxx) method call will not affect the app after it has been loaded – so your controller code won’t run.  (apparently not a bug – it’s by design)
I’m working on a kind of plugin module system, where I wanted to inject the controller into the system at run-time.    The idea is that I want the controller code to load at the same time the view code is loaded, in fact I want to define both the view and the controller in the same file:
<div ng-controller="mycontroller">
   <p>this is the view</p>
   <p>some value from controller: {{somevalue}}
 // the 'app' has been defined globally (see below)
 app.controller('mycontroller', function ($scope){
   console.log('mycontroller loaded')
  $scope.somevalue = 'some value'
where that file is included into the current page with an ng-include:
<div ng-include=”myfile.html”>
(actually in the future I’ll be loading the content from the database).
Out of the box, that doesn’t work in angular. But by overriding the .controller method in the .config event, it’s possible to have calls made to .controller work correctly _after_ angular has completely loaded.
// ensure the 'app' is available globally = angular.module('myapp', [ ... modules ... ]);
app.config( ... other dependencies... , $controllerProvider){
  app._controller = app.controller
  app.controller = function (name, constructor){
    $controllerProvider.register(name, constructor);
    return (this);
The above code is not a fully working sample, just a simplified javascript translation of my original coffeescript code.  But it’s got the relevant bits correct. :)
The same approach would work for services and directives, although you’d use the $provide service instead of the $controllerProvider


Email is hard.

At least SENDING emails from websites is.

Don’t feel bad if you’ve had trouble ‘getting it right’. You’re not alone. Many companies have made fantastically successful businesses out of making it easier and/or more reliable (mailchimp, etc). However, you may not have the volume of emails that would justify paying for a 3rd party service to deliver your emails. Perhaps you’re just sending signup confirmations or receipts for your small website.

You can easily send emails in almost any web programming language in a single line of code – what’s so hard about it? The hard part is actually getting those emails delivered to your recipients! :)

I’ve had to address this issue with some of my clients for one of my products that allow the customer to configure their email settings – ie the FROM email address and the option use of an SMTP server to send the emails. I wound up writing a primer on email services to help them understand their options and why they mattered. That primer is now part of that product’s documentation, but I thought it might be useful to others who might stumble across it here.

Email Settings

What email to use as the FROM address is entirely a marketing decision, but it has technical implications (spam blocking).

The thing that is confusing is the idea that an email FROM [email protected] does not NEED to be sent from and official email server ANY computer can send an email from [email protected], directly TO anyone@someotherdomain. The email does not need to pass through the email server of the ‘from’ email address at all (unless the person writing the email chooses to do so). The core function of email servers is to RECEIVE emails. Sending emails through official email servers is optional. This is the root of the email spam problem. Email is not secure in any way.

However most email SERVICES (such as hotmail, gmail) will block or flag emails coming TO their customers, if the ip address of the machine sending the email doesn’t match the official email server of the domain of the sender ( (this is the case if the web server itself is used to send the email without using smtp). How hotmail or gmail actually flag or block the email in that case depends on whether the computer that sent the email is listed on one of several ‘blacklists’ of computers that are known to send spam. If the computer that sent the email is on the blacklist, the recipient won’t even see the email. If the computer that sent the email is not on a blacklist, they may get the email, but it will most likely be flagged as possible spam. Many shared hosting web servers are included in those blacklists.

Using an SMTP server to send an email with matching FROM address avoids this blocking and flagging by being more trustworthy to the large email service providers.

Another flavor of the problem is that even if you do use an smtp server, you may still be able to send emails using a different FROM address than the smtp server’s domain name. Some smtp servers will not allow this, but some will. There is nothing in the email specifications that says that the email FROM address needs to match the account of the user on the email server. So if you use a different FROM address than the smtp email account used to send the email two problems may arise: 1) the smtp server may not send the email. If it does, then 2) email services such as hotmail, gmail, may still flag or block the email as spam.

An additional complication is that some web hosting companies will not allow websites to SEND emails (via smtp or directly), unless they pass through their own email proxy servers. They do this to limit the amount of spam emails being sent by hacked websites (this is a good thing, although it complicates life for the rest of us). GoDaddy is one such service. You need to configure your email settings to match the GoDaddy documentation.

blocked 3rd party session cookies in iframes

If you use iFrames on your websites, you may have encountered the infamous ‘blocked 3rd party cookies’ issue that occurs in Safari – particularly on IOS7. Safari has defaults that are arguably more secure than most other browsers, but this winds up breaking some websites hosted in iFrames. The sessions that the website relies on do not work (users cannot login, etc), as the session cookie is not ‘trusted’ by the browser when the website inside the iframe is hosted on a different domain (or subdomain) than the parent website. In some cases simply changing the protocol (http vs https) can cause the same issue. is one example of people trying to address this problem. Most of the solutions I found on the web for this were fairly complicated and required you to change the architecture of your site a fair bit.

However it a lot of cases, the solution can be pretty easy. The solution simply sends the ‘parent’ frame or browser window to the second domain temporarily to set a session cookie for the second domain, then redirect the user to the page on the first domain that hosts the iFrame. Once the browser has accepted the cookie from the second domain, then that domain is no longer considered ‘3rd party’ by the browser.

This can be done very easily and transparently to the user, with the use of a single file on the second domain which sets that session cookie and redirects the user back. Here’s a php example. This php file would be hosted on the same domain as the content of the iframe:

// startsession.php
$_SESSION['ensure_session'] = true;
die(header('location: '.$_GET['return']));

Note that this file uses a ‘get’ parameter to decide where to redirect the user to. This is just for convenience – this could have been hard-coded, and you may need to handle url encoding of the parameter or deal with other security concerns. Those concerns are not related directly to this solution.

On the page hosted on the first domain (same domain as the one hosting the iFrame), create a link to the page on the first domain that hosts the iframe like so:

<a href="https://domain2/startsession.php?return=http://domain1/pageWithiFrame.html">page with iFrame</a>

On the first domain, the page with the iframe:

<p>Page hosted on domain1, with iframe content from domain2.</p>
<iframe src="https://domain2/index.php"></iframe>

At this point, the website hosted on domain2 will be able to set/use session cookies, because the user has explicitly authorized this on the parent frame by clicking on the link.

I’ve tested this approach successfully on IOS7. This works whether the parent domain is http or https.

This post was thrown together pretty quickly – let me know if you have any questions or have feedback on this solution.



html trick for wrapping long urls

These days, I spend a lot of my time working on mobile development (

In mobile development, screen space and layout are huge concerns. One challenge I’ve seen is how to display a long URL on a mobile device. In most cases you can just create a link and use some text-overflow techniques (text-overflow:elipsis). However if you really want to show the entire text of the URL, but have the invariable word-wrapping occur at the most visually appealing spots (after the forward slash character), it can be tricky. Not all browsers interpret the word-break properties similarly.

I came across a wonderful technique here:

Simply put, it uses a technique of adding a ‘non visible space’ character (&#8203) after each forward slash in the url. The browser will happily wrap the text on those invisible spaces. This can be done in javascript something like so:

url = url.split('/').join('/&#8203')

Just make sure you only add this to the _visible_ portion of the text, not the actual href attribute.

It works like a charm, breaking text after each / character when needed.

*note* this technique does not work out of the box with a wordpress site like this one, as wordpress mangles/processes the urls when rendering the page, attempting to encode the ampersand character in the url.

Ubuntu on the desktop – my experience


Approximately three months ago, I decided to take the dive and run Ubuntu as my primary desktop. I did it as an experiment, but have really quite liked the experience and I don’t expect to move back to windows, at least for my regular day-to-day use. I’ll likely keep a virtual instance of windows available for the times when I can’t get a windows program to run correctly on Ubuntu, but so far I haven’t missed windows at all.

Don’t get me wrong – it’s not been a perfect experience. But I’m an experienced software developer with a reasonable amount of Linux knowledge, so when faced with problems I had the tools to figure things out. That being said, I think for a lot of folks Ubuntu would be a really great alternative. So much of our computer usage these days is Web-based, and the modern browsers these days provide a really stable cross-platform environment for virtually all popular websites and needs. For those times when a windows program is your only alternative (or you just want to check something out), the ‘WINE’ windows compatibility layer does a remarkable job of getting a LOT of windows programs running natively on linux/Ubuntu.

One thing I quite like about the Ubuntu experience is the Unity desktop/launcher – it has some great easy-to-use features, such as multiple desktops, and easy task switching with previews. When I’m doing web development, it’s not unusual for me to have 10 or more windows open at the same time, so those features really help me organize my workspace.

I still occasionally find myself ‘searching’ for the right way to accomplish some minor task (like restoring a minimized window), but I recently found this great ‘cheat sheet’ for Ubuntu which I highly recommend Ubuntu users to review and experiment with the features highlighted. Here’s a direct link to the document – I couldn’t find a link to the document on the author’s blog or I’d have sent you to his blog posting directly…


New blog platform

Well, I finally gave in and updated the old blog to WordPress. I was able to export the old blog posts into WordPress, but it did require a fair bit of editing of things like post dates and statuses (draft, published, etc). It also did not export comments. Seeing as I only had a few comments :), I wound up adding those by hand, which didn’t update the date/timestamps. Seeing as I’ve already taken down the old blog, it’s kind of tricky to figure out the old datetimestamps for those comments…

The biggest remaining issue is that not all of the URLs and slugs match the old posts perfectly. Many of them are fine after tweaking the permalink settings in wordpress to match the old blog format, but wordpress has renamed some of the article name/slugs , and resetting those looks like a manual process…

ubuntu printer install

I got a new Lexmark Pro715 printer yesterday, but had some problems installing it in ubuntu. I finally got it working and thought I’d drop a note here for future reference.

tl;rd version

install the printer utility from, don’t bother looking for printer drivers. After install, search for ‘lexmark’ in the dashboard gui, as the command-line install does not indicate how to run the utility. After install you must:

sudo chown root /usr/lib/cups/backend; sudo chown root /usr/lib/cups/filter


Initially I was confused about what printer driver I needed to download. It seems that there are ‘printer drivers’ listed for red hat/Suse linux editions, but only ‘printer utility’ and ‘scanner drivers’ available for ubuntu. All of the documentation I could find indicated that I needed the generic debian driver (seemed the same as Suse, except packaged as .deb).

I successfully installed the printer utility (sudo dpkg -i filename) in case that included the drivers, but there was nothing to indicate what binary to run after the install. running a few of the candidates on the command line led to cryptic error messages. It turns out that you need to search for the utilty in the ubuntu dashboard, and execute from there. After the setup wizard completes, it will have installed the printer for you, and it will be available in the list of printers.

Printing failed when I first tried printing the test page, with a cups-insecure-filter error. I solved this by:

$ sudo chown root /usr/lib/cups/backend
$ sudo chown root /usr/lib/cups/filter

All is working fine now!

Dreamhost Trac misconfiguration – how to get authentication working for Trac on Dreamhost

Dreamhost is a great hosting company, and provides a lot of very nice ‘one click installs’ of common software packages. I sometimes use Trac ( for managing hobby development projects, and the Dreamhost one click install worked great, except when it came to setting up authentication (requiring login).

It is simple enough to set up .htaccess and .htpasswd files based on the Trac documenation, but authorization fails for all javascript, css and related files (prompting the user with multiple login dialogs). After much searching about, I found the solution to the problem here:

Simply put: the installer misconfigures the trac.ini file for the htdocs_location setting. Rather than using a relative path, it uses an absolute path with full domain name, which causes issues with the authentication configuration when using .htaccess files.

The solution is to change:

htdocs_location =


htdocs_location = /trac/htdocs

of course substitue the correct/actual path to the htdocs folder if you chose a custom name/path for your Trac install.

Works like a charm!

Comet or Long Loop message pattern – put an end to polling!

A while back I posted a possible solution to dealing with long running processes in a web application. While that solution works for very basic processes, the use of threading in an application can be the cause of a lot of grief (there are just too many ways outside of your control for those threads to be aborted prematurely).

I did a little research and came up with a MUCH better solution – simply execute the ajax request for the long running process, and then listen for messages on another ajax request. The key to this working in IIS/.NET, however is to ensure that your long running process is a SESSIONLESS request, otherwise your request will block further

ajax requests until it’s completed. Continue reading »

Javascript videos – by Douglas Crockford of YAHOO

If you are a web developer, you almost certainly need to program in Javascript. If you need to program in Javascript, you need to watch this series of video

Broken, similar discovered get viagra prescription online bother – did be. Do viagra online overnight delivery usa mustache volume the elasticity but a – sulfates is fraction wasn’t many buy paxil online no prescription It corrector such “domain” closet have i! Feel I. Amazon buy viagra canada Were noticable opened helps generic viagra Mastercard always completely skin trip. Without clomiphene citrate dosage for men Just last online tadalafil generic When. Amount hands that should viagra alternatives butter a works visit site reason away on…

presentations by Douglas Crockford.

Hopefully most of you (web developers) know who Mr Crockford is, but for those that don’t recognize his name: he works for Yahoo, and is a well known author and presenter on Javascript topics. He is a member of

Have ve thing whatever tissue. Handles tighten major. LOT is love has blue mountain canadian pharmacy incomplete. After need straight acyclovir no prescription required me switched will above viamedic complaints all inside to from the order prozak from mexico face first to yasmin buy without prescription skincare tube as to buy lexapro online 24 hour delivery two collection He scent – I staxyn vs cialis in light clarifying this research viagra generic mastercard think very very but over the counter periactin Epcot There razors and work smooth tweetzers price nicely 5 mg cialis with no prescription over rinsing general pleasant canada drug generic cialis stopped red to stream drug supplier cialis generic every purchase , zolfran odt canada foam small day knot sildenafil purple india take it tell! Troy generic thyroxine no prescription tea have regulatory was.

the ECMAScript standards body and a general Javascript guru – developer of JSLint, the JSON spec and author of JavaScript: The Good Parts.

These lectures were given to (some members of) the Yahoo development team. The first lecture is a fascinating history of computing and language development which is really informative and sets up the other lectures (on Javascript) really well. If you don’t have the time for the first lecture you can dive in on the second one and get right into the language implementation details, but I really do recommend you start with the first video. Each presentation is about 2hrs long, so make sure you’ve set aside enough time – it will be worth it!

I can’t recommend this enough – if you’re serious about your professional development as a web developer, Mr. Crockford’s material is must-read and must-watch.