boogdesign posts

Longer posts on standards based web design, portable web development and Linux, intermingled with some stuff on my other nerd interests.

Rob Crowther, London, UK based Blogger, Web Developer, Web Designer and System Administrator - read my Curriculum Vitae

Buy my book!

Book CoverHello! HTML5 and CSS3 available now

Buy my other book!

Book CoverEarly access to HTML5 in Action available now

Category: Management and Communication

03/04/10

10:11:26 pm Permalink London Web Standards: Javascript with Frances Berriman and Jake Archibald

Categories: Front End Web Development, Management and Communication

Review: LWS March: JavaScript ? The events that get left behind & Pro-bunfighting at The Square Pig, 30 - 32 Procter Street, Holborn, London, WC1V 6NX 19:00 to 20:30

This event was focussed on JavaScript, specifically the Glow 2 library from the BBC which both speakers are working on. Frances talked about coordinating work on a JavaScript project with a geographically distributed team, discussing some teamwork strategies and demonstrating some useful tools. Jake talked about the nightmare that is DOM Level 2 keyboard events and how he'd worked around the issues in Glow 2. Both of them were funny and engaging speakers, so much so that I feel I could hardly do them justice with a textual repetition of their talks. That and I didn't really take detailed notes this time, and had to leave before the Q&A... So, for a summary of what was said check out the usual live blog of the event from Jeff, meanwhile I'm going to have a play around with three of the JavaScript tools discussed: JSDoc, QUnit and Glow 2 keyboard events.

JSDoc

In team environments developers are supposed to liberally comment their code and also document the requirements before coding, and the APIs afterwards. Inevitably anything that developers are asked to do which is not strictly coding tends to take a back seat, especially when deadlines are approaching. Documentation requirements therefore get skimped or skipped, and often are not kept in sync as the code evolves. In the Enterprise development world this issue led to the development of tools like JavaDoc which, if you write your comments in a particular format, will automatically generate nicely formatted documentation for you when you've finished. This halves the amount of documentation you have to write, increasing the chance developers will do it, and keeps that documentation close to the source code to which it pertains, improving the chance it's kept up to date. JavaDoc has been much imitated, and the JavaScript equivalent is JSDoc.

So, how do you use JSDoc? First you have to download and extract the latest version, you also need to have a Java runtime installed, version 1.5 or later. Now you have to adjust your comments slightly, assuming you have comments in your code already :) I didn't, so I grabbed the complex.js file I'd copied out of the rhino book for my post on the canvas element. You can run JSDoc on that file as it stands and it will generate some documentation. Assuming you're in a command prompt in the directory where you extracted JSDoc (and, er, you're on Linux - adjust path separators if you're on Windows), issue a command line like this:

java -jar jsrun.jar app/run.js -a -t=templates/jsdoc/ ../code/complex.js

The command will produce a directory out with the generated documentation. This is what the basic documentation looks like:

The results of JSDoc on a standard JavaScript file

As you can see (check out the full results), it's found the constructor function but not much else - everything is in the global namespace. Let's annotate the comment before the constructor function and see what happens, this is what it looked like before:

/*
 * The first step in defining a class is defining the constructor
And here's what it looks like when it's JSDoc enabled:
/**
 * The first step in defining a class is defining the constructor
Can you see the difference? Look closely at the first line - there's one extra asterisk. That extra asterisk is what tells JSDoc to look in this comment for 'special' stuff. At the end of the comment I'm going to add an annotation:
 * @constructor
 */

Run the command again and suddenly there's a whole load more stuff:

The results of JSDoc on a JavaScript file with a single annotation

In the new version of the documentation there's a Complex class, and it's found all the methods. However, your fellow coders may appreciate a bit more than the basics. Perhaps you might want to document what the expected parameters and return values are? Additional annotations follow the same pattern as @constructor - an ampersand followed by a keyword. Some of the other keywords let you provide additional information, here's what parameters and return values look like:

/**
 *Add two complex numbers and return the result.
 *@param a {Complex} A complex number
 *@param b {Complex} A second complex number
 *@returns {Complex} The sum of a and b
 */

Now JSDoc adds your comments to the output, and additionally provides links to any other types you've defined:

The results of JSDoc on a JavaScript file with several annotations

You can have a look at the final output here.

QUnit

Unit testing is something I keep thinking I should learn how to do. The BBC Glow team is using the jQuery unit testing framework QUnit, so this seems like a good excuse to investigate it.

QUnit is very easy to set up, you just need to link to JQuery and the two QUnit files in your document head:

<script src="http://code.jquery.com/jquery-latest.js"></script>
<link rel="stylesheet" href="http://github.com/jquery/qunit/raw/master/qunit/qunit.css" type="text/css" media="screen" />
<script type="text/javascript" src="http://github.com/jquery/qunit/raw/master/qunit/qunit.js"></script>
Then provide some HTML framework for the results to appear in:
<h1 id="qunit-header">QUnit example</h1>
<h2 id="qunit-banner"></h2>
<h2 id="qunit-userAgent"></h2>
<ol id="qunit-tests"></ol>
Now you need to write some tests. To do this, simply add a function to $(document).ready that calls the test function:
test("Passing tests", function() {
  ok(true, "The truth is true");
  equals(1,1, "One is one");
});

These are obviously quite basic tests, and not actually testing anything, but demonstrate how easy it is. The ok test accepts a 'truthy' value and a string and succeeds if the value is true, the equals test accepts two values and a string and succeeds if they're equal.

Basic QUnit tests

To try some less basic tests I once again dragged out the complex.js file. The test functions can contain more than just the testing functions, you can put as much JavaScript in there as you need to do your test. Here's what I ended up with:

test("Basic class functionality", function() {
  expect(4);
  var real = 1.0;
  var imaginary = 1.0;
  var c;
  ok( c = new Complex(real,imaginary), "Class created" );
  equals( c, real, "Simple value comparison" );
  equals( c.toString(), "{" + real + "," + imaginary + "}", "String comparison");
  equals( c.magnitude(), Math.sqrt(2.0), "Magnitude comparison")
});

You can see I've used the ok test to confirm the object gets created, then done some simple comparisons to make sure the object created is what I expect. Unsurprisingly, all my tests pass:

Slightly less basic QUnit tests

Of course, this just demonstrates that QUnit is very straightforward to use rather than that some code I nicked out of a book works perfectly. There's a whole art to writing unit tests over and above the simple mechanics of the framework you're using, but I'm definitely not the person to be telling you about that. If anyone knows of any good, JS oriented tutorials for doing test driven development, please leave a comment.

Glow 2 Keyboard Events

Jake talked about the mess that is keyboard events in current browsers, and how he set about fixing it in Glow 2. The problem with keyboard events in browsers can be summed up by a quick look at the W3C DOM Level 2 Spec:

A visual comparison of the size of the mouse event spec with the much shorter keyboard event spec

Given the lack of spec it's hardly surprising that the browsers have all implemented keyboard events slightly differently. Not only have the browser implemented things differently from each other, there are also differences between the same browsers on different operating systems. Throw in the fact that keyboards in different countries have different sets of keys, and it all gets a bit messy. In Glow 2, the keyboard events have been normalised to keydown, keyup and keypress, with the same properties on the event object across browsers.

I've downloaded the Glow 2 source to have a go with these keyboard events. After downloading the extra libraries I've managed to get it to build with Ant, but I've not got the build itself working in a browser yet. It could be me, it could be a bug, I haven't figured it out yet - when I do I'll fix up my keyboard event example application and update this post.

Another excellent event, the two best speakers of the three events I've been to, 5 out of 5. Watch out for the next one on 26th April.

Technorati tags for this review:  

Tweet this!
1 feedback »PermalinkPermalink

17/10/07

06:27:48 pm Permalink IT?S A MASHUP: The End of Business as Usual

Categories: Blogging and Internet Culture, Management and Communication

Review: Andy Mulholland - The End of Business as Usual at BCS, 5 Southampton Street, London WC2 October 15th, 18:15 to 20:15

I went to this BCS North London branch event because they usually have an 'enterprisey' slant and this one was supposed to be about Web 2.0 and mashups, which is not something I regularly associated with enterprise IT. Andy Mulholland was a very good speaker, it seems like we got the same presentation he regularly gives to boards of directors, the slides are available from the link in the previous paragraph. From now on I'm going to assume you've looked at them and list some of the things Andy discussed while he was showing the slides that stuck in my mind (ie. for an overview of the whole talk, read the slides).

The key trend affecting enterprise IT in the drive to web 2.0 is that users and consumers are now driving technology adoption, they get used to things at home and start to ask themselves why they can't use similar tools at work. As the proportion of tech-literate vs tech-illiterate clients and workers shifts in each industry, we pass an 'inflection point' past which businesses have to change to remain competitive. There are some businesses where this has already happened: travel; retail; music. The traditional business view of IT products is characterised by: "If I purchase this, I can work more cheaply." The user led change of priority is from the perspective: "If I purchase this, I can work more effectively."

There are some common traits of businesses which 'get it' which can easily be contrased with more traditional business practices:

New: Amazon leads with the most popular items responding to external demand
Old: Barnes and Noble leads with its internally defined offers

Right: eBay allows external demand to create new markets and indexes
Wrong: CommerceOne failed as it defined the markets that it would make available

Aware: Google business model continuously improves, people explore for the new
Adaptive: Traditional Software business model depends on set upgrade offers periodically

Innovative & Money Making: Second Life participants create over 7 million lines of code a week to improve environment. As of December 2006 456 people earn over $500; 29 over $5000; 2 over $25000 Every month from participating in Second Life. About 500,000 Chinese work in ?gold farms? creating superior players and selling them.

Web 1.0 was characterised by content, web 2.0 is characterised by contacts or community, this reflects a general shift for the knowledge worker: 20 years ago 80% of the knowledge they needed to do their job was in their heads, now only 20% is in their heads and the rest depends on them exploiting the vast information resources available, which is too vast for them to do by themselves.

Finally Andy discussed how to build a business case for mashups (and web 2.0):

  • Not all valuable business interactions involve a transaction
  • Front office to back office integration depends on open standards
  • We are fixated on productization. Move the value proposition from the box to the knowledge.
  • Wrong question: "If I had Google Apps, what would I save over MS Office?"
  • Right question: "What can I do with Google Apps that I can't do with MS Office?"

Overall this was an excellent talk, 5 out of 5, which may not be obvious from my potted summary. If you have a chance to see Andy Mulholland speaking in person I recommend you take it.

Technorati tags for this review:    

Tweet this!
Send feedback »PermalinkPermalink

06/06/07

11:55:13 pm Permalink In-the-Brain of Erik Doernenburg: Why Agile Teams Fail

Categories: General, Management and Communication

Review: Why Agile Teams Fail at Skills Matter, Sekforde Street, London 18:30 to 20:30

Erik started with a discussion of what motivates people towards agile practices. He quoted a recent study (I'm not sure, but I think he meant this one from the IEEE) which arrived at the conclusion that 15% of 'waterfall' projects 'succeeded', whereas 40% of 'agile' projects 'succeeded' (gratuitous quotes because he wasn't sure what the words were actually defined as in the context of the study). It's nice that agile practices can result in such a big improvement in success rates, but Erik didn't think this was the main reason why they were embraced, he thought the motivation came more from self interest as a result of the threat of outsourcing. If coding is a commodity, then it can be outsourced, and it makes sense to outsource to the cheapest supplier. But if software development is a process which relies on regular face to face meetings between developers and customers (ie. agile) then that's much more difficult to outsource.

After this short introduction, he had ten slides for his top ten (in no particular order!) reasons why agile teams fail. It's fair to say that a lot of the reasons, like agile practices themselves, reinforce one another:

#10 Believing in Myths
There are a lot of myths which have built up around agile methodologies - pairing costs twice as much; no documentation is required; agile practices are 'cowboy coding'; Scrum is agile. Erik went into each item one by one and dismissed them, for example reducing the amount of documentation is not the same as having no documentation - just produce the documents which will actually get read, there is more to just pair programming than just one person doing nothing and if you're following a recipe out of a book then you're not really doing agile. His main point was that believing in the myth is likely to cause the myth to become reality.

#9 Using Controversial Terminology
Common agile buzzwords are actually quite frightening to managers - say 'extreme development' and they hear 'risk'. The founding fathers of the agile movement chose the terminology quite badly from the viewpoint of selling the practices to conservative middle management (originally they wanted to call it 'Adaptive Development' but apparently there was a product out which already used that name). The buzzwords don't really capture the value proposition involved in the practice, for instance if 'pair programming' was instead called 'constant code reviews' there would be less opposition to the practice. Unit testing is not extra code which has to be written, but is actually part of your software design and specification, and one of it's main benefits is encouraging you to think about the external interface of your class before you implement it, reducing the risk of producing a tightly couple class structure without realising it. There was a short discussion on refactoring, how do you justify to a manager re-writing code which already works instead of writing new code? Erik thought this was a result of a blinkered view of the software development life cycle - in the real world software is never finished, and maintenance is important. He also pointed out that if you have separate development and maintenance teams then there's no incentive for the development team to produce maintainable code. The main point of the slide was you don't need to use the standard terminology and frighten off managers, find your own terms which describe value proposition involved in the practice.

#8 Missing Key Roles
Agile development doesn't mean you can do away with business analysts and testers, they are still needed just as much as developers. Erik had a slide which showed how a typical one week iteration would actually last three weeks with the involvement of the full team. While the testers worked on iteration six, the developers could be working on iteration seven while the business analysts are preparing iteration eight. From this slide we got into an interesting discussion about how to estimate how many stories should go into an iteration, Erik talked about 'T-shirt sizing' for stories - triage into small, medium and large then working out how many stories you can fit into the next iteration from how long, on average, it took to do each size of story in the previous iterations. He also talked about having 'iteration level stories' and 'release level stories' as well as how to stop testers (working on the previous iteration) destroying the momentum of the developers (working on the current iteration) - basically do the small ten or twenty minute fixes but write up new stories for the more involved defects for consideration in a future iteration.

#7 Overdoing It
This echoed an earlier slide, so I'll just quote it:

  • Make sure every document has an audience
  • Make sure every feature solves a problem
  • Drop anything without value
  • Don't micromanage
  • Don't draw a state diagram

The bottom one was slightly tongue in cheek, but the overall point was don't do any unnecessary work which no-one is going to look at.

#6 Cherry-picking Practices
The agile practices support and reinforce one another, you don't gain much benefit from only doing one or two of them. Having said that, the point of agile methodology is to adapt it to suit your environment, but that shouldn't be haphazard. Decisions should be made by the most involved people and you should stick with your changes for at least three iterations before evaluating whether or not it's working for you. If you flip flop back and forth you only end up invalidating your tracking data.

#5 Lacking Discipline or Courage
Following on from the last slide - if you don't commit fully to agile practices you're not going to get the benefits. Skipping writing unit tests can be good in the short term, but makes refactoring harder and means you're likely to miss some of the effects of your later changes, having the discipline to stick with it will reap rewards in the longer term. Similarly, have the courage to admit when something isn't working and change it.

#4 Failing to Ask for Help
Again following on - don't be afraid to ask for help (and not just from Erik's company, ThoughtWorks!). Experience counts for a great deal in agile methodology. There's a reason why the master-apprentice model of passing on expertise lasted for a long time, and it's only recently that we've started to entertain the notion that we can learn everything we need to know from books. Remember, most process books are describing what worked for the author, not necessarily what will work for you.

#3 Using Iterations that are Too Long
The key feature of an iteration is not how much you can get done in one, but the fact that it offers you a decision point. Nothing to do with velocity, everything to do with direction. If you have three month iterations than you only get four chances a year to change direction, and if you've spent two months going in the wrong direction that's wasted work. Having said that, if your average story takes four days development time to complete then one week is probably too short - the keyword, once again, is adapt.

#2 Failing to Find a Good Sponsor
There's no point even attempting agile methods if there's no support for them higher up the management chain. The environment needs to support it - from basic things like build servers for continuous integration to things like having desks where it is easy for two people to sit and work together without getting disturbed or disturbing others.

#1 Failing to Rally the Team
In this last point Erik talked about some of the fun things you could do to keep the developers interested, this focussed around lava lamps and wireless bunnies attached to build servers, but the general point was to ensure everyone was having fun and stayed committed to the project. If you have a disinterested development team then no methodology is going to save your project.

An excellent talk and discussion, the audience really added value to the presentation with their questions and comments I thought, and Erik really encouraged this level of interaction, 5 out of 5. [Slides]

Technorati tags for this review:    

Tweet this!
Send feedback »PermalinkPermalink

21/05/07

09:53:16 pm Permalink More communicating with geeks

Categories: General, Management and Communication

After my post two weeks ago, where I exhorted non-technical managers to improve their communication skills rather than distract their technical staff with improving theirs, I came across two articles this week that struck me as being very related.

First was an article on eWEEK (Where Will All the Cowboys Go?) on a strongly related subject:

Over the next five years, business will become so deeply embodied in technology?and technology so embedded in the business?that it will change the way IT is managed by an organization ... Forrester said that CEOs will now be less clueless about technology because this changing dynamic will demand from them a fighting level of technology know-how.

The basic argument, from Forrester Research no less, is that companies will become so embroiled in their IT that they will be at a competitive disadvantage if the high level executives have no understanding of it. No arguments from me there, better understanding of technology would undoubtedly allow management to communicate better with their technical staff.

Unfortunately they spoil it all in the second half of the article:

While it used to be that 100 percent of the people in IT had IT skills, this may fall all the way to 20 percent, said Cameron. IT itself will be pushed to the fringe, where there are highly fluid jobs, and the innovation and the invention will be there.

Aside from the obviously stupid assumption that the possession of IT skills is an entirely binary proposition, if only 20 percent of your IT people have IT skills, then only 20 percent of those people are actually in IT, the other 80 percent can only be described as 'support staff', or perhaps 'management'. Think about that for a minute - you have a department dedicated to IT, and yet only 20 percent of that department can actually do IT work, how productive is that likely to be? And I'm not sure how this conclusion can be related to the first part of the article, which was claiming that executives need to gain IT skills, it doesn't seem the obvious conclusion, quite the reverse in fact. I suspect this may be a case of Forrester vice-president and principal Bobby Cameron not having enough IT skills to understand the implications of his own company's research, but I'd have to purchase the report to be sure...

The second article I came across was A Security Market for Lemons, which later led me to the similar Web Hosting - A Market for Lemons:

The term comes from a paper by economist George Akerlof dealing with information asymmetry. If that's a dry sounding term, the concept is actually simple: in some markets, the seller knows more than the buyer does, or vice-versa ... Shady dealers will try and sell low-quality products at a higher price, and sellers with high-quality products will face buyers who are less willing to pay high prices because of the risk they might get a "lemon".

I don't think it's a co-incidence that the same economic theory applies so neatly to two disparate areas of IT - I think it applies generically to almost anything which is part of modern corporate IT. Let's apply it to recruitment: the buyer in this case is the manager who, as it happens, has little understanding of IT. He can't tell the difference between candidate's levels of technical proficiency, so he chooses a variety of mediocre signals to differentiate the candidates - such as their communication skills. For this job it's not in the interest of a prospective candidate to have good technical skills, he's better off having good communication skills. So the manager, over time, ends up with a set of staff who have less than optimal technical skills, and candidates, over time, become less technically skilled, because that's not what gets them hired. How do we mitigate the effects? The best way is to eliminate the disparity of information. Which, in this example, means improving the technical skills of the manager.


Tweet this!
Send feedback »PermalinkPermalink

08/05/07

11:26:13 pm Permalink Communicate with your geeks

Categories: General, Management and Communication

Last week I was reading a blog post on Angry Aussie - 6 essential communication tips for IT workers and it set me to thinking:

So why are communications skills being focused on in IT roles where this has not traditionally been a prime requisite. In short, this is one of the biggest problems in the IT industry worldwide.

It's certainly a popular topic among the sort of blogs I read regularly (which, basically, is stuff that appears on programming reddit), and the prevailing wisdom seems to be that geeks need to improve their communication skills so that their ideas are better represented within the business they work for. I think this is good advice for for an individual in today's employment marketplace - if you want to get promoted (or even hired) improving your communication skills is more likely to bring results than improving your technical skills. What this blog post is about, however, is why this individual behaviour shouldn't be enshrined as a business-wide strategy.

Let us consider the stereotypical 'geek' and the stereotypical 'suit'. The 'geek' has great analytical skills and devotes his time to solving technical problems and generally tinkering with technology in order to better understand it. The 'suit' has great communication skills which he uses to schmooze fellow suits into beneficial business relationships (I'll leave the question of who it's beneficial to open for now). It turns out that the 'geek' and the 'suit' have trouble communicating with each other, as documented frequently across many blog posts, and, as I mentioned above, the solution is apparently for the 'geek' to learn better communication skills. So what we're really saying is that the 'geek' needs to sacrifice a portion of his technical skills so that he can learn to communicate with the 'suit'. Let me now introduce the things I started thinking after I read Angry Aussie's blog post:

  • Do we really want our technical experts becoming less technical?
  • If the business people already have such great communication skills, why can't they communicate with the technical experts?

To expand on the first point - the simplifying assumption is that your knowledge and skills are a zero-sum type thing, to become good at something new you have to devote time to it, which means less time devoted to keeping up with things you already know. So if our 'geek' wants to devote time to improving his communication skills this is going to take away from the time he spends tinkering with technology. You might want to argue that the time our 'geek' spends spouting off into the blogosphere echo chamber is time he could devote to something else with no impact on his technical skills, but I'm going to assume he gets some small benefit even from that.

The second point is the crux of my argument. It seems to be accepted wisdom that our 'suit' has great communication skills whereas our 'geek' doesn't. But what it looks like to me is that we have technical experts who can only communicate with other technical experts, and business people who can only communicate with other business people. That's not one group of people with great communication skills and one group of people with poor communication skills, that's two groups of people with equally bad communication skills. If a business's only solution to this problem is to turn the 'geek' half way into a 'suit' then their technology is going to suffer. Eventually a competitor is going to come along who's business people really do have great communication skills and they're going to get eaten for lunch. This new business is going to be able to leverage both ends - they have better technical people, because they're allowed to concentrate, and better business people, because their business people really are good communicators.

I realize this is something of a straw man argument, and in real life people's abilities are spread over a spectrum rather than the black and white examples above, but I think it's also clear that while improved communication skills can be advantageous for individuals in technical roles this doesn't absolve those in business roles from also making an effort.


Tweet this!
Send feedback »PermalinkPermalink

27/03/07

11:44:50 pm Permalink Learning and Knowing

Categories: General, Management and Communication

After yesterday's post - which I will summarise as "look how much I still have left to learn" - I wasn't really satisfied with the ending. You can tell I wasn't because I added a further ending in a comment later on, but I still had the feeling that I hadn't really said what I wanted to say and that stuck in my mind a bit. This afternoon I was browsing through sys-con.com's archive of magazines and I came across an editorial by Joe Mitchko - 'Are you a WebLogic Expert':

The question is so difficult to respond to because the term "WebLogic" covers a lot of ground. I am often left puzzled for a day or two, asking myself, should I be a WebLogic expert? If I were an expert, what would I know beyond what I already know?

This chimed fairly strongly with the nascent and nebulous thoughts which had been swirling around in my head since the previous day - you can take the term "Weblogic" out of the question and the quote, leaving aside for a moment the definition of 'expert', and replace it with almost any web technology or buzzword and the comment above will still apply.

Try it with CSS. Accessibility. Ajax. Web 2.0. Pick one where you feel fairly comfortable that you have the knowledge to 'get by'.

If I were a Javascript expert, what would I know beyond what I already know?

It's impossible for me (and, in fact, anyone else) to know the answer to that question, the only way to find out what there is to know beyond what I already know is to actually go out there and look for it. That's going to be the same whatever term you plug in there - to find the limits of your knowledge you've got to embrace the unknown.

Dealing with the unknown can actually be pretty scary, in a sense you're choosing to put yourself in a position of weakness and potentially setting yourself up for failure. And this is where my thoughts ran into some blog posts I was reading yesterday - Worse Than Failure published an article from Coding Horror's Jeff Atwood, who posted a follow up on his own blog:

I can absolutely guarantee that the kinds of developers who could benefit most from reading WTF simply do not-- and never will-- read the WTF website

It struck me that (massive generalisation warning) the sort of programmers who don't need to read WTF (and Coding Horror) are the ones who're exploring the frontiers of their own ignorance by themselves, the ones who need to read it and don't are the ones who want to feel safe and secure in their current set of knowledge. The example I posted yesterday - the previous solution to the '08' problem, the function removeLeadingZeroes, was a non-exploratory solution: you know how to manipulate strings, so use that knowledge to strip the zeros off the front and solve the problem. The exploratory solution, though admittedly not an awful lot of exploration in this case, involved first realising the need to explore the unknown and then doing it.

It sounds much grander than it actually was in this case, but I think the general principal will hold up in more exacting cases. Scientific studies have shown that nearly everyone believes their ability to be "above average", a comforting lie we tell ourselves, and can keep telling ourselves as long as we don't venture out into the unknown, but the path to being an 'expert' is to always assume there's something you don't know, and then go and find what it is.


Tweet this!
Send feedback »PermalinkPermalink