Archive for the ‘Uncategorized’ Category

Plain Writing = better ROI – Internationally

July 5, 2011 Leave a comment

My recent trip to the Plain 2011 conference in Stockholm, Sweden was very worthwhile. And that’s saying a lot – as I covered my costs for this three day international conference.  I attended because the USA now has a Plain Writing Act which requires all federal agencies’ public facing published content, especially online, to be written in Plain Language as of October 2011.  I wanted to learn what other countries have learned from their experience with Plain Writing laws. The sessions I enjoyed the most were:

Annette Cheek (USA),, presentation – Google docs

Annette gave an insightful and amusing history of Plain Language and an overview of how a small group of people decided to make a federal law and how they did it. That alone was worth the trip across the Atlantic. I asked how will agencies be measured if their websites are written in Plain Language? Maybe the ACSI survey could ask users to rate the simplicity of the writing? Maybe it will 🙂

Helena Haapio (Finland) – Communicating Contracts – when text alone is not enough.

Because so many of my agency’s staff are contracting officers, I was eager to learn more. Helen’s handout had many tips on using tables, illustrations, and examples of succinct and simply written phrases. She also talked about the well known example of The Comma That Costs 1 Million Dollars (Canadian).  Rogers Communications and Aliant had a contract dispute over a contract containing  a sentence with 45 words, with a comma that a court interpreted as meaning that Aliant could renegotiate it’s five year contract after only one year.   My contracting officer coworkers have found it useful and intesting.

Thomas Mueller (Germany),  of Siegel + Gale, the cost of complexity

This presentation stated that brands who provide customers with simple communications and processes have great ROI.  Also available is the 2010 Global Brand Simplicity Index: United States.  An alternative title could be – if you aren’t writing in plain language and providing excellent usability to your customers you could be leaving money on the table. The lowest ranking brands for simplicity were utilities and financial institutions. If you invested your own money in companies rated highest in providing simplicity in their communications and interactions, your investment would outperform all others. Siegel+Gale has a blog with entries on simplicity – worth reading.

Sue Owen (Australia) , Using plain language in emergency warnings.

Sue spoke of how – tragically – many people did not understand that they should evacuate due to bushfires and floods in the Victoria province and subsequently lost their lives. A hearing determined that the warnings were written in dense, unclear, and ambiguous terms. Afterwards new guidance was issued requiring disaster warnings issued in Victoria to use clear language, avoid euphemisms, and contain specific information in relation to:
• the severity, location, predicted direction and likely time of impact of bushfires on specific communities and locations; and
• the predicted severity of impact of the bushfire and whether a specific fire poses a threat to human life.

The warnings must address and answer these questions:

  • where is the threat?
  • when will it be here?
  • how bad will it be?
  • what they can/should do

Cathy Baskerfield (Australia) – A synopsis of research when writing for people with limited literacy skills. Her presentation covered research on how to better reach the population with low literacy.  Much of any country in the world has at least 20% of its population coping with low literacy (immigrants, disabled, the elderly, indigeneous) . She postulates that Plain Language and information design is inherently interdisciplinary. Slides 26 – 33 show that combining good graphics with text:

• Can improve comprehension
• Support relationship among ideas
• Show spatial relationship
• Change adherence to information, for example, taking prescription tablets


Usability vs. UX: analysis of case studies

April 12, 2011 Leave a comment
Title of Study Results Tools Notes
1. Sullivan, Patricia. “Beyond a Narrow Conception of Usability Testing,” IEEE Transactions on Professional Communication, 32, 4, (December 1989):256 – 264 Suggests new frameworks for viewing usability studies methods and interpreting the validity of their results. Postulates that “a growing number of psychologists, engineers, and technical
communicators want to make the user more integral to the
whole development process.”
An analysis of other’s methods Questions Plain language movement probably has some influence on her too although not cited.
Title of Study
2. Hassenzahl, M. and Tractinsky, “User Experience – a Research Agenda.” Behaviour and Information Technology, 25, 2, (March-April 2006): 91-97 Suggest a new theory of UX where designers exert control to ensure that a positive experience becomes certain. UX is about contributing to our quality of life by designing for pleasure rather than the absence of pain.
Conducted a literature review of proposals received.
One can sense the rhetorician at work who works to craft the pleasing experience and downplay any lack of quality. iPhone antenna problems for example.
Title of Study
Results Notes
3. Nielson, Jacob. Writing for the Web. Suggests many best practices to follow and also suggests further study of papers and books – and then finally recommends that one enroll in his courses.
Years of usability studies and analysis drawn from that body of work. A website’s rhetoric will be less effective if users find it difficult to read. Notable in that a brief space many salient points of how people read – and how writers should take this into account when creating online communications.
Title of Study
Results Notes
4. Obrist M., Roto V., and Väänänen-Vainio-Mattila K. “User experience evaluation: do you know which method
to use?” CHI 2009, April 4 – 9, 2009, Boston, Massachusetts, USA. Extended Abstracts 2009: 2763-2766.
Unknown – this was an abstract. However the questions were particularly illuminating. Contributions from conference attendees on current known methods. Creation of a Special Interest Group (SIG) that will identify and gather
people interested in UX evaluation in different application
areas and contexts. results.
Can we ever really know how the user feels? Do they even know? Or can we only influence positive feelings and minimize negative ones?
Title of Study
Results Notes
5. Bevan, Nigel. “What is the difference between the purpose of
usability and user experience evaluation methods?” Internet paper,
Bevan notes a weakness in the methods – no metrics or requirements. He states that “user experience
seems to . . . .focus on evaluation [which] has preceded a concern with establishing
criteria for what would be acceptable results of evaluation. That comment was useful as I, too, wondered where the UX standards were.
Rigorous analysis of the UX methods and creation of a categorization of usability measures reported. He then compares and contrasts each method as to how it measures UX or usability. usable as roadmap of what one is measuring and how to do it better
Title of Study Results Tools Notes
6. Rodden et al, “Measuring the User Experience on a Large Scale: User-Centered Metrics for Web Applications”, Proceedings of CHI 2010. Creation of a UX framework – HEART: (Happiness, Engagement, Adoption, Retention, Task success). This was used to measure user satisfaction for a major redesign for iGoogle. They reported an initial decline in their user satisfaction metric (measured on a 7-point bipolar scale). However, this metric recovered over time, indicating that change aversion was probably the cause, and that once users got used to the new design, they liked it. With this information, the team was able to make a more confident decision to keep the new design. Happinesswas measured via a weekly survey on a 7-point bipolar scale).Engagement% of active users who visited 5 or 5+ days of the last week.Adoption how many new users? (i.e. # of accounts created in a week).

Retention how many users are still present (i.e. % of 7-day active users in a given week still active 3 months later).

Task success efficiency (e.g. time to complete a task), effectiveness (e.g. % of tasks complete), and error rates.

It makes sense to add a scale to UX measurements. Couldn’t it go to 11? Is it wrong to apply usability metrics to UX?
Title of Study
Results Notes
7. Large organizations need to track and compare their online sales, customers and trends such as shopping cart abandonment. Creation of overall framework to measure several factors to better identify causality. PULSE metrics: Page views, Uptime, Latency, Seven-day active users (i.e. the number of unique users who used the product at least once in the last week), and Earnings. Most of this data is proprietary and unavailable. Large ecommerce firms (Amazon, Ebay, Facebook) do have inhouse models and ongoing studies but this data is not shared nor publicly available.
Title of Study
Results Notes
8. How can Blackboard, Inc. better capture feedback and improve the UX on its web pages and software products?Presented at UX BarCamp DC in Jan. 2011 Blackboard created a framework for capturing user feedback. RUDES: Reliable, Useful, Delightful, Engaging, Simple. Users rate each experience as the RUDES and is asked if each component exceeds, meets, or misses.Unknown – appears to be a work in process. Unknown. Blackboard staff stated that scaling factors were necessary to make better design decisions. They did not disclose how this data would be collected, analyzed or used. Worth noting that the desired answer is positioned first. How good is a survey if one tries to influence it so strongly?
Title of Study
Results Notes
9. Fornell, Claes. (2011) “Citizen Satisfaction with Federal Government Services Plummets While Satisfaction With Government Websites Remains Strong”. News release and commentary. Nonsensical – agencies mission’s vary so widely that to compare satisfaction rates means nothing. Can one compare NASA to IRS? TSA to DOI? The popup survey ACSI reports scores on a scale at the national level for more than 225 companies, and over 200 federal or local government services. causes and consequences of customer satisfaction. The surveys vary among websites so comparing one federal agency’s score to anothers is not comparable – yet it is widely done.

Yet another misguided website re-do.

April 30, 2007 3 comments

Last week, the company that manages my beach house, launched a new version of their website, at

I looked at the new website and was surprised at the problems. They are easy to avoid. I’ve listed them below.

1. The website doesn’t have a liquid layout. Note the funky blank margins on the right and left of the page. Why would anyone want to waste prime screen space?

2. The domain,, formerly was a url for an adult website. Therefore the url is filtered out by content filters. Here at work we use McAfee’s content filters and I can’t look at the new website at all. Why would anyone want to use a domain that many people may not be able to reach, or tells them that they are trying to access an adult site?

3. The new site has pretty Flash pictures on the home page. The pretty pictures slow the loading of the page. Why would anyone want to make their potential renters wait? They can always bail and go to another rental agency’s page – the beach house rental market in North Carolina is quite competitive.

4. The “Rentals” section of the site now has a drop down sliding menu where the user has to manipulate through over 400 homes – very carefully. Why would anyone think that this would be better than just listing the numbers on a webpage, as was done on their old website, at

My beach house, Triple Play, is already rented for most of the summer. Good thing – because I would be really, really worried if I was counting on this new website for rentals.

Categories: Uncategorized