Archive for March, 2010

First Experiment with MAGic for Web Accessibility Testing

2010/03/30

I and another developer recently performed a preliminary experiment using MAGic with Speech to determine its suitability for Web accessibility testing. This work is part of my effort to find alternatives to using JAWS for the same purpose.  See my previous, related post,”Stop Using JAWS for Web Accessibility Testing?“.

[Note: I would like to hire, as a consultant, an experienced user of MAGic with Speech. Please contact me.]

Description of MAGic with Speech

MAGic is a screen magnifier for people with low vision or learning disabilities.  It not only enlarges screen imagery up to 36 times, but also enables setting of tinting-, brightness- and contrast of foreground- and background colors.

Three important features for our test:

  • many of the same reading commands as JAWS;
  • reads aloud in a voice the text displayed on the screen; and
  • can highlight words as they are read aloud.

Specific Test Purpose

I want to determine if using MAGic would solve a significant problem JAWS presents for accessibility testing.  The Web content JAWS reads can not be visually tracked.  This confounds sighted developers and people to whom JAWS is being demonstrated.

Background & Setup

I am sighted.  Rich, the other developer and a long-time JAWS user, is not.  We conducted the test in a quiet office on The MIT campus.  Installed on Rich’s computer were MAGic Standard with Speech 11 and NVDA.  On mine were JAWS 11 and, later, the same version of MAGic that Rich was using.  Both of us had previously tried MAGic, with Rich having become more-familiar with its functions and use.

Procedures

We focused our test on three home pages: www.mit.edu, www.disabilityinfo.org and www.clearhelper.org.  We  are familiar with them and know their accessibility to be good.  We simultaneously navigated each page multiple times.

The test had four phases:

  1. Rich used MAGic while I watched;
  2. Rich used MAGic as I used JAWS;
  3. Rich used JAWS while I used MAGic; and
  4. Rich used JAWS and MAGic together as I used MAGic.

With MAGic’s configuration tool, we enabled and disabled sets of primary functions.  Two we invoked often were speech only and speech with word highlighting.

Results

The first two phases proved problematic because Rich was unable to navigate page headings with MAGic.  To find a page’s main content, screen-reader users can look for a level-one heading.   They can then move from heading to heading to find sections of important content.  Though MAGic produces a list of headings, we could find no way to make it navigate headings as Rich is accustomed with JAWS.  We then switched computers so I would be the primary MAGic user / tester.

In the third phase, as in the previous two, we tried to invoke MAGic’s word highlighting.  It did not work consistently.  When it did work, I could visually track the content MAGic was reading.  Because Magic also spoke the content, Rich could track where I was in a page while he navigated the same one with JAWS.

For the fourth phase, I installed MAGic on my computer running JAWS.  We conjectured that using them in concert might enable extra functionality in MAGic.  This was due to statements by Freedom Scientific, maker of both products.

  • “MAGic adds visual enhancements when used in conjunction with JAWS.”
  • “MAGic is fully compatible with our JAWS screen reader …”.

Retrieved from: http://www.freedomscientific.com/products/lv/magic-bl-product-page.asp

We had hoped the extra functionality would include page navigation via headings.  If it did, we could not find it.

Conclusion

Results from the third phase were promising when the word-highlighting feature functioned.  It is likely our testing would have been more fruitful had we been more familiar with MAGic. Additional testing will be needed to determine how well MAGic mimics a screen-reader’s page navigation.

Next Steps

  • I plan to hire, as a consultant, an experienced user of MAGic with speech.  That person and I will conduct future tests.
  • I will contact Freedom Scientific to address directly the issues encountered.

Notes

  • Rich had NVDA rather than JAWS installed on the computer he used because he was borrowing it for our testing.
  • The issue of JAWS cost I noted in my previous, related post would be partially ameliorated with MAGic.  Its cost is about 55% that of JAWS.
  • Eric Damery of Freedom Scientific, at a JAWS 11 demonstration I attended, suggested using MAGic instead of JAWS for accessibility testing.
  • No endorsement of Freedom Scientific or any of its products is expressed or implied.

Cognitive Web Accessibility Assessment: First Attempt, Part 3 of 3

2010/03/26

This post is the third part of my first structured attempt to evaluate cognitive Web accessibility.  I am using WebAIM’s Cognitive Web Accessibility Checklist and its WAVE accessibility evaluation toolbar to assess the Web site of Down’s Syndrome Scotland.  See Part 1 and Part 2.

This post covers the checklist sections of:

  • Orientation and Error Prevention/Recovery;
  • Assistive Technology Compatibility.

Assessment Related to Checklist

  • Checklist Section: Orientation and Error Prevention/Recovery
    • Guideline: Give users control over time sensitive content changes
      • This guideline is not applicable.
    • Guideline: Provide adequate instructions and cues for forms
      • Title attributes of tags are used to provide instructions.  There are cues for required fields.  Form labels are inconsistently used. Example: Feedback. Fieldsets are not used, but are not required.
    • Guideline: Give users clear and accessible form error messages and provide mechanisms for resolving form errors and resubmitting the form
      • There are accessible form-error messages.  They could be more clear.  Perhaps “The field ‘Your name’ is required” could be “Type your name” or “Enter your name”.  A submitted form without text in a required field reproduces text entered in other fields when it is refreshed.  Example: Make a Donation.
    • Guideline: Give feedback on a user’s actions
      • Field-specific error messages are prefaced by “Please correct the following errors before trying to submit this form:”.  Example: Make a Donation.
    • Guideline: Provide instructions for unfamiliar or complex interfaces
      • It is possible people with intellectual disabilities would find the site’s short forms to be complex.  User testing would indicate this.  (It may already have been done).  One way to reduce any perceived complexity would be to present users each field step-by-step.
    • Guideline: Use breadcrumbs, indicators, or cues to indicate location or progress
      • Breadcrumbs are used throughout the site.  One point is recorded.
    • Guideline: Allow critical functions to be confirmed and/or canceled/reversed
      • This guideline is not applicable.
    • Guideline: Provide adequately-sized clickable targets and ensure functional elements appear clickable
      • Many links, including those of the sidebar menu, are not underlined.  Some button-images are clickable, some not.  There is no differentiation between them.
    • Guideline: Use underline for links only
      • This guideline is met throughout the site.
    • Guideline: Provide multiple methods for finding content
      • There is a top menu; a sidebar menu; a site search feature; a site map; and links within body text.
  • Checklist Section: Assistive Technology Compatibility
    • Guideline: Appropriate alternative text
      • Some images do not have it.  Other images do, but it does not describe their content well.  Example: Fantastic Fundraisers (body of page).  This is not a guideline.  It can not be tested by WAVE.
    • Guideline: Form labels
      • I am ignoring this guideline because its criteria are the same as those for the one (above): “Provide adequate instructions and cues for forms”.
    • Guideline: Tables and table headers
    • Guideline: Logical heading structure
      • A level-one heading is used on all assessed pages except the home page.  Of those with additional headings, many have a logical structure. Some do not.
    • Guideline: Links make sense out of context (avoid “click here”, etc.)
      • This guideline is met throughout the site.  One point is recorded.
    • Guideline: A logical, intuitive reading and navigation order
      • On assessed pages, this guideline is met.
    • Guideline: Full keyboard accessibility
      • Access keys are implemented.  On assessed pages, structure is not missing and event handlers are keyboard accessible.  Tabindexes are not required, but one should have been employed, for instance, to make the “Viewing Options” accessibility feature the first link on pages.  A skip link is on all pages, but it would have been better if it were visible.  There are empty links.
    • Guideline: Descriptive and informative page titles
      • This guideline is met throughout the site.
    • Guideline: Frame titles
      • This guideline is not applicable.
    • Guideline: Captions and transcripts
      • This guideline is not applicable.

General Accessibility Assessment

  • The site does attempt to meet W3C accessibility standards. Many pages have no accessibility errors detected by WAVE.  One point is recorded.
  • The site does not have an accessibility statement.
  • No explanation is provided about how to use accessibility features, such as Viewing Options or access keys.

Results

Three of five possible points are recorded.  For the entire, three-part assessment, the total is seven of ten points.

Conclusion

Down’s Syndrome Scotland has made a readily-apparent effort for its Web site to be accessible to its constituency.

Notes

  • All “Viewing Options” function in Internet Explorer 8.  All but “Large Text” do in Firefox 3.6.
  • E-mail Link To Page employs an inaccessible CAPTCHA.
  • Some of my descriptions are disjointed.  This is due to my attempt to address, generally, all the potential errors listed for each guideline of WebAIM’s checklist.  It is also because I tried to make the descriptions brief.

Cognitive Web Accessibility Assessment: First Attempt, Part 2 of 3

2010/03/23

This post is the second part of my first structured attempt to evaluate cognitive Web accessibility.  I am using WebAIM’s Cognitive Web Accessibility Checklist and its WAVE accessibility evaluation toolbar to assess the Web site of Down’s Syndrome Scotland.  For details, see Part 1.

This post covers the checklist sections of:

  • Multi-Modality;
  • Focus and Structure;
  • Readability and Language.

Assessment

  • Checklist Section: Multi-Modality
    • Guideline: Provide content in multiple mediums
      • I could find no instances of video- or audio alternatives to textual content.
    • Guideline: Use contextually-relevant images to enhance content
      • Many, particularly the header images, are not contextually relevant to pages’ textual content.  There is some contextually-relevant imagery.  Examples: Meet Keith, Titan Abseil.
    • Guideline: Pair icons or graphics with text to provide contextual cues and help with content comprehension
  • Checklist Section: Focus and Structure
    • Guideline: Use white space and visual design elements to focus user attention
      • Picture of Resources Page. Shows disproportionately-large header image.The header images focus user attention to themselves, not to the content of page bodies.  An example (pictured), is the Resources Information page.
    • Guideline: Avoid distractions
      • On assessed pages, the header images pull attention away from page-body content.  The home page has an element of text that is animated in the site’s default / standard view and in its optional views.
    • Guideline: Use stylistic differences to highlight important content, but do so conservatively
      • Important textual content is bold.  It is frequently large and red in color.  One point is recorded.
    • Guideline: Organize content into well-defined groups or chunks, using headings, lists, and other visual mechanisms
      • Pages have short paragraphs.  Headings are used, but incorrectly on some pages.
    • Guideline: Use white space for separation
      • White space is used to separate page elements.
    • Guideline: Avoid background sounds
      • There are no background sounds.
  • Checklist Section: Readability and Language
    • Guideline: Use language that is as simple as is appropriate for the content
      • I am ignoring this guideline. I do not understand how it is different from the one (below): “Maintain a reading level that is adequate for the audience”.
    • Guideline: Avoid tangential, extraneous, or non-relevant information
      • This guideline is met throughout the site.  One point is recorded.
    • Guideline: Use correct grammar and spelling
    • Guideline: Maintain a reading level that is adequate for the audience
    • Guideline: Be careful with colloquialisms, non-literal text, and jargon
      • This guideline is met throughout the site.
    • Guideline: Expand abbreviations and acronyms
    • Guideline: Provide summaries, introductions, or a table of contents for complex or lengthy content
      • This guideline is not applicable.
    • Guideline: Be succinct
      • This guideline is met throughout the site.
    • Guideline: Ensure text readability
      • These criteria meet this guideline: line height; text spacing and justification; sans-serif fonts; adequate text size; content-appropriate fonts; paragraph length; and adequate color contrast.
      • These criteria do not meet this guideline: Line length (exceeds 80 characters); and horizontal scrolling (necessary if text size is increased by 200% to 300%).

Results

Two of three possible points are recorded.  Combined with the points from Part 1, the subtotal is 4 of 5 points.

Notes

  • A point is recorded only if a site or a significant part of it consistently follows a guideline.  The Down’s Syndrome Scotland site did not meet this criterion for any of the Multi-Modality guidelines, so no related point is recorded.
  • I assessed Web pages only, not the many linked PDFs.

Cognitive Web Accessibility Assessment: First Attempt, Part 1 of 3

2010/03/18

This post describes my first structured attempt to evaluate cognitive Web accessibility.  I expect to learn about related best practices with my Plan to Assess Web Accessibility of 100 Cognitive Disability Organizations.  My working hypothesis is that their Web sites are more likely to implement accessibility features for people with cognitive disabilities than the Web sites of any other organization.

Assessment Tools

Summary of Assessment

My assessment uses a ten-point scale.  I record a point if even one guideline in each of the seven sections of WebAIM’s checklist has been met.  Yet, because I am just starting, I would like to see now how practical it is to find and to evaluate a feature representative of every guideline.

I have thus divided the assessment into three parts.  This blog post is the first.  It covers the checklist sections of:

  • Consistency; and
  • Transformability

In future blog posts, the remaining checklist sections will be assessed:

  • Multi-Modality;
  • Focus and Structure;
  • Readability and Language;
  • Orientation and Error Prevention/Recovery;
  • Assistive Technology Compatibility.

As well, I will record up to three points if the Web site attempts to meet W3C accessibility standards, if it has an accessibility statement, and if it explains how to use accessibility features.

Web Site Description

The Web site of Down’s Syndrome Scotland is the subject of my first review.  I chose it simply because I have noticed many U.K. Web sites make an effort to be usable by and accessible to people with cognitive disabilities, particularly intellectual disabilities.  The Web site is bright and cheery. Pages have big photos and colorful elements. The home page is pictured below.

home page with pictures of children and young women with Down's Syndrome

Assessment: Consistency & Transformability

  • Checklist Section: Consistency
    • Guideline: Ensure that navigation is consistent throughout a site
      • On every page, the options of the top menu are the same.  It, the site search box, and the sidebar menu are always in the same place.  The sidebar menu’s options do necessarily change because they are related to each site section’s content.
    • Guideline: Similar interface elements and similar interactions should produce predictably similar results
      • Such elements include “Print this page”, “Email to a friend” and the search box. All produced predictably similar results. I could find no related inconsistencies elsewhere.  One point is recorded.
  • Checklist Section: Transformability
    • Guideline: Support increased text sizes
      • When text size is increased to 200% and to 300%, the text in the content section of the site’s pages looks fine.  Menu-, header- and footer text have an overlapping problem, causing illegibility.
    • Guideline: Ensure images are readable and comprehensible when enlarged
      • Most images are big. All of them and the smaller ones, with the exception of a Scottish Consortium logo, meet this guideline.  One point is recorded.
    • Guideline: Ensure color alone is not used to convey content
      • A red underline in the navigation menu is the only visible site-section indicator. There is no such indicator when styles are disabled.
    • Guideline: Support the disabling of images and/or styles
      • Site-content layout is logical and navigable with images and/or styles disabled.

Results

Two of two possible points are recorded.

Notes

  • I tested each guideline on at least three pages with Firefox.  I used Internet Explorer a few times to determine if effects were Firefox-specific.  (None were.)  To take less time, I will likely use Firefox exclusively for subsequent assessments unless a Web site or a feature is incompatible with it.
  • I tested the text-size and the image-enlargement guidelines with the Default Full Zoom Level 4.3 Firefox extension to assure that text sizes and zoom levels were increased to 200% and to 300% precisely.

Stop Using JAWS for Web Accessibility Testing?

2010/03/16

Since before Web-accessibility evaluation tools were available, developers have used JAWS to test their Web sites.  I have for so long that not using it never occurred to me.  I was spurred to consider the possibility at a recent demo about the newest version.  Eric Damery of Freedom Scientific, the maker of JAWS, proposed it.  This post details the reasons I am considering no longer using JAWS for Web accessibility testing.  Future posts will discuss alternatives.

Description of JAWS

JAWS is wonderful software.  For a person with a significant visual disability, it reads aloud or converts to Braille the content sighted people view on their computer screens.  Without the JAWS screen reader or one of its competitors, hundreds of thousands of people would be unable to use computers, nor would they be able to access the Web.

Using JAWS For Accessibility Testing

People who are blind have long been considered the most excluded from the Web.  This is the central reason Web accessibility standards have been focused on making sites accessible to them.  (An even larger excluded population are people with cognitive disabilities, but that is a topic for another day.)  Thus accessibility-minded developers have always considered it important to test sites with a screen reader.  JAWS is chosen for this because people with visual disabilities use it more, by far, than any other screen reader.

Another benefit of accessibility testing with JAWS is that sites made compatible with it also often work well for people with physical disabilities.  People who can not use a mouse, and/or who use a single-switch device instead of a keyboard, can navigate an accessible Web site in a way similar to that of JAWS users.  A simple approximation of this experience is to visit a Web site and attempt to navigate it using only the Tab key.

Reasons I Am Considering No Longer Using JAWS

  1. Though I have used JAWS to help test Web accessibility since the time (1995) it was first available, I know only enough about it for such testing.  Like all screen readers, JAWS is complicated.  For people who must use it to access a computer, many months are typically needed to learn it well.  Over the years, as I have hired developers and introduced them to Web accessibility and JAWS, it has been difficult for each to master it sufficiently for accessibility testing.  In part this is because they don’t have to use it all the time as people who are blind do.
  2. Another reason developers find JAWS troublesome is that the Web content it reads can not be visually tracked.  The temptation for sighted developers to watch JAWS is too great.  They are so dependent upon their sight that trying to get them to test Web sites while their screens are off, for instance, is difficult.
  3. Sighted developers inexperienced with JAWS, and sighted people to whom JAWS is being demonstrated, are often confounded by the way it reads Web content.  They expect it to read an entire Web page as they perceive they do.  Instead, JAWS reads Web pages in chunks and pauses before links.  This behavior closely mimics what sighted people really do, which is to skim Web page content.
  4. JAWS is expensive.  At the time of this writing, it costs $895 for most people, plus an annual software maintenance agreement (SMA) of $120.  Cost alone can be an initial barrier for people who are blind and, as a population, are chronically underemployed.  For developers who use JAWS to test accessibility, Freedom Scientific requires the more expensive professional version.  Its cost of $1,095 plus a $200 yearly SMA can be a barrier for developers without institutional backing.  To evangelize accessibility testing to other developers, I am interested in lower-cost or free alternatives.
  5. “Ninety percent of blind people don’t use a screen reader.”  Kevin Carey, Chairman of the Royal National Institute of Blind People (RNIB), said that recently at an INMD seminar for Web developers.  He also said that is the reason Web sites should be made “self-voice”.  I imagine he was referring to people who are legally blind.  To access Web sites, they must use screen-magnifiers, text-to-speech software, and/or Web site widgets.

Notes

  • JAWS is only one of many tools my team has used to test Web site accessibility.  Most importantly, people with disabilities have always been hired to vet the accessibility of our Web sites as much as possible.
  • Reason 5 may mean I should place more emphasis on my experiments with text-size enlargement and incorporation of text-to-speech features.
  • The quotes attributed to Kevin Carey were provided to me by a person who attended the seminar.
  • I am interested in feedback.  Please comment.

[Editor’s Note: Readers may be interested in a follow-up post, “First Experiment with MAGic for Web Accessibility Testing“.]

Defining A Good Accessibility Statement

2010/03/09

This post lists the recommendations of eight Web articles I found that opine about a good accessibility statement. I also found three that advocate a site-help page be used instead.  All are referenced below.

Common Recommendations

The survey results are from articles published on the Web by accessibility-focused organizations or by site developers.  One, “Evaluating the Usability of Online Accessibility Information“, is based upon research. Publication years range from 2002 to 2009.

Mentioned In Most Articles:

  • Agreement
    • Do not just list accessibility features; explain how site visitors can use them.
    • Detail any site barriers to accessibility.
    • Provide contact info for people who experience accessibility problems.
    • Make the accessibility statement easy to locate on the Web site.
  • Disagreement
    • Don’t refer to how the site conforms to accessibility standards (50%).
    • Do refer to how the site conforms (50%). Place the info at the bottom of the statement.

Mentioned in Half of Articles:

  • Explain the site’s or the organization’s commitment to accessibility.
  • Do not use jargon.  Use clear, plain language targeted to the site’s audience.
    • Use an alternative to the term “accessibility” because many visitors do not know what it means.

Mentioned in at Least 2 Articles:

  • Separate accessibility-statement content into sections.
  • Reference authoritative, stable accessibility-help, i.e., The BBC’s My Web My Way.
  • Do not limit accessibility information to a specific impairment.
  • Do not assume knowledge visitors may not have, e.g., which browser they use.
  • Do not claim accessibility features if they are not present.

Relevancy To Current Assessment Plan

I decided to investigate this in preparation for my plan to assess the Web accessibility of 100 cognitive disability organizations. Specifically, I considered not just awarding a point for the presence of an accessibility statement, but for the presence of a good one.  To do that, I needed to determine agreed-upon characteristics.  Now that I have, I realize it would take too much time to assess the accessibility of the Web sites and whether or not their accessibility statements, if existent, are good.

Referenced Articles

Articles That Advocate Site-Help Pages Instead

Notes

  • I searched for recommendations by people who identified themselves as having a disability, but found none.
  • Did I miss an important resource?  Please comment or contact me.

E-mail Usage Monitoring for People with Intellectual Disabilities

2010/03/08

This post is about reports from CogLink, e-mail software designed for people with intellectual disabilities. I have been receiving them in my capacity as the “Helper” of the person using Coglink (also me).

In my review of CogLink, the last of three posts about it (see list below), I explained the CogLink term “Helper”.  It is someone who installs it, provides assistance during automated training on how to use it, and manages its advanced features.  One of them is an option to receive monthly usage reports.

Report Contents

Reports have two sections: monthly- and weekly statistics.  Each has two subsections.

Social Measures

Number of:

  • times e-mail is checked;
  • hours spent using CogLink;
  • messages received; and
  • messages sent

About this section, the report explains:

“These items reflect the user’s initiation of and overall engagement in email, including time spent emailing and the level of social exchange (i.e. messages sent vs. received) with their email partners. Decreases in email activity over time may indicate the need for the support person to contact the user and/or their partners concerning possible reasons for these changes (e.g. technical problems, change in email address, out-of-town).”

Skill-based Measures

Number of:

  • hours spent composing email;
  • words per minute;
  • words per sentence; and
  • characters per message

About this section, the report states:

“These items reflect the user’s email message composition skills. For example, examining trends in the average # words per sentence or sentences per message may reveal changing skills in using email.”

Conclusion

I think these reports are useful in helping people with intellectual disabilities learn how to use e-mail.  Because repeated, consistent training is likely needed, these reports are a good way to track related problems over time.

The reports do not include confidential information, such as the content of e-mail messages, nor statistics broken down by e-mail “Buddies” (Coglink’s term).  In my opinion, the reports provide just enough information to be helpful without being invasive of privacy.

Related Posts

Note: No endorsement of CogLink is intended or implied.

50+ Readability Resources Related To Cognitive Web Accessibility

2010/03/04

I have created an index of readability resources related to plain language; measurement tools; guidelines, research; content; symbols; and  free- and commercial products and services. At the time of this writing, there are over fifty. I will add more as I find them.

Characteristics Of Readability Listings

  • All have links to the original sources.
  • All are annotated with related information, primarily edited quotes from source pages.
  • The majority are free- and commercial products and services.  The rest are research articles.
  • The publication dates of original studies and articles range from 2001 to 2009 / present.

Links to Readability Index & RSS Feed

Notes

Technorati Verification Code = 63S9AZXDSA9K

Plan to Assess Web Accessibility of 100 Cognitive Disability Organizations

2010/03/02

I will assess the efforts of 100 cognitive disability organizations to make their Web sites accessible to their constituencies.  This post is a description of my current plan.  I am open to suggestions for improvement.

Evaluation Criteria

I will base the assessment upon WebAIM’s latest Cognitive Web Accessibility Checklist, which has these sections:

  1. Consistency (of navigation);
  2. Transformability (increased text- and image sizes, etc.);
  3. Multi-Modality (of content);
  4. Focus and Structure (use of elements to focus attention, not distract it, etc.);
  5. Readability and Language (clear display of text and use of plain language);
  6. Orientation and Error Prevention/Recovery (adequate instructions, feedback and error recovery)
  7. Assistive Technology Compatibility (use of alternative text, labels, headings, keyboard accessibility, etc.)

10-Point Measurement

On each Web site, I will look for the features described in the checklist. I will record a point if I find even one feature included in a checklist section. Thus up to seven such points could be recorded.

One point will be recorded if a site attempts to meet W3C accessibility standards (1.0 or 2.0).  I will judge this based upon a related site statement, or by a positive result from running WebAIM’s WAVE against up to three site pages.

One point will be recorded if a site has an accessibility statement.

One point will be recorded if a site explains how to use accessibility features.

Assessment-Progress Tracking

Upcoming blog posts will describe the assessment as I undertake each step.  It may well be that I revise my methodology after a few initial evaluations.

Index of Web Sites for 100+ Cognitive Disability Organizations

For this assessment, I created an index of Web sites of over 100 cognitive disability organizations.  To identify them, I used the same criteria listed in my previous blog post.

Notes

  • WebAIM is engaged in an effort to incorporate cognitive Web accessibility evaluation into WAVE. It may be WebAIM would find this assessment useful.  I will solicit feedback from Jared Smith, Associate Director of WebAIM.
  • Have a suggestion? Please post a comment or contact me.

%d bloggers like this: