In a recent discussion with Senator Conroy’s media advisors, I was given the privilege of being politely declined any more information about the composition of the proposed URL ‘block’ page or what options people have to check if their Web site is caught up in the filter.
Here’s the short version of what the ACMA proposes when it comes to notifying people that their Web content (local or offshore) has been refused classification: A URL blacklist will be kept under lock and key by ACMA and when a content infringement occurs it will notify “readily identifiable and contactable website owners that their content is to be added to the RC content list”.
Related content - In pictures: 10 free (and legal) ways to bypass Net filtering
Then, instead of being presented with an offending Web site, the URL will reveal “a standardised block page that enables users to seek review of any material that they find blocked”.
Is this not a guilty until proven innocent mindset? It would be good to get the public’s perception on what level of proactive measures should be provided to avoid having to appeal an instant judgement made by the ACMA.
Another of ACMA’s proposed transparency and accountability measures is the use of a ‘block’ page. Under this option, a standardised block page will be used to advise people trying to access a filtered URL, including end-users, content owners, or content service providers, that the content they have attempted to access is blocked by the filter because it is on the RC Content List.
A spokesperson for Senator Conroy said the block page will provide a link to the relevant part of the ACMA Web site that will include information about the classification system and how it applies to the internet, avenues for appeal or review and statistics and information on the current composition of the RC Content List.
The “statistics and information on the current composition of the RC Content List” would be an eye-opener indeed. How much transparency are we talking about here?
When asked for more details about the composition of the 'block' page the silence was deafening. We all wait on.
Senator Conroy has said repeatedly it makes no sense to publish a list of banned content and to that end I agree with him; it doesn’t make sense. But that doesn’t mean people shouldn’t be given the opportunity to be more proactive in determining whether any content they publish will end up on a blacklist.
That is, of course, if you insist on trying to regulate and inherently unregulated medium in the first place.
And before anyone jumps and yells “the content classification guidelines are well documented and should be obvious”, I understand that fully. My point is the way in which the RC Content List will be enforced is neither obvious nor well documented.
With Senator Conroy admitting the concept of URL blacklist “creep” is a legitimate political debate, Web content publishers may find themselves inadvertently infringing content acceptability guidelines.
The blacklist may start off with obvious RC material, but slowly move, or “creep”, into otherwise legal content, including gaming, commerce, politics and the media.
Recently, a reader asked me if any URL filtering scheme would provide a mechanism for “looking up” or otherwise querying whether a Web site contained RC content in a way that does not require ACMA to publish the list.
The technology to allow it is certainly available. You enter a URL into a Web form and receive a response along the lines of “the Web site address you entered contains content that does not infringe any classification guidelines”. The blacklist items are never exposed in this “blind” query system.
In addition to an unencumbered Internet, another of ACMA’s concerns appears to be the possible reverse engineering of the RC Content List. It’s also possible to reverse engineer the Windows 7 source code if you are so inclined, but the practicality of “reverse engineering” the ever-changing content universe that is the Internet – and methods to try and police it – diminishes exponentially the more you think about it.
The concept of proactive URL checking may be a trivial (or even moot) argument, but at the very least it is one way concerned business people can check that their Web site is not infringing any classification guidelines before waking up one morning to a shiny new block page where a product order page once stood.
Again, it would be good to get some public opinion on what other methods the DBCDE has at its disposal to allow people to be more proactive without publishing the list.
When I asked whether the department will categorically rule out any form of active URL query mechanism, my question remained unanswered.
Having some level of “pre-approval” would also go a long way to repealing any accidental blocking of a Web site. There has already been one case where a business Web site was blacklisted for inadvertently hosting RC material.
Lies, damned lies and statistics – soon we may witness the clandestine Internet filter, the lack of proactive blacklist prevention and an abrupt block page as a veritable exhibition in all three.