Categories
Security

Will Automated Content Filtering Improve Online Safety?

THE UK GOVERNMENT wants Internet Service Providers to automatically filter adult content and improve online safety. How realistic are the proposals?

THE UK GOVERNMENT wants Internet Service Providers to automatically filter adult content and improve online safety. What is behind this initiative, will it work, and will our children be safer online?

Online Safety – The Case for Change

An OFCOM survey (2013) of Internet usage in the UK found that 82% of 5 to 7 year olds, 96% of 8 to 11 year olds and 99% of 12 to 15 year olds are online. 12 to 15 year olds spend as much time online as they do watching television.

The Internet is not a safe place, a ChlidLine survey of 13 to 18 year olds found that 60% had been asked to create a sexual image or video of themselves, 40% had created the images and 25% had sent them.

Access to devices has completely changed over the past decade. Children use mobile phones, tablets, e-Readers, games consoles and many other connected devices. The days of controlling web filtering on a single home PC are gone. Parents lacking IT skills and the ability to keep pace with relentless technical change struggle to implement appropriate safety controls. In 2011, Microsoft surveyed parents across 10 countries. 64% had applied ‘no filters’ at all to their home networks or devices.

The ubiquity of mobile makes the challenge even more complex. By 2015 the mobile adult content industry is forecast to reach $3bn. 1 in 5 mobile searches is for adult content. The production, distribution and consumption of adult content has never been higher and the devices and access points to reach that content are everywhere.

Is automating adult content filtering the answer to this problem?

Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World

How does Content Filtering Work?

Content filtering is the prevention of access to inappropriate material online. This may be ‘situational’, such as filtering certain websites from work networks, or may be age related in terms of filtering adult content. There are different types of filters and they can be applied in different ways. Filters can be applied on a device (such as a home PC, laptop or tablet), on a shared broadband connection (usually a single point coming in to the home), at the Internet Service Provider (preventing content from ever reaching your home router) or by the search provider (for example Google filtering out inappropriate content in the search results).

Content filtering works in different ways depending on the type of filter being used. Some techniques include blacklisting (blocking access to specific sites), whitelisting (allowing access to only whitelisted sites), search filtering (removing sites from search results), domain filtering and content filtering.  The simpler techniques are based on static filtering (blocking access to domains, IP addresses etc.), dynamic filtering is more sophisticated and aims to block part of parts of websites.

Voluntary labelling codes including RTA (Restricted to Adults) and Safe Internet Rating Standard enable content providers to tag their sites and make the filtering task easier.

Cyberspace Law: Censorship and Regulation of the Internet

Content Filtering Challenges for Internet Service Providers

The UK Government wants Internet Service Providers to filter adult content by default. This means that even though you have set no content filters yourself, the Internet provider will filter out adult content for you. This is a compelling option as you don’t need to know how to technically implement the filters yourself. It will also protect all of the connected devices in your home (* remember however that mobile connections over 3G or 4G might be going to a different service provider and might not be under the same controls).

This change would require consumers wishing to access adult content to explicitly opt-in. On the face of it, this looks like an improvement to online safety, but there are challenges for the Internet Service Providers and some key points which parents should be aware of:

  1. Managing the opt-in/opt-out – as this process is new, the mechanics of the opt-in/out process need to be designed and tested. I would argue that the process needs to be standardised across Internet Service Providers and backed by legislation and mandatory codes of practice. The agility of the Internet Service Providers to process opt in/out requests might also come into question. How long will it take to provision the change? Most big providers in the UK are currently only asking new customers to opt in or out. Pre-existing customers are not yet affected.
  2. Privacy of the data – As the opt-in becomes an explicit request to access certain types of content, any associated data is sensitive and must be secured and anonymised. Consumption of certain types of adult content has been loosely correlated with marriage breakdown and promiscuity. Making inferences from opt-in data would be ethically questionable and deeply unpopular. This could be a PR nightmare for the Internet Companies.
  3. Managing the filter – as sites pop up, others die and techniques are found to bypass filtering, management and testing of filters will become more complex and costly. This is a shift in ‘burden of responsibility’ and Service Providers should be justifiably nervous. They will also need to share information as there will be judgement calls to make in terms of whether a site or content should be blocked or not.
  4. Over and under blocking – Content Filtering often leads to over blocking, that is, to err on the side of caution and block more content than strictly necessary. A legitimate concern is that over blocking will result in useful education sites being inaccessible. The opposite of over blocking is under blocking, where the filter is more ‘relaxed’ and lets through some questionable content. A parent trusting the Internet Service Provider to make the right decision every time will be unforgiving if their child is exposed to inappropriate content.
  5. Implementation and operational costs – The additional complexity of managing the opt-in/out processes and filters means bigger operating costs. The Internet Service Providers are unlikely to bare all of this cost, so expect it to be transferred to the consumer through price rises. Price rises are likely to be applied to all customers, not just those that really want to make use of a filtering service.
  6. Switching between ISPs – A great deal has been done over the past decade (local loop unbundling etc.) to open up the broadband services market. Competition between suppliers and the ability to easily switch providers is essential. It is unclear how opt-in/out data would be transferred as accounts are transferred between service providers. Process standardisation would help remove some of the uncertainty.
  7. Filter variance across Internet Service Providers – a related point is the degree of standardisation of filtering applied by competing Internet Service Providers. It is conceivable that some will over block and some will under block. As you move between service providers you may be exposed to ‘filter frustration’. Service Providers should collaborate closely and share information to ensure their filtering services are standardised.
  8. Multichannel filtering – Internet Service Providers will need to ensure that filtering works across different access channels. ADSL, 3G, 4G and emerging services are likely to have to conform to any government legislation. Could filtering regulation stifle development and rollout of services across different communications technologies?
  9. Bad Press and Filter Lobbying – Bad news stories feel close by. Internet Service Providers will face ridicule in the press if they block access to reputable sites or enable access to dubious content. There is no real commercial pay-off for the Internet Service Provider and they bare more reputational risk. In conjunction with new challenges in terms of managing their filters, they could face ‘filter lobbying’ where those filtered out argue for re-inclusion.
  10. A precedent for further ‘censorship’ – The changes call into question the government’s role in Internet censorship. The Internet Service Providers risk being seen as ‘censorship enablers’ and critics will no doubt suggest that this could lead to further and more far reaching government intervention.

Privacy Online: Perspectives on Privacy and Self-Disclosure in the Social Web

How easy are Content Filters to Bypass

The easiest way to bypass a content filter is to use an unfiltered access point. Microsoft’s 2011 survey highlighted the lack of filtering applied to existing home networks. The government’s initiative helps by enabling default filtering, but unfiltered routes will still exist. To bypass the filter, find an unfiltered route.

With a little ingenuity, encryption and a proxy, the filter could also be tricked into passing the traffic. See this change for what it is, a means of ‘flipping the default’, not a means of providing undefeatable content filters.

The even more ingenious might establish peer to peer or other sharing networks, and simply move dubious content onto unfiltered sites.

I have concerns that peer to peer sharing and driving Internet users towards the even shadier side of the web (such as The Onion Router (Tor)) might be an unintended consequence.

Weaknesses in the Content Filtering Approach

My concerns include:

  • It is unclear who will ultimately bare the costs of this and further change
  • Content filters can and will be defeated. The big risk is that parents think these filters are fool proof.
  • The delegation of trust to the Internet Service Provider has consequences. Parents might avoid having important (and difficult) conversations with their children, assuming that safety has been ‘covered elsewhere’.
  • Automatic content filters will not keep children safe from cyber-bullies on Social Networks, or necessarily from other predators. Online safety is a very broad topic and there is a risk that parents will think this legislation is a ‘fix all’. According to a survey by the Anti-bullying Alliance, 40% of parents and 44% of teachers do not know how to ‘react’ to cyber-bullying. 55% of Internet users surveyed also ‘accepted that cyber-bullying was part of everyday life.

Children and young adults engage in risky online behaviour. The ChildLine survey cited above bares this out. Automated content filters will not solve every online safety issue. That will only come from education, awareness and being ‘web savvy’.

Minding Minors Wandering the Web: Regulating Online Child Safety

Alternatives to Content Filtering

Automated content filtering is not a perfect solution. There are clear benefits for parents who do not want (or are unable) to tackle the technical challenge of implementing filters themselves. An holistic approach to online safety must be implemented. Topics this needs to cover include:

  • Education and debate – providing young people with the resources and support they need to make good choices online and maximise the benefits that the Internet and Social Media provides.
  • Holistic approach – tackling reputation protection, identity protection, cyber-bullying, trolls, self-harm etc. as part of the overall online safety debate.
  • Self-censorship – as young people mature, examining how to apply self-censorship and self-regulation to online activities.
  • Accountability software such as Safe Eyes or Covenant Eyes – an interesting concept aimed at helping (mostly adults) overcome online addictions and behaviours.

Other Online Safety Resources

Further Reading on Online Privacy

By Steve Nimmons

Steve is a Certified European Engineer, Chartered Engineer, Chartered Fellow of the British Computer Society, Fellow of the Institution of Engineering and Technology, Royal Society of Arts, Linnean Society and Society of Antiquaries of Scotland. He is an Electric Circle Patron of the Royal Institution of Great Britain, a Liveryman and Freeman of London and serves on numerous industry panels. He is a member of Chatham House, the Royal United Services Institute and the Chartered Institute of Journalists.

By continuing to use the site, you agree to the use of cookies. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close