09 April 2014

Data protection

Posts relating to the category tag "data protection" are listed below.

10 May 2013

IP Address Sharing and Individual Identification

BT has announced a trial of its Carrier-Grade Network Address Translation (CGNAT) where Internet Protocol (IP) addresses will be shared between subscribers.

organisations [will] generally have to treat IP addresses as personal data

Concerns have been expressed about the ability for some application to work if they rely on the assumption that IP addresses are unique, and also how this affects the identification of individual people.

Out-law.com provides a good review of the issues and information from BT, but links to the sources are not provided. BT has apparently stated they will still be able to identify individuals despite using CGNAT.

But the issue of identification does not only relate to newsworthy "illegal online activity" but also for wider privacy protection of completely legal activity where it is clear that IP addresses really must be considered as personal identifiers, especially when they can be combined with other data sets. Something to be considered in privacy impact assessments.

Posted on: 10 May 2013 at 09:48 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

05 April 2013

Fair Data?

At the end of January, the Market Research Society (MRS) launched an initiative called Fair Data.

Photograph from the London Shard at dusk looking towards Canary Wharf

Existing MRS Company Partners (who are already subject to the MRS Code of Conduct), and others who apply and pass an assessment by the MRS of their "policies and procedures", must firstly adhere to the 10 principles and secondly must "use the Fair Data mark in all relevant dealings with customers and respondents". The 10 principles relate to the following topics:

  1. Consent
  2. Purpose
  3. Access
  4. Security
  5. Respect
  6. Sensitive personal data
  7. Supply chain
  8. Ethics
  9. Staff training
  10. Default to not using personal data unless there is adherence to the above nine principles

So the scheme does not include all eight data protection principles but some extra business process requirements. Perhaps this is because the trust mark has been designed "to be used internationally".

The scheme seems to have some initial endorsements, but these type of things won't work unless there is a large adoption so that consumers and others recognise the mark, and that is backed up by verifiable evidence that it makes a difference. I am not sure if this "kite mark" or "trust seal" is the one to make everyone confident about use of their personal data.

Posted on: 05 April 2013 at 18:32 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

19 March 2013

Data Protection Risks for Mobile Apps

The European Commission's Article 29 Working Party on the Protection of Individuals has published its opinion on data protection risks for mobile apps.

Partial image of the recommendations for app developers in the European Commission's Article 29 Working Party on the Protection of Individuals 'Opinion 02/2013 on Apps on Smart Devices'

Opinion 02/2013 on Apps on Smart Devices describes the roles, risks and responsibilities of four groups: app developers, operating systems/device manufacturers, app stores and third parties. It seems slightly odd to lump together "rganisations that outsource the app development and those companies and individuals building and deploying apps", and think this will lead to confusion.

However, in the conclusions on pages 27-28, there are two useful lists under the headings "App developers must" and "The Working Party recommends that app developers", which would be suitable for consideration of inclusion within internal compliance standards for app development.

Posted on: 19 March 2013 at 08:28 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

08 March 2013

Mobile App Privacy Labelling

Stakeholders in the US Department of Commerce's National Telecommunications and Information Administration (NTIA) have been discussing visibility of privacy notices and opt ins for mobile applications.

Mock-ups of proposed privacy labelling for mobile phone applications

A meeting of the Privacy Multistakeholder Process: Mobile Application Transparency group in February reviewed feedback on a discussion draft on proposals for voluntary [privacy] transparency screens (VTS) in mobile apps. Mock-ups of the short notice screens were also presented.

The Mobile App VTS requires mobile app developers and publishers to [voluntarily] provide information about what data is collected from consumers, and details of any data sharing with third parties.

The group's next meeting is on 14th March 2013.

See also previous posts about Privacy and Terms of Use Labelling, Privacy, Labelling and Legislation, Privacy Labelling, Security Labelling, Software Assurance Labelling, Adverts and Privacy Notices, and Privacy Notices Code of Practice.

Posted on: 08 March 2013 at 20:53 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

01 March 2013

Privacy Legislation for Mobile Apps

With the publication of a report by the US Federal Trade Commission, new proposed privacy legislation is gaining support in the United States.

Photograph of an airline's mobile check-in app offering a £99 upgrade with '99999999' displayed where a date is expected

The FTC's report Mobile Privacy Disclosures: Building Trust Through Transparency made recommendations for platform developers, app developers and third parties including advertising networks. The report also commented on how app developer trade associations, academics, usability experts and privacy researchers can contribute.

The Application Privacy, Protection and Security Act of 2013 (APPS Act) discussion draft proposes requirements for user consent, the protection of personal and de-identified data, with enforcement by the FTC.

It will be interesting to see where this goes.

Posted on: 01 March 2013 at 07:44 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

25 January 2013

ICO Fines Sony Over PlayStation Network Compromise

Sony Computer Entertainment Europe Limited (SCEE) has received a monetary penalty of £250,000 from the UK's Information Commissioner's Office (ICO).

...the attack could have been prevented if the software had been up-to-date, while technical developments also meant passwords were not secure.

The monetary penalty notice describes the background and the ICO's reasoning but is heavily redacted. Apparently the intrusion and theft of data occurred as a result of attack that exploited unpatched software to gain access to personal and business data, including insecurely stored passwords. It is a great pity the monetary penalty notice has had redactions, since other ICO similar notices and undertakings don't seem to be able to have this benefit, and neither do organisations issued with enforcement notices by the FSA.

SCEE are allowed an early payment discount of 20% if the monetary penalty is paid by 14th February 2013, but it is widely reported that Sony are to appeal against the decision. But I am not sure that whether it was "a focused and determined criminal attack" or not makes any difference as to the requirement for baseline security measures. Also that "there is no evidence that encrypted payment card details were accessed" and that "personal data is unlikely to have been used for fraudulent purposes" doesn't mean there wasn't a breach of the Data Protection Act 1998.

Posted on: 25 January 2013 at 08:35 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

18 January 2013

Proposed Amendments to EU Data Protection Framework

MEP Jan Philipp Albrecht, Rapporteur to the European Parliament's Committee on Civil Liberties, Justice and Home Affairs has published a report with suggested amendments to the EU Data Protection Framework proposals.

These might well add to the concerns of the UK's Justice Committee, and certainly from the advertising industry around the issue of explicit consent and a widening of the definition of personal data, including in some circumstances "Internet Protocol addresses, cookie identifiers and other unique identifiers".

The report outlines the current text proposed by the Commission, the proposed amendment and justification for the proposed change. Apologies for the length of this post, but some of the more important suggested amendments for web site and web service operators are outlined below to give a flavour of what might be expected.

  • 14 "... The principles of data protection should not apply to data rendered anonymous in such a way that the data subject is no longer identifiable"
    changed to
    "... This Regulation should not apply to anonymous data, meaning any data that can not be related, directly or indirectly, alone or in combination with associated data, to a natural person or where establishing such a relation would require a disproportionate amount of time, expense, and effort, taking into account the state of the art in technology at the time of the processing and the possibilities for development during the period for which the data will be processed."
  • 15 "When using online services, individuals may be associated with online identifiers provided by their devices, applications, tools and protocols, such as Internet Protocol addresses or cookie identifiers. This may leave traces which, combined with unique identifiers and other information received by the servers, may be used to create profiles of the individuals and identify them. It follows that identification numbers, location data, online identifiers or other specific factors as such need not necessarily be considered as personal data in all circumstances."
    changed to
    "When using online services, individuals may be associated with one or more online identifiers provided by their devices, applications, tools and protocols, such as Internet Protocol addresses, cookie identifiers and other unique identifiers. Since such identifiers leave traces and can be used to single out natural persons, this Regulation should be applicable to processing involving such data, unless those identifiers demonstrably do no relate to natural persons, such as for example the IP addresses used by companies, which cannot be considered as 'personal data' as defined in this Regulation."
  • 31 "In order for processing to be lawful, personal data should be processed on the basis of the consent of the person concerned or some other legitimate basis, laid down by law, either in this Regulation or in other Union or Member State law as referred to in this Regulation."
    changed to
    "In order for processing to be lawful, personal data should be processed on the basis of the specific, informed and explicit consent of the person concerned or some other legitimate basis, laid down by law, either in this Regulation or in other Union or Member State law as referred to in this Regulation."
  • 19 "In order to ensure free consent, it should be clarified that consent does not provide a valid legal ground where the individual has no genuine and free choice and is subsequently not able to refuse or withdraw consent without detriment."
    changed to
    "In order to ensure free consent, it should be clarified that consent does not provide a valid legal ground where the individual has no genuine and free choice and is subsequently not able to refuse or withdraw consent without detriment. The use of default options which the data subject is required to modify to object to the processing, such as pre-ticked boxes, does not express free consent."
  • 25 New "The interests and fundamental rights of the data subject override the interest of the data controller where personal data are processed in circumstances where data subjects do not expect further processing, for instance when a data subject enters a search query, composes and sends an electronic mail or uses another electronic private messaging service. Any processing of such data, other than for the purposes of performing the service requested by the data subject, should not be considered in the legitimate interest of the controller."
  • 45 New "The right to the protection of personal data is based on the right of the data subject to exert the control over the personal data that are being processed. To this end the data subject should be granted clear and unambiguous rights to the provision of transparent, clear and easily understandable information regarding the processing of his or her personal data, the right of access, rectification and erasure of their personal data, the right to data portability and the right to object to profiling. Moreover the data subject should have also the right to lodge a complaint with regard to the processing of personal data by a controller or processor with the competent data protection authority and to bring legal proceedings in order to enforce his or her rights as well as the right to compensation and damages resulting of an unlawful processing operation or from an action incompatible with this Regulation. The provisions of this Regulation should strengthen, clarify, guarantee and where appropriate, codify those rights."
  • 54 "To strengthen the 'right to be forgotten' in the online environment, the right to erasure should also be extended in such a way that a controller who has made the personal data public should be obliged to inform third parties which are processing such data that a data subject requests them to erase any links to, or copies or replications of that personal data. To ensure this information, the controller should take all reasonable steps, including technical measures, in relation to data for the publication of which the controller is responsible. In relation to a third party publication of personal data, the controller should be considered responsible for the publication, where the controller has authorised the publication by the third party."
    changed to
    "To strengthen the 'right to erasure and to be forgotten' in the online environment, the right to erasure should also be extended in such a way that a controller who has made the personal data public without legal justification should be obliged to take all necessary steps to have the data erased, but without prejudice to the right of the data subject to claim compensation."
  • 61 "The protection of the rights and freedoms of data subjects with regard to the processing of personal data require that appropriate technical and organisational measures are taken, both at the time of the design of the processing and at the time of the processing itself, to ensure that the requirements of this Regulation are met. In order to ensure and demonstrate compliance with this Regulation, the controller should adopt internal policies and implement appropriate measures, which meet in particular the principles of data protection by design and data protection by default."
    changed to
    "The protection of the rights and freedoms of data subjects with regard to the processing of personal data require that appropriate technical and organizational measures are taken, both at the time of the design of the processing and at the time of the processing itself, to ensure that the requirements of this Regulation are met. In order to ensure and demonstrate compliance with this Regulation, the controller should adopt internal policies and implement appropriate measures, which meet in particular the principles of data protection by design and data protection by default. The principle of data protection by design require data protection to be embedded within the entire life cycle of the technology, from the very early design stage, right through to its ultimate deployment, use and final disposal. The principle of data protection by default requires privacy settings on services and products which should by default comply with the general principles of data protection, such as data minimisation and purpose limitation."
  • 84 "'data subject' means an identified natural person or a natural person who can be identified, directly or indirectly, by means reasonably likely to be used by the controller or by any other natural or legal person, in particular by reference to an identification number, location data, online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that person;"
    changed to
    "'data subject' means an identified natural person or a natural person who can be identified or singled out, directly or indirectly, alone or in combination with associated data, by means reasonably likely to be used by the controller or by any other natural or legal person, in particular by reference to a unique identifier, location data, online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural, social or gender identity or sexual orientation of that person;"
  • 106 New "4a. Consent looses its effectiveness as soon as the processing of personal data is no longer necessary for carrying out the purpose for which they were collected. "

The topic of information security is also addressed:

  • 39 "The processing of data to the extent strictly necessary for the purposes of ensuring network and information security, i.e. the ability of a network or an information system to resist, at a given level of confidence, accidental events or unlawful or malicious actions that compromise the availability, authenticity, integrity and confidentiality of stored or transmitted data, and the security of the related services offered by, or accessible via, these networks and systems, by public authorities, Computer Emergency Response Teams - CERTs, Computer Security Incident Response Teams - CSIRTs, providers of electronic communications networks and services and by providers of security technologies and services, constitutes a legitimate interest of the concerned data controller. This could, for example, include preventing unauthorised access to electronic communications networks and malicious code distribution and stopping 'denial of service' attacks and damage to computer and electronic communication systems."
    changed to
    "The processing of data to the extent strictly necessary for the purposes of ensuring network and information security, i.e. the ability of a network or an information system to resist accidental events or malicious actions that compromise the availability, authenticity, integrity and confidentiality of stored or transmitted data, and the security of the related services offered by these networks and systems, by public authorities, Computer Emergency Response Teams - CERTs, Computer Security Incident Response Teams - CSIRTs, providers of electronic communications networks and services and by providers of security technologies and services, in specific incidents, constitutes a legitimate interest of the concerned data controller. This could, for example, include preventing unauthorised access to electronic communications networks and malicious code distribution and stopping 'denial of service' attacks and damage to computer and electronic communication systems. The processing of personal data to restrict abusive access to and use of publicly available network or information systems, such as the blacklisting of Media Access Control (MAC) addresses or electronic mail addresses by the operator of the system, also constitutes a legitimate interest."

While not all these amendments (or the rest of the draft framework itself) will come into law, it would be a brave organisation not to start taking these types of considerations into planning and upcoming projects.

Posted on: 18 January 2013 at 08:00 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

04 January 2013

Online Behavioural Advertising Rule Changes

The UK Code of Non-broadcast Advertising, Sales Promotion and Direct Marketing (CAP Code) will include new rules in a month's time (February 4th 2013) relating to greater transparency and choice for consumers around Online Behavioural Advertising (OBA).

Photograph of a hand-written notice taped to the pavement with the words 'Please mind the hole!!' written on it - there appears to be an uncovered inspection chamber below

The Committee of Advertising Practice (CAP) published the Online Behavioural Advertising Regulatory Statement in November 2012 describes how notices must be provided to web users, in or around online display advertisements, that they are undertaking OBA, together with a mechanism to opt out. These are based upon the pan-European industry-wide agreed self-regulatory standards — European Advertising Standards Alliance (EASA) Best Practice Recommendation and the IAB Europe Self-Regulation Framework.

The rules are defined in a new Appendix 3 of the CAP Code, and will be enforced by the Advertising Standards Authority. The rules will be reviewed again later in 2013.

Posted on: 04 January 2013 at 08:39 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

18 December 2012

Disposal

Just in case you have got the development process running slickly, and operation is going smoothly, have you thought about asset disposal, and related data destruction, at the end of life?

Photography of a few autumnal-coloured leaves on a lichen-covered tree

Well, the UK's Information Commissioner has produced a short guide IT Asset Disposal for Organisations.

As it states in the guide "if personal data is compromised during the asset disposal process, even after it has left your organisation, you may still be responsible for breaching the DPA so it is important to manage the process correctly". This is of course relevant for other types of data too. And not just your own equipment, but that of organisations processing data on your behalf (and in the cloud).

Who is your "asset disposal champion"?

Posted on: 18 December 2012 at 06:58 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

27 November 2012

Personal Data Anonymisation Code of Practice

The UK's Information Commissioner's Office (ICO) Head of Policy, Steve Wood, recently discussed the issues around data anonymisation on the ICO blog. Anonymised data is information that does not identify any individuals, either in isolation or when cross referenced with other data available, and he suggested the need to develop an effective and balanced risk framework for personal data anonymisation to protect privacy and yet provide opportunities to exploit the data.

the risk of identification must be greater than remote and reasonably likely for information to be classed as personal data under the DPA

Anonymisation is another technique that can be used to reduce the risk from the loss or unauthorised access to personal data, along with data minimisation, pseudonymisation, aggregation, masking, encryption and tokenisation.

Following the ICO's public consultation earlier in 2012, a new code of practice has been issued under the Data Protection Act that focuses on managing the data protection risks related to anonymisation. Anonymisation: Managing Data Protection Risk Code of Practice intends to assist organisations that need to anonymise personal data, identifies the issues to consider, discusses whether consent is required, confirms there are fewer legal restrictions on anonymised data, and describes the legal tests required under the Data Protection Act.

The code provides guidance on a decision making process to help when considering the release of anonymised data that includes establishing a process to take into account the:

  • likelihood of re-identification being attempted
  • likelihood the re-identification would be successful
  • anonymisation techniques which are available to use
  • quality of the data after anonymisation has taken place and whether this will meet the needs of the organisation using the anonymised information.

The key point behind the code is the need to make a risk-based decision, and this could form part of a privacy impact assessment.

I very much like the examples and case studies in the three annexes. The case study in Annex 1 includes an example of how the "scope of personal data" can be minimised in the same way the "scope for PCIDSS" can be. In the latter, the storage of encrypted card holder data by an organisation that does not have access to the encryption keys can be deemed out of scope of PCIDSS requirements. In the code's case study, the partial redaction of data, means the originating organisation must still consider the information as personal data (because it has the full version of the data, and the key to reverse the redaction), but another party that only has the redacted data set does not need to treat the information as personal data. Parallel compliance examples.

The section on governance, discusses the need for assigning responsibilities, providing staff training, having procedures to help identify difficult cases, keeping up-to-date with legislation, the use of privacy impact assessments, being transparent with the individuals concerned, reviewing possible consequences, and preparing for an incident when re-identification has occurred.

Posted on: 27 November 2012 at 21:33 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

More Entries

Data protection : Web Security, Usability and Design
ISO/IEC 18004:2006 QR code for http://clerkendweller.com

Requested by 54.205.166.220 on Friday, 18 April 2014 at 05:01 hrs (London date/time)

Please read our terms of use and obtain professional advice before undertaking any actions based on the opinions, suggestions and generic guidance presented here. Your organisation's situation will be unique and all practices and controls need to be assessed with consideration of your own business context.

Terms of use http://www.clerkendweller.com/page/terms
Privacy statement http://www.clerkendweller.com/page/privacy
© 2008-2014 clerkendweller.com