09 April 2014

Privacy

Posts relating to the category tag "privacy" are listed below.

07 June 2013

User Profiling and "Significant Impact"

Do you profile your customers, clients and citizens with data from your applications?

"Profiling" means any form of automated processing of personal data, intended to analyse or predict the personality or certain personal aspects relating to a natural person, in particular the analysis and prediction of the person's health, economic situation, performance at work, personal preferences or interests, reliability or behaviour, location or movements.

The European Commission's Article 29 Working Party has published an opinion, in the form of an advice leaflet, to provide input into the current discussions on European data protection reform.

The paper supports that the scope of Article 20 covering processing of personal data for the purpose of profiling or measures based on profiling, and that there should be greater transparency and control for data subjects of profiling and subsequent measures based upon the profile generated, and thus acknowledges the this creates more responsibility and accountability for data controllers.

However, the paper suggests profiling and measures should only be subject to additional control if they significantly affect the interests, rights or freedoms of the data subject.

See further discussion here and here.

Posted on: 07 June 2013 at 19:03 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

05 April 2013

Fair Data?

At the end of January, the Market Research Society (MRS) launched an initiative called Fair Data.

Photograph from the London Shard at dusk looking towards Canary Wharf

Existing MRS Company Partners (who are already subject to the MRS Code of Conduct), and others who apply and pass an assessment by the MRS of their "policies and procedures", must firstly adhere to the 10 principles and secondly must "use the Fair Data mark in all relevant dealings with customers and respondents". The 10 principles relate to the following topics:

  1. Consent
  2. Purpose
  3. Access
  4. Security
  5. Respect
  6. Sensitive personal data
  7. Supply chain
  8. Ethics
  9. Staff training
  10. Default to not using personal data unless there is adherence to the above nine principles

So the scheme does not include all eight data protection principles but some extra business process requirements. Perhaps this is because the trust mark has been designed "to be used internationally".

The scheme seems to have some initial endorsements, but these type of things won't work unless there is a large adoption so that consumers and others recognise the mark, and that is backed up by verifiable evidence that it makes a difference. I am not sure if this "kite mark" or "trust seal" is the one to make everyone confident about use of their personal data.

Posted on: 05 April 2013 at 18:32 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

19 March 2013

Data Protection Risks for Mobile Apps

The European Commission's Article 29 Working Party on the Protection of Individuals has published its opinion on data protection risks for mobile apps.

Partial image of the recommendations for app developers in the European Commission's Article 29 Working Party on the Protection of Individuals 'Opinion 02/2013 on Apps on Smart Devices'

Opinion 02/2013 on Apps on Smart Devices describes the roles, risks and responsibilities of four groups: app developers, operating systems/device manufacturers, app stores and third parties. It seems slightly odd to lump together "rganisations that outsource the app development and those companies and individuals building and deploying apps", and think this will lead to confusion.

However, in the conclusions on pages 27-28, there are two useful lists under the headings "App developers must" and "The Working Party recommends that app developers", which would be suitable for consideration of inclusion within internal compliance standards for app development.

Posted on: 19 March 2013 at 08:28 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

08 March 2013

Mobile App Privacy Labelling

Stakeholders in the US Department of Commerce's National Telecommunications and Information Administration (NTIA) have been discussing visibility of privacy notices and opt ins for mobile applications.

Mock-ups of proposed privacy labelling for mobile phone applications

A meeting of the Privacy Multistakeholder Process: Mobile Application Transparency group in February reviewed feedback on a discussion draft on proposals for voluntary [privacy] transparency screens (VTS) in mobile apps. Mock-ups of the short notice screens were also presented.

The Mobile App VTS requires mobile app developers and publishers to [voluntarily] provide information about what data is collected from consumers, and details of any data sharing with third parties.

The group's next meeting is on 14th March 2013.

See also previous posts about Privacy and Terms of Use Labelling, Privacy, Labelling and Legislation, Privacy Labelling, Security Labelling, Software Assurance Labelling, Adverts and Privacy Notices, and Privacy Notices Code of Practice.

Posted on: 08 March 2013 at 20:53 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

01 March 2013

Privacy Legislation for Mobile Apps

With the publication of a report by the US Federal Trade Commission, new proposed privacy legislation is gaining support in the United States.

Photograph of an airline's mobile check-in app offering a £99 upgrade with '99999999' displayed where a date is expected

The FTC's report Mobile Privacy Disclosures: Building Trust Through Transparency made recommendations for platform developers, app developers and third parties including advertising networks. The report also commented on how app developer trade associations, academics, usability experts and privacy researchers can contribute.

The Application Privacy, Protection and Security Act of 2013 (APPS Act) discussion draft proposes requirements for user consent, the protection of personal and de-identified data, with enforcement by the FTC.

It will be interesting to see where this goes.

Posted on: 01 March 2013 at 07:44 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

27 November 2012

Personal Data Anonymisation Code of Practice

The UK's Information Commissioner's Office (ICO) Head of Policy, Steve Wood, recently discussed the issues around data anonymisation on the ICO blog. Anonymised data is information that does not identify any individuals, either in isolation or when cross referenced with other data available, and he suggested the need to develop an effective and balanced risk framework for personal data anonymisation to protect privacy and yet provide opportunities to exploit the data.

the risk of identification must be greater than remote and reasonably likely for information to be classed as personal data under the DPA

Anonymisation is another technique that can be used to reduce the risk from the loss or unauthorised access to personal data, along with data minimisation, pseudonymisation, aggregation, masking, encryption and tokenisation.

Following the ICO's public consultation earlier in 2012, a new code of practice has been issued under the Data Protection Act that focuses on managing the data protection risks related to anonymisation. Anonymisation: Managing Data Protection Risk Code of Practice intends to assist organisations that need to anonymise personal data, identifies the issues to consider, discusses whether consent is required, confirms there are fewer legal restrictions on anonymised data, and describes the legal tests required under the Data Protection Act.

The code provides guidance on a decision making process to help when considering the release of anonymised data that includes establishing a process to take into account the:

  • likelihood of re-identification being attempted
  • likelihood the re-identification would be successful
  • anonymisation techniques which are available to use
  • quality of the data after anonymisation has taken place and whether this will meet the needs of the organisation using the anonymised information.

The key point behind the code is the need to make a risk-based decision, and this could form part of a privacy impact assessment.

I very much like the examples and case studies in the three annexes. The case study in Annex 1 includes an example of how the "scope of personal data" can be minimised in the same way the "scope for PCIDSS" can be. In the latter, the storage of encrypted card holder data by an organisation that does not have access to the encryption keys can be deemed out of scope of PCIDSS requirements. In the code's case study, the partial redaction of data, means the originating organisation must still consider the information as personal data (because it has the full version of the data, and the key to reverse the redaction), but another party that only has the redacted data set does not need to treat the information as personal data. Parallel compliance examples.

The section on governance, discusses the need for assigning responsibilities, providing staff training, having procedures to help identify difficult cases, keeping up-to-date with legislation, the use of privacy impact assessments, being transparent with the individuals concerned, reviewing possible consequences, and preparing for an incident when re-identification has occurred.

Posted on: 27 November 2012 at 21:33 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

16 November 2012

Digital Identity for Winners

A very comprehensive report by the Boston Consulting Group, that assesses the value of digital identity, has been published by Liberty Global.

Examples of the charts included within 'The Value of Our Digital Identity'

The Value of Our Digital Identity, describes consumers increasing awareness and desire for control and how user control increases the willingness of users to share data. The report highlights how unlike some commodities, as the volume and variety of digital data grows, so does its value. And this data explosion is being driven by digital services & media, online data transactions, the internet of things and the current boom in social media, In turn this can fuel economic growth.

The report attempts to define what digital identity is, quantifies the current and potential economic value of digital identity for organisations and consumers, identifies important trends and offers a set of guiding principles that could help responsible organisations benefit from the value of digital identity.

Topics included that may be of particular interest to those involved with application design and implementation include:

  • Problems when there is a lack of transparency for users about how their personal data is collected and used
  • The benefits of offering the right to be forgotten
  • How the the form of consent should be based on the type of data requested
  • The need for convenience (usability)
  • Sector-specific variations in user behaviour
  • The requirement to increase data security (and not just using technical controls)
  • Why there should be flexibility in regulation to allow users to make their own choices
  • How digital identity can be used to provide differentiation from competitors

The report suggests that organisations need to establish and promote a trusted flow of data, or otherwise there are significant lost opportunities for value generation. Read, digest and implement.

Posted on: 16 November 2012 at 20:51 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

13 June 2012

Privacy and Terms of Use Labelling

In previous posts, I have mentioned labelling in A Software Security Kitemark?, Trust and E-commerce Trustmarks, Privacy Labelling, Trust .UK, Security Labelling, and Software Assurance Labelling. There are some impressive developments in ideas for privacy and terms of use labelling.

Screen capture of the dsample CommonTerms prototype terms preview

In Coming to Terms on the Project VRM blog describes the work at StandardLabel.org, CommonTerms and BiggestLie.

There are some great insights into rights, user behaviour and clarity of expression, which could contribute to formulating better, more understandable, descriptions of software security quality for users. Could security be meaningfully summed up in a single statement, or even just a small number of icons?

It's a challenging problem to produce something of value to a consumer, that takes a minimal amount of effort to digest. I like the approach of the OWASP Application Security Verification Standard, but even this has a degree of complexity of manual vs. automated testing, and I am not sure software security (from the end user's viewpoint) can be entirely divorced from the security of the underlying infrastructure.

Any thoughts?

Posted on: 13 June 2012 at 07:25 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

29 May 2012

Cookies Etc Law v3

The Information Commissioner's Office (ICO) has updated its guidance relating to the use of tracking technologies under changes to the UK's Privacy and Electronic Communications Regulations (PECR) which came into force last year, but which began to be enforced last saturday, 26th May 2012.

Implied consent is certainly a valid form of consent but those who seek to rely on it should not see it as an easy way out or use the term as a euphemism for "doing nothing"

Version 3 is an update to the version issued last December, and provides further information on "implied consent". The guidance is accompanied by a blog posting and video presentation.

Posted on: 29 May 2012 at 20:09 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

18 May 2012

Client-Side Storage in HTML5

Client-side, or local, storage is an area of concern for privacy and security. Therefore I was keen to attend the latest meeting of the London Web Performance Group titled HTML5 and Localstorage - Storage in the Browser at the Lamb Tavern (building c1780, but on the same site since 1309) in Leadenhall Market on Wednesday evening.

Photograph of many drawers in a filing cabinet labelled with journal dates

I almost changed my mind as I was also tempted to attend another local event on the same evening about NoSQL for Java Developers. Anyway I was very pleased I went to the client-side storage event, but it was so well-attended I almost did not have a seat. As usual, Stephen Thair (@TheOpsMgr) had done a great job organising the event.

Andrew Betts (@triblondon) described his experiences developing HTML5 applications for mobile devices, avoiding native code whenever possible, so that content could be available when the device is offline or in poor signal areas by using client-side storage. He described the pros and cons of using HTTP cookies, Indexed Database API (IndexedDB), Web SQL Database (WebSQL), local storage (key/value store) and Application Cache (or AppCache). Well the answer of which to use is "all of them". Andrew described how the FT.com application makes use of each type's advantages, to combine together into a responsive and network-robust application suitable for the most frequent and demanding of users. Therefore cookies are used for session management, AppCache for a default fallback page, local storage for static content such as HTML scaffolding, JavaScript and style sheets, and IndexedDB/WebSQL for the HTML content of pages. Thus they manage to fit the application into the HTML5 constraints imposed by different operating systems.

He explained many of the techniques used to circumvent mobile network and device-specific issues, but also explained how they managed to squeeze extra storage by compressing content as ASCII or base64 encoded data into JavaScript's UTF-16 double-byte encoding. It is a very clever piece of optimisation, which could also be used for code obfuscation. Details in the presentation slides.

I think users of client storage will have to be careful if it might be determined to be tracking technology. In the FT.com application case, this client storage is not offered to casual web site users, but only to those who have installed the app, are registered and log in. Thus there are opportunities to obtain consent, over and above any warning the device may offer. We are expecting to hear more about the ICO's plans for enforcement of the new regulations at a press conference this morning. Other HTML5 security issues are of course still a concern here. I was slightly troubled by one feature mentioned.

The presenter's slides are now available.

Posted on: 18 May 2012 at 09:05 hrs

Comments Comments (0) | Permalink | Send Send | Post to Twitter

More Entries

Privacy : Web Security, Usability and Design
ISO/IEC 18004:2006 QR code for http://clerkendweller.com

Requested by 54.196.57.4 on Wednesday, 16 April 2014 at 23:53 hrs (London date/time)

Please read our terms of use and obtain professional advice before undertaking any actions based on the opinions, suggestions and generic guidance presented here. Your organisation's situation will be unique and all practices and controls need to be assessed with consideration of your own business context.

Terms of use http://www.clerkendweller.com/page/terms
Privacy statement http://www.clerkendweller.com/page/privacy
© 2008-2014 clerkendweller.com