Surveillance issues in smart cities

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

Lua error in package.lua at line 80: module 'strict' not found.

Smart cities seek to implement information and communication technologies (ICT) in order to improve the efficiency and sustainability of urban spaces while reducing costs and resource consumption.[1] In the context of surveillance, smart cities monitor citizens through strategically placed sensors around the urban landscape, which collect data regarding many different factors of urban living. From these sensors, data is transmitted, aggregated and analysed by governments and other local authorities in order to extrapolate information about the challenges the city faces in sectors such as crime prevention,[2][3][4] traffic management,[5][6] energy use [6] and waste reduction. This serves to facilitate better urban planning[7] and allows governments to tailor their services to the local population.[8][9]

Such technology has been implemented in a number of cities, including Santa Cruz, Barcelona, Amsterdam and Stockholm. Smart city technology has developed practical applications in improving effective law enforcement, the optimization of transportation services,[10] and the improvement of essential infrastructure systems,[10] including providing local government services through e-Governance platforms.[11]

This constant and omnipresent transmission of data[7] from disparate sources into a single government entity has led to concerns being raised of these systems turning into ‘electronic panopticons’,[1] where governments exploit data driven technologies in order to maximize effective surveillance of their citizens. Such criticism is drawn from privacy factors,[10] as the information sharing flows operate vertically between citizens and the government on a scale that undermines the concept of urban anonymity.[10]

Law enforcement

The most discernible use of smart city technology for Government Surveillance arises in law enforcement, where critics consider the accumulation of intelligence through data collection strategies key to intelligence-based policing.[12] The technology available in smart cities includes extensive CCTV installations (such as in London and Dubai),[10][13] smart traffic sensors in New York[14] and crime prediction software in Santa Cruz, California.[2] This technology holds the potential to significantly improve the type and volume of information that may be relied upon by law enforcement authorities when dealing with crimes. Most policing technologies developed within smart cities appear to have shifted law enforcement from "displinary" to "actuarial",[12] with less focus on identifying individual criminals in order to ascribe guilt and a tendency to classify and manage groups based on levels of dangerousness.

Policing Techniques

Proactive policing

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Traffic Management is a major focus of proactive policing technologies.

Garland's culture of control theory has been used to describe the trend towards proactive policing in smart cities.[12] In Palestine, there have been proposals to introduce GPS-based tracking systems to cars for the purposes of law enforcement within a modern urban environment.[15] Here the location and speed of every vehicle is recorded and transmitted to the local authority, with a fine issued if the speed of the car exceeds the limit for more than 10 seconds.[15] The technology also holds the potential to relay information regarding accidents and traffic jams,[15] allowing traffic to be rerouted. An extensive camera system in Amsterdam relays data regarding the traffic situation to a central control point,[5] allowing authorities to warn motorists of incidents ahead or adverse weather conditions.

Such technology has a combined preventative and deterrent effect on motorists committing traffic violations. By controlling the speed of vehicles, authorities may minimize one of the more common risk factors in vehicular crashes.[16] Similarly, by monitoring the location of vehicles through a mix of GPS and camera technology, authorities are able to react in real time to minimize heavy traffic incidents and therefore the likelihood of crashes.[5] Such technology also enables police and emergency authorities to respond instantly to accidents that may occur. The extended ‘reach’ of the ‘long arm of the law’ could thus improve traffic management and efficiency, reducing energy consumption and improving civilians’ safety.

There is criticism of the use of smart city technology for proactive policing. The constant monitoring of every vehicle's location couples with the panopticon-like concept of continuous law enforcement[10] and introduces a level of individualistic paternalism,[10] where citizens are deemed incapable of obeying traffic laws voluntarily. More controversially, GPS tracking and camera monitoring may be inappropriately suited to other high-risk behaviour (such as drunk driving and fatigue),[16] which are also major factors in traffic accidents. There are also implementation difficulties, as older vehicles lacking GPS equipment would not appear in data streams, severely reducing the accuracy of potential analyses. There is also the risk of arbitrariness within proactive policing. GPS-based speeding enforcement would hold an individual driving above the speed limit for 9 seconds innocent whilst exceeding the limit for 10 seconds would constitute an offence. Such arbitrary measures do not account for the differences in car performance and remove discretion from law enforcement. Extrapolating this lack of discretion across multiple areas of criminal law, with automatic enforcement being implemented as the norm, the potential for unfair outcomes and public dissatisfaction with such technology becomes evident, due to the relatively high risk of non-accountability by governments using these methods.[17]

Predictive policing

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Predictive techniques in policing are not new, as search warrants are a pre-existing example of authorities acting on the basis of suspicion and prediction in contemporary communities[18] In the smart cities context, predictive policing is the use of data analytics to determine potential locations of future crime.[18] This data collection often occurs through smartphones carried around by urban populations. Through location-based services in smartphones, the movements of individuals can be tracked and scrutinized by authorities. This has the potential to be particularly effective in crowd control. By comparing the different velocities of individual smartphone users within a certain location, it is possible for law enforcement authorities to ascertain crowd density.[19] This allows for targeted crowd management and prediction of dangers related to excessive crowding.[19] Police are thus able to take appropriate action (such as information broadcasting) in order to reduce the threat of injuries from incidents (such as crowd stampedes), as well as crowd related crime (such as thefts) from occurring.[19]

This type of policing also allows for law enforcement agencies to ‘predict’ where, when or by whom a crime may occur in the future and respond accordingly. Big data analytical tools are used to identify patterns in crime,[18] allowing authorities to map high risk areas, times and days for certain types of crime. Through such software, police are also able to create profiles of potential criminals and associated behaviours.[18] Developments in technology within smart cities allow for the scope of predictions to be increased as well as the types of responses available to law enforcement bodies.

Santa Cruz has been the site of a number of predictive policing experiments.[2]

Experiments conducted in response to a ‘predictive policing algorithm’ based on crime data in Santa Cruz, California, enabled police officers to identify the most likely time and place within a certain locality for a particular crime to be committed.[2] This allowed for targeted patrols to be made with a 4 percent decline in burglaries and 13 additional arrests being recorded within the first 6 months.[2] These figures are preliminary however, and do not account for unreported crime or crime that was prevented through increased police presence.

Although it is possible to envisage such law enforcement intervention as becoming the norm where smart city surveillance technologies have been adopted and implemented, predictive policing has raised a number of legal and non-legal controversies.[20] Firstly, the level of criminal activity in a particular area sufficient to warrant extra patrols is unclear when predicting the commission of offences. The point at which the probability of crime becomes statistically significant is one which legal scholars and courts alike have had trouble defining.[10] Within this framework, there is a degree of arbitrariness upon which the weight of the predictive data analysis must be considered, as high crime areas can only be defined with reference to “low levels of crime”.[10]

Further, in the United States, searches and arrests must be made on grounds of reasonable suspicion under the Fourth Amendment. This means that officers must be able to “point to specific and articulable facts” that “warrant the intrusion”, or make a predictive judgment that the person is in possession of an item that related to the commission of an offence. Similar protections, though not constitutionally based, exist in Australia,[21] as well as the UK.[22] The latter was confirmed as binding by the European Court of Human Rights[23] on a number of European nations, which includes civil law states. The ability to formulate such “reasonable suspicion” on the basis of big data algorithms is controversial, with some critics arguing that in the absence of active police corroboration of predictive forecasts, there are insufficient grounds to warrant an arrest.[18] Further, the general nature of predictive forecasts is arguably incompatible with the acceptable standards outlined by the United States Supreme Court[18][24] with respect to specific individuals. Patterns of crime generated through data analytics are unlikely to generate the level of accurate predictive detail required for police officers to effect an arrest, when compared to informed tip offs.[18] While in the US, courts have allowed profiling to be used in stopping and searching persons in the right context,[18] notable judicial dissents[25] and academic research[18] highlight that profiling lacks probative value. In the UK, a House of Lords Report[26] recommended that such technology be prohibited from use by local authorities, unless they were tied to the investigation of serious criminal offenses. In addition, a major factor in Europe is that predictive policing technology must be exercised in accordance with legislation that is sufficiently clear on the scope of use (foreseeability) and affords persons adequate legal protection from arbitrary uses of predictive data algorithms.[12]

A data-driven stop and frisk program in New York was found to constitute racial profiling.

Non-legal controversies also arise over the passive discrimination that predictive policing programs can generate. In New York, a data-driven stop and frisk program was aborted after a US District Court found that the program constituted racial profiling.[27] Roughly 83% of persons stopped under the program were persons of colour.[10] This discrimination was masked through the noise generated by mass data analysis,[10] leading some academics to state that the number of factors within predictive policing algorithms may result in conflicting data and biased sampling.[10] The European Court of Human Rights has also acknowledged the disproportionate targeting of search powers against persons of colour in the UK,[23] highlighting the dangers of smart city technology in predictive policing.

Mass surveillance

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

The concept of smart cities is inherently tied to mass surveillance. The benefits derived from smart city technology are dependent on constant data flows captured and aggregated by sensors, cameras and tracking applications.[10] This persistence surveillance however, raises a number of privacy issues. Mass surveillance through big data acts in a manner that reduces urban anonymity,[10] due to the breadth of information and potential uses which can be extrapolated when multiple data streams are analysed together by a single governmental entity. Advocates of smart cities (such as Vint Cerf) state that this is akin to the level of privacy experienced in small towns.[28] In contrast, critics state that information sharing in smart cities has shifted from horizontal information flows between citizens to a vertical, unilateral process between citizen and government, reflecting concerns about panopticism.[10]

Data collection

Smart city applications often collate and analyse distinct sources of data in order to improve government services to operate more efficiently and effectively. Urban residents have few alternatives other than to subscribe to these services, particularly when making use of essential infrastructure, and thus indirectly and involuntarily consent to the sensors and surveillance technologies deployed throughout the urban environment through the mere act of residency.[10] In Amsterdam, wireless meters collect data about energy usage,[29] while the Mobypark app allows for the advertisement and renting of available parking spaces.[30] The information collected across these and over 70 other projects in Amsterdam is stored by the City of Amsterdam via a common IP infrastructure.[31] Considering that data from these services is accessible by a primary governmental body it allows for the possibility of data which is collected from these ‘distinct’ sources to be aggregated.[32]

Big data analysis

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Big data often refers to the use of data analysis and mapping algorithms generate valuable insights from seemingly disparate datasets.[33] The implications of applying such analysis to aggregated data sets are that they allow for a more holistic view of the needs of a particular community to be formed. Within smart cities, this data can be used as a reflexive tool when implemented within the urban ICT framework[34] allowing the Government to better meet the goals of smart cities – improved livability, efficiency and sustainability.[1] Such benefits were found in Barcelona, where tracking of residents commuting patterns led to a revamp and simplification of the city’s bus routes.[9] Combined with the implementation of smart traffic lights[35] that allow for central control, buses in Barcelona now run to a schedule that attempts to minimize the amount of time spent waiting at traffic lights.[36]

Big data analysis is not without flaws in its approach This is particularly true when applied to law enforcement, or where data is collected without the willing cooperation and consent of parties involved. Critics argue that there is an element of "mythology" surrounding big data that larger data sets offer deeper insights into urban issues with higher leels of accuracy and objectivity.[17]

Reliability

The increasing significance attributed to big data analytics, particularly within smart cities, gives rise to a situation where government bodies place an ‘almost faith-based’ reliance upon the veracity of results that have been predicted by analyzing surveilled data.[37]

In the absence of critical insight however, reliance on data alone has little support, as seen in the legal doctrine of reasonable suspicion.[18] Traditionally, decisions to apprehend or search an individual in sole reliance upon personal “hunches” were deemed to fail the legal standard of reasonable cause.[18] In this regard, it is difficult to see how data-driven hunches can be considered more reliable.[18] Both elicit assumptions being made based on inferences drawn from observable data, which can be falsified or otherwise inaccurate, undermining the integrity of the process.[38]

Critics of the increasing role played by data-based surveillance for the purposes of law enforcement foresee that such reliance could lead to issues in prosecuting individuals based on a probability-based crime system.[18] Furthermore, such a system holds the potential for conclusions to be drawn by attributing weighting to certain characteristics of an individual – an approach which could inadvertently mask any discriminatory agendas held by law enforcement bodies potentially targeting certain minorities.[39] Adding to the potential for discrimination, many big data algorithms often create new categories that exceed the scope of regulations designed to prevent against the unfair or discriminatory use of data.[38]

Outside law enforcement, critics argue that smart cities facilitate a shift to e-governance platforms, often at the expense of physical interactions with citizens.[40] While e-governance can improve service delivery and expand the ability to collect data from a single platform,[11] such processes may be at the expense of competitiveness and based merely on a technology push for more data sources and aggregation mechanisms.[31] As a result, the desire for increased surveillance undermines a fundamental aim of most smart cities to improve efficiency and effectiveness, as the citizens’ desire for certain ICT applications is ignored at the expense of further data aggregation. An example of this controversy has arisen in the UK, where proposals for a Scottish identity card were met with public outcry,[41] while similar cards have been implemented in Southampton[8] with little trouble, as many city services are provided in exchange for data collection.

Privacy and autonomy

In some situations, privacy may be lessened by surveillance.

The normalization of the collection and aggregation of big data[10] by Governments raises issues of privacy and autonomy. Part of this is fuelled by feature creep, where technologies and applications that were perceived to be ‘creepy’ until recently have now become socially acceptable.[10] Much of the concern surrounds the inconvenience and inability for citizens to opt out of new technologies where they form part of essential government services, as there are few alternatives.[10] Should an individual wish to appear “off the grid” they are forced to employ a range of tedious measures (such as paying in cash only and not utilizing a mobile phone) in order to reduce their data footprint.[42] Despite this, such tactics would only minimize and not eliminate their collectable data.[42]

Privacy concerns are raised where the data collected may be capable of linking to or identifying an individual,[43] particularly when collated from multiple information sources. The storage of data by governments remains opaque, while the potential for cross-sharing data across government services often means that data is accessible by parties that the provider did not intend to share the data with.[10] By mere participation as a member of an urban community, particularly through the use of essential urban services and infrastructure, an individual is placed at risk of having their data shared amongst multiple platforms and users. While individually such data may not identify the person providing it, when combined with other data in the set, such data may be considered as personally identifiable information (PII), and thus fall under strict privacy laws.[43] The constantly evolving uses of smart cities technology do not often fit neatly into privacy law frameworks,[43] which may be extremely broad, like in Australia,[44] where a discussion paper published by the Australian Law Reform Commission confirmed that anonymised data may still be PII.[44] Similar regimes exist in the United States[45] and the European Union (see: Data Protection Directive). In Europe, Government technology that interferes on privacy must be based on a "pressing social need" or otherwise "necessary in a democratic society" and be proportional to the legitimate aims espoused.[46] This means that authorities implementing smart cities regimes are at risk of violating privacy laws if appropriate safeguards are not taken. The European Court of Human Rights has held that surveillance mechanisms (including those implemented in smart cities technologies) can violate the right to privacy, especially where domestic legislation does not define the scope or manner of surveillance.[47] Conversely, individuals may find that their data has been used illegally in the implementation of smart cities technology. As much smart city technology is based on open platforms that are often outsourced[10] to private citizens and corporations, there are massive risks that PII may be unlawfully shared to third parties. Compounded with the relative opaqueness of data storage by governments, critics argue that individual privacy can be curtailed massively through residence in a smart city with little recourse for individuals.[10]

Government surveillance is arguably driven by paternalistic desires to protect citizens,[10] however the individualistic and tailor-made benefits delivered by smart city technology may reduce autonomy. This holds particularly true in light of the shift towards predictive policing that occurs within the smart city environment. Whilst nobly intended, such unilateral actions by a Government may be seen as oppressive[10] – with the omnipotent role assumed by the Government seen as giving rise to that of a panoptic institution.[10] Modern cities are increasingly valuing privacy and digital security, as evidenced by the latest “The Economist Safest Cities Index 2015”,[48] where a Digital Security metric was incorporated alongside traditional measures of safety such as Personal Security and Health.

Panopticism

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

Elevation, section and plan of Jeremy Bentham's Panopticon penitentiary, drawn by Willey Reveley, 1791

The English philosopher Jeremy Bentham created a circular prison design, known as the Panopticon, whereby prisoners knew that they were capable of being observed at any time without their knowledge – thus affording the prison officers a position of omnipresence.[49]

The French philosopher Michel Foucault re-conceptualized the notion of a panopticon as a metaphor for a ‘disciplinary society’, wherein power relations (and imbalances) can be defined and reinforced.[50] In such a society power is seen to approach its ideal form by increasing the number of people who can be controlled.[50]

In this regard, the development of smart cities and the resulting increase in the surveillance capacity of the Government gives rise to conditions which mirror that of the disciplinary society described by Foucault. To this end, the development of smart cities are seen by its critics to foreshadow a larger societal shift - particularly the role adopted by the Government - towards mass surveillance, paternalism, discipline and punishment as a means to attain social order,[50] particularly in the United States, where the “Internet of Things” is being used to collect increasingly specific data.[10] The commodification of surveillance in exchange for services has tended to normalise data collection and create indifference to panoptic developments in technology.[51] One of the major issues with Panopticism in the Smart Cities context is that the 'surveillance gaze' is mediated by the selective biases of the operators of any application or technology, as was shown by a study on the use of CCTV cameras in the UK, where the "usual suspects" tended to be targeted more frequently.[12] In Durban, this panoptic "gaze" extends based on CCTV operator intuition due to a normalisation of the characteristics of criminals.[52] Compounding these issues, digitally based panopticism usually views the "visibility" of undesirable characteristics as the problem, and often fails to adequately address matters that are invisible to the surveillance gaze.[52]

Police state

<templatestyles src="Module:Hatnote/styles.css"></templatestyles>

If a shift toward mass surveillance came to fruition, it could give rise to the development of an electronic police state as a result of the increased surveillance capabilities and law enforcement activities. This represents a distinct narrowing of the purpose of surveillance to that of maintaining social order via improved law enforcement. Van Brakel argues that these changes have already taken place, and that the focus of police has gradually moved towards "front-loading" their intelligence systems with relevant knowledge that can be later sorted and used,.[12] Supporting this institutionalised shift, the House of Lords in the UK argued in 2009 that an advantage of surveillance activities is the ability for the government to provide a more tailored approach to governance,[26] and by extension, law enforcement.

Solutions

In seeking a middle ground between the societal benefits afforded by big data and the resulting loss of privacy and autonomy, academics have proposed a number of solutions.[10] Deakin argues that “smart cities” are not simply those that utilize ICT, but where such intelligence is tailored to meet the needs of citizens through community and environmental drivers.[53] Komninos refers to the three layers of intelligence in smart cities[31] as the artificial intelligence of smart city infrastructure, the collective intelligence of the city’s institutions and the intelligence of the city’s populations. By integrating these layers in the implementation process, smart cities may be able to overcome the issues of government opacity that plague them. One of the issues with establishing a legal framework for smart city technology is determining whether to take a technology-specific or technology-neutral approach.[54] Many technologies have developed too rapidly to be covered by a single technology-specific regime, while a technology-neutral approach risks being too ambiguous to encourage use or development of the regulated technology.[54] Further, most applications are too benign to be regulated, while other more controversial technologies tend to be enabled by the creation of legislation, such as the Regulation of Investigatory Powers Act 2000, which established scenarios where police were able to carry out surveillance, with or without authorisation.[12] A challenge to these laws is currently pending in the European Court of Human Rights,[55] reinforcing the difficulty of establishing a suitable legal regime. One potential legal solution in the UK has been the development of the tort of misuse of private information,[56] which the English Court of Appeal held could potentially be breached by data collection, for which damages may be claimed.[57]

Studies conducted by Deakin and Campbell in 2005 identified three types of interaction between citizens and smart cities.[58] They concluded that citizens desire accessible and reliable information and seamless and responsive governments during transactions.[58] Further, any consultation with the community needed to be transparent and based on democratic engagement and accountability.[58] Bennett Moses et al. hold that the success of data-driven technologies is based on technical, social and normative dimensions.[17] This means that smart city technologies must satisfy citizens of their effectiveness, have a major beneficial impact that encourages uptake and align with generally acceptable ethics and values.[17]

Access

A potential solution to bridge the divide between the competing benefits and costs of big data surveillance is to turn the management of personal information into a ‘joint venture’.[59] Increasing awareness of how, where and why data is collected by the Government establishes the groundwork for a non-adversarial approach to the use of data within smart cities.[59]

Barcelona is a city that has embraced smart city technology while maintaining public access.

This process minimizes perceptions of secrecy,[10] and cities that invest in multiple points of access, such as Barcelona with its Open Government platform[60] have seen growth in the use of smart city applications.[61]

Furthermore, this process has developed to allow individuals to access their own data in a usable format,[59] as seen through Barcelona’s Open Data project.[62] In this way autonomy is regained both in relation to awareness of how an individual is affected by the collection of data as well as participation in the actual application of this data to generate information, as new technologies are developed.

Accountability

In addition to general awareness of the intended purpose of data collection ‘before the fact’, accountability processes ‘after the fact’ are also required.[10] A potential measure is for responsible parties to be notified where some sort of discriminatory decision is made, thus allowing appropriate action to be taken.[63] In data-driven processes, particularly in the fields of law enforcement, it is difficult to attribute responsibility to a single body or source, as often the information is derived from a number of different locations.[12] Further, opacity is often essential to predictive policing technologies, as transparency may encourage potential offenders to alter their behaviour to avoid detection.[17]

Transparency processes however remain crucial to ensure that a panoptic view or electronic police state cannot be imposed, as it allows for a review of how decisions are made in relation to them and what criteria this is based upon. Accountability is particularly relevant in the implementation stage

Implementation

The implementation stage of smart city technology is considered to be crucial, as the applications and platforms must be grounded in the “social capital, environmental and cultural attributes of the communities they represent”.[64] Paskaleva notes that e-governance platforms are particularly suited to democratically generating community support where residents are able to participate in the decision making and implementation process.[11] Confirming this, studies by Deakin et al. highlight that community backlash to smart city technology is minimized where e-government services are co-designed by governments and communities.[58] An example of collaboration at an extreme level was seen in Bletchley Park, where the Nazi Engima cypher was decoded in what is often referred to as the first smart city.[31] More recently, citizen participation has been encouraged in Edinburgh,[65] where citizens are invited to ICT ‘taster’ sessions in local venues, enabling them to learn about the planning, development and design of new smart city technologies.[64] Such partnerships incorporate elements of democracy[64] and highlight how digitally inclusive decision making generates the requisite level of trust to support the implementation of smart city technology. Trust acts as an empowering and engaging mechanism for citizens according to Finch and Tene.[10] This empowerment intelligence allows citizens to upskill[31] and assist in the development of innovative smart city networks, addressing areas not contemplated by authorities. In Hong Kong, such development takes place in the Cyberport Zone,[66] while in Amsterdam, “Smart Citizens Labs”[67] are designed for interaction between citizens and government. These mechanisms have resulted in large levels of enthusiasm for smart city technology,[11] as evidenced by the numerous crowd-sourced Amsterdam Smart City projects to date.[68]

Kista has implemented smart city technology using the Triple Helix Model with positive outcomes.

The Triple Helix Model for Smart cities, combining university, industry and government[34] in the development process is regarded as a potential benchmark for smart city development and implementation. Kourtit et al. advance that this model implements the knowledge generated from collaboration to tailor smart city applications to market needs.[69] Empirical studies conducted on smart cities in the Netherlands compared the level of ICT penetration to the city’s level of smartness under the Triple Helix Metric, finding a strong positive correlation. A live example of the Triple Helix Model in practice can be seen in the Kista Science City business cluster in Stockholm.[70] Underpinned by the Stokab Model of government provisioned dark fibre,[71] more than 1000 companies[72] including multinational Ericsson,[73] the Royal Institute of Technology (KTH) and Stockholm University reside in Kista,[72] which has grown to become the largest corporate area in Sweden. The success of Kista highlights the usefulness of the Triple Helix Model in Smart City implementation and provides a potential platform for cities seeking to introduce smart city technology in a manner that optimizes resident uptake.

Anonymity

When considering the potential for privacy law breaches, particularly within the smart cities context containing a vast scope of data that is available to the Government, data may often need to be de-identified to maintain privacy.[10] Whilst this may make it difficult to reconcile data collected from multiple services, it could still allow for the useful collection and aggregation of data for defined purposes. The E-CAF system (Common Assessment Framework),[74] where a database of all children assessed by government services (including police, social services and schools) is maintained by the UK Government, highlights how anonymity is fading due to data-driven technologies.[12] The system allows authorities to predict which children will commit crime in the future and allow them to intervene, based on a number of risk factors and profiling.[12] It is evident that citizens captured by the database as children will no longer be "anonymous" members of society. Given the potential Government presumption that parties unwilling to share their information are inherently suspicious,[12] the difficulty of maintaining anonymity in modern smart cities is clearly quite high.

References

  1. 1.0 1.1 1.2 Lua error in package.lua at line 80: module 'strict' not found.
  2. 2.0 2.1 2.2 2.3 2.4 Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. 5.0 5.1 5.2 Lua error in package.lua at line 80: module 'strict' not found.
  6. 6.0 6.1 Lua error in package.lua at line 80: module 'strict' not found.
  7. 7.0 7.1 Lua error in package.lua at line 80: module 'strict' not found.
  8. 8.0 8.1 Lua error in package.lua at line 80: module 'strict' not found.
  9. 9.0 9.1 Lua error in package.lua at line 80: module 'strict' not found.
  10. 10.00 10.01 10.02 10.03 10.04 10.05 10.06 10.07 10.08 10.09 10.10 10.11 10.12 10.13 10.14 10.15 10.16 10.17 10.18 10.19 10.20 10.21 10.22 10.23 10.24 10.25 10.26 10.27 10.28 10.29 10.30 Lua error in package.lua at line 80: module 'strict' not found.
  11. 11.0 11.1 11.2 11.3 Lua error in package.lua at line 80: module 'strict' not found.
  12. 12.00 12.01 12.02 12.03 12.04 12.05 12.06 12.07 12.08 12.09 12.10 Lua error in package.lua at line 80: module 'strict' not found.
  13. Lua error in package.lua at line 80: module 'strict' not found.
  14. Lua error in package.lua at line 80: module 'strict' not found.
  15. 15.0 15.1 15.2 Lua error in package.lua at line 80: module 'strict' not found.
  16. 16.0 16.1 Lua error in package.lua at line 80: module 'strict' not found.
  17. 17.0 17.1 17.2 17.3 17.4 Lua error in package.lua at line 80: module 'strict' not found.
  18. 18.00 18.01 18.02 18.03 18.04 18.05 18.06 18.07 18.08 18.09 18.10 18.11 18.12 Lua error in package.lua at line 80: module 'strict' not found.
  19. 19.0 19.1 19.2 Lua error in package.lua at line 80: module 'strict' not found.
  20. Lua error in package.lua at line 80: module 'strict' not found.
  21. Lua error in package.lua at line 80: module 'strict' not found.
  22. O'HARA v CHIEF CONSTABLE RUC, 1997 2 WLR 1 (House of Lords).
  23. 23.0 23.1 GILLAN AND QUINTON v. THE UNITED KINGDOM (The European Court of Human Rights 2010-01-12). Text
  24. Draper v. United States, 358 U.S. 307 (US Supreme Court 1959-01-26).
  25. United States v Sokolow, 490 U.S. 1 (US Supreme Court 1989-04-03).
  26. 26.0 26.1 Lua error in package.lua at line 80: module 'strict' not found.
  27. Floyd v. City of New York, 959 Supp.2d 540, 562 (United States Court of Appeals).
  28. Lua error in package.lua at line 80: module 'strict' not found.
  29. Lua error in package.lua at line 80: module 'strict' not found.
  30. Lua error in package.lua at line 80: module 'strict' not found.
  31. 31.0 31.1 31.2 31.3 31.4 Lua error in package.lua at line 80: module 'strict' not found.
  32. Lua error in package.lua at line 80: module 'strict' not found.
  33. Lua error in package.lua at line 80: module 'strict' not found.
  34. 34.0 34.1 Lua error in package.lua at line 80: module 'strict' not found.
  35. Lua error in package.lua at line 80: module 'strict' not found.
  36. Lua error in package.lua at line 80: module 'strict' not found.
  37. Lua error in package.lua at line 80: module 'strict' not found.
  38. 38.0 38.1 Lua error in package.lua at line 80: module 'strict' not found.
  39. Lua error in package.lua at line 80: module 'strict' not found.
  40. Lua error in package.lua at line 80: module 'strict' not found.
  41. Lua error in package.lua at line 80: module 'strict' not found.
  42. 42.0 42.1 Lua error in package.lua at line 80: module 'strict' not found.
  43. 43.0 43.1 43.2 Lua error in package.lua at line 80: module 'strict' not found.
  44. 44.0 44.1 Lua error in package.lua at line 80: module 'strict' not found.
  45. Lua error in package.lua at line 80: module 'strict' not found.
  46. S. AND MARPER V. THE UNITED KINGDOM (The European Court of Human Rights 2008-12-04). Text
  47. LIBERTY AND OTHERS v. THE UNITED KINGDOM (The European Court of Human Rights 2008-07-01). Text
  48. Lua error in package.lua at line 80: module 'strict' not found.
  49. Lua error in package.lua at line 80: module 'strict' not found.
  50. 50.0 50.1 50.2 Lua error in package.lua at line 80: module 'strict' not found.
  51. Lua error in package.lua at line 80: module 'strict' not found.
  52. 52.0 52.1 Lua error in package.lua at line 80: module 'strict' not found.
  53. Lua error in package.lua at line 80: module 'strict' not found.
  54. 54.0 54.1 Lua error in package.lua at line 80: module 'strict' not found.
  55. BIG BROTHER WATCH AND OTHERS V. THE UNITED KINGDOM (The European Court of Human Rights 2013-07-04). Text
  56. CAMPBELL V MGN LTD (House of Lords 2004). Text
  57. Vidal-Hall v Google Inc (England and Wales Court of Appeal 20q5).
  58. 58.0 58.1 58.2 58.3 Lua error in package.lua at line 80: module 'strict' not found.
  59. 59.0 59.1 59.2 Lua error in package.lua at line 80: module 'strict' not found.
  60. Lua error in package.lua at line 80: module 'strict' not found.
  61. Lua error in package.lua at line 80: module 'strict' not found.
  62. Lua error in package.lua at line 80: module 'strict' not found.
  63. Lua error in package.lua at line 80: module 'strict' not found.
  64. 64.0 64.1 64.2 Lua error in package.lua at line 80: module 'strict' not found.
  65. Lua error in package.lua at line 80: module 'strict' not found.
  66. Lua error in package.lua at line 80: module 'strict' not found.
  67. Lua error in package.lua at line 80: module 'strict' not found.
  68. Lua error in package.lua at line 80: module 'strict' not found.
  69. Lua error in package.lua at line 80: module 'strict' not found.
  70. Lua error in package.lua at line 80: module 'strict' not found.
  71. Lua error in package.lua at line 80: module 'strict' not found.
  72. 72.0 72.1 Lua error in package.lua at line 80: module 'strict' not found.
  73. Lua error in package.lua at line 80: module 'strict' not found.
  74. Lua error in package.lua at line 80: module 'strict' not found.