Technology is not neutral, as facial recognition algorithms and predictive policing have shown us. Algorithms discriminate by design, reflecting and reinforcing pre-existing biases.
Susan Orr, The College at Brockport, State University of New York and James Johnson, University of Rochester
Voting at home is safe from fraud and disease, but gives up a key advantage of in-person voting at official polling places: a secure, safe environment where everyone can cast their ballot secretly.
A national health plan that uses data to assess individual risk and control disease outbreaks would have created less disruption than the current coronavirus pandemic response.
Canada’s COVID Alert app maybe be privacy-safe, but the government has failed to release any information about what effect it expects it to have on COVID-19 transmission.
Taming Big Tech’s market power requires addressing their monopoly over user-related data collection instead of employing traditional antitrust measures such as breaking up the firms.
In response to the Covid-19 epidemic, more than 50 countries have developed tracing applications to help alert citizens and authorities when outbreaks occur. But the process is anything but simple.
National MP Hamish Walker and political powerbroker Michelle Boag have admitted leaking confidential patient information – but does that make them legally liable too?
In a country marred by systematic discrimination and continued social marginalisation, particular consideration needs to be given to the measures being used to contain the spread of COVID-19.