According to a recent report released on Wednesday, the use of educational technology in schools, such as artificial intelligence, digital surveillance, and content filters, threatens the civil rights of students with disabilities, LGBTQ students, and students of colour.
According to a report by the Centre for Democracy and Technology, a nonprofit, nonpartisan organisation that promotes civil rights in the digital age, some technology used in schools to block explicit adult content and flag students at risk of self-harm or harming others has also seriously affected already vulnerable students.
The study’s foundation is a comprehensive online national survey about how schools, students, and teachers use technology. The Centre for Democracy and Technology polled 1,029 ninth through twelfth graders, 1,018 sixth through twelfth grade parents, and 1,005 sixth through twelfth grade instructors this summer. The organisation claimed that the sample was weighted to be “nationally representative.”
The surveys show widespread misunderstanding on the use of artificial intelligence in the classroom, according to the Centre for Democracy and Technology, and the majority of parents, students, and instructors say they want more information and training on its proper usage.
Risks of education technology for students with disabilities, LGBTQ students, and students of colour are described in a report.
The study describes how disabilities, LGBTQ students might be harmed by school technology, often unintentionally. According to the Centre for Democracy and Technology, vulnerable students are most severely affected by these effects.
The outing of LGBTQ students through digital monitoring, a potentially traumatic event in which their sexual identity or orientation is shared without their knowledge, has been reported by students.
Artificial intelligence was cited as being used most frequently by students with impairments, and same students were also more likely to report receiving punishment for utilising it.
Content pertaining to race or the LGBTQ community is more likely to be filtered out, according to one-third of teachers. This, according to the centre, “amounts to a digital book ban.”
Some schools have encountered resistance because of how they used technology. A school district in Texas was forced to remove a filter that was blocking the website of the LGBTQ youth-focused Trevor Project after the American Civil Liberties Union filed a lawsuit against the district.
According to Elizabeth Laird, director of equity in civic technology for the Centre for Democracy and Technology, “There are some students who should already be protected by existing civil rights laws, but they are still experiencing disproportionate and negative consequences because of the use of this education data and technology.”
Laird said its survey shows that schools have not fully grappled with how education technology is affecting the promise of an equitable education, resulting in civil rights and technology being treated as separate issues. This is despite the fact that schools frequently have dedicated staff and other practises set up to ensure that students’ civil rights are being protected.
The moment has come to bring those together, according to Laird, who believes they have been kept apart.
Civil rights organisations demand more federal direction
The poll reveals an increase in student and parent worries about data and privacy over the past year, despite the fact that schools have been doing more outreach than in previous years. According to survey data gathered in prior years, parents and kids need to be more informed about the technologies that schools choose and employ.
Civil rights organisations, including the ACLU, the American Association of School Librarians, the American Library Association, Disability Rights in Education Defence Fund, and the Electronic Frontier Foundation, signed a letter accompanying the Centre for Democracy and Technology’s report, pleading with the federal Department of Education for additional guidance. This was released by the White House in October following the release of a Blueprint for an AI Bill of Rights.
The letter states that “a year after the Blueprint’s publication, the need for education-related protections remains and, if anything, is even more urgent given the explosive rise of generative AI.”
In the poll, 57% of teachers claimed they had no formal training in AI, while 24% said they had received training on how to spot improper usage of AI.
The poll also revealed that 19% of respondents had turned in a paper produced with AI, while 58% of students had used ChatGPT or other generative AI programmes. Students say they use AI for homework tasks, personal concerns with family and friends, and mental health issues.
The usage of generative AI is increasingly prevalent among students with disabilities: They reported using the technology to be 72%. Parents of children who have disabilities are more likely to report that their kids have received discipline for using AI. Increased disciplinary rates among marginalised communities are described in the report as “particularly worrisome.”
The privacy and security of the data that the school collects and stores concerns these children and their parents more than other groups, with 71% of students with disabilities and 79% of their parents expressing this concern.
A “promising practise that could be extended to the rest of the school population,” advises the Centre for Democracy and ICT, is that licenced special education teachers are more likely to have discussions with students and their parents about student privacy and equity concerns with ICT.
Long arm of school surveillance
The civil rights issues may extend outside of the school’s boundaries. Some kids are more likely to rely on school-issued devices when they are at home, especially students of colour and those from lower-income households. Thus, surveillance and tracking can follow them home.
For those pupils, “their learning environment is quite different than those who can essentially opt out of some of this tracking,” Laird said.
It’s possible that technology gadgets used by students to charge their mobile phones will also scan and monitor such devices. 51% of pupils who had charged their school device said that school software started downloading content to and syncing with their personal device.
The usage of monitoring technology in schools has persisted since the epidemic era of remote learning, according to 88% of teachers. Preventing the unauthorised monitoring of students was prioritised by the White House in its blueprint. According to the Centre for Democracy and Technology, using surveillance technology can result in a variety of issues for children with disabilities and LGBTQ students.
Due to technology that tracks them, children with disabilities and LGBTQ students are more likely to report being reprimanded. According to Laird, sometimes kids receive punishment for something that the technology indicated, while other times, they receive punishment for how they reacted when they were flagged.
Even after school hours, schools occasionally provide law enforcement access to their data. Data sharing with law enforcement after hours, according to 53% of special education teachers and 46% of teachers in Title I schools. The parent of a ninth grader claimed in an interview with the Centre for Democracy and equipment that when something on her child’s tablet was identified by the school’s monitoring equipment, law enforcement was contacted even before her was informed. Without her permission, her son was interrogated for an hour.
According to Laird, “all of those things may lead to the removal of students from the classroom and the loss of instructional time.” This could also be a potential infringement of a student’s right to a free and appropriate public education, which is specific to preventing discrimination on the basis of disability, if such students are being recognised and intervened with disproportionate frequency.
Also Read: Community Outcry Over Proposed Cuts To Faculty And Programs At West Virginia University