The Mozilla Foundation has released a report on the Internet-related events in 2020. Mozilla looked back at the good and bad changes that took place in the internet in 2020, saying that the Internet in 2020 helped us and made us harder than ever, when it was a difficult year for everyone in the world.
In 2020, due to the impact of Corona 19, the demand for Internet services increased due to a global ban on going out. The Internet is available to half of the world’s population, and they are actively using remote work and online classes. On the other hand, the other half of the world cannot connect to the Internet. There are also countries that restrict free Internet access among the people. In countries with limited internet connectivity, millions of people try to access internet information using VPNs. However, anonymous users using VPNs are often used for cybercrime, and Internet access is blocked in certain parts of the world as daily.
Mozilla points out that in order to reduce the number of people who cannot connect to the Internet, it is necessary to discuss ways to provide the Internet inexpensively, such as public investment in infrastructure that connects rural areas and cities, and the expansion of public cafes such as schools, universities and libraries. In addition, in order to improve the quality of the Internet, it is argued that it is necessary to support not only facilities owned by the government and large private Internet companies, but also facilities operated by local communities. Mozilla points out that in order to transform the problems exposed in 2020, people must look at both the good and the bad about the Internet.
Next comes the internet and racism. Technology has been thought to be equal regardless of race. However, the existence of racial inequality has recently been revealed. Blackbird, a web browser that appeared under the slogan of a web browser for African Americans in 2008, was a browser that displayed search results optimized for black people using custom search. Blackbird shows the demand for black-optimized search results, with more than 300,000 downloads in the months from its appearance.
In addition, Safiya U. Noble, author of Google’s Algorithms of Oppression, a best-selling book about racism based on search engine algorithms, searched for black girls on Google. He said he was shocked to see a lot of pornography appearing in the search results. After that, Google adjusted the black women’s search results, but did not comment on changing the search algorithm. According to a report in June 2020, searches for words such as Latin girls and Asian girls resulted in automatic completion of keywords related to adult content. It is pointed out that there are many Google algorithms that display racist search results.
Racist techniques are not limited to search engine algorithms. For example, it has been reported that various technologies have racist behavior, such as an error that occurs because the AI used in the U.S. judicial test does not recognize black faces, or the problem that the speech recognition algorithm does not recognize black voices well is pointed out to researchers. . IBM also announced a withdrawal from the development of facial recognition technology in June 2020, fearing that such technologies could promote racial discrimination.
Mozilla raises the question of what attribute the predicted user might have been about racism by technology. In addition, against the background of the development of racist movement technology, Ford Foundation technology fellow Matt Mitchell points out that white-centered technology is developed because technology companies do not create a good environment for people of color and women to work.
Next is worker rights. There are over 50 million gig economy workers around the world working on online platforms like Uber. Mozilla is reporting a situation where gig economy workers are deprived of their right to access information.
When an Uber driver was assaulted by a passenger in 2015, he asked Uber for information on the assault passengers, but was ignored for weeks. Even though he was working as a freelance driver, he was furious that Uber’s control of the passenger information in his vehicle. Later, the man founded the App Drivers & Couriers Union and filed a lawsuit demanding minimum wage payments or holiday guarantees for Uber and other dispatch application operators.
In a lawsuit against Uber, it was discovered that Uber had a secret parameter for evaluating the driver, but how this parameter affected the driver’s job was unknown. Mozilla points out that in this case, the right to obtain digital information and workers’ rights are closely linked.
WeClock, an application developed to support gig economy workers’ right to obtain information, can record gig economy workers’ content and store working hours, distance traveled, and wages as data. In October 2020, data collected by 213 users using Wicklock revealed that their wages have decreased without notifying Geek Economy.
According to Mozilla, gig economy working conditions are on the rise. In September 2020, a ruling was issued against a Barcelona-based food delivery app Glovo that should treat drivers as employees rather than freelancers. In addition, on November 3, 2020, a law was passed that mandated operators to expand the welfare benefits of drivers who work through applications in California, the home of several dispatch services. As a result, dispatch services such as Uber, lift, and door dash are continuing to announce welfare benefits for California drivers.
Next is transparency. In 2020, the magnitude of the influence of various SNS services was revealed, and transparency was required for companies operating SNS. Following the raid in the US Congress in January 2021, Facebook and Twitter suspended the account of former President Trump. Some have participated in this response, and others have expressed doubts that hinder the free expression of opinions by SNS operating companies.
In October 2020, a Facebook post about End SARS against Nigerian Police Forces was labeled as false information. Facebook explained that the response was the result of the system’s confusion between the COVID-19 names SARS-CoV-2 and End SARS. However, Facebook did not clarify fundamental measures to prevent the situation from recurring.
In addition, a survey using the Citizen Browser Project, a data-gathering algorithm developed by New Media Markup, revealed that Facebook continues to recommend political groups that it has announced that it has banned. Mozilla says it is difficult to extend transparency monitoring across all platforms and languages using these tools, but it says monitoring will help businesses demand transparency. Related information can be found here .
Add comment