As we start into the 2020s, it is clear that trust – in products and services, in technology, in privacy, and so on – is becoming more and more relevant and business-critical. The Internet industry has a vested interest in making sure that the Internet is a place that people can trust – a place where all stakeholders in both society and business feel safe to be and to interact. As our world becomes more dependent on critical digital services, the industry has become more aware of the importance of going the extra mile to help make the Internet a better place to be. With this in mind, in this issue of dotmagazine we explore some of the industry-based and multi-stakeholder initiatives that are working in various sectors to clean up the neighborhood, as Michele Neylon from Blacknight puts it.
So, what can the industry do to build trust in the Internet? One of the instruments that the industry has at its disposal is self-regulation – this is actively leveraged in some sectors of the industry to improve the quality and legality of services, and in other parts of the industry is still being discovered and explored. Examples of initiatives include developing recommendations and best practices, and then bringing industry players – from one sector or from along a particular value chain – together to implement these, and even sometimes to develop them further into binding rules.
Now, almost undoubtedly, everyone reading this will have at one time or another come across some of the less optimal practices from the Internet’s Wild West. For example, inboxes full of spam and websites with invasive pop-up ads concealing the content beneath them are areas where the industry learnt a perhaps hard, but certainly valuable, lesson: bad user experience does not induce customers to spend money on your products. Recognition of this is perhaps the foundation of good practice: security issues, privacy concerns, legally questionable – or downright illegal – content and offers lead to a negative user experience and the resulting loss of trust of these users.
When your user experience is perceived as being bad, you'll probably have issues doing business online in the long run. Self-regulatory initiatives can not only help industry players themselves to ensure they are working according to best practices rather than those that are less appropriate, but it can also help them to guide their customers to ensure that their behavior online is also acceptable.
One example of such an initiative is that of the Certified Senders Alliance (CSA). This brings together the players along the email marketing value chain and provides guidance on best practices to build and maintain a good and trusted relationship with your customer through email, and on how to comply with technical rules and legal regulations like the General Data Protection Regulation (GDPR). The players that the CSA brings together are the email service providers (who send emails on behalf of brands) and the Internet service providers and mailbox providers, who work hard to protect their customers against spam. The CSA whitelist helps brand messages which are legally and technically compliant to be delivered to their customers, providing brands and ESPs with guidance and certification to help them bypass the increasingly strict spam filters and improve their deliverability through appropriate behavior.
Another example of an industry initiative of this kind is the Acceptable Ads Standard initiative which is an example for how the process develops when the industry acknowledges that there's an issue and itself creates mechanisms to address those issues. In this case, the standard was developed because “unacceptable” ads or pop-ups were perceived as bad user experience. And this is a nice example of where the industry itself – at least the good players within this ecosystem – have acknowledged that there's an issue that needs to be addressed. One way of doing so is by establishing a standard that works for both users and the industry. The Acceptable Ads Standard (AAS) was the result and now an ecosystem is growing up around it. This involves a broad multi-stakeholder approach, which includes not only all the different parts of this industry, but also civil society. This means the average Internet user is represented and can thus contribute to the understanding of what is acceptable and what is not, as well as ideas for how to handle it.
Now, so far, we have been looking at developing trust among human beings – but trust can also be engendered through technology. An example is strong encryption; if I can use it, I don’t need to 'trust' whether my data is safe, because I then just know that it’s safe. The same goes for authentication. When I receive an authenticated email from someone, I don’t have to 'trust' that they are who they say they are, because I know it for sure. In these instances, trust is trumped by knowledge.
Again, looking at an email-related initiative, BIMI (Brand Indicators for Message Identification) allows brands to publish their logos in such a way that the logo will be displayed within the email client (not just in the email itself once opened), as long as the email is an authenticated, legitimate email from that brand (and therefore not, for example, a malicious Phishing email). This allows brands to approach customers in a trusted manner, and rewards brands through increased awareness of their logo. Such initiatives incentivize best practices with a reward for the sender – a win/win situation for the industry.
On the technical level, the foundation for how to interact in a safe way on the Internet is encryption. The eco Association and the Internet Society – who in November 2019 signed an MoU with the intention of collaborating on a range of Internet topics – both strongly advocate the increased use of encryption to make communications and data more secure online. Here, governments around the world have been working to weaken encryption technology so that it can be broken for surveillance purposes, something which would have ramifications for the security of communications on the Internet, as Rinalia Abdul Rahim from ISOC explains.
However, as eco Association Board Member Klaus Landefeld points out, there are also further governmental initiatives to circumvent encryption – such as the British secret service’s proposed “Ghost Protocol”, which has the potential to undermine trust in the confidentiality of electronic communication. Rather than allowing such drastic measures, Landefeld argues that the industry needs to find better ways to cooperate with law enforcement agencies to enable more efficient prosecution of the perpetrators of the cyber crime that affects us all at some time or another.
Perhaps to some ears in the industry, this might sound like foreign territory but, in fact, the industry has long been cooperating with law enforcement agencies in the area of illegal content. The handling of illegal content is a topic which has been gaining traction recently in a range of industry contexts, including ICANN and the Internet Governance Forum (IGF). This brings me to two final self-regulatory initiatives: the INHOPE network of complaints hotlines, represented here by the eco Complaints Office, and the Framework to Address Abuse.
The eco Complaints Office advocates for the take-down of illegal content rather than web blocks, in order to get the content offline fast and effectively. To achieve take-down – especially of content like child sexual abuse material (CSAM) – the Complaints Office makes it possible for any Internet user to make a report of offensive or illegal content. It then works closely with hosting providers and partner hotlines in the INHOPE network to have the content, if it is found to be illegal (after a legal assessment by the Complaints Office lawyers) or deemed contrary to the provider’s terms and conditions, taken offline. In the case of illegal content, the relevant law enforcement agency is also brought into the loop to ensure that not only are the offenders brought to justice, but also that a clear message about the consequences of publishing illegal content (e.g. hate speech) is sent.
Finally, we come to the new Framework to Address Abuse. First published in October 2019, there are currently already more than 50 companies as signatories. Here again, the industry acknowledges that there are issues out there that need to be addressed – issues where in the past they might have not wanted to take responsibility, not seen it as within their remit. But things are changing. Industry players now see themselves as playing a role in addressing and solving the problem of illegal content.
As Michele Neylon explains, there are a number of mechanisms at hand that can be employed in certain cases to stop or prevent abuse. CSAM is a good example, because there is simply no grey area, no room for interpretation or debate. Here, what is illegal is almost universally illegal – there are similar laws around the world, regardless of jurisdiction. And for Michele, it's pretty clear that he has to take action on it as a hosting provider or in his role as a registrar, because coming back to the point that it's about doing business online, he says – and I think he's absolutely right on this – you have to look after your neighborhood. If you want to do business online, you don’t want a rat-infested, crime-ridden neighborhood where neither you nor your customers feel safe. We all need to be invested in making the Internet a good place and a safe place to be.
Lars Steffen is Director International at eco – Association of the Internet Industry (international.eco.de), the largest Internet industry association in Europe. At eco, he coordinates all international activities of the association and takes care of the members from the domain name industry and the blockchain community. He further represents the industry as Community Outreach Co-Coordinator of the Universal Acceptance Steering Group at the Internet Corporation for Assigned Names and Numbers (icann.org), to facilitate the support of internationalized domain names and email address internationalization.