Wikimedia Conference 2011: Cultural Heritage, Commons and lots of Data. Pt. 2

November 8, 2011 1 comment

This is part 2 of my report on the Wikimedia conference 2011. The first part can be found here

Teun Lucassen – Wikipedia is reliable [Citation needed]

Almost all the sessions in the cultural heritage track were presentations about a certain project. This was interesting but not something I had not heard yet. It therefore chose to go to a presentation by Teun Lucassen (@tlucassen) about how users experience the reliability of Wikipedia. Lucassen is a PhD student at the faculty of Behavioral Sciences at the University of Twente. The first question Lucassen asks in his research is if Wikipedia users need help deciding if an article is reliable. The problem with Wikipedia is the that it is hard to find out who the authors of an article are and if they can be considered a reliable source. Throughout the history of Wikipedia, several attempts have been made to help the user deciding. Lucassen first showed WikiViz, a datavisualization tool developed by IBM. The tools adds a bar to the article which shows a number of statistics about the article. For example that it has been edited 87 times, by 23 users. The problem with this kind of information is, what does it say about the reliability? Especially when you realize that most of the edits are made by automated bots. Lucassen told that he always uses this tool as a bad example. In this however, I do not totally agree with him. His research reminded me about my own research I did in the Digital Methods class about Wikipedia last year. Here I analyzed how different articles were build. This showed that most articles have been created by several different users, but the majority of the text was written by only one or two persons. All the others edits made by human editors were mainly grammatical and linguistic improvements. This is a problem for an encyclopedia who’s goal it is to show a neutral point of view. Showing how many people are actually responsible for the text can therefore be a useful way give an indication about the reliability of the article. My full report can be found on my blog.
Lucassen studied three methods that would help the user to decide if the article is reliable. The first is a user based rating system, which is implemented at the moment in the English language Wikipedia. The second one was an easy algorithm that shows a rating depending on the amount of edits and users the article has. The third one is what Lucassen calls an ‘Adaptive Neural Network Rating system’. This uses a difficult algorithm that is impossible for the user to understand. Lucassen did not tell his testing group that this system was complete nonsense. He gave the testing group the same articles to read with different ratings in order to see how this would influence their idea of trustworthiness. His test results showed that people considered the article less reliable when the user based rating system was used. People did not trust the opinion of other people or thought that there were not enough votes. The simple algorithm made people more positive about the article. All test users agreed that this mark that is created by this algorithm is not able to give much useful information about the article. The third, made up, algorithm showed both positive and negative results. This can be explained by a phenomena called ‘over-reliance’ . This is when people start making their own assumptions about what the algorithm means. It was funny to see how people had started to believe an algorithm which was completely made up.
Lucassen concludes his research that because of the ambiguous quality of Wikipedia, helping the users can be a good strategy in order to make Wikipedia more reliable, but that there are many pitfalls in how to achieve this. Lucassen proposes a user based rating system where he voter has to add a small piece of text that explains why he gave the grade. I found Lucassen’s presentation extremely interesting and I think this kind of research can definitely be used in combination with the research that is done at the Digital Methods course at the MA New Media at he University of Amsterdam. More information about Lucassen’s research can be found on his blog.

Ronald Beelaard – The lifecycle of Wikipedia

Ronald Beelaard did extensive research to the lifecycle of Wikipedia users. The reason for this is the article written by the Wall Street Journal which concluded that Wikipedia editors were leaving Wikipedia on a larger scale than ever. Beelaard started his own research in order to find out how many Wikipedia users ‘die’ each month and how many are ‘born’. He also took in concern the phenomena called a ‘Wikibreak’, where editors stop editing Wikipedia for a while, to come back later. Beelaard showed a big bulk of numbers which weren’t always easy to understand and concluded that the dropout rate is only a fraction as big as the numbers that are mentioned in the Wall Street Journal. It is however true that less people start editing Wikipedia and the young editors die earlier than the old ones. The total community is shrinking but the seniors are more vital than ever.

Erik Zachte – Wikipedia, still a world to win

The last presentation of the day was given by Erik Zachte (@infodisiac), a data analyst. He researched Wikipedia’s mission to embrace the whole world. He showed in a graph (Inspired by Hans Roslings Gapminder) that Wikipedia is growing in all languages, but that some of them are relatively small compared to the amount of people who speak the language. The English Wikipedia is off course the biggest, but the Arabic or Hindi Wikipedia is still relatively small, despite the millions of people who speak these languages. This is partly because of the internet penetration in these countries, which is not as high as in Western countries. This is also the reason why the Scandinavian Wikipedia is doing so extremely well. But this is not the only reason. Zachte showed for example that the English language Wikipedia is edited by an very high number of people from India. Zachte also showed that most edits come from Europe, which can be explained by the high amount of languages here. When a big disaster or worldwide event happens, a Wiki page appears of it in all the different languages.
There is a big correlation between the amount of habitants of a country and the size of the Wikipedia in that language. By putting a geographical map with the population density over a map with the sizes of each Wikipedia, Zachte showed interesting outliers. A nice piece of datavizualization. Zachte ended his presentation by addressing the rise of mobile internet. In African countries, not many people own a desktop computer with an internet connection. There is hoewever, a big rise in the use of smart phones. Zachte therefore beleives that the Wikimedia foundation should make their site more accessible for editing the pages with these devices in order to create a larger Wikipedia.

In the end I can look back on a very well organized event with lots of interesting presentations. The cultural heritage track gave a nice overview about what is happening at the moment in the field and how open content and standards can help spread the content. The Wiki-world track was for me however the most fascinating. It reminded me of all the researches that were done last year in the Digital Methods and datavisualization classes of my MA New Media studies and the fact Wikipedia and all its data is such an interesting object of study. I hereby want to thank the organization for a great day and I hope to able to be part of it next year.

Creative Commons Licentie
Dit werk is gelicenseerd onder een Creative Commons Naamsvermelding-GelijkDelen 3.0 Nederland licentie

Advertisements

Wikimedia Conference 2011: Cultural Heritage, Commons and lots of Data. Pt. 1

November 8, 2011 3 comments

On Saturday 5 November, the Wikimedia foundation held a conference in Utrecht. I took the opportunity to go there and write this report about it. Because of the size I decided to split it in two parts. The first is mainly about Cultural heritage and Creative Commons, the second part is about Wikipedia itself.

The Wikimedia foundation is a non-profit organization that is at the top of several open source projects dedicated to bringing free content to the world. Its most famous project is of course Wikipedia itself, but their are several other projects which deserve attention like the Wiktionary and Wikimedia Commons, which was discussed a lot today.

The conference was opened with a speech by a man who introduced himself as the CEO of the Wikimedia foundation and talked about the commercial successes they have reached. This was done by creating ad-space on the Wikipedia pages and receiving sponsor money by for example Neelie Kroes in order to keep her page clean. During his speech it pretty soon became clear that this was all part of a comedy act about exactly everything that Wikimedia is not.

Jill Cousins – Europeana without Walls

After this little piece of comedy theater it was time for Jill Cousins, executive director of the Europeana project, to open the conference with a keynote. Cousins presented the current status of the project and its relation with the Wikimedia foundation. Europeana’s goal is to digitize all of Europe’s heritage and to make it publicly available. Europeana aggregates the objects and its metadata from institutions all over Europe. Here Cousins addressed the copyright problem. Goal is to release all the metdata collected by Europeana under a Creative Commons license which allows commercial use by other parties (CC-0). The institutions are quite anxious towards this because they believe that they lose control of their material and fear a loss of income if others can use their content for free. However, as Cousins mentioned, without the possibility of commercial use, the objects can barely be used. This because the material can not be embedded on sites that exploit commercial activities, like for example put ads on their site. This also means that the objects can not be used in Wikipedia articles, since their regulations prescribe that media content has to be openly available and also for commercial use.
Europeana realizes that most of their objects are not found directly on their own portal website, but on other sites that embed their content, so being able to work together with other sites is vital for this project.
An other issue that Europeana has, is the lack of good metadata (this I also described in my MA thesis which can be found here). In order to make full use of the semantic possibilities of the web, especially with more than 15 million objects, good metadata is essential. Europeana has recently launched several projects and a handbook to encourage the different institutions to fill in their metadata in a correct and unified way. Here Cousins also noted that no matter what the information status is, the metadata should always be in public domain.

After the plenary opening, visitors had the option of choosing three different ‘tracks’. The first was purely focussed on cultural heritage, the second was about the world and data around the different Wikimedia projects and the third, the ‘Incore Wikimedia track’ consisted of different technical sessions for Wikipedia editors. Because of my focus on digital heritage and Europeana, I chose the first.

Maarten Dammers – GLAMWiki Overview

The first speaker was Maarten Dammers (@mdammers), a very active Wikimedia volunteer. He showed the GLAMwiki project. This stands for Galleries, Libraries, Archives, Museums & Wikimedia. Goal of this project is to build a bridge between these cultural institutions and Wikimedia. In order to achieve this several different projects were found. The first project Maarten talked about was Wiki loves Art. In this project users were asked to go to museums and to take pictures of different art objects and upload them to the Wikimedia Commons image bank. Because these pictures are under a CC-BY-SA license, which means commercial use is allowed, the pictures can be embedded in Wikipedia pages about the artist or object itself. By crowdsourcing these images and by making a contest out of it, the image bank quickly became filled with thousands of pictures. Other Wikipedia users started to add metadata to the images and to place them in articles which greatly enriched the Wikipedia pages.
In the second part of the presentation, Lodewijk Gelauff (@effeietsanders) joined Maarten to talk about Wiki loves Monuments. This project is the successor of the Wiki loves Art project. Where the Art project was only in the Netherlands, the monuments project was focussed on monuments from all over Europe. After the project was finished, it had resulted in 165000 photo’s in the Wikimedia Commons image bank.

Maarten Zeinstra – The value of putting photo collections in the public domain

After a short break, Knowledgeland employee Maarten Zeinstra (@mzeinstra) presented the results of his research about what the benefits are for institutions when they put their (photo)collection in the public domain. Maarten analyzed 1200 photo’s that were released by the Dutch National Archive. All pictures were from Dutch political party members in the history of the Netherlands. When these photo’s were put in the Wikimedia Commons image bank, Wikipedia users quickly started to add the pictures to Wikipedia articles. The result of this is that the pictures of the National Archive gained a lot more attention and automatically new metadata was added. To analyze this, Maarten made use of different tools created by members of the Wikimedia foundation. Several of these tools can be very helpful when analyzing Wikipedia, also on an academic level.
Interesting in this presentation was that this analysis actually showed to what extent the materials that are being put in the image bank are used. This information is extremely helpful when institutions are in doubt about if they should put their collection in the public domain. Maarten’s research also showed that it is more likely that the materials are used when a specific set is chosen. Maarten compared the collection of the National Archive with a lot bigger one from The Deutsche Fotothek which was uncategorized. From the 1200 photo’s from the National Archive, 55% was used in different language Wikipedia articles. From the Deutsche Fotothek collection, only around 3,5% was used. The main reason for this is the fact that an uncategorized collection requires more effort from the Wikipedia editors in order to sort them out. The full report can be found on the website of Images for the Future.

Sebastiaan ter Burg – Creative Commons in practice.

Sebastiaan ter Burg (@ter_burg) is a photographer who works independently. When he makes a photo report for a company he has one clear condition: all his work becomes, directly or with a delay, freely available under a CC-BY-SA license on his Flickr account. This means that all his work can be freely used and spread, even for commercial purposes. In his presentation, Sebastiaan talked about the benefits this way of working has for him. First of all, it saves him a lot of paperwork. In the ‘old’ way of making money with photo’s, an invoice and a contract is created for each picture that is sold. By releasing the material under a Creative Commons license, this is no longer necessary. Sebastiaan sets a fixed price for a photoshoot and so their is only one contract. The more important advantage is the fact his work is being spread and is being used in all kind of other different media. He noted that he has a better income than most freelance photographers. It has to be noted however, that Sebastian’s business model is not better than the old one per se. Quality is still the most important aspect when making money in the creative industry. It will however become harder for photographers who are not at the top to generate an income. When more photo’s are released under a Creative Commons license, less photographers are needed to report an event. When a couple of photographers take good pictures, other media can use them. The presentation of Sebastian showed that a business model that works with open data can work, which is a refreshing thought.

Johan Oomen – Open Images

Johan Oomen (@johanoomen) is the head of the research and development department at the Netherlands Institute for Sound an Vision. He presented Open Images, which is a sub-project of the  Images for the Future project. This is a Dutch project which has the goal to digitize the Dutch audio-visual heritage and to make it publicly available under an open license. Oomen explained that ‘open’ has to be understood in its broadest meaning: open source, open mediaformats (ogg), open standards (html5) and open content (CC). This way of working stimulates the reuse and remixing of old content. The project will also work together with the Europeana project in order to make the content more easily accessible. The project will continue for two more years and will mainly focus on this reuse and the possibilities of crowdsourcing new information, metadata and products.

Jan-Bart de Vreede – Wikiwijs, use of open content for education purposes.

Jan-Bart de Vreede is active in the Wikiwijs project which has the goal to let teachers use more open source content in the classroom. This content varies from images and videos, to complete lessons. Different objects or parts of lessons can also be combined in order to create new lessons, as long as as these are also shared under a Creative Commons license. In order to guarantee quality, educational institutes and teachers can add their opinion about the material. Interesting to hear was that the number one reason for teachers not to share their content, is that they think their material is not good enough. Which is kind of strange when they have been using it themselves for years.

This is the end of part 1, which is all about presentations from the Cultural Heritage track of the conference. In part 2, presentations from the ‘Wiki-World’ track are being discussed.

Creative Commons Licentie
Dit werk is gelicenseerd onder een Creative Commons Naamsvermelding-GelijkDelen 3.0 Nederland licentie

MA Thesis: Europeana Building a European Identity

November 8, 2011 2 comments

Last month I was finally able to receive my Masters diploma from the University of Amsterdam. My research about Europeana and the European identity has been found interesting enough to let me pass. However, there are still a lot of questions remaining after finishing this research. My supervisor Theo Thomassen commented that in order to really build a complete study, one more year of study is probably required.

Hereby I want to thank Theo Thomassen for supervising me, as well as many other people I have met during this year in the MA New Media at the University of Amsterdam. Beforehand , I actually did not expected it be so interesting and fun. The focus on many different skill sets, like blogging and theoretical analysis, but also on more practical research in the Digital Methods class, really gave me a better understanding in so many different fields. The last course about datavisualization was very extensive, especially working together with so many different people from different studies, but very interesting and it really opened my eyes about the possibilities of this kind of analysis when studying huge datasets.

At this point I can put MA in front of my name, which is a good feeling. I still however believe that I am just only starting to delve into the material and hopefully I will be able to continue exploring a field that has so many interesting aspects.

Anybody who is interested in my MA Thesis: It is freely available under a CC-BY licence and can be found here.

Categories: Europeana Tags: , , ,

Wikipedia and the Utopia of Openness: How Wikipedia Becomes Less Open to Improve its Quality

October 15, 2011 2 comments

I found out today that I have never posted my final paper of the Digital Methods of Internet Research. During my year in the Master New Media at the UvA, this was one of the most interesting researches I have worked on. With a final grade of 8.5, I was also asked to present it on the Digital Methods Conference. In this blog post, I have put down the abstract and the method. If you find it interesting, the full paper can be found here under a CC-BY-SA license.

Abstract

Wikipedia has become an enormous source of information in the last decade. Because of its ubiquitous presence on the internet and the speed of which it is updated, it has become more than a reference. It becomes ‘a first rough draft of history’. In this study the changing politics of openness are analyzed. By looking at both small articles, as well as one extremely popular, the role of openness and transparency within Wikipedia is discussed. In this study I point out that in order to improve the quality of Wikipedia, it is sometimes necessary to limit the amount of openness, which is not a problem as long as the process remains completely transparent. At the same time, more transparency is needed to improve the smaller articles, which are often created by a single person.

Method

In this paper, I want to take a deeper look inside Wikipedia and the way that the articles are created. Who is responsible for the content that can be found on Wikipedia? What is the consequence of the fact that ‘anyone can edit’ at any time and how is dealt with a project that has become so incredibly large? In the first part I will point out how Wikipedia works. The basics of Wikipedia will be explained and a more in-depth analysis of the politics of Wikipedia is done. By looking at the rules and regulations of Wikipedia, as well as how they are actually regulated by the community I will point out how Wikipedia has managed to control such a large group of editors and created an encyclopedia of high quality in stead of an anarchistic chaos.

In the second part, a closer look is taken to how an article is created and how it develops. Who creates the article? Is it a dedicated member of the community or an anonymous user who believes he can add something to the encyclopedia,? It is also interesting to see what happens after the creation. How does the community respond and what kind of edits are made? By taking a couple of articles as a case study, this will be made clear. This will make clear that a user should look at the average Wikipedia article more critically. Since this is hard for the average not so media-savvy Wikipedia user, Wikipedia should make this process of creation more insightful

In the third part, a more closer look will be taken to articles who are subjected to heavy editing. By taking a more deeper look into the Wiki article about Julian Assange the it will be made clear how the community responds on a topic like this and what this means for the idea of the ‘open’ and collaboration.

From this analysis, I conclude that the role of Wikipedia has changed, it has gone to be more than an encyclopedia, as it functions as an up to date news source. This has implications for the openness of Wikipedia and other ideas from the early days. To make sure Wikipedia can stay and become a more reliable source of information, transparency is the key.

Discussion.

The fact that Wikipedia is becoming bigger everyday, both in size, as in its ubiquitous presence, makes it an important object of study. On a daily base, millions of people use Wikipedia as a source of knowledge. The Wikipedia community is well aware of this and does its utmost best to create articles of better quality. This is not only done by checking new edits by both humans and bots, but also by creating new policies and guidelines. It seems that in the ten years of existence, the ideology of the early days has been abandoned. Rules can in fact be made and changed and the amount of openness can de reduced, as long as it benefits the quality of the content.

Wikipedia has developed from a small and open project, into a huge bureaucracy. This has several implications. It has become harder to start editing Wikipedia, new users often are frustrated by the wall of bureaucracy they run into and are therefore demotivated to become a Wikipedian. The consequence of this is that a declining group of people, is forming one of the biggest sources of knowledge. At the moment this does not affect the popular articles. As showed in the study to Julian Assange’s page, it is checked and discussed more than ever, despite the limited accessibility. It can however, reflect on the quality of smaller articles since more expertise is required and may as well lead to more conflicts between editors.

The increasing bureaucracy has two effects. On the one hand it decreases the amount of transparency. Because of the enormous growth of the policies and guidelines, it becomes harder to get the basic rules of Wikipedia and to see why a decision is made. At the same time, the user can assume that the article is of better quality because the content that is actually in the article, complies to all the rules. This however, does not apply to articles where only one editor created all the content. Most of the rules have to be checked by other users. As this research has shown, the text created in less popular articles is usually not changed much after that. The only edits that were made are text formats or adding categories and inlinks.

Therefore, I suggest that Wikipedia must give more attention to how the specific article is created and make it visible for every visitor. This way it brings back the transparency that has always been so important and improves the knowledge of the reader. It should be shown in the article how many users created it. For example, note a percentage in the top that shows how many of the content of the article is written by the same person and how many edits were made all together. This gives the user a better idea if an article is trustworthy and unbiased. By making the creating process even more transparent, it becomes easier for the user so decide how to approach the given information

It is up to Wikipedia as well as scholars to study better ways of indicating the quality of the article. With more than 3.5 million articles in the English-language Wikipedia, this can not be done efficiently by the human contributors, which numbers are slowly declining. New ways have to be found to automatically identify the quality of an article, as some researchers have already started discussing. This way, Wikipedia can indicate the quality of the article and show this to the user. This does not only make the user more aware of the fact that the content of Wikipedia is not perfect, it makes it also possible to automatically generate lists for the Wikipedians of articles that need to be checked for quality. It might even be possible to regulate the edit options automatically, giving more access when an article has proven to be of less quality, decreasing the amount of bureaucracy for starting editors.

This study has shown that Wikipedia has transformed since it was found, leading to a more bureaucratic organization. This has several implications, mainly on the openness of Wikipedia. As pointed out, these decisions can benefit the quality of Wikipedia, as long as the process remains completely transparent. By making less popular articles also more transparent, not only the quality of the content will improve, but it also notifies the reader how reliable an article is.

Europeana Thesis Conclusions

August 3, 2011 Leave a comment

The last few weeks, I have been working non-stop on my final thesis. At some points, it has been pretty hard to come to a good conclusion. A lot of expectations I had in advance appeared not to be correct.
In the end, I believe I have made a good analysis of Europeana in relation with a European identity and the cultural policy of the European Union.

In short: here are a few of my conclusions.

1. The European Union must not try to become a ‘United States of Europe’. The cultural strategies the EU uses to construct a European identity are similar to the strategies that national states use. For example the introduction of a European anthem and a standardized passport. In terms of cultural heritage: it is easy to claim that all of the German and French history is also a part of European history. This assumption however will not lead to an idea of a European idea of a shared history. The European identity consists out of the variety of all the different states in it and is constantly changing. The EU must try to show its ‘unity in diversity’ and the Europeana project is more than suitable for this since it combines all of Europe’s heritage.

2. The current interface of Europeana is not able to show the cultural objects in a European context. A search query leads to thousands of results presented in a 4×3 grid in no particular order. When an object is clicked, it is shown individually on the website of the contributing institution. This way, the European context is not present. In fact, all context has disappeared. Europeana should strive to show this context. This can be done in two ways. The first is to let experts like historians and archivists create a new European story. By combining primary and secondary sources from different countries and institutions, new relations and contexts can be shown. The second option is to create spaces where users can combine, view and discuss their own objects. The recently released Europeana API proves to be an excellent tool for this.

3. Fix the metadata. At the moment, tests with the Europeana API show that a lot of the objects do not have standardized metadata. This makes it hard for programmers and users to use and combine different cultural objects. For example, one institutions uses ‘1867’as a date mark, while an other uses ’19th century’. This makes it impossible to combine these objects in for example a timeline. Luckily Europeana has also noticed this problem and recently released the ‘aggregators handbook’ and a dummy space where aggregators can test if all fields are filled in correctly

4. The current trend is that the idea of feeling European is decreasing. In fact, it is now lower than at the start of the cultural program in 1992. However, it appears that this idea is very much related to economical and political factors. At the moment, during the economic crisis, less people have the idea that their country benefits from the European Union. At the same time, more people tend to feel less European.

5. The Europeana project fits perfectly in the cultural goals of the European Union. Because of its role as an aggregator of aggregators, it can also very easily adopt new institutions, aggregators and even new members countries. This allows Europeana to show the diversity and commonalities of the European culture, as well as new stories and insights in the history of the world. This way Europeana can become the representation of the diversity that unites Europe.

Right now, an English peer reviewer is taking a look at my thesis. After it is enhanced, I will put the entire thing on my blog.

Categories: Europeana Tags: , ,

e-G8: Governments Acknowledge Importance of Open Internet

June 14, 2011 Leave a comment

Image ‘Mark Zuckerberg elysee france Nicolas Sarkozy e-G8‘ by Admond filed under a Creative Commons Attribution 2.0 Generic licence

Two weeks ago, leaders of several nations gathered to discuss worldwide challenges at the G8 conference. Prior to this event, French president Nicholas Sarkozy organized for the first time the e-G8 conference. Here, the focus was on the importance of the internet. Lots of civil rights organization were concerned that Sarkozy would use the conference to promote a worldwide controlled and regulated internet. However, now the report of this conference has been published, it appears that the conference has yielded some positive outcomes.

Before the conference started, the organization was criticized. Digital rights organizations were not invited to the conference, which was co-funded by big companies like Google, Ebay and Microsoft. This led to a lot of critique. Author and internet activist Cory Doctorow rejected the invitation with the words:

“I believe it’s a whitewash, an attempt to get people who care about the Internet to lend credibility to regimes that are in all-out war with the free, open ‘Net.”

Because of this private character of the conference, and the fear that Sarkozy would expand his wishes for a further regulated internet, several organization organized to let their voices being heard as well. the French digital rigthts organization, ‘La Quadrature du Net‘, held an improvised press conference together with Larry Lessig and Jeff Jarvis, and the organization Acces handed over a petition where they made a call to the governments to protect the internet as an open and neutral place, without government control.

If the participants of the e-G8 listened to the protesters, can not be said. fact is however, that in their final report, they do emphasize on the importance of an open internet.

“The openness, transparency and freedom of the Internet have been key to its development and success”

It continues with:

“Freedom of opinion, expression, information, assembly and association must be safeguarded on the Internet as elsewhere. Arbitrary or indiscriminate censorship or restrictions on access to the Internet are inconsistent with States’ international obligations and are clearly unacceptable.”

They also indicate that the protection of personal data and privacy is essential for a healthy public sphere. Goal is to make internet users more aware of what happens to their personal data when they put it online and ask the big internet companies to achieve this.

On the topic of copyright, the report remains a bit vague. They argue that international laws have to be made to protect intellectual property. How these are maintained and what the consequences are is not made clear. this, while the French government has accepted the three-strikes-bill, which makes it possible to close off individual users from the internet. This law directly runs in  the idea of a free and open internet for everybody.

In the end, the published report is only an advice. Governments do not have to apply to these ideas. Still, it is a positive sign that world leaders realize the importance of a free and open internet.

This blogpost is a direct translation of the one I wrote for Dutch digital rights organization ‘Bits of Freedom‘, which can be found here.

Bits of Freedom

June 3, 2011 Leave a comment

O btw.

Last week my first post for the Dutch digital rights organisation ‘Bits of Freedom‘ was posted.

Here I compare the privacy policies of three different SMS-alternatives. WhatsApp, Blackberry Ping and eBuddy.

Conclusion is that a lot of questions remain like: where is my data stored? How can I remove my data from their servers? Are they saving my friends data as well?

For the full article, click here. (Dutch)

Today, I’m going to write an article about the e-g8, the internet conference that was held before the g8 about regulating the internet. Several digital rights organizations and internet personalities protested beforehand because of the elitist character of the conference, that was sponsored by big companies like Google, Microsoft and Ebay.