This page shows the source for this entry, with WebCore formatting language tags and attributes highlighted.
Why don't I believe most of what I see online?
It's easy enough to make fake text and fake articles. Anyone can write something online. You can make up information out of whole cloth or you can subtly manipulate the so-called truth to slant it in a desired direction, either by omission or by emphasis or both. Since many people read only the titles of articles, you don't even need to try too hard to make well-substantiated and well-linked articles. Sure, somebody's going to "debunk" your text. However, by then, your lie will have made its way around the world and established itself as truth in many minds. The debunking will either be ignored or actively repudiated. You will have established mind-share with your lies You can lend gravitas to your article with quotes and citations. Even more effective are videos and images. People trust videos and images more than text. That's why you'll often see stock photos that have nothing to do with the article. Or the image is related, but is of a different place or from a different time. If you're writing an article about a natural disaster but the pictures you have available aren't sufficiently tragic, then you'll just pluck a stock photo and trust that no-one notices that it's of the wrong island or from 10 years ago. This works for people, too. You can lend credence to your fake quotes by just including a photo of the person who supposedly said it. As shown below, now you can include a video. <media src="https://www.youtube.com/v/o2DDU4g0PRo" href="https://www.youtube.com/watch?v=o2DDU4g0PRo" caption="Fake videos of real people -- and how to spot them" author="Supasorn Suwajanakorn" width="560px"> This video is from April 2018. The presenter describes software that can build videos depicting people saying things that they never said. It builds fake videos. This software is real. Almost no-one is adequately equipped to avoid being duped by such videos, at least temporarily. He goes on to indicate that we can detect fakes---at least for now---because the technology for properly emulating teeth and tongues is about ten years away. This is cold comfort. As is the browser plugin that he says he is also developing, which will detect these kinds of videos in real time. What is far more likely is that fake videos like this will sweep around the world---and perhaps <i>already have</i>. They will sow discord and unrest until something really, really bad happens. Imagine a video of Putin declaring war on the U.S. If we could be manipulated with text and pictures, imagine what can be done with this video technology. In another, similar development, a team at Nvidia has developed new technology that can create completely new images from existing images, by analyzing and <i>mixing</i> similar images. The paper <a href="https://arxiv.org/abs/1812.04948" source="arXiv.org" author="Tero Karras, Samuli Laine, Timo Aila">A Style-Based Generator Architecture for Generative Adversarial Networks</a> is online. The demonstration below is quite impressive, showing completely artificial human faces that are completely convincing. They move on to objects, like furniture and entire interior designs as well as cars. It's quite impressive how smoothly it mixes in real time. <media src="https://www.youtube.com/v/kSLJriaOumA" href="https://www.youtube.com/watch?v=kSLJriaOumA" author="Tero Karras FI " caption="A Style-Based Generator Architecture for Generative Adversarial Networks" width="560px"> This technology, too, could be used to create extremely convincing and false pictures to decorate false news reports. A picture with the proper characteristics lends a large amount of credibility to a story, at least initially. An interesting place to use this technology would be in police investigations, to replace sketch artists (if they haven't been largely replaced by similar technology already). The sketch artist could be trained to mix the proper characteristics to create a convincing face until the victim or witness has a eureka moment and recognizes the "photo". <b>Update (31.12.2018):</b> even without this face-mapping software, it's possible to manipulate videos with desktop software, as shown in the clip below, from the Neo Magazin Royale from ARD Fernsehen in Germany. In it, they made a fake video of Yanis Varoufakis giving Germany the finger. It was convincing enough that the largest news-talk show featured the video (without checking its origin) and then had Varoufakis on the show to explain himself. Hilarious, but definitely a sign of the future. What if no-one had admitted they'd faked it? The video is in German with English subtitles. <media src="https://www.youtube.com/v/Vx-1LQu6mAE&feature=youtu.be" href="https://www.youtube.com/watch?v=Vx-1LQu6mAE&feature=youtu.be" source="YouTube" author="Jan Böhmermann" width="560px">