This post will be pretty dark so if you just want to have a nice time on an Animal Crossing forum you may not wanna read it!! c: There is nothing "bad" in it per se but it's about realistic, less fun to think about topics! You have been warned!
Maybe right now. But databases, technologies and algorithms are being built and what we have to be careful and conscious about is what practical effects they have right now and how they could be potentially used in the future.
Imagine someone hates "prominent person X" and has access to their data and the way "the online experience" shows up to them ( = algorithms for social media feeds, searches, displayed advertisement, ...).
Now imagine "prominent person X" is a group of people, this group has been defined by some characteristic of them that has been analyzed (for example political belief).
Look at China today. Instead of just a company that's trying to maximise profit and engagement with the platform, an authoritarian government decides what - not a person or a group - but EVERYONE in China gets to see or not see.
With all these technologies developing, how do we make sure that it isn't used in a bad way? Like for example in China? And even if they aren't used as heavily or as restrainingly as in China, if it is easy to bypass whatever filtering we are talking about...
Because this filtering is
already happening in the west! What is already happening is that echo chambers and social bubbles are being created with ways of interaction that have never existed before. Of course social bubbles have always existed, but just think about how differently people interact online versus in real life. Now, of course this is not only done through personalized advertisement but also (especially)
through how social media feeds work (the algorithms that decide what you should see and when - and of course you also filter yourself by deciding who you follow in the first place).
These echo chambers create limited world views that are in a large part controlled by algorithms run by corporations. These algorithms are absolutely not transparent and they aim to show you stuff that is as engaging to you as possible. The days where platforms like Twitter and Facebook let you see the stuff from (at least!) your friends that you might have disagreed with are long gone. You already have to actively search for the stuff that they "don't want to show you". It's not just social media: for example Youtube is also hiding "controversial" videos, this even includes videos about the Corona virus for example. So all these websites will just show you more and more of the stuff which you like and engage with.
What I think that this ultimately leads to is nothing other than radicalized tribalism. You can easily see the effects online and in the real world. Through the way these algorithms work, people are basically fed the same narrative over and over again and this leads to the
illusory truth effect which basically means that if you hear something often enough you start believing it. And don't you
dare to think that you are immune to that, no matter what your beliefs are. I'm not claiming to be immune to it either.
Twitter perfectly illustrates how these basically tribalized groups interact with each other. They basically play an appalling game of identity politics. They don't consider opposing viewpoints
at all, they don't listen to logical arguments as they don't want their belief systems challenged (who does, it's your understanding of
the world); they will be very quick to demonize "the other group". They will draw multiple conclusions from one statement (that may or may not be taken out of a complex context) and use any measure "win" the argument because that's literally the only thing that matters to them. This is happening daily and as this act is (re)played often enough, it will become the norm. Kids will grow up seeing this, learning/thinking that this is how discussions work.
So this tribalism that is facilitated leads to a fight of identities and what happens when this ecalates enough, for long enough... we are not so far
yet, but if this keeps getting worse... well... you can see what happens when the people on the "right" "win" when you look at Germany in the 20th century, and you can see what happens when people on the "left" "win" when you look at Russia in the 20th century.
And once it gets so far, no matter who "wins" - don't you dare to think that you will be happy if "your group" wins. Because as if what this technology is causing right now wasn't bad enough, it is at that point when all the surveillance technology will be implemented (in ways you really don't want to imagine) to
make sure you are acting "the right way".
Don't kid yourselves, the databases and technologies will be there.
So yeah, I don't mind getting "better advertisement" at the moment. I do not think that Google of Facebook are inherently evil, at all! But what they are building up, I see great potential in it to be used in a bad way (again, look at China, you can't argue against that). And I see that the current implementation of this technology will bring us closer to a point where it is abused. This is what you could argue against, and if you do, I truly hope that you are right. But I also hope that you recognize my point.
I see very little potential in this technology in making the world better, but I see a lot of potential in it being harmful.
Of course a single person who refuses to use Google who won't change anything. That's why these technologies need to be discussed and public awareness needs to be raised. Just as I'm trying to do right now. Even if we keep using those little surveillance boxes called smart phones, at least we can show some resistance by not letting these companies and systems take control of our
homes. We need to find ways of dealing with this and prevent the situation from escalating. We probably need legislation to keep these companies and technologies in check. It is an incredibly complex issue, and I don't think that it's right to brush this off with "chill, they don't care about
you". This is about so much more than just you and me.