Crawford’s piece about listening in on media and messages very much run in parallel to Lacey’s point a few weeks ago about listening as an imperative factor to the public sphere. Crawford, though, types out the kinds of people and agents who are listening (individuals, politicians, corporations, as well as the different modes of listening (reciprocal, background, delegated). In order to your voice to have any sort of meaning, you need to have listeners. It seems like people like Crawford and Lacey are taking Carey’s model of communication and addressing the part that has often lay dormant (figuratively and literally). They’re addressing the audience – the people being spoken to, the half of the model of communication that has been seen as passive, as only consumers, and attempting to see this part of the communication loop as an active one, as one that listens in, rather than passively takes messages from producers. The act of listening, both argue, is active. In digital media, and in Twitterspace, which is what Crawford concentrates on, the act of listening is, indeed, an act, that is constituted in the different modes, and by different forms of agents.
I very much agree with the point that there is a privileging of the voice. If we were to take a gendered look at this, it seems as though we might be able to make it into a gendered metaphor, and one that works with colonization as well. That the voice penetrates into a space, and disseminates seeds of information and messages, but without a receptacle, it becomes merely masturbation, with no hopes at all of reproducing, or letting the seeds, or the message, proceed with any significance, no hopes of affecting change, or instigate an exchange of ideas and discourse. After all, Western imperialism would not be a thing if there weren’t the non-West to receive Western culture and mentality of Western consumer capitalism. Much like theories of the Other, there needs to be an opposite in order to define the principal. A Slave to define the Master. And oppressed to reinforce and ensure superiority. A listener in order to render the speaker significant.
There is now an expectation to use these services, and to listen to these services. Our world has normalized these technologies, and if we want to participate in a certain social circle, we must use these technologies too. Think about how many invitations for offline events come on Facebook now. If you choose not to use it, you are cut out from many social events for which information can only be gotten on social media.
David Beer gets into algorithms and the technological unconscious, in which we are being controlled by power structures that we cannot see, and that machines increasingly talk more to each other without human intervention or human agency. As Hayles says, referenced by Beer, these systems, though, aren’t infallible. An example of this that came from some of my previous research was the Flash Crash in May of 2010 when the Dow dropped 1000 points, and no one knew why, but high frequency trading, which is trading done at the nanosecond level by algorithms, were suspect in this catastrophe.
Here’s the Kevin Slavin TED talk on algorithms.
I had posted yesterday about how Facebook groups status updates and posts by friends according to key themes, the latest that came up the day after Thanksgiving being Christmas. In a faux cheerful way, Facebook informs me every time I sign in now that 23 of my friends are talking about “Christmas.” Facebook, through its algorithms, creates a sort of public sphere, a common topic, that only you can see, and that really, only you can participate in, because you are presumably the only person who knows all of the people in your newsfeed that Facebook says are talking about “Christmas.”
Another algorithmic thing that Facebook does is determine which posts are most visible, which friends show up on top of your newsfeed, which is presumably some algorithm that involves how much you click on their profile or respond to their posts, and vice versa.
These ideas about Facebook tie into our discussion last week about filtering and customization. In this case, and, likely in most cases, algorithms play a huge role in how we perceive the world, how we get information, even without us being active. It chooses and filters for us, it anticipates what we want to see and feeds it to us. What’s the bottom line here? Is it so there are more eyeballs glued to the screen? So that we continue listening? What role does the algorithm play in our continued and sustained listening, and how much, then, can we say we are active in our acts of listening if algorithms are tied so closely with our experience of the public, of dialogue, discourse, and democracy?
These questions tie into Lash’s idea of post-hegemonic power, in which power and power structures are constituted within us, and our Facebook walls, and in what we have seemingly chosen to do, rather than imposed on us from outside. It is within the system itself, and is invisible to us.