On Algorithmic Culture

image credit : “Algorithmic Culture” by harpsachord

 

On Algorithmic Culture
by Mark Jewusiak

In August of 1998, Sun Microsystems co-founder Andy Bechtolsheim writes a check for $100,000 to a pair of ambitious college students—Sergey Brin and Larry Page—whose idea for a more efficient Internet search engine would forever change the process of data archiving and its retrieval, while also helping create a mammoth company with innumerable competitive offshoots. On an idea conjured within the confines of Stanford University, Bechtolsheim’s modest startup funds would help found Google Inc., now Alphabet Inc.  The idea was simple enough: Larry and Sergey were aware that if their search engine were to differentiate itself from the inadequate offerings of the day, their search engine would need to improve the quality and speed of its services. To make this happen, they proposed the collection of all the Internet’s data for optimal functionality; this idea when expressed inside a college classroom was met with scoffs as a seemingly impossible endeavour. However, the determined duo, alongside their algorithm, would propel their company and other future services of data collection to ubiquity. It is here that Google succeeds, and it is here that the company understands that its value is in the relentless collection of data (now considered the most valued asset on the planet as of 2017); often times at a very seductive no-cost fee for its services. For many, Google occupies a space in their day-to-day lives, if not a complete dependence for their services. Made available to all, it created a community, defined in part by consumer data and by those who subscribed and shared their content for indexing.  It became part of a new data-obsessed industry, and a public obsessed with free services, including apps for a then unforeseen disruption in mobile computing–the smart phone. For the user, the smartphone made the availability and sharing of data mobile and instantaneous, and the gathering of personal data into an entirely new age of personal collection—knowing more about the user than the user. It is here, where community and dependence become strong precursors in understanding cultural if not social movements.

Image credit : Yeohyun Ahn
Graphic art created with algorithms.  Image credit: Yeohyun Ahn

It is in the power of the algorithm—its optimization and sometimes suggestive nature—that propels, guides and often steers our thinking.  At first feel, Google’s seductive power injects an “optimism [which] is a natural response to the arrival of a powerful and mysterious new technology, [but] it can blind us to more troubling portents” as Nicholas Carr suggests. The company’s extraordinary ability to sort and pluck information resides in its unique algorithm, providing users with efficient and often very accurate search results.  Algorithmic dependence has become habitual, its daily use suggests its addictive nature, from Facebook to YouTube, all incorporating an algorithm, tuned and catering to a pile of personal data. A dependence can carry addictive qualities—like any other addiction tied to stimulants—it has a perniciousness that can delve deeply into the psyche. The effects of dopamine and the pleasure principle are well-documented, suggesting that many are enslaved by its host during its habitual course of release.  Various social media platforms succeed on this principle by providing emotional triggers with the ability for the user to tally “likes” as a way to gain validation.  Many Facebook users are emotionally connected to the extent that it can induce depression.  Also, news feeds often provide biased information to its users as it recycles a history of personal “preferences”; this arguably affects critical and analytical thinking, and may produce binary environments, collectives and tribal thinking. The reduction of ideological options also reduces language and the extent people communicate.

Google’s algorithm is a data crunching machine, and many like are designed to continually optimize every underachieving corner of our lives. Digital machines, as Ian Ayers notes, are “not just invading and displacing traditional experts; they’re changing our lives.” As Google continues to expand its services, and as the algorithm is adopted seemingly everywhere, it is where people will “connect to both electronically and emotionally.”  It’s the emotional component that can often become a dependency, with obvious broad implications.

The ability to influence a message or its receiver is not a novel one.  In 1964, Marshall McLuhan famously declared in his book, Understanding Media: The Extensions of Man that “the medium is the message.”  In short, his idea was to convey the importance of the medium in its ability to influence the message, and to have a social effect.  When McLuhan infers that mediums “massage,” one might argue instead that certain elements of the Internet influenced by the algorithm suggests a medium that caresses: where massaging lulls the senses, caressing stimulates them. As we subscribe to the Internet and to its vast array of data gathering sites, or to smartphone apps, it ultimately feeds the biggest on the block—Google and its secret sauce, the algorithm. It should be noted, the legion of online providers collecting personal data, Facebook, and companies like it, employ their own algorithms, offering services for the procurement of personal data.  Services that provide an emotional connection, are the most addictive and influential, if not manipulative, providing and reaching out with the most caresses. It has become part of our culture to subscribe and to offer a window into our personal lives, to a world audience, but at what pecuniary and emotional cost, is yet fully unknown; however, what is certain, is the degree to which the many subscribe on a whim. The desire to garner likes, expressions of self-interest and narcissistic tendencies at an unhealthy degree are on the rise, while cases of depression relating to observing the curated and often more fabulously-perceived-lives of others has produced an unbeknownst anxiety.

A search engine that provides suggestions under its search field, can potentially open a fissure, a kind of synapse and misfired synapse, between desired research and impulsive research. When we take this further, acquiring information for purposes of relaying refined content for public consumption, by persons presumably adhering to journalistic practices and standards, creates a slippery slope.  Questionable credibility is disconcerting in a field traditionally equated with venerable respectability of Cronkite-ian idealism. According to Brian Winston, “over 90 percent of American journalists, for example, use the searchable electronic archives, databases and news sources of the net in their work. The archives include sites established by mainstream media themselves which transfer – ‘shovel’ – original print to abandon the expensive and labour-intensive business of keeping physical archives of clippings – ‘morgues’. Journalists also use ‘comment sites’ such as the reactionary, propagandistic Drudge Report…” To some extent this proves how and why so often audiences (from all sides) participate in echo-chamber political espousing.

To what extent then will algorithmic optimization lead a user into thinking that what has been found is all that can be found? How reliable will news from Twitter be as a democratic source of day-to-day accounts transpiring in conflicted and war-torn areas? Full disclosure: For purposes of this article–as well-researched as it is–minute bits of online information were used to cultivate a tiny portion of its content. Knowing this, does this now affect how one judges this article? If so, how often does one pause to reflect on how ominous this practice is when reading any online article? How much qualitative information will be left in the digital shadows if search engines are so efficient that they overextend as an idealized search provider?  Are minority perspectives ignored? More importantly, what does it say of future journalism if the norm is to seek data through search engines, and to what extent can that be manipulated? The ability for entities both domestic and foreign to influence presidential elections should translate as some great concern; it should also come as no surprise that “fake news” and “fact checking” are buzz words in today’s media. The answer in many ways illustrates itself in the current political environment, one that often blurs conventional lines.

It is the opinion of this author that the plight of long-established news agencies is the greatest threat to the intellectual integrity of any given society, if not society itself, and likely symptomatic of a much greater disease. Outside pressures have also impacted the credibility of these agencies over the past two decades; to remain viable, online advertising dollars are an impetus to follow “trending” stories, and as content started to transition to the Internet, it’s more than likely that historically credible news agencies would have to bow to pressures of several newer, less credible reporting outlets that relay innocuous and inane stories, or embellish ones that should never see the light of day. News agencies are no longer confined to their broadsheet dimensions and distribution channels, and the amount of necessary, if not desperate, attention-grabbing, if-it-bleeds-it-leads headlines are limitless, while personal consumption of news does have its limitations. In this scenario, it becomes increasingly difficult to separate the intellectual and factual wheat from the chaff. The incessant amount of innocuous news detracts from that which should be at the fore, dismissing pertinent issues and creating an increasingly apathetic culture. It’s not only credibility that becomes a concern, but quality in an algorithmically optimized culture.

News agencies are now pressured into providing at the top of their articles the length of time it will take the average reader to read the article, to seemingly comply to a public that would rather get snippets of news than get an in-depth perspective reliant on thoroughly researched reportage. Morsels as opposed to meals of information can make a public susceptible to manipulation. Headlines seem to be enough for a 140-character-or-less audience, making misreporting and manipulating that public that much more viable if not easier. It’s an impulsive and addictive tied-to-emotion behaviour, one that dispels the notion for the need of well-researched articles, but this also suggests that media is acquiescing, feeding an audience now looking for things that already nourish their deepest emotions, anxieties, in other words, a pleasure principle amplified through caressing.

To use today’s U.S. political climate as an example, if there are two headlines that contradict each other on the same subject, what likelihood is there that both articles will be read by the same reader? Isn’t it more likely that bias will preclude one’s choice, with an already-ingrained “truth” they have set. The result is a binary public, with binary opinions, devoid of critical and analytical thinking in desperate need of multiple insightful perspectives.

On the surface, there are many obvious positives to take from the algorithm.  Some of them include our ability to communicate like never before; it’s also resulted in countless marriages as a result of online dating algorithms; while also opening countless areas in science and technology with limitless possibilities.  But there’s an inherent contradiction regarding the algorithm’s ability to conciliate a healthy society. The underlying current regarding algorithms is that they’re designed to optimize; it’s an equation that suggests a language more akin to a fervent capitalism than a placating, nourishing socialism, and yet the results are very much intended to provide for our needs, mollify, or at least assuage fears, anxieties, while providing democratic platforms and accessibility of information to all. Its mathematical DNA, if you will, is about efficacy, like a production line, a company’s resources, its accounting, etc. Our dependence for the algorithm doesn’t stop in its current state. The next stage of this culture will be expressed with a growing dependence on machine learning and AI, which many of the world’s great thinkers hint will bring about doomsday scenarios; loss of jobs at a grand scale—mostly in the middle class, including the aforementioned accounting positions—will produce wage disparity and ever-growing class hostilities inside an already pressurized geopolitical environment. This notion opposes an idealized society, where futurists see all of life’s mundane labours restricted to the efficacies of robots and the innards of computational machines, allowing for a more supposed leisurely lifestyle and standard of living, minus the crippling burdens of 20th century work schedules. In time, some pertinent questions regarding this current culture will have to be addressed: Where will humanity choose to place its energies in the future? Will guaranteed income become adopted by nations of the world and how much political upheaval will that result in? How will we continue to share and disseminate information that is both factual and hopefully inspirational?