But Capitalism...

How we could actually make a better internet (but those making lots of money would probably make less and that would make people mad).

That whole tech anti-trust thing and wtf does it mean?

And maybe, is it time that we start looking at the impacts that tech has across different industries, instead of just the tech industry

If you’ve been on the internet, and sucked into tech news this week, or have encountered someone who has been sucked into tech news this week, you’ve probably heard about the Big Tech Anti Trust hearings.

Yesterday, individuals gathered to answer questions about the four Big Tech giants — Apple, Amazon, Google, and Facebook — and review how they operate. With claims that these companies are just using their power to crush one another, using tactics like predatory pricing, unfair business negotiations, claims of infringing on patents of the little guys, and questionable ethics on how the data is used — I’m not shocked at any point that this was brought up.

Watching the statements yesterday, and reading through what happened and its supposed impacts, I’m left with more questions than answers. While we’re in a time where we’re figuring out how, and if companies will be held accountable, I’m still wondering how much catch up some of these congressmen have to play?

Among my journey to where I’m at today, I took a detour and worked for a progressive lobbying firm, that advocated for tech. Part of my role was to help educate people on how the technology worked, the good and the bad side. Imagine explaining the internet, blockchain technology, and the internet of things to your local representative.

That’s why this quote from the Atlantic’s Anti-Trust coverage hit me smack dab in the face:

When Facebook’s Mark Zuckerberg testified on Capitol Hill two years ago, the hearings were an embarrassing exercise in congressional cluelessness. They furthered a cliché: The doddering American political elite, who sometimes seemed to confuse Messenger with the passenger pigeon, would never have the savvy to keep up with the dynamism of Big Tech, let alone regulate it.

Franklin Foer for the Atlantic

If we’re talking of takeaways from this Anti-Trust hearing, I’ve got a few that make me concerned for the future of tech ethics:

1. How are we including people who actually understand how technology works, in the policy behind this technology?

The doddering American political elite seems to stick out in this situation. My husband paid his way through college working as a Geek Squad agent, and the stories I’d hear of people who struggled with technology concern me — and not because of how technology is “easy” (it’s not, it’s at times purposely confusing), but on how nuanced this shit is.

Many times, tech is an insider club. It almost feels like it operates in its own lingo and operational standards. It leverages this and the allure of the future to draw folks in. And I’ve even felt like I’ve lived in a community that’s been a victim of techs allure — living in Reno, Nevada at the time of Tesla’s arrival, a contested arrival that came with the promise of $1.3 billion dollars in tax incentives. Needless to say Nevadan’s weren’t too happy with how that deal turned out.

My ask: let’s start getting people who are great at tech, involved in the policymaking process, helping hold tech giants accountable, and educating the public to the impact that this technology has on our lives.

One person who’s done this exceptionally well is Bianca Wylie — who has done a lot of coverage on Google’s Sidewalk Labs, and the impact of the data that they collect. Earlier this year, she wrote a piece about how we need governments to take a more active voice on tech efficacy. She takes a look at the framework in which we’re living in, which often looks at tech to be a solution, and how that needs to be met with questions as well:

Sidewalk Labs is but one of thousands of companies stepping into this narrative void and mixing marketing with crisis. This note is not an effort to malign the intent. But as we know, good intentions do not always lead us to the best places. Without the fuller picture of this conversation, which is also about masks and hospital beds and tests, the creep of techno-solutionism continues along very slowly over time.  The fault and pressure resides with governments and public science to step up and step in to frame this conversation properly. I’m not encouraged but it doesn’t mean we should stop asking.

Bianca Wylie

2. What actions will actually be taken from this antitrust hearing? How does our capitalist system impact the potential impact of this?

There’s been a lot of conversations about these platforms and how they’re “too big to moderate effectively,” but does that mean we should just give up? Wired shared a piece about Facebook’s size, and how the notion of “too big to moderate” is actually more of a case of “would cut significantly into their profit margins to moderate.” In reality — isn’t that a problem and even more so of a case to take it to look towards regulation. We don’t do that for other industries? But we turn a blind eye to what tech is doing and what they can do in the future.

In a time where content is king, and we’ve got the tools to access and content in everyone’s hands — not necessarily a bad thing, we’ve got to be conscious of the impact of this content. As Kevin Roose, tech journalist for the NYT recently shared, misinformation spreads like wildfire.

There’s a difference between moderating terrible hot takes, I for one believe that if you have a terrible hot take (cough cough, Andrew Sullivan) the court of public opinion will take care of you (and that’s not cancel culture, that’s consequences), but when it comes to takes that are dangerous and misinforming that’s when platforms have the responsibility to step in.

It’s true that no site that relies on user-generated content, and has millions or billions of users, can ever perfectly enforce its content rules at scale. But in no industry, save perhaps airlines and nuclear power plants, do we suggest that anything short of perfection is equivalent to failure. No one says there are simply too many people in the world to enforce laws at scale; we just employ a ton of cops. (Of course, the protest movement against police violence has powerfully argued that those funds would be better spent elsewhere—a question for another article.)

Enforcing the rules can be done; it just costs money. Not enforcing the rules has costs, too. They just end up on society’s balance sheet, not Facebook’s.
Gilad Edelman for Wired

Are we okay with the cost of not regulating the content online? I’m constantly in awe of what we appear to tolerate in a) lack of moderation and b) lack of fair ethical treatment for those who are actually moderating the content. I think you make enough money, Zuck — the human impact of society is a bit more important to me.

3. Tech involves every area of our lives; why don’t we act like it?

Tech has successfully infiltrated every area of our lives, and that’s not a bad thing. We have better health care tech, cool new cures, ways of connecting with one another and sharing information with one another, but at the end of the day, that’s even more the reason for us to make sure that we know how this works.

We have tech journalists, education, and tech degrees — but when are we just going to admit that we need people who can discuss and talk about technology and its day to day impacts? All Things Considered covered how tech has changed our lives in the last 10 years, and one thing that it brought up was the simple use of smartphones — we don’t even communicate in the same way anymore.

And it’s not limited to communication — in fact, I experienced many of the tech advances during this pandemic, when I did a surgical follow up for an appendectomy, sending photos of my scars through an app, getting married on Zoom, or discussed the pros/cons ethically of photos at protests from a data perspective — these are all things that tech has influenced on our lives. None of the above are directly in the “tech” industry, but have significant lasting impacts on our health, personal lives, and even our criminal justice systems — let’s talk about them like they do. Sure, they’re convenient as hell, but let’s look at how it can change our society for the long run.

Pew Research published a large essay on the positives of digital life, how we can “find our people” or connect with others, but I’m over here thinking about the conversations that are happening behind the scenes, what do these company roadmaps look like? What is their long term game? How are they monetizing?

Sound Black Mirror-ish? It’s because some of it probably is.

The reason that show is so damn terrifying is because it’s based in truth.

Oh, and one more tweet on why this is important:


Quick Links


That’s all for this week’s Erin for Tech. I’m looking forward too seeing you next week! As much as the internet can be a dumpster fire, this has been a pleasant place to be, and in the meantime — I look forward to seeing you on Media Hackers, a newsletter dedicated to looking at different ways tech tools can help empower creators, media makers, and journalists, without sacrificing a code of ethics.

Join us at Media Hackers

Share