Last Days Here by Eric Chock

Tutu on Da Curb

Tutu standing on da corna,
she look so nice!
Her hair all pin up in one bun,
one huge red hibiscus hanging out
over her right ear,
her blue Hawaiian print muu muu
blowing in da wind
as one bus driver blows
one huge cloud of smoke
around her,
no wonder her hair so gray!
She squint and wiggle her nose
at da heat
and da thick stink fumes
da bus driver just futted all over her.
You can see her shrivel up
and shrink little bit more.
Bum bye, she going disappear
from da curb
foreva.

Day 10: #TheSealeyChallenge

The Gates by Muriel Rukeyser

DOUBLE ODE

IV

. . .

Black parental mysteries
groan and mingle in the night.
Something will be born of this.

Pay attention to what they tell you to forget
pay attention to what they tell you to forget
pay attention to what they tell you to forget.

Farewell the madness of the guardians
the river, the windows, they are the guardians,
there is no guardian, it is all built into me.

Do I move toward form, do I use all my fears?


Day 8: #TheSealeyChallenge

Continuity by Cynthia Arrieu-King

Deathless

. . .

Mennonites drape dresses over a clothesline, what they wore on the beach.
It takes me a second to recall why my brother wanted to canoe

around a man-made lake. To feel the water push back.
Fish introduced, deer listening. A soil hurled at my feet

in excruciating slowness, the end. We rowed
into sunburn, knew to jab deep and try to pull through.

No one, when I asked them in their hospital sick beds, ever
wanted me to leave the TV on, no one wanted an artificial heart.


Day 6: #TheSealeyChallenge

Poetry and Commitment by Adrienne Rich

The Kind of Poetry I Want

by Hugh MacDiarmid

A poetry the quality of which
Is a stand made against intellectual apathy . . .

But, more than that, its words coming from a mind
Which has experienced the sifted layers on layers
Of human lives---aware of the innumerable dead
And the innumerable to-be-born . . .

A speech, a poetry, to bring to bear upon life
The concentrated strength of all our being . . .

Day 5: #TheSealeyChallenge

#TheSealeyChallenge 2021
IMG-3751.jpg

#TheSealeyChallenge 2021

I’ve long admired and appreciated the people who participate in #TheSealeyChallenge: An annual “community challenge to read one book of poetry a day for the month of August.”

The dedication!

The expertise!

The generosity!

You know who you are. ✨

Rewind to March 2020: Enter sorrow. Enter death. Enter rage. Enter sobering truths about human nature.

After much denial, I realized that I’d be spending a lot of time alone with my racing mind. No more darting out of the apartment into the distractions of public spaces the moment the opportunity for contemplation arose.

Be still. Be quiet. Feel. Forged in this pandemic crucible—some kind of focus.

August 2021: So, yeah, like, um, I guess I have time to read 31 books of poetry in 31 days? With gratitude, I begin.

Jennifer Hasegawa
The Long-Tail Danger of the Lack of Diversity in Tech: Monochromatic Algorithms

I attended with a focus on learning more about what technologists are doing in the areas of Artificial Intelligence (A.I.) and the Internet of Things (IoT).

Personally, I’m interested in A.I. and how it applies to my work as an Information Architect and how chatbots and content recommendation engines can help my users more easily and naturally find what they are looking for. I’m interested in IoT as it applies to my newbie-level work on e-textiles.

With so many impressions and ideas tucked into my head, I left the conference inspired and yet tongue-tied. I knew that something profound needed to be teased out, but what?

***

As I prepared for and traveled to the conference, I monitored Twitter #ghc17 to get a preview of the scene at the conference.

The top tweet was a supportive rallying cry from a tireless male ally and I was inspired. As the day went on, this tweet remained as the top tweet, and it made me wonder.

As I prepared to write this post, I checked Medium itself to see impressions from Grace Hopper.

The top result was a thoughtful post from a male attendee of Grace Hopper 2016. This also made me wonder.

How can it be that representative voices of this conference about women in computing — one rising to the top for the first 24 hours and one rising to the top a year later — are not the voices of women?

As a woman of color, I’m not shocked, as I see an endless queue of folks who don’t know my experience, put into positions of being the dominant voices speaking about and around my experience, whether they like it or not.

And, I’m not asking in outrage. Here’s a :-) to prove it. I’m asking because it’s an interesting question.

***

I see three factors at work that produced these results:

  • Content

  • User behavior

  • Algorithms

The content emitted an information scent and evoked user behavior that made algorithms identify it as highly relevant to users looking for information about Grace Hopper.

This is a loaded potion.

One could make guesses about why a man’s tweet supporting women would rise to the top of Twitter, besting tweets from women supporting women.

One could make guesses about why more womens’ voices speaking on Grace Hopper are not surfaced in Medium’s top search results, instead making the top story a man’s experience at the conference a year ago.

But I want to talk about the algorithm.

al·go·rithm

a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.

Why the algorithm?

The algorithm is the god from the machine powering them all, for good or ill. (from How Algorithms Rule the World)

These Medium and Twitter examples illustrate symptoms of the issue that rose to the top of everything I took away from Grace Hopper: monochromatic algorithms.

The lack of diversity in technology we see today is kindling a humanitarian crisis in which ubiquitous monochromatic algorithms will marginalize any people who were not represented in their design and creation.

At this point in the flow, we’re hearing the issue surfaced at this level:

  • Women are paid less than men doing the same job.

  • Discrimination in the hiring process keeps women and minorities out of the tech workforce.

  • We don’t have enough women and minorities in the tech workforce candidate pipeline.

But to be clear, what I’m talking about is the long-tail problem that will manifest as a result of these initial factors.

If we don’t course-correct, our lives in the future will be shaped by algorithms produced by a monochromatic tech workforce that does not understand, reflect, or empathize with its user population.

Here are just a couple of examples of these monochromatic algorithms at work:

Machine Bias: There’s software used across the country to predict future criminals. And it’s biased against Blacks.

As you hear the following buzzwords in your daily life — A.I., machine learning, neural networks, big data, data science, IoT, and “smart” anything — consider how each of these technologies involves algorithms that dictate rules for how decisions are made.

Consider how these algorithms and decision trees are currently being designed and coded by a workforce that may not be able to account for even a tiny sliver of the wide range of human experiences in this world. And that in effect, this tunnel-visioned workforce is determining outcomes in our healthcare, education, housing, environment, communication, and entertainment.

Remember that it was not until 1992 that wheelchair access became legally required for public spaces.

Also, remember that corporations are people and that some of these corporations are essentially algorithms and how given these rights, these algorithms may be given much leeway over how they control and shape our lives.

If the predictive-policing algorithm cited in the Pro Publica story above were designed by a team that included even a single person of color, would its scoring system use race as a factor?

Yet something odd happened when Borden and Prater were booked into jail: A computer program spat out a score predicting the likelihood of each committing a future crime. Borden — who is black — was rated a high risk. Prater — who is white — was rated a low risk.

Two years later, we know the computer algorithm got it exactly backward. Borden has not been charged with any new crimes. Prater is serving an eight-year prison term for subsequently breaking into a warehouse and stealing thousands of dollars’ worth of electronics.

Sure, we’ve been using flawed data for centuries, but the difference now is that these buzzwords are mesmerizing in their promise of a better world, when in reality, they could just be a false veneer over the bad data, convincing us that these algorithms know better.

How can something called Big Data be wrong?

How can an Internet of Things be bad — it’s the internet (love the internet), but with things (love things).

I’m going to overgeneralize by using my mom and dad as examples.

My mom believes that A.I. will destroy humanity by becoming smarter than us and bending us to its will.

My dad believes that A.I. will be benevolent and can’t possibly do harm because it will be like Data on Star Trek and just know to how to do the right thing.

As A.I. moves closer and closer to the heart of our lives, we must consider the words of Dr. Fei-Fei Li (@drfeifei) at Grace Hopper 2017:

“There is nothing artificial about A.I.”

A.I. does not spring to life on its own as pure, objective, god-like technology. Human beings with biases, conscious and otherwise, are the ones putting the “intelligence” into A.I. by designing and coding algorithms and by contributing data to models that are crowdsourced, as with the Google Translate example above.

In other words, A.I. can only be as intelligent, or dumb, as we are.

My technologist hero Jaron Lanier speaks of the myth of A.I. and how a dangerous echo chamber created when the following factors exist:

The root data is bad.

We blindly trust A.I. to do the right thing.

The user interfaces surfacing the data don’t allow us to question or point out flaws in the data.

See The Myth of AI: A Conversation With Jaron Lanier

The lack of diversity in tech is not just about fair and equal treatment in the workplace, it is also about a much larger crisis: fair and equal treatment in the digital and virtual worlds being built all around us.

As the analog world continues to fade and the virtual world comes more and more to the forefront of our lives, are we going to correct the errors and biases built into the analog world or are we going to bake these travesties in all over again?

I was in awe of the work Chieko Asakawa shared at Grace Hopper around her use of technology of help blind people be more independent and navigate the world.

And at the same time, it highlighted how sight-centric our world is and what an uphill battle it is when you need to retrofit the world to meet your needs when your experience was not considered in its design.

We must evolve our tech workforce to reflect the diversity of the real world we live in and include diverse voices in positions that can influence the technology that will shape our lives.

If we allow these algorithms and the virtual worlds we inhabit to be built by only a narrow range of people, these systems will inevitably perpetrate biases against those not represented.

We will miss out on one of the biggest opportunities in the history of the civilization to fix the most fundamental and profound wrongs in how we inhabit this world.

Imagine the end of gender- and race-based profiling and discrimination and where humanity can go without these limitations perpetrated by flawed stereotypes and data.

Imagine a world in which we are truly seen and valued for who are are and are treated according to our actual behaviors, not on what we are artificially predicted to do or want.

Imagine a world in which technology promises to make the world a better place and actually succeeds in doing so, not just for a narrow demographic, but for everyone.

To learn more, see the Asilomar AI Principles, one of the earliest and most influential sets of AI governance principles.

Jennifer Hasegawa