When camera movement means turning your head

I went to a VR/360° video installation the other day for my first experience outside of the relatively simple Google Cardboard. What struck me mainly was how low-res it was, to the point where I was never sure I had the headset focused properly, as well as the fact that I was nauseated when the camera dollied.

But I did have one moment, a kind of epiphany that I suppose must be old hat to people who get a chance to experience a lot of these: I was watching a 360° film when I happened to swivel my head away from the characters to check out the surrounding environment and was surprised to see a fire in the distance. It would have felt heavy-handed as a pan in a movie, and it wouldn’t have been important enough to the story to justify that anyway. But the fact that I just as easily could have missed it gave it extra weight, like I had discovered something. In a way analogous to how a movie feels like it’s rewarding you for your cleverness when it trusts you to put the pieces together on your own rather than spoon-feeding you, I felt rewarded for my curiosity.

It’s several days later and I’m still thinking about this not-a-pan. What happens to cinema when you take away camera movement? Is it still cinema? Or is it immersive theater? Or is it something else?

GPT-2 is “like a three-year-old prodigiously gifted with the illusion, at least, of college-level writing ability”

Every once in a while you find an article that seems just for you. John Seabrook writes in The New Yorker about the group OpenAI and their A.I. called GPT-2, which is so advanced that they are keeping it under lock and key. Not only is the whole thing about the bleeding edge of writing writers, it’s also a wonderful example of a digital article. It also appears in print, of course, but the online presentation is so subtle yet inventive, with its interactions to reveal AI-written text and a mini-game of “spot the AI” within a paragraph, that I cannot imagine the print version competing with the web one.

I do wish Seabrook had talked to Robin Sloan about his experiments in writing fiction using the same A.I. It might have encouraged him to explore the idea of A.I. as a potential partner or generator of raw material to be re-shaped.

Jake Elliott on game writing formats and tools

A really interesting set of posts on the mechanics of writing Kentucky Route Zero, the experience of rolling your own dialog format, and comparing Twine, Ink, and Yarn.

It’s behind a Patreon paywall, but c’mon—it’s for Cardboard Computer, the creators of Kentucky Route Zero, and their first tier is a dollar a month. Those four blog posts alone should cost more than that to access.

Notes on The Training Commission

The Training Commission” is “a serial fiction newsletter by Brendan Byrne and Ingrid Burrington.” A little over 18,000 words in total, plus interviews (about which more below).

The writing

Good writing, good concept. Concepts, really—there are phrases scattered within that could be their own novellas.

Science fiction often entails gesturing toward an event or idea without filling in every little detail, making the reader work a little bit to catch up. Byrne and Burrington do a lot of this, dropping evocative phrases, sometimes elaborating on them later, sometimes not.

They also treat the climax this way: There is a major yes/no decision that is elided, seen only in retrospect. I liked it.

The presentation

I received it in my email inbox, serialized over several weeks. Having it serialized over email is cool in theory but I found it frustrating in practice—mixing my fiction into my metastasizing list of requests and anxieties didn’t make for a focused experience. I suggest reading it on the website.

One danger of doing technically interesting things in the presentation of a story is that you’ll lose the reader. The authors state as much in an introductory email:

As “fun” as it would have been to turn this into an elaborate puzzle where you have to piece together narrative hints to open encrypted files, we know that many of you are busy people who probably have just enough time for a newsletter but not for like, an alternate reality game. (Also, we are not game designers.)

That being said: we are doing something kind of tricky with disseminating the files.

Even that small tricky thing was enough to interrupt me. It involved downloading a special browser that allows peer-to-peer browsing, which in itself is not difficult, but I could only run it on my laptop, so I lost the ability to read the story on my phone during my commute. I finished the story a couple weeks after the final email was sent because that’s how long it took for me to find an uninterrupted stretch of time to sit at my computer.

The interviews

Central to the plot are some documents that turn out to be interview transcripts, and those are real interviews conducted with real people by Ingrid Burrington. The fiction links to these documents, as well as to a real news article, in a way that plays with the concept of a piece of fiction as a bounded work. It is both a way of expanding the fictional universe and of citing one’s sources.

 

Sometimes tech’s greatest benefit to the reader comes when it stops with the publisher.

Craig Mod, in Wired:

We were looking for the Future Book in the wrong place. It’s not the form, necessarily, that needed to evolve—I think we can agree that, in an age of infinite distraction, one of the strongest assets of a “book” as a book is its singular, sustained, distraction-free, blissfully immutable voice. Instead, technology changed everything that enables a book, fomenting a quiet revolution. Funding, printing, fulfillment, community-building—everything leading up to and supporting a book has shifted meaningfully, even if the containers haven’t. Perhaps the form and interactivity of what we consider a “standard book” will change in the future, as screens become as cheap and durable as paper. But the books made today, held in our hands, digital or print, are Future Books, unfuturistic and inert may they seem.

I’ve been thinking about this as I work on a piece of digital fiction, and realize that it gets better every time I take away an opportunity for interaction.

I was also thinking about this reading Robin Sloan’s print mailings, which you can get if you sign up for his newsletter. Sometimes they come on thin pink paper, printed with a Risograph and folded in thirds. The latest one came in an envelope and on paper that felt very much like a junk mailer, which Sloan explains was of necessity but also part of the fun, since it came from a fictional bureaucracy. In each case, he runs the Ruby scripts, does the care and feeding of the AI, gets ink on his hands, or whatever else needs to be done for you to receive something delightful in your actual physical mailbox with no double-click to install, no log in, not even an ‘on’ button.

I see a parallel here between the technical burden that Sloan shoulders, and that the book printers and distributors that Mod refers to do, and a particular trend in web development: doing more work earlier in the process, in order to make what you send to your reader lighter and less complex. That can mean server-side rendering, doing most of the computation on the sending end rather than the receiving end so that the reader receives the simplest bundle of text possible. Or even before code hits the server, doing more work at compile time. Rich Harris, the maker of one such tool called Svelte, argues that “complexity, like energy, can only really be converted from one form into another” and he would prefer to take on the complexity rather than make his reader or customer or whoever is waiting at the end of the process deal with it. Shifting more of the effort sooner in the assembly line.

What you as a creator lose in that bargain is often not a loss at all. You may give up some novel or flashy presentation, but do your readers want that, or do they want to escape from it? And making it easier on the readers might make it harder up front for the people making the thing—but it’s always hard to make something that feels easy, always complicated to make something the feels simple.

Some of the time—maybe most of the time—the fancy tech does not belong in the hands of the reader. Too often it results in irrelevant cognitive load for the reader or too much computational load on the reader’s device—two analogous problems that often go hand in hand. Respectful tech helps get a piece to the reader in a way that feels light and simple and elegant, as with a restaurant that keeps all the complexity hidden behind the door to the kitchen, so that guests might have a simple, quiet, concentrated, respectful experience. (Snow Fall is Benihana.)

It feels like people take as a given that interactive widgets and lots of motion are what it means to use the native tools of the web, when in fact those are the tools of advertising, the problematic funding source for most of the web, which is built around the goal of diverting your attention rather than aiding your concentration. What would a web that prioritized readers over buyers look like? I suspect the difference would be greater than just an absence of ads, or even an absence of clickbait.

How not to say something

From “The Dinner Party” by Joshua Ferris:

On occasion, the two women went to lunch and she came home offended by some pettiness. And he would say, “Why do this to yourself?” He wanted to keep her from being hurt. He also wanted his wife and her friend to drift apart so that he never had to sit through another dinner party with the friend and her husband. But after a few months the rift would inevitably heal and the friendship return to good standing. He couldn’t blame her. They went back a long way and you get only so many old friends.

He leaped four hours ahead of himself. He ruminated on the evening in future retrospect and recalled every gesture, every word. He walked back to the kitchen and stood with a new drink in front of the fridge, out of the way. “I can’t do it,” he said.

Did you catch that? A new drink. Ferris could have had another paragraph or two there, with beautiful and clever language explaining that our narrator had started drinking two hours ago, was on his third, and liked to pair his dry reds with cutting loquaciousness. Continue reading “How not to say something”

Conflict simulation, violence and order

I’m skeptical if intrigued about using the tools of evolutionary biology to trace the genealogy of myths and folktales, but the idea that stories that demonstrate good conflict resolution are adaptive (more useful and thus more likely to survive) stood out to me as insightful:

“Little Red Riding Hood,” the tale of Polyphemus, and other ancient tales are all preoccupied with peril. They are populated by predators real and imaginary. They are replete with physical and interpersonal threats—in particular deceit. They confront characters with at least one crisis and force them to either resolve it or meet a terrible fate. Even the folktales of the Agta, which emphasize harmony, only do so through a sharp contrast with discord. When we try to define the qualities of memorable narratives today, we often fall back on clichés and tautologies. Stories need conflict, we say. Why? Because conflict makes for a good story. But maybe there’s a deeper reason.

Not only are ancient myths and folktales almost universally concerned with danger and death; they are blatantly didactic. If we remove their layers of symbolism and subtext—which have been interpreted and reinterpreted for millennia—and focus on their narrative skeletons, we find that they are studded with practical and moral insights: people are not always what they seem; the mind is as much a weapon as the body; sometimes humility is the best path to victory. Modern stories frequently plunge us into lengthy interior monologues, exhaustively describe settings and people’s physical features, delight in the random, absurd, and orthogonal, and end with deliberate ambiguity. The earliest stories were, for all their fantasy, far more pragmatic. Their villains were often thinly veiled analogies for real-world threats, and their conclusions offered useful lessons. They were simulations that allowed our ancestors to develop crucial mental and social skills and to practice overcoming conflict without being in actual danger. Though we may never definitively know what confluence of biological and cultural pressures hatched the first stories—though narrative has far exceeded its preliminary role in human evolution—it seems that our predecessors relied on stories to teach each other how to survive.

The idea of story as practical simulation makes intuitive sense and provides a neat alternative to the received wisdom of “stories need conflict because conflict makes for good stories,” which I had never recognized as a tautology until this article called it out.

Jonathan Gottschall’s The Storytelling Animal: How Stories Make Us Human makes a similar point about stories as conflict simulation, but I while I read that book I was thinking only in terms of how stories do their work, and not about where the “rule” for writers came from.

The other book this makes me think of is Lajos Egri’s The Art of Dramatic Writing, which claims that a story needs a dramatic question, and then needs to answer it—another way of describing the didactic function. At the time I read that book, that claim seemed suspect—if you have a lesson to impart, just be direct and write an essay! Dressing up a lesson in the guise of a story felt both dishonest, like hiding medicine in food, and like a good way to waste the unique strengths of fiction.

But now I wonder. Maybe there are questions and answers that are best addressed through fiction—ones that aren’t “lessons” because the questions are ineffable or the answers are multiple or contingent or otherwise complicated. The article above lists lengthy interior monologues, randomness, and ambiguity as evidence that modern stories have abandoned their didactic function. But isn’t that instead evidence that we are facing different sorts of conflict in the modern age?

Man versus ennui. Man versus the algorithm. Man versus his own alienation. These are a long way from the old “man versus man” and “man versus nature” models of conflict you get in high school. Maybe a better framing for modernity is not conflict/resolution or question/answer but disorder/order. Tzvetan Todorov talked about the idea of “violence” rather than “conflict” in his book Introduction to Poetics. There is a system in a certain order, some violence is done that upsets the order, and then work has to be done to put things back into (probably a new) order. (I think I’m remembering my Todorov correctly—I admit to working from one sentence I jotted down in school a decade ago.) This feels to me like a mental tool as well adapted to modern fiction as “versus” is to fairy tales.

Learning to count

Most of the time I don’t like writing, I like having written. As a result, writing can fall prey to productive procrastination: Cleaning the kitchen is easier, and its rewards more immediate. The writing impulse can also die at the hands of excessive modesty: Nobody is asking for my writing, nobody needs it. And it is easily put off by tiny martyrdoms: I should do the thing that other people want from me, rather than the thing I do seemingly only for myself.

Surely if I work long and hard enough on the needs of others and life’s inescapable chores, my karma will accrue and I’ll find myself suddenly in a peaceful office, with a large oak writing desk, its polished top, otherwise devoid of distraction, dwarfing a clacky keyboard and a large mug of coffee. This fantasy office occupies the top of a turret from which descends a spiral staircase, and its door is locked for no reason, as it is obvious to everyone that I am not to be disturbed. The view out the generous window above my desk is lost on me, given how absorbed I am in the work, my fingers not leaving the keyboard for hours at a stretch.

Somehow my conscientious avoidance of writing has failed to materialize that perfect writing setup, and so, like many writers, I need some kind of accountability to get anything done. And accountability, the etymologist will note, requires counting.

Writers have a bunch of numbers to obsess over when our work collides with the world: sales, pageviews, rankings. My favorite, which I mean to emulate, is the writer who encouraged herself to bring her work into the world by setting a goal of a certain number of rejection letters. But during the writing process, our opportunities for counting are scarcer. We have only two things we can count to see if we’re keeping our pace: words and hours. Continue reading “Learning to count”

Is “Proposal for a book to be adapted into a movie starring Dwayne The Rock Johnson” electronic literature?

In describing electronic literature I’ve sometimes used the definition “stories that you can’t print out.” Robin Sloan’s “Proposal for a book to be adapted into a movie starring Dwayne The Rock Johnson” has me reconsidering that.

The story takes the form of an email. More specifically, a printed email: Rather than the scroll bar and cacophonous collection of buttons that would signify an email program, we get a series of 8.5×11″ sheets. But despite mimicking paper, the story is very much of the digital world. It would not work nearly as well as a finely printed book. The date in the header of each page and the URL in the footer function as mise en scène, whereas in a book they would be so out of place as to be confusing and distracting. And the typographic details, like having two hyphens for an em dash, would seem like inattention to detail instead of part of the tone. In a carefully crafted book object, sloppiness is sloppiness; in a simulacrum of an email, sloppiness is verisimilitude. Continue reading “Is “Proposal for a book to be adapted into a movie starring Dwayne The Rock Johnson” electronic literature?”