Wednesday, May 31, 2017

American Epic, part 3: “Out of the Many, One” (PBS/Lo-Max Films, 2017)

by Mark Gabrish Conlan • Copyright © 2017 by Mark Gabrish Conlan • All rights reserved

At 9 p.m. I watched the third episode of the remarkable PBS series American Epic, detailing the discovery and rediscovery of American folk artists who — at least according to the legend this show’s creator, Bernard MacMahon, largely subscribed to — were virtually unknown to the conventional music market until the late 1920’s, when record companies, armed with the newly invented electrical recording equipment, fanned out across the South and places like Appalachia looking for new markets for their products that hadn’t been reached by radio. This meant having to find artists who could record the kinds of music that were already popular in those areas, and MacMahon makes probably more of a to-do than is merited factually that the music scene in those areas consisted exclusively of amateurs who played only for themselves and their friends, or in casual dances organized in their home villages. Actually a number of the musicians recorded in the late 1920’s, like blues singer Blind Willie McTell and the country-folk-blues-pop artist Jimmie Rodgers (who was literally, in Duke Ellington’s phrase, “beyond category” — it was Rodgers who introduced the throat yodel into country music and also brought in the steel guitar, because he’d recorded with Hawai’ian bands and loved their music), had had careers as professional entertainers well before they recorded — though even during Rodgers’ lifetime (he made his first record in 1927 at age 30 and died of tuberculosis six years later) he was hyped as a railroad brakeman who had turned to music only when his disease made it impossible for him to continue such a strenuous occupation. The first two shows in the American Epic series had focused mostly on country and blues artists, notably the Carter Family, the Memphis Jug Band and Charley Patton, but the third one — which ran an hour and a half, compared to the one-hour time slots of the first two — promised to introduce us to other cultures whose music was also recorded for the first time in the late 1920’s. 

The opening segment covered the Hopi Indians in New Mexico and the recording of two of their traditional songs, “Chant of the Eagle Dance” and “Chant of the Snake Dance,” in 1926 for Victor, a session set up by Mike Billingsley, who as a child had been fascinated by tales of the Hopi and as soon as he was old enough literally ran away from home to join the tribe. At the time the Hopi were being threatened with prosecution for performing their old religious rituals, despite that pesky little part of the U.S. Constitution called the First Amendment and its guarantee of the “free exercise” of one’s religion. The “Christian” moralists of the time thought the Hopi snake dance was a pagan or Satanic ritual, and — much to the discontent of some members of the tribe, who thought that if anyone outside the tribe got to see or hear the snake ritual it would lose its efficacy in providing the desert tribe the few inches of rain it needed each year to grow their staple food, corn — Billingsley organized a public relations tour for the Hopi elders, who not only recorded their chants but went to Washington, D.C. and were filmed doing the snake dance for government officials, who duly pronounced it moral and passed an act of Congress telling local authorities to let the Hopi practice their religion. 

Then they did a segment on Hawai’ian music in general and Joseph Kekuku, the inventor of the steel guitar, in particular, which was an interesting inclusion because Hawai’ian music was not only recorded well before America’s other ethnic musics (as early as 1903!), there was even such a fad for it that in the ’teens about 25 percent of all the records sold in the U.S. were of Hawai’ian music. The steel guitar comes with its own legend: it seems Kekuku was walking down a railroad track carrying his guitar when he saw a leftover bolt on the track. He picked it up and it accidentally brushed across his guitar, creating what’s become the familiar glissando of the slide used to fret guitar strings. There’s a quite remarkable scene in which the descendants of Joseph Kekuku hear him clearly for the first time: prior to the making of this film he was only known to have made a few cylinders in the early 20th century in which he’s virtually inaudible, but the filmmakers discovered a 1925 Columbia electrical in which he plays the pop song “I’ll See You in My Dreams” on steel guitar backed by a loud and rather obnoxious banjo — but Kekuku’s own playing is quiet and lyrical and does full justice to the song. (A lot of Hawai’ian bands remodeled the pop songs of the period to fit their style; one of the other records played in this segment was a Hawai’ian cover of George Gershwin’s “Oh, Lady Be Good.”) There’s also a fascinating montage showing how extensively the slide or steel guitar has been used in other sorts of music, including blues (Son House), country (Hank Williams), Afro-pop (King Sunny Adé) and progressive rock (Pink Floyd). 

Then the show took a brief detour to the Texas-Mexico border for the remarkable career of Lydia (sometimes spelled “Lidya”) Mendoza, who recorded for Victor’s Bluebird subsidiary in 1928 and had a star-making hit with a glum song called “Mal Hombre” (rivaling “House of the Rising Sun” and Kurt Weill’s “Surabaya Johnny” as a song about a woman bitterly lamenting her ruination at the hands of a no-good man — the title literally means “bad man” but the show’s subtitles render it as “cruel-hearted man”), and she performed literally for decades, though she took a time-out from her career in the early 1940’s to get married and raise her kids. Luckily her later performances were extensively videotaped by her family — and she was as intense in the 1980’s as she was in the 1920’s; based on the evidence here, if there was a Mexican Edith Piaf, she was it. Then the show cuts to Louisiana for the first recordings of Cajun music — a New Orleans businessman ordered 500 copies of the first Cajun hit, Joseph Falcon’s “Allons en Lafayette,” at a time when record companies didn’t expect to sell that many copies of anything. The show featured interviews with the Breaux family — Falcon’s wife and co-performer Cheruil was a Breaux, and she also performed with three Breaux brothers, one of whom wrote what’s probably the most famous Cajun song ever written, “Jolie Blonde” (a surprisingly infectious melody given that the lyrics are a tale of woe because the singer’s pretty blonde girlfriend has left him), though Harry Choates’ later cover is the one from which most subsequent artists learned the song — including, of all people, Buddy Holly and Waylon Jennings. Jennings was a D.J. on country station KLLL in Holly’s home town, Lubbock, Texas, and Holly wanted to break him as an artist — and for their first record they cut a cover of “Jolie Blonde,” garbling the title as “Jole Blon” and adding R&B saxophonist King Curtis (and more recently Bruce Springsteen has covered the Holly-Jennings version — he’s shown here doing it in a clip from one of his concert videos). 

But the most extensively featured artist on this program — he takes up almost half its total running time — is the blues musician Mississippi John Hurt. He was born in 1892 or 1893 (white Southern authorities weren’t always that meticulous about documenting the births of African-American babies — Louis Armstrong always celebrated his birthday as July 4, 1900 and it was only after his death that a birth certificate came to light establishing the actual date as August 4, 1901) and lived virtually all his life in the tiny town of Avalon, Mississippi. He was scouted in 1928 by Tommy Rockwell and Bob Stephens of Okeh Records (who had produced the classic Bix Beiderbecke-Frank Trumbauer records for the label; later Stephens went to work for Decca and made Count Basie’s records for them from 1937 to 1939; John Hammond was furious that Decca “poached” Basie’s contract from him but paid tribute to Stephens for getting such a good sound on Basie’s records despite Decca’s substandard equipment) and recorded twice, in 1928 in Memphis, Tennessee as part of an Okeh field trip, and in 1929 in New York City. On the latter trip he got so homesick that he wrote and recorded a song called “Avalon Blues.” Then he went back home and for the next three decades made his living as a sharecropper and a cow-sitter for local white landowners, one of whom was a woman Hurt went to because Okeh had sent him a sample copy of one of his records but he had nothing to play it on. So he went to her and asked her to play the record for him — only she wouldn’t let him in her house. Instead she let him stand on her front porch as she aimed the horn of her phonograph through the screen door, and when she played the record she was astonished and exclaimed, “That’s you on that record!” (The story was told by the woman’s daughter.) 

Then in the early 1960’s a blues researcher named Dick Spottswood heard “Avalon Blues,” which even then was one of Hurt’s more obscure records (he recorded 20 sides for Okeh but only the 13 originally released are known to exist — there’s a sad little segment with Sony Music archivist Michael Brooks on how many of the metal parts from which records are made were melted down during the Depression and again during World War II in scrap-metal drives — and two of Hurt’s sides were on Harry Smith’s Anthology of Folk Music from 1952 but “Avalon Blues” remained unknown even to a lot of dedicated blues collectors). While other people who’d heard the record thought it related to the mythical kingdom of Arthurian legend, or was something Hurt had made up, Spottswood wondered that maybe there was a real Avalon, Mississippi and John Hurt actually lived there. There was, and he did, and like a number of other old blues musicians Hurt was rediscovered in the early 1960’s and had a second career on the folk-music circuit. Son House and Skip James were likewise rediscovered in the early 1960’s after having made a few records in the late 1920’s or early 1930’s and then disappeared from the music scene. (The fabulous and still underrated Robert Wilkins had quit blues voluntarily in 1936, after a riot broke out at a juke joint where he was performing and he decided his music had had an evil influence and had caused the fight; he became a born-again Christian, worked as a minister and faith healer, and when he returned to music in the 1960’s it was as a gospel singer.) What was remarkable about Hurt’s second career was that he’d survived the years in good health, and though he hadn’t played in years his guitar chops were as good as ever — unlike Skip James, who already had terminal cancer when he was rediscovered and could afford medical care only because Cream covered one of his songs, “I’m So Glad,” and the royalties paid for his medical bills. What’s also remarkable about Hurt is how little he sounds like the Delta guys — Robert Johnson, Muddy Waters, Howlin’ Wolf, et al. — who’ve set the template for what we expect “Mississippi blues” to sound like. Hurt didn’t use the slide guitar, his voice was soft and lyrical, and he was an excellent picker; indeed, if he hadn’t proclaimed his origins by billing his native state on his records as part of his name, collectors would probably have guessed he was from Virginia, because he comes a lot closer to the softer, subtler, more lyrical Piedmont blues style than the rougher, more openly emotional Delta blues. Like Robert Wilkins, Hurt was able to describe rough-and-tumble scenes in a matter-of-fact style that in some ways holds up better than the more blatant, searing emotionalism of Robert Johnson and company — among his Okeh records (and a song he remade in his second career) is a version of “Stagolee” that comes a lot closer to white singer Frank Hutchison’s version (also for Okeh) from 1926, two years before Hurt’s record debut, than most of the Black records of it. 

The odd coincidence that both Black and white artists recorded versions of “John Henry” and “Stagolee” certainly muddies the waters in terms of trying to figure out who influenced whom: the reverse-racist consensus that dominates jazz historiography today essentially asserts that all jazz is a Black invention and white jazz musicians only stole from their racial betters (and frequently made more money because of the racism inherent in the music business, as in the rest of American life, through most of the 20th century) becomes even harder to support when a show like this documents that even songs considered so quintessentially “Black” as “John Henry” and “Stagolee” may have originated in white, not Black, folk traditions. Overall, American Epic is an excellent presentation of the subject, hampered a bit by the limitations of length (each episode could have been two hours long without feeling stretched) and what film footage has survived (fortunately John Hurt was filmed extensively in the 1960’s and the show includes two clips of him at the 1963 Newport Folk Festival. a 1965 British TV show and an amateur audio-visual record that’s the only footage of him in color). Occasionally the show just sounded silly — as when, at the end of episode two, Robert Junior Lockwood (Robert Johnson’s nephew and one of the few musicians who actually can be believed when he claims to have played with Johnson) said that all blues had to be in medium-walking tempo because that was the speed at which a sharecropper walked behind the mule that was pulling the plow in the cotton fields (and it got even sillier when he said the early blues singers learned to sing loud from having to call out instructions to the mules in voices loud enough to get these famously recalcitrant animals’ attention) — but for the most part it was a worthy effort to pay tribute to the great traditions of American grass-roots music.

Tuesday, May 30, 2017

Michael Jackson: Searching for Neverland (Lifetime/Silver Screen Entertainment, 2017)

by Mark Gabrish Conlan • Copyright © 2017 by Mark Gabrish Conlan • All rights reserved

Charles and I watched the TV-movie Michael Jackson: Searching for Neverland on Lifetime. In fact, this time around Lifetime listed itself as a producing company on this one (in partnership with Silver Screen Entertainment) as well as the network outlet. The central characters are not so much Michael Jackson (played by someone or something called “Navi”) as the two bodyguards who signed on and worked for him steadily during the last two years of his life, Bill Whitfield (Chad L. Coleman) and Javon Beard (Sam Adegoke). The story is framed by court depositions Whitfield and Beard are giving in the case against Jackson’s physician, Dr. Conrad Murray (who appears only briefly and is played by Ken Colquitt), who was accused of negligent homicide in Jackson’s death over the repeated doses of the anesthetic propofol he gave the singer in his last days as Jackson prepared for a series of comeback shows at the O2 Arena in London. When we meet Michael Jackson it’s already Christmas 2006 and he hasn’t released a new album in five years, he’s been acquitted of charges of child molestation, but the experience has poisoned him against ever living at his Neverland ranch in Santa Barbara (when his kids complain that they want to go back there, he says they can’t because “it’s been contaminated with evil”) and he’s just returned from a stint in Bahrain. 

Michael’s big concern is that his kids not be photographed — at one point, on his orders, Whitfield and Beard rough up a (white) papparazo who’s sneaked a photo of the children, and then Michael has to pay $75,000 to avoid prosecution — though he’s also portrayed as a child-man completely unaware of his dire financial condition. He has the humiliating experience of having an F. A. O. Schwartz store opened especially for him, he buys nearly $40,000 of merchandise as Christmas presents for his kids, and then his credit card is declined and the rather unctuous white woman at the register says she can’t let him take the stuff on a mere promise to pay or she’ll lose her job. (Eventually Michael has to call his attorney at 4:30 a.m. and get him to cover the charges.) The other issue surrounding the film is Michael’s desire just to kick back and be a normal father, versus the frustration of being an instantly recognizable celebrity who draws a crowd whenever he appears in public. (In one grimly amusing scene, Whitfield and Beard have Michael wear a motorcycle helmet and take his kids through the streets of Las Vegas so clad — it works in the sense that no one recognizes him, much less mobs him, though one would think someone would have guessed who it was when he addresses his younger son as “Blanket” — who else ever named a kid “Blanket”?) Michael Jackson: Searching for Neverland is an engaging movie in some ways — Chad L. Coleman and Sam Adegoke enact the lead roles with power and authority, Navi looks credible as the later Michael Jackson even though he hardly dances with the élan of the real one (but then, who could?) and the only time we hear him sing is when, after having shown his kids a bunch of Charlie Chaplin’s movies, we hear him croon “Smile,” Chaplin’s theme song for his film Modern Times and a bittersweet song about smiling one’s way through adversity that works here almost as well as it did in Chaplin’s masterpiece. 

The film does a good job of showing how precarious working for a celebrity can be, especially with the power shifts around Michael Jackson that rivaled anything that went on in a medieval court — at the start he’s being managed by a Black woman named Raymone Bain (Holly Robinson Peete) and his personal assistant is a white guy, John Feldman (Brian Ibsen), but first Feldman and then Bain fall from power and at the end Jackson’s manager is a young Black wanna-be filmmaker named Michael Amir Williams (Mykel Shannon Jenkins) whom Michael Jackson bonded with during a photo shoot in New York City (which if nothing else proves that he still has his celebrity chops together; it’s just a still-photo shoot so he doesn’t have to move, but he strikes all the right poses) over a script they were working on for an animated movie about King Tutankhamun. Naturally the film also charts Jackson’s fraught relationships with his biological family; at one point Michael’s brother Randy (Kristofer Gordon) crashes his car through the gates of the Jackson manse and demands money apparently owed him from the Jackson Five’s days as a working group, and this ruins Michael’s mood so much that he abandons plans to attend a birthday party for his friend Elizabeth Taylor (another child star who had a troubled adult career and spent the last years not working but still garnering publicity and tabloid coverage) and spends the night hiding in his bedroom instead. Eventually Michael finds his desire to buy a $55 million estate in Nevada, which he has christened “Wonderland,” thwarted because he has huge assets but is cash-poor (the real Michael Jackson had to sell half of his 50 percent interest in Sony-ATV Music, holder of the Beatles’ song copyrights, just to pay off his mounting debts), and that is what finally forces him to agree to the ill-fated comeback concerts in London. I remember thinking when the announcement was made that those concerts would never happen — though what I reasoned was that Michael would find some B.S. excuse to back out of them the way he had out of previous tours, not that the reason he’d never take the stage at O2 was he’d croak first. 

The prospect of returning to the stage after eight years in which he’d done virtually nothing professionally must have been frightening as all get-out for Michael Jackson, especially since he couldn’t just stand up and sing his old songs the way Frank Sinatra could in his later years (as long as Sinatra could get up there, remember — or be cued on with teleprompters — the words and croak out a reasonable approximation of his songs’ original melodies, his audiences would be pleased). No, Michael Jackson’s audiences would have expected him to dance — and to dance in his late 40’s as well as he had in the music videos he’d shot in his early 20’s, and to do the old routines perfectly in real time without the protection of being able to retake. No wonder it took him so long to agree to do the shows at all, and when he finally started rehearsing for them (his rehearsals were filmed and released post-mortem as part of a documentary called Michael Jackson: This Is It) he was so anxious he literally couldn’t sleep and Dr. Murray started giving him propofol injections that narcotized him so well that at the end of this movie, he calls Whitfield and is so stoned Whitfield at first doesn’t recognize his voice and thinks it’s a prank caller. Michael Jackson’s career — and his life — seem yet more evidence of my conviction that the only way a child star can have a normal and happy adulthood is if she or he gives up show business altogether the way Shirley Temple and Deanna Durbin did. It’s also an illustration of what, when Charles and I watched the DVD Michael Jackson: HIStory, Part II, I called the Michael Jackson perplex: “[T]he portrait we get from it is of Michael Jackson the child-man who had a great gift for communication and, because of his eccentric background, surprisingly little to communicate: an egomaniac with at least some awareness of his own limitations, a prima-donna star with a willingness to learn from others, and a sad and pathetic figure who professionally projected an aura of excitement and joy.” (I also had a “perplex” about Elvis Presley, another troubled star who lived in a bubble cut off from normal human contact and who died in his 40’s from prescription drugs: the Elvis Perplex was how much potential talent Elvis had, how little of it he actually used, and how hugely successful he became just on that little.)  

Michael Jackson: Searching for Neverland is in some ways a typically clichéd tale of a superstar whose own fame turned him into a recluse and cut him off from human values — though he wasn’t forgotten by his public the way Gloria Swanson’s character was in Sunset Boulevard, there’s a lot of the same feeling here, especially given Michael’s intense loneliness and the fact that, aside from his kids, virtually everyone in his presence was being paid (or at least being promised to be paid — according to Elizabeth Hunter’s script for this film, Jackson was so cash-poor that for the final five months of their tenure Whitfield and Beard were not being paid, and his last manager was clearly trying to drive them out of the entourage) to be there. It’s a sad story that isn’t rendered any less sad simply because we’ve heard it before — though it does make one wonder why he didn’t pull his act together long enough to make another record: Quincy Jones, who produced the three Michael Jackson albums on which his historical reputation will rest — Off the Wall, Thriller and Bad, in chronological order — called him “a workaholic,” a surprising thing to say about an artist who created virtually nothing in the last eight years of his life, and one can’t help but think the distraction of actually having something professional to do would have kept him alive and uplifted his spirits. Just after Michael Jackson: Searching for Neverland Lifetime aired one of those tacky pseudo-documentary clip fests called Michael Jackson: Icon, of which I watched only the first segment — saying that Michael Jackson revolutionized the art of dance and indulging in such a blatant bit of “first-itis” (a term I coined for the tendency among biographers to assert that the person they’re biographing was the first in history to do a particular thing) as saying that when Michael Jackson first appeared with the Jackson Five at age 10 no one had ever before heard a child sing with the full passion of an adult. Naturally this had me screaming at the TV, “Does the name ‘Judy Garland’ mean anything to you?”

Monday, May 29, 2017

Sinister Minister (Lifetime, 2017)

by Mark Gabrish Conlan • Copyright © 2017 by Mark Gabrish Conlan • All rights reserved

I settled in and watched TV for the rest of the night, including one of the Lifetime “Premiere” movies that they’ve extended to Sunday as well as Saturday nights — and Lifetime’s “Premiere” for Sunday, May 28, 2017 was something with the risible title Sinister Minister (just try to say that without at least chuckling!), though it was filmed under the less silly but also less clear-cut title Brightside — spelled on the film’s imdb.com page as one word even though the actual name of the town where it takes place is “Bright Side” — two words. The film begins with what’s by far its best sequence, a hot sexual encounter between the titular sinister minister, known only by his initials “D. J.” (Ryan Patrick Shanahan) and a woman he’s having an adulterous affair with, though he’s feeding her the usual malarkey about how God wouldn’t be making it possible for them to love (and screw) each other if God didn’t think it was right. Then D. J. receives word that his wife is dead — she was found hanged in their garage and the officials rule her death a suicide — and a typical Lifetime title advances the time frame to “Three Years Later.” Three years later D. J. is the minister in a small town called Bright Side, the woman we saw him adulterously fucking in the prologue is his wife, but he’s already set his sights on her replacement — or rather replacements, since he’s attracted to both Patricia “Trish” Corbett (Nikki Howard) and her daughter Sienna (Angelica Briones). Trish got pregnant with Sienna when she was just 15, though she must have married Sienna’s dad, since he’s discussed in the movie and there’s no indication he’s a step-parent — but the two divorced a year earlier and Sienna started cutting up, misbehaving, doing worse at school and smoking marijuana after her dad and mom broke up. Determined to keep her away from the big city and the kinds of trouble Sienna could get into there, Trish moves the two of them to Bright Side, where they check out D. J.’s church one Sunday morning. D. J. checks them out as well, much to Sienna’s initial displeasure — “Mom, he’s looking at my boobs!” she complains — and she makes it clear she’s bored by the whole church thing and suspicious of D. J.’s intentions towards her mom as well as her. 

Mom, however, is enthralled by the church in general and D. J. in particular, especially since the sermon he preaches the first day she goes there is about his past as the road manager for a famous rock band (the script — whose writer, a name otherwise unknown to me, is still unknown to me since she’s not identified on the imdb.com page and all I remember about the name is it definitely looked like a woman’s), in which capacity he tried his best to keep up with the drug use and general dissipation of his employers until he found God, left the music business and settled down in Bright Side. The precise denomination of D. J.’s church is unspecified; it’s in an adobe (or faux adobe) building that reminded me of the small church in San Diego’s Old Town which served as the model for the one in which Ramona, the Native American heroine of Helen Hunt Jackson’s 1890’s novel Ramona, and her Native partner Alessandro were supposed to have got married, except that was supposed to be a Roman Catholic church and whatever denomination D. J. is in, it must be one that allows its priests to be married openly. Whatever it is at the start, he’s so obviously drooling over both Trish and Sienna I half-expected him to announce to them, “I’ve decided to leave my church and become a Fundamentalist Mormon, so I can marry both of you.” D. J. first sets his sights on Trish, offering her a job when her previous employer, the owner of the “Friendly Joe’s” restaurant at which she was working as a waitress, fires her for taking calls on her cell phone at work. He’s got a wife already, but a sinister car accident out in the boonies around Bright Side takes care of that little problem; he lives, she dies and the authorities call it an “accident.” Then Sienna comes home a few days after D. J.’s last wife died in the “accident” and finds him and her mom necking on the couch, leaves in disgust and locks herself in her room to smoke pot. When D. J. tries to talk to her, she rather coldly informs him that his youth slang is about two decades out of date — presumably it was what was current when he was still roadie’ing for that mysterious big rock band — and Sienna is put out enough by her mom’s actions with D. J. that when the two actually get married (with the ceremony officiated by the Black assistant minister in his church) Sienna is nowhere to be found, just as she bolted the funeral service for D. J.’s immediately previous wife. 

This being a Lifetime movie, most of Bright Side’s little police force buys that the death of the previous Mrs. D. J. was an accident, but not female detective Leslie Mann (Rachel G. Whittle); she’s already suspicious that the minister has lost two wives in three years, and she gets even more suspicious when Trish’s ex, John Wells (Jeff Marchelletta, who was heftier but also hunkier than Ryan Patrick Shanahan, and I was certainly hoping the screenwriter would end the film with him and Trish reconciling!), turns up in Bright Side. Shortly after he arrives, he disappears and ostensibly takes Sienna with him — Trish finds her room empty and she’s left behind a computer-printed letter saying she’s left Bright Side to live in the city with her dad — but then a couple of hikers in the woods around Bright Side spot a body that turns out to be John’s. We then get a glimpse of a room at the town’s one hostelry, the “Bright Side Inn,” room #2 in a line of pretty shabby motel-like abodes, and Sienna is actually holing up there with D. J., who’s waiting for a chance to get rid of Trish and has promised Sienna that once her mom is out of the way, the two of them can be together. Just what on earth attracted Sienna to a man who initially repulsed her as much as D. J. did is a mystery our anonymous screenwriter never bothers to explain — but the film leads to a typical Lifetime confrontation scene as mom slowly realizes D. J. is poisoning her (and even finds an ampule of a sickly green fluid that appears to be what he’s using — though instead of doing the obviously sensible thing and taking the ampule to the police for analysis, she pours the stuff down the kitchen sink) and she, D. J. and Sienna meet in room #2 of the Bright Side Inn, Sienna threatens to stab D. J. for poisoning her mom, D. J. gets the knife away from her and says he’s going to make it look like Sienna killed her mom and then herself, but fortunately Detective Mann arrives to save the two women — including getting Trish to a hospital in time to save her from the effects of the drug D. J. gave her — and take D. J. into custody (it’s something of a surprise to see a Lifetime movie in which the principal villain is arrested at the end instead of killed).  

Sinister Minister was supposedly based on a true story, the arrest and conviction of Rev. Arthur Schirmer in 2013 for the murder of his wife Betty Jean in 2008, followed by his plea of no contest to a charge that in 1999 he killed his first wife Jewel — though the real Rev. Schirmer let nine years, not just three, pass between his two killings and as far as the online sources have it did not romance both a mother and her daughter the way D. J. does in the film. But it was in connection with the real-life Schirmer case that headline writers apparently coined the phrase “the sinister minister.” What’s weak about Sinister Minister the movie is that the writer and José Montesinos, who directed effectively given what he had to work with, really didn’t offer much insight into What Made D. J. Run — a passing remark he makes towards the end about having had an overprotective mother is as close as we get to an explanation for why he’s the way he is — and it also doesn’t help that the casting person, Scotty Mullen, came up with three women who look pretty interchangeable. When D. J. mistakes Sienna for Trish’s younger sister instead of her daughter, my only thought was, “You, too” — and Rachel G. Whittle as the woman cop who unravels the whole thing is on the same body type as the women playing the mother and daughter in distress: they’re all slender, athletic, with long, free-flowing straight black hair, and the only thing that distinguishes Whittle from the other two is that, as part of her playing the cop, she dresses in more butch-cut pants than Nikki Howard or Angelica Briones. Sinister Minister is frustrating because with a little more care, especially in the writing department, it could have been considerably better than the common run of Lifetime movies (where was Christine Conradt that week when they needed her?); instead it’s just another sporadically interesting film in which Ryan Patrick Shanahan’s performance as D. J. is neither subtle and complex enough to be a genuinely convincing seducer/villain nor flaringly psycho enough to make the character scary.

National Memorial Day Concert, May 29, 2017 (PBS/WETA, 2017)

by Mark Gabrish Conlan • Copyright © 2017 by Mark Gabrish Conlan • All rights reserved

After Sinister Minister I put on KPBS for the 29th annual National Memorial Day Concert — though the term “concert” probably should be put in quotes because whatever these annual extravaganzae are, they’re not “concerts” in any but the broadest sense of the term. They’re held annually on the National Mall in Washington, D.C. within eyeshot of all the war memorials in the place (including the preposterous one honoring the veterans of World War II, for which the architect’s renderings looked so over the top they were compared to Albert Speer and it was suggested that the only way this would be an appropriate World War II memorial would be if Germany had won — though frankly to me it looked less like an Albert Speer production than something MGM’s art department head, Cedric Gibbons, would have come up with for an Esther Williams water ballet), and the orchestra is Washington’s regular one, the National Symphony, conducted by Jack Everly, who took over the task when the concerts’ founding conductor, Erich Kunzel, died. What these “concerts” really are is performances by actors presenting real-life stories of heroism among America’s veterans and also tales of recovery from life-threatening injuries sustained in combat, often accompanied by mention of the loved ones that helped take care of the vets and nurse them back to a semblance of normal life. This content has so overwhelmed the musical numbers that after a stentorian opener sung by African-American baritone Christopher Jackson with a full chorus as well as the National Symphony, it was 20 minutes before we got another musical selection that wasn’t being narrated over. It also doesn’t help that the concert seems to get the same people to participate over and over: the hosts are Joe Mantegna (a regular) and Laurence Fishburne; Gary Sinise pre-taped his contribution because he’s witnessing the birth of his first grandchild, but as one of Hollywood’s few “out” Right-wingers he’s long been affiliated with this event; and some of the musical guests, notably singer Ronan Tynan (not a great singer but a genuinely heartfelt one), were also familiar from prior years. 

The first tribute segment was to Col. Richard Cole of what was, when he served, the U.S. Army Air Corps (it spun off into a separate service, the U.S. Air Force, after World War II), who’s 101 years old and the last survivor of the Doolittle Raid on Tokyo in 1942. (Cole had an injury that kept him from the concert stage but we were assured he’s still all right.) Robert Patrick played Cole effectively against a montage of actual World War II clips. Then Fishburne introduced a tribute to the Tuskegee Airmen and the five survivors of that quite compelling squadron were introduced on stage. After that the National Symphony Orchestra played a piece called “Commemoration” by Robert Wendell, and tenor Russell Watson sang a sappy inspirational song called “You Raise Me Up.” The next segment was one of the most compelling stories of the night: Luis Avila from Metairie, Louisiana, who had already done three tours in Iraq when he was sent to Afghanistan in 2011 and assigned a fourth which ended abruptly when an improvised explosive device (IED) blew up the armored vehicle his team were in and left him comatose for two years and with one less leg than nature’s design for him. The really moving part of his story was the sheer dedication of his wife to looking after him and working to bring him first out of his coma and then render him mobile and articulate. The Avilas were played by a real-life Latino/a acting couple, John and Ana Ortiz, while the orchestra played such standard “inspirational” works as the Bach-Gounod “Ave Maria” and the “Largo” from Dvorák’s “New World” symphony that eventually got turned into a faux spiritual, “Goin’ Home.” Later the Ortizes explained that a key part of Avila’s therapy had been playing him music, and he and his musical therapist, Robeson Vadrian (I think I scribbled down her name more or less correctly), made a subsequent appearance on the show. After the producers told the Avilas’ story, Renée Fleming, operatic superstar, was brought on to sing “Wind Beneath My Wings” — a thoroughly pathetic (in both senses) piece of work — and then Ms. Vadrian and Luis Avila came out and joined her for “God Bless America.” Avila’s contributions were pretty toneless but still moving given all that he’s been through and how vividly we’d just seen it dramatized, and Vadrian got swamped by the world-class operatic voice on stage with her, but Fleming herself was audibly a lot more turned on by “God Bless America” than “Wind Beneath My Wings,” and it showed. Afterwards a singer named John Radamzick who’s part of a group called Five for Fighting came out and did a song that variously seemed to be called “All for One” and “Together We’ll Rise,” which sounded strikingly like U2: the same sort of stentorian lead voice, the same clucking noises on guitar, the same aura of strained seriousness. 

After that we got Mary McCormack, the marvelous lead actress on the woefully short-lived (four seasons) USA Network TV series In Plain Sight about the federal witness protection program, playing Jacke (that’s how she spells her first name!) Walton, who waited 33 years for her father to return from Viet Nam alive or dead — she even had a letter she had written him when he was in country, which had been returned to her shortly before her mother was officially notified that he and his entire patrol had never returned from an operation — before his remains were finally found and she opened and re-read the letter she’d written him as a child. After that a country singer named Scottie McCreary came out to do a song called “In the Patch Between,” and then retired general and former U.S. Secretary of State Colin Powell, another “regular” on the Memorial Day concert stage, came out for a tribute to the late Jerry Colbert, who thought up the idea for these elaborate “concerts.” Then an unidentified bugler blew “Taps” as a memorial for all America’s war dead, and afterwards they introduced the current chair and vice chair of the Joint Chiefs of Staff, Joseph Dunford and Paul Selva, as well as the usual medley of the songs of all the U.S. armed services (including not only the Coast Guard but even the National Guard — did you know the National Guard had an official theme song? Neither did I) before there was a sort of token bow to the whole idea of a world without war, with Vanessa Williams joining Patrick Lundy and a gospel choir called The Ministers of Music doing “Let There Be Peace on Earth.” As an anti-war song it’s hardly at the level of, say, John Lennon’s “Imagine,” but it’s a traditional piece and therefore “safe” enough for this context. The show closed with Christopher Jackson and the full forces returning for “America the Beautiful” — an uplifting ending to a show that remains perched uneasily between glorification of the U.S. military and its mission (and the many defense contractors who contribute to keeping this show on the air — Lockheed Martin is the lead sponsor and General Dynamics got a later plug) and an acknowledgment in some of the stories told of the human cost of war and how, while war may at times be a necessary evil, it is still evil.

Sunday, May 28, 2017

2001: A Space Odyssey (Kubrick Productions/MGM, 1968)

by Mark Gabrish Conlan • Copyright © 2017 by Mark Gabrish Conlan • All rights reserved

We also watched a TNT showing of the film 2001: A Space Odyssey (in a letterboxed edition, which was a gain — but with all too many commercial interruptions, which was not a gain; this is one film that cannot survive being chopped, sliced and diced for the financial requirements of commercial television). — 7/27/93

•••••

I ran Charles the tape I’d made from Turner Classic Movies two months ago of a shortened (149 minutes instead of the 171 minutes of the theatrical version and the 185 minutes of Kubrick’s first cut) but, blessedly, letterboxed version of 2001: A Space Odyssey. It remains a haunting film despite the virtual failure of any of its predictions to come true — we’re just 2 1/2 years away from 2001 and the world is singularly bereft of giant space stations, regular passenger service to the moon and intelligent computers that talk and ultimately go psycho. It’s also a great film — I remember thinking in the 1970’s it was the only movie made since Citizen Kane I would put at the level of the all-time best (even in that American Film Institute poll of the 100 greatest movies of all time — already, predictably, controversial because of its blatant omissions[1]2001 placed 22nd, which helps make up in retrospect for its appalling omission from the Academy Award nominees for Best Picture of 1968; the Academy is often criticized because Citizen Kane, the top vote-getter in the AFI poll, didn’t win Best Picture for 1941, but at least it was nominated!) — and it holds up surprisingly well, though I noticed Charles was impressed less by the metaphysical aspects of the movie than by the subplot involving the psychotic Hal (Charles was actually talking to the screen during much of the human-computer confrontation, warning the astronauts not to trust the computer — and even I couldn’t resist joking, when Hal boasted that no H.A.L. 9000 computer had ever made a mistake, “At least not since we stopped buying chips from Intel”) — and even though there’s a certain structural shakiness about the movie, in which the final psychedelic sequence and the rebirth of astronaut Dave Bowman as the Star-Child seem grafted-on flights of purest fantasy that don’t have all that much to do with everything we’ve seen up until then, which has been meticulously plotted science fiction.

Charles said he can’t help, whenever he watches 2001, but read back the details of Arthur C. Clarke’s book version of the story into the film and fill in its ambiguities that way — which is a perfectly justifiable way to handle a film that is made from a book, but is somewhat more problematic in a movie like this in which the book was written after the film, albeit by one of the people who conceived the script (and Clarke’s main motive in writing the book seems to have been less to create a definitive literary version of the story than to write a version encompassing all the differences he’d had with Kubrick over how the story should go). I think it’s best to conceive of the book and the film as two separate versions of the same legend; certainly Clarke’s novel provides one perfectly viable reading of the film, but not the only (or necessarily the most authoritative) one. Many of Kubrick’s directorial decisions — especially the elimination of the narration that was originally going to run through the entire film and explain everything — seemed designed to heighten the ambiguity, to increase the number of viable readings that could be made of the film — and in that sense, as I told Charles at the end, 2001 seems to vindicate director Allan Dwan’s statement that “we write with the camera, not with a pencil or pen, and we’ve got to remember that and not get trapped by the fellow who writes with words.” (Charles replied with the argument that at least words say something specific, whereas images don’t! He was also amused when I mentioned Ray Bradbury’s statement that he hadn’t liked 2001 because he thought Stanley Kubrick was a terrible writer who got in the way of Arthur C. Clarke, a great writer.) Certainly 2001 has been enormously influential, not only in making science-fiction a respectable film genre at last (though there had been important science-fiction films before — not only oldies like the original Metropolis but also estimable works in the 1950’s like the first Invasion of the Body Snatchers, Forbidden Planet and even — despite the ill-advised inclusion of a bug-eyed monster — This Island Earth — and I’ve always been amused by our friend Chris Schneider’s comment that he particularly hates Star Wars because he felt the success of 2001 opened up the market for intellectual science fiction in films and the even greater success of Star Wars closed it again) but also in anticipating the current age of music videos in exalting vivid imagery over intellectual sense (though 2001 is a marvel of clarity compared to a number of recent films which supposedly tell a coherent story but are really excuses to get from one explosive action scene to another), and also technically in terms of actually making spacecraft and other planets that looked convincingly real on screen for the first time. — 6/18/98

•••••

The movie Charles and I selected last night from our DVD backlog was one of my all-time favorites — indeed, when Citizen Kane was finally displaced from its traditional top slot in the 2012 Sight and Sound “10 Best of All Time” poll by, of all things, Hitchcock’s Vertigo (which I really like but I don’t think is the greatest film ever made — indeed, I don’t even think it’s the greatest film Hitchcock ever made: I’d rate Shadow of a Doubt, Notorious and Strangers on a Train ahead of it), I thought that if any film deserved to knock off Kane from the “best film of all time” title it was this one: Stanley Kubrick’s 2001: A Space Odyssey. 2001 was first released in 1968 after Kubrick had spent over three years shooting it, originally calling it Voyage Beyond the Stars until, according to an imdb.com “Trivia” poster, he saw the 1966 science-fiction film Fantastic Voyage and hated it so much he was determined not to use the word “Voyage” for his film. When it first came out it didn’t draw big audiences initially — especially not big enough to fill the large theatres needed to run it in its original 70 mm format (it was billed as being in Cinerama but really wasn’t) on a road-show basis, complete with intermission — but a strange thing happened. Maybe not that many people liked it, but the folks who did like it went to see it again. And again. And again

And though wags liked to joke that they were seeing it multiple times to figure out what was supposed to be happening in it, 2001 attracted a series of cult followings including hippies turned on by the long psychedelic sequence at the end, in which astronaut Dave Bowman (Keir Dullea), the last survivor of the crew of the interplanetary ship Discovery, crashes through the so-called “Star Gate” to reach the interstellar aliens who have been controlling all human evolution; computer nerds who heard and saw the talking H.A.L. 9000 computer (nicknamed “Hal”) and got into artificial intelligence work because they were inspired to create such a computer in real life; as well as all sorts of other people who just liked the idea that someone had finally created a science-fiction movie with the depth and complexity of the best science-fiction writing. The film began life as a 1952 short story by Arthur C. Clarke called “The Sentinel,” in which early human explorers on the moon dig up an artifact left there by some other civilization — a tall object that looks something like an elongated pyramid — and note that it beams a radio signal back to wherever it came from. The explorers realize it was a sentinel left there by another planet’s ship so when it was uncovered it would send back a signal that humans had evolved enough to be capable of space travel — and the story ends with the narrator expressing the expectation that there will be some kind of response and writing the last words, “I do not think we will have to wait for long.” Director Stanley Kubrick read “The Sentinel” and after completing Dr. Strangelove in 1964 decided to expand it into the basis for his next film. He hired Clarke to write the screenplay in collaboration with him, and the two worked out an arrangement by which Clarke would publish a novel with the film’s title, over whose content he would have sole control, while Kubrick would have sole control over the content of the film. So with 2001 we have a rare level of documentation of how, at least for this one film, the always contentious relationship between screenwriter and director worked out: the differences between 2001 the novel and 2001 the movie represent the points of contention between the two creators. 

The story opens with a prehistoric sequence (quite convincing even though it, like the entire movie, was shot inside the soundstages at MGM’s studio at Borehamwood,[2] England — later, for his Viet Nam War movie Full Metal Jacket, Kubrick re-created Viet Nam on British soundstages quite convincingly even though one imdb.com contributor noted that the palm trees were not only fake but all looked exactly the same: they had been made from the same mold) called “The Dawn of Man,” in which prehistoric apes are threatened by hunger, drought and predators (including a leopard, who drives them off the carcass of a zebra — actually a dead horse painted to look like a zebra, even though biologically zebras are closer to goats than horses) until inspiration from outer space, represented by a giant monolith (in Clarke’s novel its proportions are 1:4:9, representing the sequence of square and cubed numbers, but in the movie the monolith is more elongated than that), teaches them how to use the bones of dead animals as tools so they can kill the ubiquitous tapirs for food and also attack each other. (It seems curiously against the pro-peace Zeitgeist that Kubrick would release a film in 1968 that argues that violence is a necessary and inextricable part of human evolution, but it was a theme Kubrick would return to in his next film, A Clockwork Orange.) The lead ape, Moon-Watcher (Daniel Richter), throws his bone into the air in triumph — and Kubrick jump-cuts to a scene in space in which Earth has been honeycombed by nuclear weapons in orbit, each bearing the logo of one of the world’s three major powers (the U.S., the Soviet Union and China), placed there so any one of them can launch a nuclear attack without having to do so from earth-based missiles, bombers or submarines. (2001 was originally supposed to end with the Star-Child — the reincarnated form of Dave Bowman whom the aliens have sent back to earth — blowing up the nuclear weapons in space and thus rendering them harmless, but Kubrick thought this would look too much like the “We’ll Meet Again” end-of-the-world ending of Dr. Strangelove, so the plot point of atomic weapons in space is almost totally lost in the final film.) 

We then meet Dr. Haywood Floyd (William Sylvester), Earth scientist on his way to the moon in a craft flown there by Pan American Airways (one of a number of firms shown in this film that met their demise well before the real 2001), which docks on a space station so Floyd can transfer to a landing craft to complete his journey to the moon. (Clarke’s novel has an ironic comment that Floyd has just made a trip humans had been dreaming of for millennia — going to the moon — and it was totally routine. The line became even more ironic when humans actually did go to the moon six times between 1969 and 1972 — and then stopped.) After a whole lot of stiff-upper-lip conversations between Floyd and his Russian colleagues on the space station, and then between him and the people at Clavius base on the moon — which has been quarantined, ostensibly because of an unknown epidemic among the personnel stationed there but really to protect the secrecy around the discovery of another monolith, obviously put there (as per Clarke’s original source story) by members of another civilization to let them know when humans evolved enough to make it to the moon — we finally get to see the monolith, it emits its high-pitched signal — there’s a grimly amusing moment in which the moonbase staff put their hands over the “ears” of their space helmets, acting by instinct and momentarily forgetting that the sound they hear is through their own radio receivers because sound waves don’t carry in the moon’s airless environment — and then Kubrick cuts to the spaceship Discovery on a mission to Jupiter 18 months later. Discovery contains five astronauts, though three of them are in hibernation, in which their bodies’ metabolism has been slowed to the absolute minimum to sustain life and they’re unconscious in giant combination refrigerators (we see ice crystals formed around the windows that allow us to see their eyes and noses but nothing else about them) and sarcophagi. 

The two astronauts who are “up and running,” as it were, are Dave Bowman (Keir Dullea) and Frank Poole (Gary Lockwood), both attractive young men who make their on-screen debuts in gym shorts as they run up and down the curved interior of the space vehicle (curved, like the space station, to create artificial gravity). The sixth member of the crew is their H.A.L. 9000 computer, who’s represented as a glowing red eye with a yellow pupil — indeed. H.A.L.’s eyes run throughout the ship, giving the astronauts a sort of living-under-Big-Brother feel as the computer can keep track of them wherever they are on board — and the computer, whom (as a BBC reporter in a news segment on the voyage helpfully explains) one addresses as “Hal.” One not only addresses it, it addresses back in the chillingly monotonous tones of actor Douglas Rain, a Canadian who was engaged by Kubrick because he had an even-toned voice that would sound credible as one synthesized by a future machine. (Interestingly Kubrick originally thought of the computer as having a woman’s voice, like most recent commercial artificial intelligences, including Apple’s Siri and Amazon’s Alexa, which are given female voices and personae.) Unfortunately Hal goes crazy for reasons that, like much about this movie, are kept ambiguous in the film (though Clarke is clearer about them in the novel); he makes a minor error — predicting a flaw in a radio control unit on board and sending Bowman on a mission to replace it — and this snowballs into another extra-vehicular mission in which Poole goes out to replace the first unit and let it fail, and while he and Bowman are gone Hal goes totally bonkers, killing the hibernating crew members (this is depicted in what one of the original reviewers called “the most chilling modern death scene imaginable”: the lines representing the hibernators’ vital functions gradually become less wavy and finally flatten altogether, and the computer screens flash first “LIFE SYSTEMS CRITICAL” and then “LIFE SYSTEMS TERMINATED”), severing Poole’s oxygen line so he floats away helplessly in space, and refusing to open the ship so Bowman, flying the space pod that was supposed to support Poole, can’t get back in.

In a sequence Clarke borrowed from another short story of his, “Take a Deep Breath,” Bowman is obliged to use the manual entrance and, since he’s dressed in his spacesuit but forgot to put on its helmet, he has frantically to clutch for the seconds he can survive in a vacuum until he can pull the lever that seals the door and allows air to flow back into the chamber. Then he marches to the central chamber that holds Hal’s higher brain functions and methodically disconnects them — leaving on only the purely autonomic controls that maintain the ship’s operations, including its life support systems — and Hal responds by delivering a slowed-down version of “Bicycle Built for Two” as he’s essentially lobotomized until he expires completely. In Clarke’s novel there’s a long depiction of Bowman’s voyage to Jupiter after that — he discovers the ship’s library of recorded music and works his way backwards through classical-music history, starting with the Romantics and ending with Bach — but in the film it quickly cuts to Bowman reaching Jupiter and taking a space pod out to explore a third monolith, this one orbiting Jupiter. (In Clarke’s original conception the destination planet was Saturn, but it turned out that the special effects required to create a convincing surface of a gas-giant planet and the ones required to create convincing rings around it were so different they were incompatible, so while the astronauts in Clarke’s novel bypassed Jupiter and used a slingshot effect to get to Saturn, the ones in the movie stayed on course for Jupiter.) Then the monolith opens the “Star Gate” (though you wouldn’t know it was called that unless you’d read Clarke’s novel) and Bowman passes through a long series of stunning optical illusions, much like the experimental movies that were popular with student audiences in the 1960’s but of course produced on a much more expansive budget, including second-unit footage of the Hebrides in Scotland and Monument Valley in Utah, famed as the location of innumerable John Ford Westerns. (But the view of Monument Valley, though including the famous elevated mesa often seen in the background of Ford’s films, is heavily solarized and color-distorted, leading me to joke, “I have a feeling we’re not in John Ford’s Monument Valley anymore.”) 

Finally Bowman ends up in a ridiculously ornate hotel room and progressively ages in various steps (looking surprisingly like the real Keir Dullea as he has naturally aged — often actors who were artificially “aged” for a role when they were young don’t look at all like that when they really get to be that old) until, after one more visitation from the monolith, he’s reincarnated as a planet-sized fetus (the “Star Child,” he’s called in Clarke’s books) and ends up orbiting the earth and moon. In the original draft of the script — and in Clarke’s novel ­— he sets off all the nuclear weapons the great powers have orbiting in space, but Kubrick thought that would look too much like the ending of his previous film Dr. Strangelove — in which the Russian “Doomsday Device” sets off a series of nuclear explosions that render the whole world’s surface uninhabitable — so at the end the Star Child simply floats in space. “Now that he was master of the world, he didn’t quite know what to do. But he would think of something,” wrote Clarke in the novel (which I’m quoting from memory). Clarke would eventually write three more novels in the 2001 “universe,” and his first sequel was filmed (but extensively changed) in 1984 as 2010 with Roy Scheider as Heywood Floyd, the character William Sylvester played here. Though its prediction that once we got to the moon humans would continue space exploration indefinitely and build permanent space stations, run regular moon flights and ultimately colonize the rest of the solar system proved wrong (and so did Arthur C. Clarke’s rather optimistic prediction that the moon would contain water-bearing rocks, which would have allowed human colonists to crush them and extract water, which in turn could be used to release breathable oxygen, which was disproven when humans actually landed on the moon and found no embedded water in its rocks), some of 2001’s calls proved surprisingly correct: the hand-held screens on which the Discovery astronauts watch radio transmissions from Earth look like modern-day tablet computers and the electronic devices used to monitor the health status of the hibernating astronauts are in standard medical use today. The term “flatlining” is even used by modern-day doctors and nurses to describe what’s happening when a person starts losing their life while being so monitored and needs emergency intervention to keep from croaking completely.  

2001: A Space Odyssey remains a magnificent film even though the date in which it was set has come and gone and humans have retreated back to earth after their first forays to the moon; with the arguable exception of Robert Wise’s The Day the Earth Stood Still it was the first science-fiction film made by a “name” director since Fritz Lang’s Woman in the Moon (1928), and Kubrick’s attention to detail made it the most realistic-looking one to date. Instead of the smooth-surfaced toy spaceships that had bounced around on wires in things like the 1930’s Flash Gordon serials, the moon craft and the Discovery look like actual electromechanical devices, with ridges, protuberances, bolts and the other accoutrements of actual human construction instead of the idealized burnished surfaces of previous movie spacecraft. (There’s a nice bit of “planting” in the script when we twice get close-ups of the back of the space pods, with their “CAUTION! EXPLOSIVE BOLTS” warnings, before the explosive bolts themselves feature prominently in the sequence of Bowman breaking back into the Discovery after Hal refuses to let him back in.) Stanley Kubrick took credit for the special effects as well as the overall direction of the film — which incensed Douglas Trumbull, who was really the effects person in charge but just got relegated to a long list of assistants (ironically 2001 won the Academy Award for Best Achievement in Visual Effects and the award went to Kubrick, the only Oscar he ever won: the Academy didn’t even nominate 2001 for Best Picture — at least the famously shut-out Citizen Kane got nominated — and the film that did win Best Picture that year, Oliver!, was a rankly sentimental musical whose only saving graces were the shards of genuine darkness and emotion left over from Charles Dickens’ source novel after author Lionel Bart and the filmmakers got through sweetening it), which in turn led Steven Spielberg to credit not only Trumbull but his entire team on Close Encounters of the Third Kind, leading to the insane inflation of movie credit rolls we’ve seen since. 

Midway through the film — at the moment when it cut from the moon to the Discovery mission to Jupiter — Charles pointed out that 2001 is really more like a Soviet film, especially a Soviet film made after the 1930’s, when Stalin ruled that the high-energy fast-moving montage style of the Soviet silent classics was no longer politically correct. Its closest successor is the Andrei Tarkovsky film of Solaris, made in 1972 and likewise a long, ponderously paced movie about extraterrestrial intelligence and its power to manipulate human beings — even though Tarkovsky said at the time he was trying to make an anti-2001 and in particular he wanted characters that had rich emotional lives and went through recognizable and identifiable personal conflicts. (The main character in Solaris is a space scientist who is still devastated by the suicide of his girlfriend, only he meets up with her again when the sentient ocean that covers the planet Solaris sends him a replica — only, since she’s based on the one source Solaris had, his memories of her, he makes the same mistakes with his replica girlfriend he made back on Earth with the original.) When 2001 was released one of the main things it was criticized for was the dry, dispassionate depiction of the human characters, who talk to each other in stiff, formal, almost militaristic language and show no signs of any emotional connection. (The closest we come is the videophone call between Dr. Floyd and his daughter back on earth, played by Kubrick’s then five-year-old daughter Vivian, and even that seems more out of parental obligation than any real love.) 

Much of the criticism came from people incensed that the H.A.L. 9000 computer seemed to have a deeper and richer emotional life than any of the humans — within Douglas Rain’s monotone voice he’s able by subtle inflections to indicate pride in the H.A.L. series’ flawless operating record and a sort of thinly veiled patronization towards the human members of the crew (indeed, not only did 2001 deserve a Best Picture Academy Award, but arguably Rain deserved one for his incredible performance as the voice of Hal!), as well as genuine grief and fear when he is finally disconnected — while in the disconnection scene Bowman, up until then shown as little more than a cog in the mission, takes on real human qualities and becomes an impressive revenge figure. Criticizing 2001 for making the computer more deeply and richly emotional than the humans is seizing on one of the film’s great strengths and calling it a flaw! Andrew Sarris, one of the main critics who panned the film for the stiff, formal way the astronauts talked, later admitted he’d been wrong when he watched the actual moon landing on TV and noticed the real astronauts were just as stiff and emotionless as the ones in Kubrick’s film. 2001 is also an example of what Sergei Eisenstein in the late 1920’s called “the sound film” as opposed to the talkie: one which would have a bare minimum of dialogue (or, in Eisenstein’s vision, no dialogue at all) but which would use the soundtrack to add music and sound effects so it could tell a story more effectively than a silent film could. 2001 has also been praised for nice little touches, like at least mentioning the seven-minute delay in radio contact between the ship and Earth caused by the sheer distance between them, and using total silence for scenes taking place in space (since sound waves can’t move in a vacuum, space is silent, and some of Kubrick’s most striking effects are the sound edits between space, which is quiet, and the interior of the space pod, which contains air and therefore sound — and reportedly Kubrick himself dubbed in Bowman’s heavy breathing in some of these scenes). It is, to my mind, the greatest science-fiction film ever made (my others would be the Tarkovsky Solaris, Fritz Lang’s Metropolis and Woman in the Moon, Wise’s The Day the Earth Stood Still and the first version of Invasion of the Body Snatchers, directed by Don Siegel, which is essentially a science-fiction film noir) and the gold standard to which anyone making a science-fiction film now should aspire. — 5/28/17


[1] — Among the omissions noted — and criticized — in yesterday’s Los Angeles Times by critic Kenneth Turan: none of the Fred Astaire-Ginger Rogers musicals — indeed, nothing with Astaire at all, though Gene Kelly made the cut with “Singin’ in the Rain” at #10 — as well as none of Garbo’s films (at least two, Queen Christina and Camille, should be on anybody’s list of the 100 best films ever made in the U.S.!), nothing by directors Ernst Lubitsch or Preston Sturges, and only four silent films, The Birth of a Nation (inevitable because of his historical importance and artistic quality, despite its horribly reactionary and racist politics) and three by Chaplin, The Gold Rush, City Lights and Modern Times — nothing by Buster Keaton, who made at least two silent masterpieces worthy of inclusion (Sherlock, Jr. and The General). Turan also noticed that none of Busby Berkeley’s films made the cut — though that’s not so surprising because, as spectacular as his numbers were, very few of Berkeley’s films were all that great as complete films — and neither did 1939’s Gunga Din, “without whose example everything from Star Wars to Raiders of the Lost Ark might not have happened.”

[2] — Earlier references to this studio I’ve seen spell its name as two words: “Boreham Wood.”

Halls of Montezuma (20th Century-Fox, 1951)

by Mark Gabrish Conlan • Copyright © 2017 by Mark Gabrish Conlan • All rights reserved

Two nights ago Charles and I watched a considerably less exalted but still interesting film: Halls of Montezuma, made by 20th Century-Fox in 1951 and oddly poised between your typical rah-rah war movie of the time (six years after the end of World War II, while the Korean War — or as it was euphemistically called, the “police action”) was still going on and the Cold War was at its peak, and something deeper and richer. Producer Robert Bassler decided to splurge on three-strip Technicolor and managed to get the full cooperation of the U.S. Marine Corps in the making of the film, including supplying the producers with actual documentary footage of the real-life Battle of Okinawa which the movie was reproducing. Indeed, the Marines were so taken by the results they actually used the movie as a recruiting tool! At the same time Bassler hired Lewis Milestone as his director, working from a script by Michael Blankfort (though other writers, including an actual Marine officer, were on the project before Blankfort was), and since Milestone’s best-known credit was the 1930 film All Quiet on the Western Front he wasn’t exactly your go-to guy for a pro-war movie. As a result, Halls of Montezuma is poised between a rah-rah war movie and a war-is-hell movie. It’s centered around your typical gung-ho commanding officer, Lt. Carl Anderson (Richard Widmark), who’s about to lead a Marine platoon onto a hotly contested island (the island is carefully unnamed in the film, but the alternate title under which it was released in some countries, Okinawa, gives it away), and in the early parts it flashes back to the pre-service backgrounds of some of his men. 

Before the war Anderson was a high-school chemistry teacher, and he remembers one of his current troops, Conroy (Richard Hylton), as a stutterer whom Anderson broke of that habit by sheer force of will. Now Conroy, having already participated in the bloody battles at Guadalcanal and Tarawa, is suffering from what would then have been called “shell shock” and is now known as post-traumatic stress disorder (PTSD), and Anderson is once again taking a no-nonsense position with him, ordering him to participate in the latest landing and once again pushing him by sheer force of will to do what he’s most afraid of. Other men in the outfit include Pigeon Lane (a tough-as-nails Jack Palance, here billed under his original first name, Walter, with Jack in parenthesis); Pretty Boy (Skip Homeier, who first attained his 15 minutes fighting World War II in the movies on the other side, as the brainwashed Nazi kid in both the stage and film versions of Tomorrow, the World!); Coffman (Robert Wagner in his first credited role, though he’d appeared unbilled the year before in the film The Happy Years), Lt. Col. Gilfillan (Richard Boone in his first film), medical corpsman Doc (Karl Malden, surprisingly restrained and effective) and the rather preposterous character of Sgt. Johnson (Reginald Gardiner), who looks both too old (middle-aged) and too British to belong in a U.S. Marine Corps unit. (One expects a line of dialogue explaining that he’s on loan from the British military because he can speak Japanese, which none of the U.S. military men can do, but that’s not explained in the film itself.) At first the Marines think they’re going to be able to establish a beachhead on the island and occupy it with little resistance, but they soon learn better as the Japanese begin opening fire from machine guns in pillboxes, snipers, and ultimately battlefield rockets, which bedevil the Americans because they can’t figure out where the rockets are coming from and if they’re not found and destroyed within nine hours they’ll decimate the rest of the invading U.S. troops. 

One of the messages the Marine Corps especially wanted this film to communicate to Marine recruits as well as civilians was the need to take as many prisoners of war as possible. As Anderson explains in the film, “We used to say, ‘The only good Jap is a dead Jap.’ We were wrong. A dead Jap can’t be interrogated and can’t give us any information.” He has to deal with Coffman, who before his own death in the battle goes crazy and wants to kill all the prisoners out of hatred and revenge. He also has to deal with the Japanese themselves, including Nomura (Richard Loo, the Chinese actor who’d found his niche playing dastardly Japanese officers in movies made since World War II and no doubt was not happy about still being typecast this way six years after the war was over!). It turns out that instead of stationing the rockets on the far side of the mountain the U.S. Marines were trying to take, Nomura, a civil engineer before the war, worked out a system of tunnels so the rockets could be kept hidden inside the mountain, safe from enemy attack, brought out whenever needed and quickly hidden away again. (This is historically accurate and one reason why the assaults on Iwo Jima and Okinawa were so costly in terms of U.S. lives.) Once Anderson and company figure this out, they radio the information to the aircraft carriers stationed offshore and the carriers launch raids with dive bombers to close up the openings in the mountain so the Japanese can’t bring out their rockets and the Marines can charge ahead with minimal resistance — though the closing shot of the film isn’t the successful conclusion of the battle, but simply one of the Marines marching across the battlefield as the soundtrack belts out the “U.S. Marine Hymn” (the one that begins, “From the halls of Montezuma to the shores of Tripoli,” and which actually took its melody from the operetta Geneviève de Brabant by French composer Jacques Offenbach) and we’re obviously supposed to take this as a stirring patriotic conclusion.  

Halls of Montezuma is a schizoid film, reflecting in part the studio’s attempt to make a one-sidedly patriotic film of Cold War inspiration and in part the darker, deeper ideas director Milestone had about war. There are a lot of visual motifs he’d used in his other war movies, including the use of bombed-out farmhouses and other ramshackle buildings as improvised military command centers à la All Quiet and the long scenes of servicemembers marching into battle through lovely pastoral countryside, the beauty of the landscape ironically contrasting with the brutality of what’s going on in it (something Milestone had done big-time in A Walk in the Sun and King Vidor had done before that in the 1925 silent The Big Parade). Milestone’s idea of war was a dirty, nasty business almost no one got out of alive, and through much of the movie he actually tones down the usually shrieking hues of three-strip Technicolor and gets an almost “black-and-white in color” effect much like what became standard in the late 1960’s, when Technicolor introduced a process called “denatured color” that reduced the intensity and vibrancy of its colors and therefore made color seem suitable for more “serious” film stories that had previously been done in black-and-white. Having said that, through much of Halls of Montezuma I found myself wishing it had been in black-and-white; the vision of war in Milestone’s direction and Blankfort’s script cries out for the chiaroscuro, almost noir look of classic black-and-white and instead gets a less vibrant but still not altogether appropriate color scheme. Those rousing opening and closing renditions of the U.S. Marine Hymn and John Philip Sousa’s march about the Corps, “Semper Fidelis” (a year before 20th Century-Fox made their Sousa biopic Stars and Stripes Forever) may say war is a wonderful enterprise all red-blooded American boys (not girls, yet!) should want to join in, but what we get in between is a dirty, nasty business in a movie that’s hardly at the level of All Quiet on the Western Front or A Walk in the Sun but certainly has its points — even though, quite frankly, I’d rather see the movies in which the studio promoted Widmark as the next James Cagney (Fox head Darryl F. Zanuck had signed Cagney to Warner Bros. in the first place in 1930 and then lost him three years later when he was forced out of the studio, though he landed Cagney as a free-lancer in 1946 for the World War II spy drama 13 Rue Madeleine) than the ones in which, as here, they were trying to make him into the next John Wayne!

Thursday, May 25, 2017

Frontline: “American Patriot” (WGBH/PBS, aired May 16, 2017) and "Bannon’s War” (WGBH/PBS, aired May 18, 2017)

by Mark Gabrish Conlan • Copyright © 2017 by Mark Gabrish Conlan • All rights reserved

Recently the long-running PBS Frontline program — actually produced for the national network by station WGBH in Boston — has run a couple of episodes that perhaps unwittingly formed odd bookends, one showing the extreme “alt-Right” in revolutionary mode, mounting — and so far getting away with — armed insurrections against the U.S. government, while the other shows the “alt-Right” actually winning admission to the halls of official government power with which to promote its white-separatist, nationalist “America First” agenda. The first program, aired May 16, was called American Patriot — an oddly singular title for a show with plural protagonists — and dealt with the antics of the Bundy family of Nevada. Their first 15 minutes of nationwide fame came in 2014, when paterfamilias Cliven Bundy, a cattle rancher in the middle of a 20-year battle with the U.S. Bureau of Land Management (BLM) over when and where his cattle could graze and how much he’d have to pay the government for what was essentially rent for the public lands on which his cattle fed, decided to make his stand in the appropriately named town of Bunkerville, Nevada. Cliven Bundy was at the receiving end of federal policies aimed not only at getting more money from the cattle ranchers in the area but reducing the total amount of area available for grazing so more of the land could be allowed to return to its natural state — and his case became a cause célèbre for the radical-Right militia movement in general and groups with names like the Oath Keepers and the Three Percenters in particular.

Bundy had declared he wasn’t going to pay his grazing fees, and the BLM responded by mounting an operation to seize his cattle and essentially hold them as collateral for the fees he owned. Suddenly the BLM agents were faced with an armed resistance by militia groups demanding that the federal government not only give Cliven Bundy back his cattle but get out of the business of land management altogether and give control of the West’s lands either to the private sector or to state or local governments which would be easier for the ranchers to influence. It wasn’t a new demand: a similar movement had started up in the central West in the late 1970’s that called itself the “Sagebrush Rebellion,” and when Ronald Reagan campaigned for President in those states in 1980 he proudly announced, “I am a Sagebrush Rebel.” It was one of the first signals Reagan sent that as President he was going to abandon the tradition of Republican environmentalism that had begun with Theodore Roosevelt and continued through the administrations of Richard Nixon and Gerald Ford (Nixon had signed into law the big environmental protection bills of 1969-1970 and appointed dedicated environmentalists like William Ruckelshaus and Russell Train to run the newly formed Environmental Protection Agency). In 2014 the Bundys were seen by the Right in general — both the nascent alt-Right and the quasi-“respectable” Right of media outlets like Fox News — as heroes courageously standing up to government overreach. As Oregon militia leader Brandon Rappola told Frontline, he was moved to come to Bunkerville to defend the Bundys when he saw a video on YouTube of the male Bundys getting tased by BLM agents and their elderly aunt knocked to the ground. “To come in as a militarized force against your citizen like this, that’s when we the people, we say no, this is not what the Constitution stands for. And we have to remind our federal government that we are the power.” Eventually the BLM agents realized they were outnumbered and outgunned, and they retreated; the Bundys got all their cattle back and they weren’t arrested.

Cliven Bundy instantly became a huge hero to the American Right as a man who had courageously stood up to government oppression — he appeared on Sean Hannity’s show on Fox News and Hannity basically stared at him with gooping admiration — until his public credibility nosedived when he made a widely quoted comment that African-Americans had been better off under slavery than they’ve been since. “I’ve often wondered, are they better off as slaves, picking cotton and having a family life and doing things, or are they better off under government subsidy?” Cliven Bundy said publicly, and in 2014, with an African-American (albeit not one who was descended from American slaves) in the White House, most of the “respectable” Right still considered such expressions of open racism as beyond the pale. The Bundys emerged again in 2016, when Cliven’s son Ammon — who compares to his dad much the way recently defeated French presidential candidate Marine Le Pen compares to hers (Le Pen père was openly anti-Semitic; Le Pen fille realized that in order to be a serious player in French politics she needed to downplay her party’s traditional anti-Jewish prejudices and recast the racist message in nationalist terms, much as Donald Trump did in his successful campaign for President of the U.S.) — led a seemingly bizarre occupation of the Malheur National Wildlife Refuge in Oregon. The Malheur National Wildlife Refuge had originally been established in 1908, when Theodore Roosevelt was President (remember the Republican environmentalist tradition that T.R. established?) and Ammon Bundy and his brother David were coming to the defense of Dwight Hammond, another rebel rancher who had been accused and convicted of arson by the federal government. The government accused Hammond and his family of deliberately setting fires on federal land that endangered human life; the Hammonds claimed they had merely set the fires so the land would grow back as pasture. They were given a light sentence and were actually released when the government won an appeal and a judge ordered them back to prison on the ground that the sentence didn’t meet federal mandatory minimums — and, as Ammon Bundy told Frontline, “This urge just filled my whole body. I felt a divine drive, an urge that said you have to get involved.” The Bundys staged an occupation of Malheur under the organizational name “Citizens for Constitutional Freedom” and, as at the Nevada confrontation, attracted plenty of militia activists and other people who not only had guns but had had military or paramilitary training and therefore knew how to use them.

Not all the militia members went along with the Malheur occupation — they saw themselves as self-defense units and this looked too much like taking the offensive — but among the people who did come was a rancher from Arizona named LeRoy Finicum, who directly confronted law enforcement officials and challenged them to shoot him. They did. Eventually the Malheur occupation ended and the government arrested Aaron and David Bundy and charged them with conspiracy — but an Oregon jury acquitted them on all counts. What was most striking about the Bundy stories was that the government used the same scorched-earth tactics against them they had previously deployed against Left-wing activists from the 1960’s and 1970’s until more recent cases, including the Occupy movement (which some of the Malheur occupiers actually compared themselves to publicly even though the Left-wing Occupiers targeted urban areas and had a very different set of demands and issue positions). They infiltrated agents, including one who posed as a filmmaker interviewing the Bundys for a documentary but who was really an FBI agents assigned to get the Bundys to make incriminating statements on camera. What’s more, some of the infiltrators deliberately acted as agents provocateurs, encouraging the militias to do something violent that the government could then use either to indict them or just go out and shoot them. And the government used the conspiracy statutes against the Bundys because one of the marvelous things about conspiracy law, if you’re a government prosecutor trying to suppress a popular political movement of either the Left or the Right, is that in order to prove there was a conspiracy and your defendants were part of it, you do not have to prove that they actually did anything illegal. All you have to establish is they came together for an illegal purpose and they did one or more “overt acts” in furtherance of that purpose — and the “overt acts” do not necessarily have to be illegal in and of themselves. As the legendary Clarence Darrow explained conspiracy law, “If one boy steals candy, that’s a misdemeanor. If two boys talk about stealing candy but don’t do it, that’s a felony.” I left the Frontline “American Patriot” documentary with oddly mixed feelings, hating the Bundys and loathing their cause but also oddly glad that the government’s underhanded tactics against them ultimately failed.


If the “American Patriot” documentary showcased the alt-Right in the years when it was out of power, the May 23, 2017 Frontline episode, “Bannon’s War,” showed what it looks like when it has a President in office who, if not a committed alt-Rightist (Donald Trump doesn’t appear genuinely committed to much of anything beyond what will make Donald Trump richer and more popular), was certainly comfortable with their philosophy. Like so many of the members of the American ruling class these days, Steve Bannon served his apprenticeship at Goldman Sachs, which is so powerful in its own right on Wall Street and so influential in Washington, D.C. (Trump is the fourth President in a row who has appointed a Secretary of the Treasury who used to work there) it sometimes seems as if the federal government has simply outsourced its entire economic policy to Goldman Sachs. But instead of going from Goldman either into government service or the hedge-fund business, Bannon took his career on a different track, heading for Hollywood with the intent of mobilizing conservatives both in finance and in the entertainment industry to make movies that would reflect the Right-wing world view and counter what Bannon and his fellow Right-wing ideologues saw as the propaganda being put out by “liberal Hollywood.” Bannon didn’t get his name on any major dramatic feature films — he claimed to have helped develop the show Seinfeld and to have had a profit participation in it, but other people involved with Seinfeld have disputed that — so he started producing documentaries admittedly influenced, at least stylistically, by Leni Riefenstahl’s famous 1934 Nazi propaganda film Triumph of the Will. His first production was called In the Face of Evil: Reagan’s War in Word and Deed, and it was originally intended as a celebration of the 40th President’s single-handedly winning the Cold War — but the attacks on the World Trade Center and the Pentagon on September 11, 2001 led Bannon to add a coda to the Reagan film before he released it in 2004, saying that the Evil Empire still lived, only now the world-threatening enemy was not Communism but Islam. Bannon hooked up with David Bossie, whose group Citizens United produced documentaries trashing Democratic Presidential candidates John Kerry and Hillary Clinton (the Citizens United U.S. Supreme Court decision that opened the floodgates for corporations and rich individuals to buy U.S. elections was centered around a small corporate contribution to Bossie’s film attacking Clinton just before the 2008 election) and also discovered a book called The Fourth Turning by authors William Strauss and Neil Howe. 

The central argument was that U.S. political and social history moves in “saecula,” periods of about 70 to 80 years, and that the American Revolution, the Civil War and the combination of the Great Depression and World War Two were turning points in the succession of “saecula.” Nation author Micah L. Sifry, in a February 8, 2017 article on Bannon (https://www.thenation.com/article/steve-bannon-wants-to-start-world-war-iii/), summed up the theory as follows: “According to Strauss and Howe, roughly every 80 years—a saeculum, or the average life-span of a person—America goes through a cataclysmic crisis. Marked by savagery and genocide, and lasting a decade or more, this crisis ends with a reset of the social order and its survivors all vowing never to let such a catastrophe happen again. Each of these crises, Strauss and Howe posit, have been formative moments in our nation’s history. The Revolution of 1776–83, followed roughly 80 years later by the Civil War, followed 80 years after that by the Great Depression and World War II.” In 2009 Bannon released a film called Generation Zero that was basically a depiction of the U.S. economic crisis of 2008 in terms of the saeculum theory, though he took it considerably farther than Strauss and Howe had: in a profile of Bannon published in the February 2, 2017 Time (http://time.com/4657665/steve-bannon-donald-trump/), and also in the Frontline program, historian David Kaiser recalled that he had been asked for an interview for Generation Zero, and when it was filmed Bannon wanted a very specific comment out of him. “He wanted to get me to say on camera that I thought it (the so-called “Fourth Turning,” the fourth saeculum in American history) would occur,” Kaiser recalled. “He wasn’t impolite about it, but the thing I remember him saying, ‘Well, look, you know, we have the American revolution. Then we have the Civil War. That’s bigger. Then we have the Second World War, That’s even bigger. So what’s the next one going to be like?’” As part of his belief that the fourth turning was about to happen in the U.S. — and his determination to use his influence as a filmmaker and activist to bring it about — Bannon looked for a politician who could stage a Presidential campaign on his mix of far-Right nationalism, veiled racism and anti-Islam “clash of civilizations” rhetoric. At first he thought he’d found his ideal candidate in Sarah Palin — he even made a film about her, The Undefeated, that was an attempt to launch her candidacy and propel her to the White House — but Palin quickly lost credibility with the Republican Right after she abruptly resigned as governor of Alaska to become a commentator on Fox News, and instead of “undefeated” the general consensus of the Republican Party about Palin became “quitter.” 

However, when Donald Trump made his ferocious entry into Presidential politics in June 2015 by denouncing virtually all Mexican immigrants as “rapists and criminals,” which soared him to the top of the Republican field overnight and ultimately propelled him to the White House, Bannon — as the proprietor of Breitbart News, a far-Right news Web site Bannon took over from its founder, the late Andrew Breitbart, and turned into so aggressively pro-Trump a propaganda site quite a few contributors left in protest (quite a few of Frontline’s sources about Bannon were people who worked for him at Breitbart and quit in disgust over his making it a site to promote all things Trump at the expense of other Right-wing leaders and causes) — went along for the ride and got appointed chief White House strategist when Trump won. Bannon and Stephen Miller, whom he’d met when Miller was an aide to Jeff Sessions, then U.S. Senator and now Attorney General, and recruited to the Trump campaign, drafted the controversial first version of the immigration/refugee/travel ban against individuals from seven majority-Muslim countries and deliberately made sure that no one outside Trump’s inner circle got a look at it before Trump issued it. Indeed, it was largely Bannon’s idea to have Trump start his presidency with a flurry of executive orders to make it clear, as Bannon put it, that there was a “new sheriff in town” (a phrase quite a lot of Trump advisers have been using to defend his policies and establish him as a transformational leader who seeks a profound and lasting change in American politics and how American individuals relate to their government), which made the Trump administration in its early days look less like a newly elected government of a democratic republic and more like a cabal of generals in a Third World country who had just grabbed power in a coup d’état and whose leader was ruling by decree. Bannon also not only anticipated but actually welcomed the protests that followed the anti-Muslim ban, figuring that most of America would be repelled by them (as they were by similar street actions in the late 1960’s, paving the way for the election of Richard Nixon and Ronald Reagan as “law and order” Presidents) and thus anti-Trump protests — the bigger, more unruly and more violent, the better — would only bolster the administration and make Trump and his policies more popular. 

It hasn’t quite worked out that way — Trump’s approval rating in opinion polls has hovered between 38 and 42 percent, showing he’s kept the loyalty of most of the 46 percent of the people who voted for him but he hasn’t really expanded his base — but so far the Democrats have proven unable to mount an electoral resistance to him: Trump got all his Cabinet appointees through the U.S. Senate despite a razor-thin Republican majority, he got his American Health Care Act through the House of Representatives and so far the Republicans are 2-for-2 in the special House elections in Kansas and Montana despite much-ballyhooed Democratic challenges — and as the Frontline documentary points out, reports of Bannon’s demise as a Trump adviser have been greatly exaggerated. It’s true that Bannon took such an outsized role in the early days of the Trump presidency he ran the risk of getting himself fired by challenging Trump’s notorious ego — Trump has made it clear over and over again that there’s no room for anyone in his administration (or his business empire before that) with an independent power base: there’s room for only one prima donna in the Trump world, and that’s Trump — and it’s also true that Trump’s other key adviser, his son-in-law Jared Kushner, has made at least some attempts to move the Trump administration closer to mainstream hard-Right Republicanism and away from Bannon’s messianic vision — but Trump took Bannon and White House chief of staff Reince Priebus on his trip to Saudi Arabia, though he sent them home before the White House entourage reached their next stop, Israel. (Stephen Colbert showed a photo of Bannon with some of the Saudi royal family’s entourage and bitterly joked on his late-night talk show, “These aren’t the people in white robes Bannon usually hangs out with.”) 

In some ways Bannon seems at times to be a reincarnation of one of the least acknowledged but most important people in Trump’s history, the New York super-attorney Roy Cohn, who began his career as chief of staff for the notorious Red-baiting U.S. Senator Joe McCarthy (R-Wisconsin) and later masterminded Trump’s rise from small-time real-estate developer in the outer boroughs of New York to major player in the sacred precincts of Manhattan. Just as the cadences of McCarthy’s rhetoric live on in Trump (as well as in Rush Limbaugh, Bill O’Reilly, Sean Hannity, Mark Levin, Roger Hedgecock and the other superstars of the Right-wing media), so does Cohn’s take-no-prisoners style and view of the world in apocalyptic terms lives on in Bannon. The Frontline documentary on Bannon ends with Washington Post political editor Robert Costa summing up, “Bannon sees an amazing and probably last in his lifetime opportunity to really have his worldview come to the fore in American politics. He wants to see this out as much as he can, to see what can actually be accomplished with a populist president.” While Donald Trump is in no way, shape or form a “populist” — he’s actually the sort of 1880’s politician the original Populists of the 1890’s were railing against, the super-rich man who bought his way into political office and blatantly and unashamedly used it to make himself and his rich friends even richer, and though he threw out a lot of populist-sounding rhetoric out during the campaign that was as meaningless as the lies he told people to get them to buy his condos, spend money at his casinos or attend Trump University: as President, Trump has governed as an extreme Right-wing Libertarian whose budget and health care proposals show a determination to end the whole concept of a government safety net and tell individuals that when it comes to retirement or health care, they’re on their own — Costa is right that Bannon has an apocalyptic world view and that his promise to make Trump a transformative President feeds Trump’s insatiable ego and his view of himself as a super-person who alone can fix America’s problems.