Lights, Camera, Curate: Understanding the History and Preservation of the Film Industry
From silent films to streaming services, cinema has evolved throughout the past 150 years and continues to shape American culture and technology.
In June 1878, railroad tycoon and former governor of California Leland Stanford wanted to know if there is a moment in which a running horse has all four legs off the ground. To prove his theory, he hired photographer Eadweard Muybridge. Knowing cameras of the time were too slow to catch such rapid action, Muybridge devised a system of tripwires connected to multiple cameras along the track that would fire milliseconds apart as the horse ran past — a photographic first in capturing quick successions of movement. (And yes, all four legs do leave the ground for a brief moment.)

Edweard Muybridge’s “Apparatus for Photographing Objects in Motion.” (Courtesy National Museum of American History)

Cyanotype. Photograph by Eadweard Muybridge, 1880s, Series Neg. No. 9.1476, Animal Locomotion Serial No. 628, “Gallop, Bay horse, ‘Daisy’.” PG.003856.0758.
Less than 20 years later, the Lumière brothers created the first film screen publicly, kickstarting an industry that captured the world and continues to blend technological innovation with human imagination. Today, the Smithsonian Institution is home to a vast collection of cameras, props, costumes, scripts and much more — representing the evolution of film and its lasting impact both nationally and internationally.
Lumière’s first cinematographe (now located in the National Museum of American History) merged the capabilities of a camera, film processor and projector, setting the groundwork for movies to be a source of public entertainment. A wave of competing cameras and exhibition methods quickly followed suit, transforming motion pictures from a solitary experience into one that can be easily enjoyed by large audiences.

Auguste and Louis Lumiere crafted the first cinematographe in 1895, projecting sprocket-wound film onto a screen at 46 frames per second (Smithsonian National Museum of American History).
Edison’s vitascope, currently on display in NMAH’s Entertainment Nation, was the most notable successor to the cinematographe and a successful adaptation of his previous single-user peephole kinetoscope by allowing multiple people to view a film together. After debuting in Manhattan in 1896, the projector enthralled vaudeville theaters across the nation and fueled Edison’s involvement in cinema. Though a controversial figure for monopolizing the industry, he eventually founded the Motion Pictures Patents Company, a harbinger of today’s Hollywood.
As the technology evolved, so did the storytelling. The Vitaphone enabled the incorporation of synchronized sound — when sound aligns with what’s on the screen — starting with “Don Juan” in 1926. The 1932 Technicolor camera made Dorothy’s ruby slippers sparkle when the “Wizard of Oz” was released seven years later, and the National Museum of the American Indian’s film splicer built the tension of sequences like the “Battleship Potemkin” Odessa Steps scene.
These films gripped audiences and led to Hollywood’s Golden Age, which began toward the end of the 1920s. The novelty of sound, color and highly intricate editing drew tens of millions of people to theaters every year, propelling cinema into a shared cultural experience that offered an escape from the hardships of daily life. In the face of the Great Depression, glitz and glamour were a welcome reprieve.
This was an era that also minted movie stars and perfected a wide range of genres. The Big Five production companies (Metro-Goldwyn-Mayer, Paramount, Fox, Warner Bros. and RKO Pictures) operated under fully vertically integrated systems that meant films were cheaper to make as each studio owned all components from production to exhibition, and they could invest more on stars that guaranteed a box office success. Undeniable charisma and carefully constructed public personas turned actors into icons, captivating the nation on and off screen with legacies that live on today — in part at the Smithsonian. <

A seductive accessory fit for a sex symbol, these kidskin evening gloves were worn by actress Marilyn Monroe. (Courtesy National Museum of American History)
Marilyn Monroe, who signed her first film contract in 1946, catapulted to fame as the quintessential “blonde bombshell.” Perhaps nothing epitomized her coquettish image better than her kidskin evening gloves that were symbolic of how studio systems cultivated the mystique and allure of their stars. Similarly, Katharine Hepburn was named the greatest female star in Hollywood history after fashioning herself into a cultural icon with sharp wit and steely elegance — it is not surprising no actor or actress has ever managed to break her record-setting four Best Actress Oscars wins, all of which are on display at the National Portrait Gallery.
Ultimately, antitrust legislation, censorship, and blacklists resulted in the end of the Golden Age but gave rise to New Hollywood in the 1960s, a period known for auteur directors that infused mainstream cinema with a boldly unconventional spirit. Global new wave movements, led by creatives such as Agnes Varda and Federico Fellini, challenged the limits of cinematic innovation and greatly influenced the work of American directors including Martin Scorsese and Francis Ford Coppola.
Stanley Kubrick pioneered groundbreaking special effects in films such as “2001: A Space Odyssey,” exemplified in the Stargate sequence that uses slit-scan photography and optical printing to craft an immersive and surreal experience. His correspondence with Arthur C. Clarke, the sci-fi author behind the story, is now located in the National Air and Space Museum’s collections and offers a fascinating glimpse into the creative and collaborative process that made for such a “subconscious, myth-making” production, as Kubrick wrote.
Counter-culture influences, combined with the relative freedom of new lighter 35mm cameras and lessened production codes, emboldened many film school-educated directors to pursue their edgier, more experimental concepts to the fullest extent, warping the bounds of cinematic narrative and structure. That was, at least, until the commercial success of George Lucas and Steven Spielberg initiated yet another paradigm shift toward blockbuster epics.
While neither “Jaws” nor “Star Wars” were ever predicted to be big hits, they kickstarted the careers of two of the biggest directors of all time, along with a public obsession for larger-than-life entertainment. These films and all they inspired, including “Indiana Jones” and “Jurassic Park,” ushered in an era where spectacle, technological innovation, and franchise storytelling were necessary to gain critical acclaim, reshaping not only the on-screen structure of the industry but the entire business model driving it as well.

On May 25, 2007, in Los Angeles, California, the Postal Service issued the 41-cent Star Wars commemorative stamps in fifteen designs. Terrence McCaffrey and William J.Gicker, Jr., of the US Postal Service designed the stamps. This stamp features Han Solo’s Millennium Falcon. (Courtesy National Postal Museum)
The turn of the century ushered in even more changes. “Star Wars Episode I: The Phantom Menace” was the first major film to incorporate a significant amount of digital footage, and the 2002 “Star Wars Episode II: Attack of the Clones” was the first to be shot entirely on digital cameras. An unprecedented shift, this new technology streamlined production and fundamentally reshaped how stories were told, who could tell them, and how audiences experienced them, reducing costs for independent artists and facilitating the exploration of any idea or style.
Since the present-day cinema landscape would be virtually unrecognizable without this era, artifacts ranging from the “Raiders of the Lost Ark” screenplay to an R2-D2 postal collections box are preserved across museums as symbols of the era’s cultural reach. After all, these epics weren’t merely a few hours of thrilling entertainment. They permeated everyday life, embedding themselves into American popular culture at the intersection of technology, arts, and culture.

“Indiana Jones: Raiders of the Lost Ark” was written by George Lucas and directed by Steven Spielberg, intertwining two of the most genius film minds to create one of the most successful franchises (Smithsonian National Museum of American History).
The 2010s brought about an emphasis on cultural storytelling that used technological advancements like immersive computer-generated imagery to create rich narratives that resonated with traditionally underrepresented communities and global audiences alike. Marvel, for instance, blended cutting-edge capabilities with powerful representation in the 2018 “Black Panther,” and the costume worn by Chadwick Boseman is now housed in the National Museum of African American History and Culture to honor this convergence. As a result of more accessible production technology and streaming services with diversified content, there are fewer barriers for who can create a film that authentically expresses and amplifies new perspectives.
Recent shifts in consumer habits and a coinciding decline in major studios have left the future of the film industry greatly uncertain. Perhaps streaming services mean it will once again be an individual viewing experience, or maybe the rise of indie creators will push the creative bounds of the medium.
But if history has proven one thing, it’s that the role of cinema in shaping culture and society will endure. As the lines between film, television and digital media continue to blur, visual storytelling will expand beyond traditional formats and invite audiences to engage with stories in entirely new ways. And the Smithsonian’s collection will expand with it.
Lexi Critchett is an intern in the central Office of Public Affairs. She is currently studying photojournalism at George Washington University in Washington, D.C.
Posted: 30 April 2025
-
Categories:
American History Museum , Art and Design , Feature Stories , History and Culture