Select Page

AI-Driven Personalization in OOH Advertising

James Thompson

James Thompson

In an era of infinite newsfeeds and algorithmically curated content, out-of-home has held onto one great advantage: it is unskippable. Now artificial intelligence is adding something OOH has traditionally lacked—the ability to respond to who is nearby and what is happening in the moment, bringing a new level of personalization to the most public of media.

At the heart of AI-driven personalization in OOH is data. Sensors, cameras, mobile signals and connected cars all generate a constant flow of information about audiences and environments. AI systems ingest this data to infer attributes such as approximate age and gender, traffic density, time of day, weather, nearby events and even the kinds of vehicles at an EV charger. Instead of one static creative running for weeks, a digital screen can now choose from dozens of versions in real time, matching content to the people and context directly in front of it.

Some of the most advanced executions look almost “sentient.” In major hubs from London’s Piccadilly Circus to Tokyo’s Shibuya district, billboards use AI to analyze crowd demographics and adjust messaging accordingly, serving different creative to younger evening commuters than to families at the weekend. Cameras and computer vision identify broad audience profiles, while algorithms decide which of many pre-approved ads is most likely to resonate with that mix. The effect is a public screen that appears to understand its audience without ever becoming one-to-one addressable media.

Weather- and context-triggered campaigns hint at how subtle this personalization can be. McDonald’s has used AI and live data feeds across London to switch creative based on time and temperature—pushing breakfast items in the morning, then pivoting to ice cream when the afternoon heats up. A McDonald’s frozen drinks campaign in the UK only appeared when the temperature hit 22°C, and added city names and live temperature once it crossed 25°C, turning a simple DOOH buy into timely, location-aware messaging. The content feels less like an ad and more like a real-time service.

Mobile OOH is undergoing a similar transformation. AI-enabled “smart” trucks can now generate thousands of localized messages as they move through a city, pulling in signals such as neighborhood, weather and traffic to update copy automatically. A recent campaign for PODS in New York used a generative AI platform to create more than 6,000 unique headlines in 29 hours, each tailored to the specific neighborhood the truck was driving through and conditions like subway delays or temperature. The billboard became part of the urban fabric, speaking the language of each block it passed.

Personalization is also extending to the types of products and messages served in micro-environments. Kia’s campaign at EV charging stations used vehicle-recognition technology to identify whether a Kia or a competitor’s model was plugged in, then tailored creative accordingly: feature-led conquest messaging for non-Kia drivers, and affirmation-plus-upgrade prompts for existing owners. When no vehicle was charging, evergreen creative played instead. The AI-driven targeting contributed to a significant lift in brand awareness, consideration and an 8% sales increase for the EV9. The same principle can apply across petrol stations, malls and transit hubs wherever a screen can understand context.

Behind these high-profile cases is a broader structural shift in how OOH is planned and traded. Programmatic DOOH platforms now combine AI with historical campaign data, geospatial information, mobility insights and advertiser first-party data to optimize placements and creative rotation. Algorithms learn which locations, times and messages drive store visits or online conversions, then adjust future buys and creative weighting automatically. For OOH buyers, this promises a more accountable channel, with optimization loops that look increasingly like those in online display—without losing the brand-building power of a large-format canvas.

The creative process itself is being reshaped. AI-powered attention models can generate heat maps showing which parts of an OOH design attract the most gaze, enabling agencies to test and refine layouts before they go live. Generative AI tools help produce multiple versions of a concept for different audience segments, dayparts or weather scenarios, dramatically reducing the marginal cost of personalization. In practice, the “personalized” OOH experience is usually a smart selection between many pre-approved variants rather than a fully bespoke ad for each passerby, but the perceptual effect can be similar.

All of this, however, exists against a backdrop of rising scrutiny on privacy and ethics. While some AI OOH systems rely purely on contextual data like weather, traffic and location, others use cameras and facial analytics to infer demographics. Regulators and civil society are increasingly asking what is being captured, for how long, and whether individuals can be identified or tracked across locations. The most sustainable approaches treat OOH audiences as anonymous aggregates, avoid storing personally identifiable information, and are transparent with landlords and municipalities about the technologies in use.

For media owners and brands, the opportunity lies in balancing intelligence with restraint. AI can ensure that a running-shoe ad switches from “hit the trail” on a sunny afternoon to “stay dry, keep going” when rain is imminent, making the message feel naturally attuned to its surroundings. It can help a digital street furniture network nudge commuters toward hot coffee on a cold morning and iced drinks when a heatwave hits, without crossing the line into invasive surveillance. When done well, personalization in OOH is less about knowing who you are and more about knowing what you need—right here, right now.

As AI capabilities spread from flagship city centers into secondary markets, personalization is likely to become a standard expectation in digital OOH rather than a novelty. Screens will no longer be dumb light boxes but responsive surfaces, orchestrated by models that continuously learn from how people move, gather and respond. In a media landscape shaped by algorithms, the most public medium is quietly, and intelligently, catching up.