One the first day of WebExpo 2025, I listened to a talk from Tejas Kumar named “From GenAI to GenUI – Codify your UI on the fly”, during his live demo he went through the history of adding Ai to webpages, from 2022 to 2025 and beyond.
Starting with the basics, he showed how Ai chatbots were created, back when ChatGPT was still new and people didn’t know, how to best use it yet. Interfaces were simple, just a textfield to write a query into and a search button, afterwards, one would have to wait (in the dem it was about 15 seconds) until the answer arrived. As we all know, waiting, especially if its longer than ten seconds, sucks.
To combat this he implemented streaming, which means instead of waiting for the whole message before it is displayed, small parts of the Ais reply are shown, which makes the user experience much better. In addition, he parsed through the response to display different objects in a list. Making not only the wait time shorter, but also the readability better. But wait, there is more! If text can be streamed, so can html or css, since it is just a stream if text, converted into images by your browser.
He proceeded to show how asking an Ai to display a list of movies with a strong female lead could change, by adding generative UI. Instead of displaying just a list of movies, the Ai could display Netflix like panels, that are interactive and which take you directly to a page about the movie. The Ai could even embed trailers directly into the chat and not just provide a link. Last, he asked the Air Force to show him where he could watch the movie, the Ai asked for his location via a popup and then embedded a map with the correct rout right into the chat room. Amazing! Additionally all created Ui can be created by designers, which adds a layer of control about what the Ai actually generates, since it could also generate bad things.
He proceeded to demo, how he gave tools to an Ai, which would get information from an API (in this case an API containing all WebExpo talks), understand it and interact with it. “You don’t have to browse the web, it comes to you.” Using this he can now ask the Ai about the schedule of the conference, but not searching for specific things, asking the Ai about certain topics. Last he incorporated his own google calendar into the Ai model, enabling it to understand his calendar and even add events. This way he could tell the Ai to all an event to his calendar at the time of his friends talk at WebExpo, and it did. It even provided additional information about the talk.
If you got interested in the talk, here is a recording of it:
(Use the slider to increase & decrease the size of the video/Screen recording)
Also, here is all other talks from this years WebExpo:
https://slideslive.com/webexpo