Using Artificial Intelligence For Product Placement
Table Of Contents
Automation In Place For Placement
As automation continues to innovate each industry and the way they function, there is now a new software that has the capability that has the ability to achieve product placement in films, just like our own agency.
Our CEO Stacy Jones, sat down with the CEO of one such software to discuss exactly how it works! In this blog, Hollywood Branded examines the new opportunity of using artificial intelligence for product placement from the expertise of Ryff's Roy Taylor.
A Little Background On Roy
Roy Taylor is the founder and CEO of Ryff, a new tech startup designed to change the way we think of and use images. Roy is not only a vocal advocate for immersive technologies including virtual reality, augmented reality, and artificial intelligence. He also has an extensive film background and is the director for the board of the British Academy of Film and Television Arts in Los Angeles and a technology advisor to three film schools.
Roy sat down to chat with our CEO, Stacy Jones talk about how the technology his team has created can lead to a change in the future of product placement through artificial intelligence and visual computing.
Interview Transcript
Question: Can you tell us a little bit about your background and what got you to where you are doing what you do today?
Answer: I started my career by deciding that I wanted to work doing what was most fun and in my case that was playing video games. So in 1998 I had my own business. I working selling semiconductors. And a friend of mine told me about this company called 3D Effects that made a hardware accelerator that could make video games look beautiful. I tried it, I thought it was wonderful and decided that was how I will just spend my life. So from there I went to work for a company called Nvidia and I was the first vice president, the first senior executive for them working out of my bedroom in 1998. And we started out in a very, very modest way and the business grew and grew and grew and it was a spectacular success in video graphics processes.
Along the way I had to learn how to invent a style of business because you can't buy graphics processes, you have to buy what was called an adding card. And I invented a new way of doing business to do that. And so that was so successful that brought me to America. And so I started to roll out this kind of ecosystem business worldwide, working at Nvidia, then in America until 2010 and then decided that I want you to kind of move on and do some thinking in a similar field but some of different. So I went from working in 3D in video games to working in 3D in film and television. And as I did that, I started to become aware of some of the advantages, also potential pitfalls around the placement of actors and objects and scenes in depth.
That led me to learn around about 3D and then VR and augmented reality because they're all related to this step, issue and opportunity. And through that work I got involved with BAFTA as you mentioned in the introduction I got to work with Jim Chabin and the fine people over at the advanced imaging society. And got to start working with film schools advising on how to use depth and technology in storytelling. And from there I had a wonderful time working for a company called Master Image, which then became RealD. And then worked for another company called AMD and set up a studio for them in Hollywood advising films directors, producers and technologists on how to use technology for future storytelling until the middle part story of last year. I set my own company up Ryff, which is aimed at her taking the next step forward and applying technology to storytelling by allowing you to change the pixels on the screen by some of you. So that's a very quick history of my career.
Question: Can you kind of give us a little bit more of a deeper dive about what Ryff actually does and what it offers to advertisers?
Answer: One of the kind of spurs for the company came because there is a process taking place right now in television whereby large companies, technology companies are offering to scan back catalog for TV studios. And the reason for that is because a lot of TV companies are finding out that they have catalog inventory of shows going back to '40s, '50s, '60s, '70s, the last 50 years plus. And they very often don't actually know what's in the content. So they may know the name of the show and a couple of the stars, but they couldn't tell you what was in the story or who else was there, what costars were there or extras.
And so the scans are taking place to try and detect what is in the back catalog. And there's many millions and millions of hours at this. Well, I seem to ask, there's a certain irony that when you make a frame of film or television at the time that you are making it you know everything which is in that frame.
If it's real, you have the manifest file. You know exactly where an access shirt came from or for a table or a prop. And if it's digital, then you have all of the metadata. And not only that, but at the time that you're constructing that frame, you also have absolute edit control. You can change this for anything in or out. Until such time as you push final render and flatten the video and then everything's lost. All the data is lost. All the controlling edit is lost. And so a friend of mine asked me, do you believe that there is a way the one day you could board costs, frames of film and television, the elements in the frame, well not compressed, not flattered. And the compositing actually would take place in front of the viewer on the fly and my alto tune was yes, because that is the video game industry.
If it were any other way you can blow things up or drive vehicles or interact with them. Now the difference of course is it video games look like video games and film and television. Even if it's animated, it looks beautiful and the quality is that much higher. On the other hand, the video game industry has to do some really complicated math that the film world and TV does not. You have to send data packets in fractions of a millionth of a second all the way to Korea on back. So you could snipe somebody when you're playing online after refresh the screen 90 times a second instead of 30 and it has to do the full resolution.
It is very, very difficult to do all of that photo realistically. And yet Stacy, if I ask you, do you believe the process will continue to be more powerful in the future? The answer of course is yes. Well if you believe that then we must ipso facto agree that those two worlds are on a collision course because with more powerful cloud rendered processing, we can make the video game infrastructure deliver photo realistic imagery that is both editable and traceable. We have all of the data and all of the information so we'll always know exactly what was in it. But more importantly we can allow us chose to edit it if we so wish. That's the premise of Ryff.
Question: How does your company work with filmmakers? Is it from the very beginning where you're going to production, or is it something that after the production has finished and they're in final edit or beyond that you can actually go in and manipulate and add layers into the content to provide the photo realism of the brands within the content?
Answer: We can go on set and do it as the content is being made or we can go after the fact, we can scan existing inventories and then allow the insertion of appropriate brands working with the content owner, the work, the content owner after the facts, existing inventories so we can support both.
Question: How are you going to be able to work around the creative issues that always pop up within the world of product placement?
Answer: If you just Google rights management, you will find hundreds and hundreds of companies which exist to do just that. But when I point out to people that occasionally and sometimes sadly more than occasionally, there is an issue, those issues can be extremely stressful. Extremely, particularly if a very important piece of talent actor associated issue and even more stressful is that issue might mean that the project was put on hold or even worse, it has to be taken off air or out of distribution.
When I point out to people that literally within seconds that problem can go away or be solved, we can just take the offending product out of the scene, then I see a huge change in reaction to the question because we're not just about putting things into this thing. We can also take things out of the same.
Question: How do you approach projecting referral product placement solutions so there won't be traditional credit placement being done?
Answer: Well, first of all, I would not ever be as arrogant as just presuppose I could walk in on an existing 50 year old industry and change things like that. That' just not who I am personally. And it's not my company, not our company. In the same way that there are people that will always want to do things in a particular way, there will always be traditional place. And I'm absolutely certain of it in the same way the Christopher Nolan still wants to a film using traditional film, a 70 millimeter and so on. It will always be around, we will be there for those times when it's appropriate. And for those films and TV shows where it's helpful, this may still be some times when there isn't and there will still be some people will say I just don't want your technology, namely a particular production.
Question: How do you then protect against all that content that has been shot and that will be shot where brands have run co-motional campaigns knowing that they had exclusivity in the film.
Answer: I only have to add a single paragraph or two to the contract and that's taken care of. Well when you say that, as you can imagine in your mind's eye, you're thinking about a major film like a James Bond film for example with a multimillion dollar deal. We probably might never get involved in the middle of that.
The vast bulk of television and film, which is consumed is not what happens in the first 90 day window of a major $1000 movie. It is when the movie is five years old, it is when very popular TV shows. And why shouldn't the content owners, we spend so much time and so much effort producing some of the greatest stories in so many experiences that we ever get to experience.
Question: Where do you see this technology going?
Answer: The future's very exciting. Our roadmap is predicated by cloud based GPU as graphic processor units. Cloudbase rendering the number of GPU in the cloud today is sufficient to do the kinds of things that we're looking at. In the future the number of those processes will grow exponentially and the power of those processes will grow exponentially. As they grow so will our capability and we'll be able to change more of what you see on the screen and we'll be able to do more interesting things with the content. But that's for the future.
Want to learn more about Ryff and how it works? Take a listen to the full interview in our podcast!
Next Steps
Want to learn more about the ins and outs of product placement in film and television? Check out some of the other blogs we have written on these topics!
- Virtual Reality – The Future Of Product Placement?
- Technology Product Placement And Mistakes To Avoid
- Branded Content, Product Placement And Mistakes To Avoid
- Product Placement Trade Out Opportunities in Hollywood TV and Film
- New Study Proves Effectiveness Of Brand Marketing In Virtual Reality
Want access to even more insights from industry pros and their experience in brand marketing? Subscribe to our Marketing Mistakes podcast and listen to every episode!