It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Broad's goal was to apply "deep learning" — a fundamental piece of artificial intelligence that uses algorithmic machine learning — to video; he wanted to discover what kinds of creations a rudimentary form of AI might be able to generate when it was "taught" to understand real video data.
An artificial neural network is a machine-built simulacrum of the functions carried out by the brain and the central nervous system. It's essentially a mechanical form of artificial intelligence that works to accomplish complex tasks by doing what a regular central nervous system does — using its various parts to gather information and communicate that information to the system as a whole.
Broad decided to use a type of neural network called a convolutional autoencoder. First, he set up what's called a "learned similarity metric" to help the encoder identify Blade Runner data. The metric had the encoder read data from selected frames of the film, as well as "false" data, or data that's not part of the film. By comparing the data from the film to the "outside" data, the encoder "learned" to recognize the similarities among the pieces of data that were actually from Blade Runner. In other words, it now knew what the film "looked" like.
Broad told Vox in an email that the neural network's version of the film was entirely unique, created based on what it "sees" in the original footage. "In essence, you are seeing the film through the neural network. So [the reconstruction] is the system's interpretation of the film (and the other films I put through the models), based on its limited representational 'understanding.'"
originally posted by: AboveBoard
Very interesting!
Though why he'd pick a movie where the AI "replicants" kill their "maker" is beyond me. ???
Why not start it on something with a less, shall we say, dark and dystopian manifestation. I mean, how is the AI supposed to differentiate between reality and the movie??
What the heck??
Anyway...I appreciated your OP!
- AB
n other words, using Blade Runner had a deeply symbolic meaning relative to a project involving artificial recreation. "I felt like the first ever film remade by a neural network had to be Blade Runner," Broad told Vox.
Normally, video encoding happens through an automated electronic process using a compression standard developed by humans who decide what the parameters should be — how much data should be compressed into what format, and how to package and reduce different kinds of data like aspect ratio, sound, metadata, and so forth.
Broad wanted to teach an artificial neural network how to achieve this video encoding process on its own, without relying on the human factor.
originally posted by: Unresponsible
Very dream-like. I noticed that most of the scenes with violence became choppier and more kaleidoscopic...I wonder if that's just to do with the processing of more rapid images?
originally posted by: MystikMushroom
a reply to: nightbringr
It's not being creative, it's rendering what it sees. We are seeing what it "sees" by what it is being shown. It is like looking through the eyes of someone else or an animal.
originally posted by: MystikMushroom
a reply to: Gothmog
It's a non-sentient, non-self aware AI.
Siri is an AI. Cortana is an AI. They're not sentient, but they are both still a rudimentary form of artificial intelligence.