AI learns the secret to making the perfect pizza
Artificial intelligence? Pfff, what am I supposed to do with it, it's good for nothing! That's how many think, but they should be soon proven wrong: AI now takes on the role of pizza chef - and strives for higher things.
There are many great things you can do with AI, but that was not enough for the researchers at MIT. In the PizzaGAN project, they have unleashed artificial intelligence on the classic Italian cuisine that conquered the world, and in the process, created a lot of controversy when it comes to methods and toppings. The aim of the project, entitled "How to make a pizza: learning a composition layer-based GAN model", is to "teach a machine how to make a pizza by building a generative model that reflects this step-by-step guide". The secret to the perfect pizza? It's in the layers, of course.
Generative Adversarial Networks (GAN) are a branch of computer science and groups of algorithms used for unsupervised machine learning. PizzaGAN is based on a dataset of 9,213 images of pizza obtained from Instagram. Each picture has a series of tags that describe the topping, but not the dough, sauce, and cheese. Individual images of the possible rubbers were also fed to the AI. Using the photo of a pizza, artificial Intelligence can determine which things were used for the topping and in which layers they have to be applied. However, there are currently no plans to have the algorithm automatically bake a real pizza together with a robot.
And pizza isn't the end of the story. PizzaGAN will soon be doing other things. "Although we have evaluated our model only in the context of pizza, we believe that a similar approach is promising for other types of foods that are naturally layered, such as burgers, sandwiches and salads," the study says. "It will be interesting to see how our model behaves in areas like digital fashion shopping assistants, where a key operation is the virtual combination of different layers of clothing."
Would you like to try a pizza recipe made by AI?
Via: CNet Source: Arxiv (PDF)
Recommended editorial content
With your consent, external content is loaded here.
By clicking on the button above, you agree that external content may be displayed to you. Personal data may be transmitted to third-party providers in the process. You can find more information about this in our Privacy Policy.