OpenAI Neural Network Minecraft Video PreTraining VPT
Minecraft isn’t a game most would think AI could master, but OpenAI has other plans. They managed to train a neural network to play Minecraft utilizing Video PreTraining (VPT) on a large unlabeled video dataset of human Minecraft play, while using only a small amount of labeled contractor data.



With some more tuning, OpenAI says its model can learn to craft diamond tools, a task that usually takes human players over 20 minutes (24,000 actions). How? Their model uses the native human interface of key presses and mouse movements, which means it’s quite general, and thus represents a step towards general computer-using agents. Minecraft and AI is already sort of a thing with Facebook teaming up with MIT to create an artificial intelligence agent for the game.

Sale
Mini PC Windows 11 Pro, KAMRUI AK1 PRO 12GB RAM 256GB SSD Mini Desktop Computer Intel Celeron N5105 Processor,Quad Core Micro Computer 4K, Support 2.5-inch SSD, 2.4G/ 5.0G WiFi, Gigabit Ethernet, HTPC
752 Reviews
Mini PC Windows 11 Pro, KAMRUI AK1 PRO 12GB RAM 256GB SSD Mini Desktop Computer Intel Celeron N5105 Processor,Quad Core Micro Computer 4K, Support 2.5-inch SSD, 2.4G/ 5.0G WiFi, Gigabit Ethernet, HTPC
  • [Small but Powerful]-Mini PC about 5.04 x 5.04 x 2.05-Inch. Powered by Intel Celeron Jasper Lake N5105(4MB cache, up to 2.9Ghz), delivers incredible...
  • [Intelligent Design]-What makes the small computer stand out from other mini PCs is that it has an empty expansion bay to allow adding a 2.5-inch SSD...
  • [Ready When You Are]-Mini computer was pre-installed Windows 11 Pro. With the Modern light feature, blue light while running, red light while standby....

OpenAI Neural Network Minecraft Video PreTraining VPT

Perhaps the most important hypothesis of our work is that it is far more effective to use labeled contractor data to train an IDM (as part of the VPT pipeline) than it is to directly train a BC foundation model from that same small contractor dataset. To validate this hypothesis we train foundation models on increasing amounts of data from 1 to 70,000 hours,” said the researchers.

Write A Comment