World usually means a planet. When people say "the world" they usually mean the planet Earth. Humans and animals live in the world. The Earth, the third planet from the Sun, is the only planet that we know of that has life.

Before people discovered that Earth is a planet, they often used "World" to mean "Universe". They still sometimes use it to mean all humans or all civilization. Sometimes they mean just a part of Earth such as Western world or Islamic world.