A CPU core is a unit that executes instructions for you. Back in the early days of CPU architecture every processor operated as a single unit. That is to say that each CPU could only handle a single instruction at one time (but remember that they can handle millions/billions per cycle) whilst other instructions on the stack (a queue) had to wait.
Fast forward to today and now a single CPU will have multiple cores. That means that although you still only have one physical CPU you have many cores that can all process instructions at the same time. So instead of a single stack of instructions waiting to be executed we can now have multiple stacks of instructions executing at the same time.
This means software can take advantage of this architecture and run multiple processes or threads at the same time. You'll learn about these concepts later on, but from a hardware perspective this simply means there are multiple CPUs doing work at the same time, speeding up most of the software you'll use. This is referred to as parallelism.
Another thing to consider is the clock speed that we covered earlier applies to each core. So each of the cores inside of a CPU runs at that clock speed.
That being said, at the time this was being written Apple and other manufacturers have developed and released their own CPU architectures that offer a mixture of some fast cores, some slows cores, and other others that have a very specific purpose. This allows the computer the CPU is installed in to do various clever things to safe on battery life among other things.
So it can get complex quickly. All you need to know is that CPUs run instructions, they run them at a certian clock speed which determines how many instructions per second it operate at, and modern CPUs have multiple cores that run at those clock speeds.
Abolutely every single CPU you'll ever encounter at this point in time is going to have multiple cores. It's extremely unlikely you'll encounter a single processor unless you're working with very old hardware.
When you're eventually working in the industry you'll see mention of this multi core principle. In Amazon Web Services, for example, you'll see that some EC2 Instances (which are virtual machines) will have a CPU core account associated with them.
Look at this table below:
Notice how the instances get more cores associated with them? Of course you pay more money for this increase in compute power, but simply put: the more money you spend the more cores you get access to, and the more cores you can access the faster you can run software on that particular EC2 Instance.
We'll cover AWS in more detail much further into the book.
This is essentially the "core count" of the EC2 Instance and it's an important metric to understand. When you're building out servers in AWS you'll want to get get the right balance of CPU cores and other features such as RAM and networking. This is called "right sizing". It's something we'll cover later on.