Thanks to operant conditioning, any animal can be trained to do just about anything as long as the individual actions that comprise the larger, complex behavior are within the animal's natural abilities. Including teaching a dog to "drive" a motorcycle.

B. F. Skinner referred to this process as shaping. In 1953, he described it this way (emphasis added):

We first give the bird food when it turns slightly in the direction of the spot from any part of the cage. This increases the frequency of such behavior. We then withhold reinforcement until a slight movement is made toward the spot. This again alters the general distribution of behavior without producing a new unit. We continue by reinforcing positions successively closer to the spot, then by reinforcing only when the head is moved slightly forward, and finally only when the beak actually makes contact with the spot. … The original probability of the response in its final form is very low; in some cases it may even be zero. In this way we can build complicated operants which would never appear in the repertoire of the organism otherwise. By reinforcing a series of successive approximations, we bring a rare response to a very high probability in a short time. … The total act of turning toward the spot from any point in the box, walking toward it, raising the head, and striking the spot may seem to be a functionally coherent unit of behavior; but it is constructed by a continual process of differential reinforcement from undifferentiated behavior, just as the sculptor shapes his figure from a lump of clay.


You can use operant conditioning in other ways, too:

To learn more about operant conditioning, check out this explainer I wrote last year, over at my Scientific American blog, The Thoughtful Animal.