I have two teenage daughters who are both getting ready to drive and are looking for cars. When I got my first car, my dad believed it was imperative to teach me how to, not only get under the hood and learn basic maintenance, but also learn how to change a tire and even change the oil. I am also planning to teach my daughters these skills. I think it's important for them to know how to take care of a car if they're going to drive. I think we sometimes fall into the stereotypical idea that only boys should learn these things.