THE PRINCIPLES OF LEARNING AND BEHAVIOR 7TH EDITION MICHAEL DOMJAN – TEST BANK

$20.00

Pay And Download

 

Complete Test Bank With Answers

 

 

 

Sample Questions Posted Below

 

 

 

 

  1. Your dog is sitting quietly in the front yard when an intruder approaches. As you would hope, the dog begins to bark

vigorously and the intruder runs away. The dog’s barking is an example of

  1. a.
  2. elicited behavior.
  3. c. goal directed
  4. It cannot be determined with the given information.

 

ANSWER:           d

REFERENCES:  Page 120

KEYWORDS:      Concept

 

  1. The cats in Thorndike’s puzzle boxes were able to escape more quickly over successive trials. Thorndike interpreted

this performance change to reflect a. stimulus-outcome learning.

  1. stimulus-stimulus learning. c. stimulus-response learning. d. response-outcome learning.

 

ANSWER:           c

REFERENCES:  Pages 121-122

KEYWORDS:      Fact

 

  1. Your sister’s hamster keeps escaping from its cage. On the first day, it took the hamster 14 hours to escape, but by the second week, the hamster could only be confined for 30 minutes before it worked its way to freedom. According to Thorndike’s theory,
  2. a. the stimulus of the cage has become associated with the desire to be
  3. the stimulus of the cage has become associated with the jumping necessary to gain freedom. c. the hamster jumps in order to gain freedom.
  4. the hamster “operates” on its environment to gain freedom.

 

ANSWER:           c

REFERENCES:  Page 122

KEYWORDS:      Concept

 

  1. According to the law of effect, which of the following elements is not a component of the conditioned association?
  2. a. stimulus outcome c. response
  3. All of the above are involved.

 

ANSWER:           b

REFERENCES:  Page 122

KEYWORDS:      Fact

 

 

 

  1. Which of the following would most likely be used in a discrete trial procedure?
  2. a. licking water from a tube to gain access to food pressing a lever to gain access to food
  3. c. running down a runway to gain access to food pushing a rod to gain access to food

 

ANSWER:           c

REFERENCES:  Page 123-124

KEYWORDS:      Concept

 

  1. In a discrete trial procedure, the researcher can measure all of the following except a. response rate.
  2. running speed.
  3. c. food
  4. latency to leave the start box.

 

ANSWER:           a

REFERENCES:  Pages 123-124

KEYWORDS:      Fact

 

  1. Which of the following is typical of a discrete trial procedure?
  2. a. A hungry rat makes a choice between plain food and food enhanced with a sweetener in a T-maze. A monkey pushes a lever to watch an electric train.
  3. c. A thirsty pigeon pecks a key to gain access to d. A hungry rat moves a rod to earn a food pellet.

 

ANSWER:           a

REFERENCES:  Page 123-124

KEYWORDS:      Fact

 

  1. Which of the following is true of an operant response?
  2. a. Response speed determines
  3. Pushing a lever with a paw and pushing a lever with the snout are equivalent. c. Licking a water spout and pushing a response lever are equivalent.
  4. Licking a water spout and chewing a food pellet are equivalent.

 

ANSWER:           b

REFERENCES:  Page 126

KEYWORDS:      Concept

 

 

 

  1. Magazine training involves which of the following?
  2. a. reinforcement of successive approximations non-reinforcement of earlier response forms c. classical conditioning
  3. All of the above

 

ANSWER:           c

REFERENCES:  Page 126-127

KEYWORDS:      Fact

 

  1. 10. The frog jumping contest is fast approaching. Your jumper has a maximum leap of 5 feet, far less than needed for a win. In order to train your frog to jump farther, you should begin by giving it a fly when it jumps .
  2. a. any distance 4’11”
  3. c. 5’1”
  4. 5’

 

ANSWER:           d

REFERENCES:  Pages 127-128

KEYWORDS:      Concept

 

  1. 11. Shaping depends on which of the following?
  2. a. the variability of behavior
  3. nonreinforcement of the target response
  4. c. continued reinforcement of early response forms
  5. delivering the reinforcer only for responses that exceed any previous response

 

ANSWER:           a

REFERENCES:  Pages 126-128

KEYWORDS:      Fact

 

  1. 12. Pigeons have a baseline gape response of 10-15 mm. In order to shape the gape response for a wider opening, the first reinforcers should be delivered when the pigeon opens its mouth .
  2. a. every time 10 mm
  3. c. 15 mm 16 mm

 

ANSWER:           c

REFERENCES:  Pages 126-128

KEYWORDS:      Concept

 

 

 

  1. 13. When shaping the behavior of an organism, you must a. reinforce all
  2. set each criterion so that at least some of the existing responses are reinforced.
  3. c. set each criterion so that only the response forms that exceed existing responses are d. set each criterion so that most of the existing responses are reinforced.

 

ANSWER:           b

REFERENCES:  Pages 126-128

KEYWORDS:      Fact

 

  1. 14. The major advantage of free-operant methods over discrete trial procedures is that a. the animals learn more
  2. free operant methods provide the opportunity to observe changes in the likelihood of behavior over time. c. free operant methods can reveal an animal’s preferences.
  3. free-operant methods involve S-S learning, but discrete trial procedures involve S-R learning.

 

ANSWER:           b

REFERENCES:  Pages 124-128

KEYWORDS:      Concept

 

  1. 15. Which of the following is an example of positive reinforcement?
  2. a. receiving a time-out instead of a spanking
  3. turning off the radio when the DJ plays a song you dislike c. going out to dinner after winning an award
  4. faking illness to avoid school in the morning

 

ANSWER:           c

REFERENCES:  Page 129-130

KEYWORDS:      Concept

 

  1. 16. A positive contingency between a response and an appetitive stimulus is also known as a. positive
  2. negative reinforcement. c. punishment.
  3. omission training.

 

ANSWER:           a

REFERENCES:  Page 129-130

KEYWORDS:      Fact

 

 

 

  1. 17. Which of the following is an example of (positive) punishment?
  2. a. Dora is sent to her room without dessert because of her poor b. Steve has his mouth washed out with soap for swearing.
  3. c. Bobby is not allowed to buy cigarettes because he is too d. All of the above

 

ANSWER:           b

REFERENCES:  Page 129-130

KEYWORDS:      Concept

OTHER:              WWW

 

  1. 18. A positive contingency between a response and an aversive stimulus is also known as a. positive
  2. negative reinforcement. c. punishment.
  3. omission training.

 

ANSWER:           c

REFERENCES:  Pages 129-130

KEYWORDS:      Fact

 

  1. 19. Which of the following is an example of negative reinforcement?
  2. a. Stella changes the oil in her car to avoid engine
  3. Mark hits his little brother because the brother broke Mark’s bike.
  4. c. Suzie cries after losing the card
  5. Ed gets a gold star because he didn’t act out.

 

ANSWER:           a

REFERENCES:  Pages 129-130

KEYWORDS:      Concept

 

  1. 20. A negative contingency between a response and an aversive stimulus is also known as a. positive
  2. negative reinforcement. c. punishment.
  3. omission training.

 

ANSWER:           b

REFERENCES:  Pages 129-130

KEYWORDS:      Fact

 

 

 

  1. 21. Which of the following is an example of omission training?
  2. a. Wanda cannot play with her friends because she was out too late yesterday b. Robert takes out the garbage to stop his roommate’s nagging.
  3. c. Billy sleeps late to avoid taking his history d. Ellie stops crying when she gets a lollipop.

 

ANSWER:           a

REFERENCES:  Pages 129-130

KEYWORDS:      Concept

 

  1. 22. A negative contingency between a response and an appetitive stimulus is also known as a. positive
  2. negative reinforcement. c. punishment.
  3. omission training.

 

ANSWER:           d

REFERENCES:  Pages 129-130

KEYWORDS:      Fact

 

  1. 23. Lyle leaves the theater because the music in the show he is watching is too loud. This is an example of a. positive
  2. negative reinforcement. c. punishment.
  3. omission training.

 

ANSWER:           b

REFERENCES:  Pages 129-130

KEYWORDS:      Concept

 

  1. 24. Brenda steals Kelly’s car because Kelly went to Europe without her. This is an example of
  2. a.
  3. negative reinforcement. c. punishment.
  4. omission training.

 

ANSWER:           c

REFERENCES:  Pages 129-130

KEYWORDS:      Concept

 

 

 

  1. 25. The difference between (positive) punishment and negative reinforcement is that
  2. a. punishment increases the target response, while negative reinforcement decreases the target b. punishment decreases the target response, while negative reinforcement increases the target response. c. in punishment, the target response terminates the aversive stimulus.
  3. in negative reinforcement, the response increases the likelihood of the aversive stimulus.

 

ANSWER:           b

REFERENCES:  Pages 129-130

KEYWORDS:      Concept

 

  1. 26. Differential reinforcement of other behavior (DRO) is a type of a.
  2. omission training. c. escape.
  3. avoidance.

 

ANSWER:           b

REFERENCES:  Pages 129-130

KEYWORDS:      Fact

 

  1. 27. Ralph only gets to watch television in the afternoons if he doesn’t hit his sister. Otherwise, he must spend the

afternoon in his room. This is an example of

  1. a. differential reinforcement of other b. avoidance training.
  2. c.
  3. negative reinforcement.

 

ANSWER:           a

REFERENCES:  Pages 129-130

KEYWORDS:      Concept

 

  1. 28. A rat in a Skinner box receives a food pellet every fifth time it pushes the response lever. This is an example of a.
  2. differential reinforcement of other behavior. c. positive reinforcement.
  3. negative reinforcement.

 

ANSWER:           c

REFERENCES:  Pages 129-130

KEYWORDS:      Concept

 

 

 

  1. 29. According to the text, an unpleasant outcome is technically termed which of the following?
  2. a. punishment
  3. averse stimulus
  4. c. positive reinforcement negative reinforcement

 

ANSWER:           b

REFERENCES:  Page 129

KEYWORDS:      Concept

 

  1. 30. Sometimes, removing a stimulus after some response increases the occurrence of that response. This is an example of
  2. a.
  3. omission training.
  4. c. positive d. negative reinforcement.

 

ANSWER:           d

REFERENCES:  Pages 129-130

KEYWORDS:      Concept

 

  1. 31. In some instances, removing a stimulus after some response decreases the occurrence of that response. This is an example of
  2. a. positive b. negative reinforcement. c. omission training.
  3. punishment.

 

ANSWER:           c

REFERENCES:  Pages 129-130

KEYWORDS:      Concept

 

  1. 32. Which of the following is not true about behavioral variability?
  2. a. Reinforcement inevitably decreases behavioral
  3. Behavioral variability can be the basis for instrumental reinforcement. c. Reinforcement can increase or decrease originality.
  4. Pigeons will generate novel pecking patterns if novelty is reinforced.

 

ANSWER:           a

REFERENCES:  Pages 132-133

KEYWORDS:      Fact

 

 

 

  1. 33. In a study where pigeons were reinforced only if the pattern of pecks delivered to two keys was different from the previous 50 patterns, researchers determined that
  2. a. reinforcement increases
  3. behavioral variability can be the basis for instrumental reinforcement. c. reinforcement decreases intrinsic motivation.
  4. reinforcement decreases originality.

 

ANSWER:           b

REFERENCES:  Pages 132-133

KEYWORDS:      Fact

 

  1. 34. An important aspect of instrumental conditioning is that
  2. a. there are no limitations on the types of new response dimensions that may be modified by instrumental
  3. there are no limitations on the types of new behavioral units that may be modified by instrumental conditioning. c. relevance relations occur in instrumental conditioning.
  4. the type of behavior that develops does not depend on reinforcer characteristics.

 

ANSWER:           c

REFERENCES:  Pages 134-135

KEYWORDS:      Fact

 

  1. 35. Thorndike determined that with extensive training, cats will open their mouths in order to gain release from a puzzle box, but will not give a bona fide yawn. This is an example of
  2. a. proactive b. belongingness.
  3. c. retroactive d. stereotypy.

 

ANSWER:           b

REFERENCES:  Pages 134-135

KEYWORDS:      Concept

 

  1. 36. The competition between natural responses and the responses required by the experimenter sometimes leads to the development of behaviors that interfere with an animal making an instrumental response. The development of these behaviors is called
  2. a.
  3. differential variability. c. instinctive drift.
  4. behavioral systems.

 

ANSWER:           c

REFERENCES:  Page 135

KEYWORDS:      Fact

 

 

 

  1. 37. With some difficulty, a raccoon was trained to place a single coin in a piggy bank, but when the trainer attempted to train the raccoon to place two coins in the bank, the raccoon rubbed the coins together for minutes on end, and would not drop the coins. This is an example of
  2. a. instinctive b. stereotypy.
  3. c.
  4. differential variability.

 

ANSWER:           a

REFERENCES:  Page 135

KEYWORDS:      Concept

 

  1. 38. According to behavioral systems theory, instinctive drift is a product of a.
  2. the components of the system activated by the conditioning procedure. c. negative reinforcement components.
  3. differential reinforcement of other behaviors.

 

ANSWER:           b

REFERENCES:  Pages 135-136

KEYWORDS:      Fact

 

  1. 39. Behavioral systems theory assumes which of the following?
  2. a. Because of behavioral variability, the types of responses that develop in a conditioning procedure are
  3. Because of stereotypy, the types of responses that develop in a conditioning procedure are predictable.
  4. c. Because we know the system activated, the types of responses that develop in a conditioning procedure are
  5. Because of instrumental constraints, the types of responses that develop in a conditioning procedure are unpredictable.

 

ANSWER:           c

REFERENCES:  Pages 135-136

KEYWORDS:      Fact

 

  1. 40. Which of the following is true of the nature of the instrumental reinforcer in conditioning procedures?
  2. a. The quality of the reinforcer is important, but not the b. The quantity of the reinforcer is important, but not the quality. c. Neither the quality nor quantity of the reinforcer is important. d. Both the quality and quantity of the reinforcer are important.

 

ANSWER:           d

REFERENCES:  Pages 136-137

KEYWORDS:      Fact

 

 

 

  1. 41. Two groups of rats were trained to navigate a runway for food. One group earned a single food pellet, the other received three pellets. What will happen when they are both shifted to a situation in which they earn the alternative reward?
  2. a. Rats that initially received the small reward will run faster for the larger reward than the rats that initially received the large reward
  3. Rats that initially received the large reward will run faster for the small reward than the rats that initially received the small reward did.
  4. c. Rats that initially received the small reward will run more slowly for the large reward than the rats that initially received the large reward
  5. The two groups will now run at approximately the same speed.

 

ANSWER:           a

REFERENCES:  Pages 137-138

KEYWORDS:      Concept

 

  1. 42. The elevated responding for a favorable reward resulting from experience with a less attractive outcome is called a. proactive
  2. positive contrast. c. negative contrast.
  3. retroactive belongingness.

 

ANSWER:           b

REFERENCES:  Page 138

KEYWORDS:      Fact

 

  1. 43. The decreased responding for an unfavorable reward because of prior experience with a better outcome is called a. positive
  2. negative contrast. c. stereotypy.
  3. negative interference.

 

ANSWER:           b

REFERENCES:  Page 138

KEYWORDS:      Fact

 

  1. 44. Suzie thought that earning $6.00 an hour for flipping burgers was great money when she was in high school. Now, after she lost her $20,000 a year job as a flight technician, she isn’t even considering returning to her old job at Burgers R Tasty. She is demonstrating
  2. a. positive b. negative contrast. c. instinctive drift.
  3. simultaneous contrast.

 

ANSWER:           b

REFERENCES:  Page 138

KEYWORDS:      Concept

 

 

 

  1. 45. Graduate students are barely given enough money to buy noodle soup. When they finish their degrees they jump at the chance to work for a university for pauper’s wages. The universities are able to keep the salaries low and still have plenty of applicants because of
  2. a. positive b. negative contrast.
  3. c. simultaneous d. stereotypic contrast.

 

ANSWER:           a

REFERENCES:  Page 138

KEYWORDS:      Concept

 

  1. 46. Which of the following is an example of a response reinforcer relationship with good contingency but weak temporal contiguity?
  2. a. sending sweepstakes coupons to the clearinghouse putting a sandwich in the microwave to heat
  3. c. mailing three cereal box tops to receive a plastic toy being burned by a hot stove

 

ANSWER:           c

REFERENCES:  Page 139

KEYWORDS:      Concept

 

  1. 47. A delay in the delivery of a reinforcer after the target response is likely to disrupt conditioning because a. animals have poor
  2. animals keep responding during the delay. c. animals have attentional difficulties.
  3. animals expect responses to lead to reinforcers.

 

ANSWER:           b

REFERENCES:  Pages 139-140

KEYWORDS:      Concept

 

  1. 48. Which of the following is a conditioned reinforcer?
  2. a. money food
  3. c. shelter
  4. saccharin

 

ANSWER:           a

REFERENCES:  Page 140

KEYWORDS:      Concept

 

 

 

  1. 49. Which of the following is not a conditioned reinforcer?
  2. a. giving gold stars to someone keeping someone warm
  3. c. telling someone “that’s the way”
  4. giving a good grade to someone

 

ANSWER:           b

REFERENCES:  Page 140

KEYWORDS:      Concept

 

  1. 50. Rats in a box were reinforced for rearing behavior. One group received a food pellet 60 seconds following each

rear. For another group, each rear was followed immediately by a tone, and then 60 seconds after the rearing, a food pellet was delivered. What do you think happened?

  1. a. The tone group’s learning was disrupted in comparison to the non­tone group’s b. The tone group’s learning was facilitated in comparison to the non­tone group’s learning. c. Both groups showed rapid and relatively equal acquisition of rearing behaviors.
  2. Neither group learned rearing behavior, because of the time delay.

 

ANSWER:           b

REFERENCES:  Page 140

KEYWORDS:      Concept

 

  1. 51. Dave the Druid makes a sacrificial offering of wine to the sun every 365 days. Each time he does so, the sun rises over the same stone. Dave believes pouring wine over the stone causes the sun to rise there because he has made a mistake in the component of the response reinforcer
  2. a. temporal contiguity belongingness
  3. c. contingency timing

 

ANSWER:           c

REFERENCES:  Pages 139-142

KEYWORDS:      Concept

 

  1. 52. Jeff always wears red socks on test days because he believes they allow him to earn good grades. Skinner would attribute this behavior to
  2. a. a positive response-reinforcer b. adventitious reinforcement.
  3. c. interim d. terminal reinforcement.

 

ANSWER:           b

REFERENCES:  Page 142

KEYWORDS:      Concept

 

 

 

  1. 53. According to Skinner, superstitious behavior is due to
  2. a. an accidental negative response-reinforcer b. interim reinforcement.
  3. c. terminal
  4. adventitious reinforcement.

 

ANSWER:           d

REFERENCES:  Page 142

KEYWORDS:      Fact

 

  1. 54. Closer examination of Skinner’s superstition experiment revealed that what appeared to be idiosyncratic behaviors

was/were really

  1. a. instinctive
  2. terminal and interim responses. c. pseudoconditioning.
  3. positive and negative reinforcers.

 

ANSWER:           b

REFERENCES:  Pages 142-143

KEYWORDS:      Fact

 

  1. 55. The periodicity of terminal responses is best explained by a. instinctive
  2. species-typical responses that reflect the anticipation of reward.
  3. c. species-typical responses that reflect other sources of motivation when food is d. superstitious behavior.

 

ANSWER:           b

REFERENCES:  Pages 143-144

KEYWORDS:      Fact

 

  1. 56. According to behavioral systems theory, the periodicity of interim responses is best explained by a. species-typical responses that reflect other sources of motivation when food is
  2. early components of foraging behavior. c. adventitious reinforcement.
  3. pseudoconditioning.

 

ANSWER:           b

REFERENCES:  Page 144

KEYWORDS:      Fact

 

 

 

  1. 57. Steve anxiously taps his pencil on his desk every day at 11:50. By 11:55 he is licking his lips. Assuming lunch is always served at noon, what, according to behavioral systems theory, best explains his pencil tapping behavior? a. adventitious reinforcement
  2. superstitious behavior
  3. c. species-typical responses that reflect other sources of motivation when food is unlikely early components of foraging behavior

 

ANSWER:           d

REFERENCES:  Pages 144

KEYWORDS:      Concept

 

  1. 58. In the triadic design of learned helplessness experiments, subjects in group R that are restricted to the apparatus in the exposure phase show avoidance learning in the conditioning
  2. a. slow rapid c. no
  3. unpredictable

 

ANSWER:           b

REFERENCES:  Pages 144-145

KEYWORDS:      Fact

 

  1. 59. Which would you expect to show the least avoidance learning?
  2. a. those subjects who had prior escape-avoidance training with escapable shock
  3. those subjects who had prior escape-avoidance training with inescapable shock
  4. c. those subjects who were merely restricted to the escape-avoidance training apparatus and that received no shocks
  5. any of the above depending on the intensity of the shocks delivered

 

ANSWER:           b

REFERENCES:  Pages 144-145

KEYWORDS:      Concept

 

  1. 60. Which of the following is not an alternative explanation to the learned helplessness hypothesis?
  2. a. Animals can perceive the contingency between their behavior and the delivery of a b. Animals learn to be inactive in response to shock during the exposure phase.
  3. c. Animals pay less attention to their actions due to inescapable d. All are accepted alternatives

 

ANSWER:           a

REFERENCES:  Pages 145-147

KEYWORDS:      Fact

 

 

 

  1. 61. Subjects exposed to inescapable shock in the exposure phase of a learned helplessness experiment typically show slowed escape learning during a later conditioning phase. However, if during the conditioning phase their escape responses are marked by an external stimulus, they show little disruption of their escape learning. This suggests that a. animals can perceive the contingency between their behavior and the delivery of a
  2. animals learn to be inactive in response to inescapable shock during the exposure phase. c. animals pay less attention to their actions due to inescapable shock.
  3. animals perseverate in their responses following inescapable shock.

 

ANSWER:           c

REFERENCES:  Pages 145-147

KEYWORDS:      Concept

 

  1. 62. Which of the following is thought in part to have helplessness as a mechanism of its development?
  2. a. panic attacks depression
  3. c. schizophrenia
  4. dissociative disorders

 

ANSWER:           b

REFERENCES:  Page 146

KEYWORDS:      Fact

 

  1. 63. Research has suggested that which brain region mediates long-term consequences of uncontrollable aversive stimuli?
  2. a. dorsal raphe nucleus occipital cortex
  3. c. pre posterior nuclei lateral medial nuclei

 

ANSWER:           a

REFERENCES:  Page 148

KEYWORDS:      Fact

 

  1. 64. A drug has been discovered to inhibit the dorsal raphe nucleus, you predict its effect will be a. to enhance the learned helplessness
  2. to block the learned helplessness effect.
  3. c. to artificially simulate the learned helplessness d. none of the above

 

ANSWER:           b

REFERENCES:  Page 148

KEYWORDS:      Concept

 

 

 

  1. 65. Activating the ventral medial prefrontal cortex
  2. a. substitutes for behavioral control in learned helplessness b. interferes with behavioral control in learned helplessness paradigms. c. provides evidence supporting the learned helplessness hypothesis.
  3. none of the above

 

ANSWER:           a

REFERENCES:  Page 149

KEYWORDS:      Fact

 

  1. 66. Compare discrete-trial and free-operant methods of instrumental conditioning. What are the advantages of each

class of procedure? What factors would influence your choice of procedure type if you were to explore instrumental behaviors?

 

ANSWER:  No answer provided

 

  1. 67. Describe how you would go about training a dog to open a refrigerator to fetch a can of soda. Make sure to include the details of the magazine training and

 

ANSWER:  No answer provided

 

  1. 68. How can one measure instrumental behaviors? What are the indicators that learning is taking place?

 

ANSWER:  No answer provided

 

  1. 69. Compare positive and negative response-reinforcer contingencies. How do these contingencies contribute to the classification of instrumental conditioning procedures?

 

ANSWER:  No answer provided

 

  1. 70. What are the differences between negative reinforcement and punishment? Between escape and avoidance?

 

ANSWER:  No answer provided

 

  1. 71. Compare the evidence for behavioral variability and stereotypy. What evidence is there that variability can be conditioned?

 

ANSWER:  No answer provided

 

  1. 72. What is meant by belongingness in instrumental conditioning? How does belongingness contribute to animal

“misbehavior” in learning situations?

 

ANSWER:  No answer provided

 

  1. 73. What factors contribute to the effectiveness of an instrumental reinforcer?

 

ANSWER:  No answer provided

 

 

 

  1. 74. Imagine that a friend of yours has committed a faux pas at your dinner party. Rather than embarrass your friend with an immediate correction, you wait until the party is over. Why is this not likely to alter your friend’s behavior? What could you have done to improve the chances that a correction administered after the party would change the behavior?

 

ANSWER:  No answer provided

 

  1. 75. What is the learned-helplessness effect? Describe two competing explanations of the effect. How is the effect mediated by the dorsal raphe nucleus and the prefrontal cortex?

 

ANSWER:  No answer provided

 

  1. 76. Compare and contrast free-operant and discrete-trial methods for the study of instrumental

 

ANSWER:  No answer provided

 

  1. 77. What are the similarities and differences between positive and negative reinforcement?

 

ANSWER:  No answer provided

 

  1. 78. What is the current thinking about instrumental reinforcement and creativity, and what is the relevant experimental evidence?

 

ANSWER:  No answer provided

 

  1. 79. How does the current status of a reinforcer depend on prior experience with that or other reinforcers?

 

ANSWER:  No answer provided

 

  1. 80. What are the effects of a delay of reinforcement on instrumental learning and what causes these effects?

 

ANSWER:  No answer provided

 

  1. 81. What was the purpose of Skinner’s superstition experiment? What were the results, and how have those results

been reinterpreted?

 

ANSWER:  No answer provided

 

  1. 82. Describe alternative explanations of the learned helplessness

 

ANSWER:  No answer provided

There are no reviews yet.

Add a review

Be the first to review “THE PRINCIPLES OF LEARNING AND BEHAVIOR 7TH EDITION MICHAEL DOMJAN – TEST BANK”

Your email address will not be published. Required fields are marked *

Category:
Updating…
  • No products in the cart.