What is the difference between rhythmic and melodic
Melody and rhythm are two fundamental aspects of music. Basically, melody involves playing notes of different pitches while rhythm involves beats and time. What is Melody — Definition, Features 2. What is Rhythm — Definition, Features 3. Melody is another basic element of music. We can define it as a timely linear sequence of musical tones that the listener perceives as a single entity. A note is a sound having a particular pitch and duration. When we string a series of notes together, we have a melody.
Example: Mozart, Requiem , "Kyrie eleison". Form— The structure or shape of a musical work, based on repetition, contrast, and variation; the organizing principle of music.
Example: Corelli, Suite for Strings , "Badinerie". Repetition , variation , and contrast are the foundational procedures on which music composition rests. There are a variety of ways to create thematic development motive, sequence, ostinato. In this example, a short four-note descending pattern in the bass is heard throughout under the voices. Dynamics— Designations for the relative loudness or quietness of music.
This gives further evidence that a pitch or harmonic deviation within musical material activates a neural network comprising auditory cortices spreading anterolaterally into inferior and prefrontal areas. The result is in line with the notion that auditory object recognition is supposedly processed in the ventral part of the auditory pathway Rauschecker and Scott, Figure 3.
Figure 4. Axial view of the left hemisphere of the musically elicited mismatch negativity within a time window of — ms after the occurrence of a deviant tone. The pronounced activation of IFC might also be due to musical training, since neurons of the vPMC, that are located adjacent to Brodmann area 44, might have contributed to the musical MMN.
Piano training might have established an internal forward model linking a specific motor movement with an auditory sound Lee and Noppeney, It is possible that such a model supported predictions about upcoming events and enabled a more precise concept about upcoming tones.
An internal model involving the prefrontal areas might help to detect easier auditory prediction violations, as manifested by the musical MMN. A dissociation between melodic and rhythmic processing was observed in a study in which musicians played melodic and rhythmic focussed musical sequences during an MRI scan Bengtsson and Ullen, In that study melodic information was processed in the medial occipital lobe, the superior temporal lobe, the rostral cingulate cortex, the putamen, and the cerebellum.
In contrast, rhythmic information was processed in the lateral occipital and the inferior temporal cortex, the left supramarginal gyrus, the left inferior, and ventral frontal gyri, the caudate nucleus, and the cerebellum This study, however, was not a mismatch study and the results obtained from this experiment showed a cross-modal effect since subjects played from a visually displayed score. Lesion studies, on the other hand, found that patients with a temporal lobe damage were impaired in melody processing Alcock et al.
Imaging studies investigating the processing of rhythm perception have also shown activation in the basal ganglia, the insula, left inferior parietal lobule IPL Bamiou et al.
Vuust et al. They found bilateral activation in the vicinity of auditory cortices with a left-laterality effect in musicians Vuust et al. However, to investigate the possible contributions of other areas requires a distributed source model. We therefore conducted beamforming analysis on the post-training data of our rhythm MMN study Lappe et al. To introduce this analysis we need to describe in more detail the methods and data of that study Figure 2. Ten non-musician subjects with normal hearing between 24 and 38 years old were trained to play a rhythm focussed piano exercise Figure 2A within 2 weeks comprising eight training sessions, each lasting 30 min.
Informed written consent was obtained from all subjects to participate in the study. For sensorimotor training a template was used where the keys, the tempo of the tones and the finger placement were marked, so that participants did not have to learn the musical notation before training Figure 2B.
Plasticity effects were measured by means of the musical MMN. The stimuli were generated with a digital audio workstation including an integrated on-screen virtual keyboard which permitted the generation of realistic piano tones on a synthesized piano.
The standard stimulus was composed of two identical rhythmic figures, beginning with an eighth note ms and followed by two sixteenth notes ms each resulting in a total stimulus length of ms. On deviant trials, the sixth tone was presented ms earlier leading to a total stimulus length of ms. Successive sequences were separated by a silent interval of ms.
Sequences were presented in two runs, each run comprising standards and 80 deviant trials presented in a quasi random order. A channel whole-head magnetometer system Omega ; CTF Systems was used to record the magnetic field responses. The recordings were carried out in a magnetically shielded and acoustically silent room. MEG signals were low-pass filtered at Hz and sampled at a rate of Hz. Epochs contaminated by muscle or eye blink artifacts containing field amplitudes that exceeded 3 pT in any channels, were automatically excluded from the data analysis.
Subjects were seated in an upright position as comfortable as possible while ensuring that they did not move during the measurement. Subjects were instructed to stay in a relaxed waking state and not to pay attention to the stimuli. To distract attention from the auditory stimuli subjects watched a soundless movie of their choice, which was projected on a screen placed in front of them. The subject's head position was measured at the beginning and at the end of each run by means of three localization coils that were fixed to the nasion and to the entrances of both ear canals fiducial points.
Turbo Field acquisition was applied to collect contiguous T1-weighted 0. For co-registration with the MEG measurements the positions of the fiducial points filled with gadolinium to be visible in the MRI were used. Epochs of 3. A multi-sphere head model fitted to the individual participants' structural MRI was used as volume conductor. To achieve as much similarity as possible between the background brain activity during the active and control condition, we contrasted each deviant with its directly preceding standard.
We conducted the beamforming analysis within the time window of — ms. The time window was chosen according to the corresponding source wave form that were obtained from the ERP analysis Figure 2D. The activation window was contrasted, as described earlier, to the corresponding standard time window. The results of the beamformer analysis can be seen in Figure 5.
Significant neural activation occurred in temporal and parietal cortex. Figure 5. Comparing the beamformer results between the melody and the rhythm data sets shows that pitch and rhythm deviations are processed in different brain areas. One has to remember, however, that the results were obtained in a between-group and not in a within-subject design.
However, the design of the two studies was identical except for the musical material that was trained. The results confirm previous music studies showing similar activation areas after a harmonic or rhythmic deviation Maess et al. The beamformer analysis indicates that the differential processing of melody and rhythm is reflected in the musical MMN. Differential processing of melody and rhythm is in accordance with the dual-pathway model of auditory processing Belin and Zatorre, ; Zatorre et al.
According to the dual pathway model, the antero-ventral stream projects ventrally to anterior, inferior frontal and prefrontal areas. This processing stream is presumably important for auditory pattern and object recognition. The postero-dorsal pathway on the other hand projects to parietal areas and has been associated with the processing of space and time Bueti and Walsh, These results corroborate the findings of studies investigating musical expectancy violation after a syntactic or pitch related deviation within a musical sequence that demonstrated auditory and IFC activation Maess et al.
Whereas superior temporal activation in auditory deviance detection is generally associated with sensory processing, the role of inferior frontal gyrus using this experimental paradigm is still a matter of debate. Other studies investigating the role of IFC in auditory deviance detection with non-musical material have demonstrated that IFC activation increases when auditory deviance decreases linking the IFC to a contrast enhancement mechanism which could indicate that IFC supports the STG system to discriminate auditory stimuli Opitz et al.
The aforementioned studies investigating auditory deviance detection in musical material have shown on the other hand that neural activation in that brain area is stronger for harmonically unrelated as compared to related tones Tillmann et al.
In addition, a previous voxel-based morphometry study demonstrated a reduction in white matter concentration in the right IFC in amusic subjects with severely impaired pitch discrimination ability.
The results of these studies suggest that the IFC as part of the ventral auditory pathway is indeed crucial for auditory object recognition. Expectancy violation to a rhythmic progression within a musical sequence, on the other hand, seems to be processed in the posterior part of STG and the IPL. The significant neural activation of the IPL within — ms after the occurrence of rhythmic deviation also fits to the dual stream model of auditory processing.
The postero-dorsal pathway connects the posterior part of auditory cortex with parietal areas and is associated with neural networks processing time varying events Rauschecker and Scott, The parietal lobe plays indeed an important role in processing temporal and spatial information, in integrating sensory information from different modalities, and in performing sensorimotor transformations for action. Temporal and spatial information for example, which is required for planning subsequent motor behavior, is integrated within the parietal lobe Bueti and Walsh, The posterior part of auditory cortex and the IPL as part of the dorsal stream are also crucial for sensorimotor auditory transformations Warren et al.
The strong connection between auditory and motor systems in the time domain is evident in music Zatorre et al. Synchronizing movements to musical beats is a common human behavior.
Music is rhythmically organized and unfolds in a predictable rhythmical structure enabling the listener to from expectations when upcoming musical events will occur. Although we did not find specific motor activation, it is conceivable that the short-term musical rhythm-focused training, that subjects had received prior to the MEG measurement, has established an internal forward model linking a piano tone with a specific motor movement.
This internal forward model might have supported and improved predictions about the timing of upcoming tones. Expectation violations were than presumably processed in the parietal lobule where timing and the motor system are closely linked. The dual pathway model of auditory processing has also been suggested by Hickok and Poeppel as an underlying mechanism for speech processing According to that model IPL, as part of the dorsal stream, is involved in sensorimotor mapping of sound to motor and articulatory networks.
The ventral stream, on the other hand, is responsible for mapping sound to meaning. The dual stream model for language processing comprises thereby speech perception and production connecting the motor-speech-articulation systems in the parietal lobe with lexical-semantic representations Hickok and Poeppel, ; Zaehle et al.
The beamformer analysis revealed that IFC activation was similar in the left and right hemispheres. This finding is consistent with our previous ERP analysis of the data. In contrast to our melody study, where we found a stronger involvement of the right hemisphere after a pitch deviation, the results of the rhythm study suggest that both hemispheres were involved when processing a rhythmic deviation.
Just take a look at a piano and see for yourself. Find C the black key right to C and Db the black key left of D. There you go. Finally, these intervals just named can be classified into consonance and dissonance: Most consonant: Unison and octave.
A little bit less but still very consonant : Perfect fifths. A little bit less consonant: Perfect fourths Still less, but still consonant: Thirds and sixths minor or major. Last of all, in western cultures, minor intervals are usually associated with sadness, thoughtfulness or interiorness while major ones have been related to feelings of joy, happiness, brilliance, etc.
But remember life is not always the same. Go ahead and play around. It's a matter of combining consonances and dissonances. It is better to do something somehow 'static' and consonant. Remember when you play we are all part of nature and music is a way of uniting ourselves with Pacha Mama mother nature.
0コメント