BLOGGER TEMPLATES AND TWITTER BACKGROUNDS

Saturday, December 10, 2011

Re: Wesley "Emotions in Music"

Wesley has been talking about metaphorical and physical emotions in music. Physical emotions being from the lyrics and metaphorical being from the tones. This is a switch from our usual talk of absolute music. When lyrics come into play the emotional distinction becomes blurred, perhaps that is why philosophers tend to be drawn towards absolute music, so the the lyrics do not get in the way of any distinctions. So we have to be careful when making any judgment claims between the lyrical and tonal roles of a piece.

Although parodies offer a good example of contrast between the emotions elicited by the lyrics of a song, its kind of like looking at identical twin studies in psychology to tell you how much genes play a role in personality. In these songs we have to also look at the metaphorical emotions in the piece, and in the twin studies we also have to look at the role of the environment too. Wesley gave an example of Coolio and Weird Al's "Gangster Paradise" and "Amish Paradise" to show the effect of lyrics on emotional responses. Yet, If the lyrics stayed the same in these two pieces and the form of the song changed there would be a difference in emotional response too.
Take Johnny Cash's and Nine Inch Nails song "Hurt." Even though there is only slight differences in the two pieces, mainly timbrel (Cash just plays it on an acoustic guitar and piano, and NIN use drums, keyboard, and an electric guitar) these little differences still have an impact on the emotional responses to the two pieces. Both of them still accentuate the same rises and falls, have similar dynamic contrasts, and are about the same tempo.
Johnny Cash's "Hurt"

Nine Inch Nails "Hurt"

This shows that both of the physical and metaphorical emotions play a role in our reaction to music. This kind of gets back to an earlier post of mine talking about Hanslick's and Kivy's distinctions on where the emotion in music is, although these two philosophers never talked about (at least in my readings from them) non-absolute music, I think that Hanslick would deny lyrics to have an impact and Kivy would consider them equal because they both relate back to human behaviors.

Question: Do you think that physical and metaphorical emotions expressed by a piece have an equal effect on our emotional responses to them?

Creativity Hierarchy

Last week the Q&A question was "What do you think the relations are between imagination, creativity, improvisational, and music composition"? I used my voucher for not turning in this Q&A, because I was very busy that week, but I still think that it is a good question and would like to answer it briefly in this blog post.


From chapter IV in "Bridges in Autonomy: Paradoxes in Teaching and Learning," the hierarchy between Imagination, creativity and improvisation is described like this:

When incorporating Music composition I think the hierarchy should be altered to look like this:

I propose that composing Music is a form of critical thinking. Also, Problem solving should be a subcategory of both improvisation and critical thinking because critical thinking is essential in problem solving, and not all problems are solved on a whim.

Question: What do you think of problem solving being a subcategory of both critical thinking and improvisation? Should it be under both? Why or why not?

Tuesday, December 6, 2011

Music and Language

The current article we are reading for class, is all about music and language. I'm going to bring up some more research on the correlations between music and Language. (Sorry, I can't help myself I'll try to tone down the Psycholingo).

Jackendoff (2009) mentions seven cognitive similarities between the learning and course of understanding music and language. Music and language necessarily involve memory, both bring together units in working memory with methodical regulations and arranged schemata, they both involve making predictions, music and language involve controlled muscle movement of the mouth or hands (mouth for talking, singing, and wind instruments; and hands for sign language, percussion instruments, and string instruments), acquisition involves the aural reproduction of what other’s sing, say, or do, we have the capability to create new words and songs, the potential to do the action of speaking and playing with other people. Jackendoff mentions that these capacities are not singularly related music and language.

One cognitive ability that is just applied to music and language that he mentions is that they both entail a pattern of time-based sounds. Yet, language relies on syllables and music relies on notes which can vary widely in length over time. Intonation in speaking, is comparably invariable because it tends to move between two secure pitches. Yet, these two pitches are not set like the first and fifth scale degree in music. Furthermore, two separate areas of the brain are in charge of prosodic intonation and musical pitch (Jackendoff 2009). Yet, contrary evidence supports that prosodic grouping involves related brain regions as music grouping-- the mental gathering and condensing of components into separate hierarchy levels (Patel, 2006).
Jackendoff claims the process of music processing does not match that of prosodic processing because music does not have syntax which is the basis of mental hierarchy. Music has no conceptual meaning so it does not have a match to this mental organization (Jackendoff 2009).

Converse to Jackendoff’s claim, Patel, Peretz, Tramo, & Labreque (1998) measured syntactic commonalities between music and language and found that music and language both have syntax.

Slevec, Rosenberg, & Patel (2009) found that musical syntax and linguistic syntax use the same cognitive resources and musical syntax and linguistic semantics do not occupy the same mental resources.

Yet, even though semantics do not occupy the same cognitive resources as musical syntax, listening to music still primes semantically related words. Koelsch, Kasper, Sammler, Schulze, Gunter, & Friederici (2004) found that words that the composers used to describe their pieces semantically primed words in a lexical decision task. (A lexical decision task is when either words on non-words are presented and participants have to quickly judge "yes" or "no" to whether it is a word or not. For example a non word would be manty. Both related and non-related words were used). So when participants listened to pieces that the composers labeled as happy, they were quicker to respond "yes" to the word happy than to sad. This shows that listening to sad music on some level makes us think of the word sad -- it semantically primes it.

I hope I didn't confuse anyone and hopefully you learned a little bit about Music perception!

Question: Jackendoff mentions eight simillarities between language and music (seven that do not only pertain to music and one that does) Can you think of any other correlations/similarities between music and language?






References




Jackendoff, R. (2009). Parallels and nonparallels between language and music. Music Perception 26(3), 195-204.

Koelsch, S. D., Kasper E., Sammler D., Schulze K., Gunter T., & Friederici A. (2004). Music, language and meaning: brain signatures of semantic processing. Nature Neuroscience, 7 (3), 302-307.

Patel A.D. (2006). Musical rhythm, linguistic rhythm, and human evolution. Music Perception, 24(1), 99-104.

Patel, A. D., Peretz, I., Tramo, M., & Labreque, R. (1998). Processing prosodic and musical patterns: A neuropsychological investigation. Brain And Language, 61(1), 123-144.


Slevc, L., Rosenberg, J. C., & Patel, A. D. (2009). Making psycholinguistics musical: Self-paced reading time evidence for shared processing of linguistic and musical syntax. Psychonomic Bulletin & Review, 16(2), 374-381.