I don’t usually speak personally on this blog, but with the end of my junior year and the beginning of the summer before my final year of high school, I figured it might be a good idea to share a little bit about what I have learned over the course of the year.
ONE: People are wonderful.
I'm quite the cynic. That’s not to say that I am a pessimist, but rather that I tend to outwardly express impatience than I do grace. But this year, through various interactions with people around me, I learned how wonderful people are. I can’t even count the amount of times I have struck up a conversation with someone I would have never imagined talking to and was whisked away into an intriguing conversation about something that person was passionate about. And most of this wasn’t even on my podcast, which was designed for that every purpose!
This year, I found myself (twice) at dinner parties surrounded by a plethora of ballet dancers, artists, and professionals. Assuming me to be in college, many of these people spoke to me about their work and their passions treating me like a peer, and even doing so after they learned of my youth. I spoke to a male costume designer about travels in Europe, a interior designer about food photography, and a ballerina about what it’s like to grow up so far away from family.
A lot of things distract people from their wonder, but we shouldn’t be afraid to embrace people. If you let yourself get into circumstances that you may be uncomfortable with, who’s to say what amazing people you might meet and what amazing things you might learn.
TWO: People are terrible.
I mean it; people really are terrible. When meeting people for the first time, it’s easy to just focus on the wonder of their beings but once you go deeper in relationships, it’s hard to ignore how messed up people are. We are selfish, cruel, prideful, and even though we think we are doing right, we often convince ourselves the wrong thing is right.
THREE: Contradictions are ok sometimes.
People are wonderful, and people are terrible. This seems a contradiction, and yet it’s true. I had an interesting conversation with a friend where we were trying to come up with oxymorons. My favorite one that I personally came up with was Roman Catholic (because Roman refers locationally to a specific place but catholic means universal) which stuck out to me because obviously there are a lot of people who are “Roman Catholic”. It obviously cannot be a contradiction in terms, and yet it is. Humanity can be defined as a mess of contradictions, but that's not necessarily a bad thing. Perhaps, we seem paradoxes because humanity is complex and above the comprehension of our small human brains.
FOUR: I am blessed.
People throw around the word “privilege,” which isn’t inaccurate, but in this context I prefer to use the word blessed. I have been blessed with an amazing family who loves me and raised me on books and God alike, I live in a safe neighborhood that is honestly quite beautiful. I attend an amazing IB school without having to pay tuition and I’m going to the UK this summer for vacation. To top that all off, I live in New York City, where I often mind myself meeting cool and famous people during opportunities I could only imagine. It’s increasingly clear to me that I live a life not shared by all or most of the world, and that I need to be thankful for this but also recognize that it is a blessing in the sense that I did nothing to deserve the circumstances in which I have been raised.
FIVE: I need to share my blessings.
I believe there’s little point in being able to recognize your blessings if you keep them all to yourself. This is definitely something I need to work on. I’m not even talking about big drastic things like becoming a doctor and moving to developing countries or living for a year as a homeless person. But, as I mentioned, I tend to be a cynic, which I think there is little point of when I have so much to be happy and smiling about. Especially with the college application season approaching, I know I need to be able to not just withdraw and be able to get work done, but also to be able to love the people around me that they might be a little more blessed in their own lives.
AND NOW, some thins I learned which likely won't help me in life but I want to share anyway.
1. Orangoutangs are "semi-solitary," which is really just a fancy way of saying they are introverts who need to be alone sometimes to just chill.
2. The R train is slow but pretty cool. It's pretty much like café and if you don't have your headphones in, random people will chat with you and make you feel better about life.
3. There are different kinds of rain. There’s heavy infrequent rain, where drops don’t come very often but come down with a huge unexpected splash, light frequent rain, where drops are small and little but are coming down constantly, and heavy frequent rain, where all hell breaks loose from the sky. I adore the second two types, but despise the first, which makes me feel like I have no control over anything.
Well, I'm off to summer. Hopefully this summer (and senior year for that matter) will have another whole plethora of things for me to learn.
Have you learned anything this year? Let me know in the comments!
- Alexandra G. Kytka
Despite the desire of many high school juniors to "go against the system" at least when it comes to the college process, the fact is that our world is full of systems that we work in and with in order to make anything meaningful at all. Scientists define isolated systems to get a perfect constant of the law of conservation of energy, but even ballet dancers follow a system of dance moves and patterns in their artistic form. But why does any of this matter? Can't we just all destroy the "system" of rules and formalities and find the truest science and dance without it?
In his book Godel, Escher, Bach, Douglas Hofstadter explores the idea of an isomorphism, that is, (in the mathematical sense of the word) a one to one correspondent between two sets. In layman's terms, that's a system that helps you understand another system that is more cryptic.
For example, if I give you a pattern of a triangle followed by a square, a pentagon and a hexagon, you may understand that "system" as 3,4,5,6 in terms of the number of sides on the square. That system of numbers (3,4,5,6) itself is not the same as that pattern, but rather it is a separate system that helps you understand the pattern because it holds a one to one correspondence with the original. This is an isomorphism. Also loosely correlating to an analogy, isomorphic systems help you work within another. Just like I know that the next shape will have 7 sides, if given "cat:mouse is as "dog:--," I can know that the -- is cat. So now we know what an isomorphism is in a formal sense, but what does this mean for the "informal" things that (arguably) matter more? In the following lines I will argue that all systems we have in place are our isomorphic attempts to understand the system that is reality.
In his theory of language, linguist Ferdinand de Saussure (and later, in 1956, Roland Jakobson) argued that all language is actually metaphor or metonymy. Calling back to the title of this post, we may uses the example of a "cat." The English word "cat," as he argues, although attempting to use to the same idea as the Egyptian "mau," actually communicates a fundamentally different idea, that is, the idea of a graceful being whose values according to Egyptian culture is completely different from the Anglo-American conception of a "cat." Postmodern philosophers cite this phenomena to claim that language is utterly distinct from reality. However, these academics miss one key point, which is that while language as a system is distinct (or rather, separate) from reality, it is not indeed contrary to it. Otherwise, I believe we would see more significant differences between the nature of things and the labels we place upon them. Rather, any given system of language is our attempt to parallel and understand reality. Thus, I cite language as an example of our isomorphic attempts to grasp at the true nature of reality.
Art and literature are also clear examples of this. Throughout history, we see groups of people use stories to better comprehend principles and truths around them, ex. myths to explain the origins of the universe. But even on a daily basis this is true. The theory of Narrative Identity in Psychology and hermeneutic epistemology proposes that one's identity is formed by integrating individual life experiences into an over-arching and evolving story of the "self". This happens very early in childhood development and is something I have seen personally while babysitting toddlers, who when alone and/or going to sleep often recount to themselves their daily lives and their future plans for tomorrow. This same idea is spoken about by popular writer Malcom Gladwell. The historical and modern use of art forms such as painting and theater do this very thing either on an individual level or on a scale of a nation, event, or larger idea. Through the use of propagandistic pieces like Death of Marat or the Aeneid, nations and people groups define their identity via a story of courage and power. But aside from clear attempts to define identity, literature and art often reveal deeper truths about reality. In The Seven Basic Plots, Christopher Booker highlights key themes and characteristics present across a wide selection of books, a connection that suggests something greater than just coincidence or plagiarism. Rather, these plot types, which include "Rags to Riches" and "The Quest" are direct reflections of human nature and our connections with the universe.
What I have suggested here sounds like a pretty lofty idea, however, despite the fancy labeling of an "isomorphic system," you probably already realize this on some level. As humans, we crave understanding and use many forms in order to satisfy this craving. But my point is larger than this. In creating this deeply layered connection between all these "areas of knowledge" and our attempt to understand the universe, I have really suggested something greater. What all of these examples have in common is the idea of a system, a set of connected parts comprising a complex whole. If we are to take anything from these isomorphic systems, it is that the universe itself is a system– it is orderly, has sets of rules, and is able to be understood on some level. Whether this conclusion is true, or whether humanity has completely failed in creating accurate isomorphisms that can parallel reality... you decide.
Written by Alexandra Kytka.
The following dialogue is an expansion of a brief conversation I had with my biology teacher, however, both members of the Dialogue are constructed by me and do not necessarily fully reflect the thoughts of my teacher or me.
Edmund: It's so strange to me how people like Aristotle or Dalton are often considered scientists when their theories were so far from the truth.
Peter: Well, science is a process, and part of that process is putting out hypotheses that later prove to be wrong.
Edmund: Even so, it's not only that their answers were wrong- their process that they used to get their answers was not a process at all- it was based on wild guesses more than observable reality. It makes you wonder if you can consider so-called early scientists scientists at all.
Peter: So if we are basing the title scientist on participation in observation and experimentation, does that mean the first scientist was Galileo? Even Copernicus who is often credited with the start of the heliocentric model of the universe focused on theoretical math. Galileo is renowned for actually looking at the sky to provide evidence.
Edmund: I'm not convinced even Galileo makes the cut. He did take a step in the right direction, but it wasn't until Darwin that we see the full scientific process.
Peter: I agree that the modern scientific process did not fully develop until around the time of Darwin but can't you argue that the development of the scientific process through the trial and error of previous scholars can itself be considered an essential part of the scientific process? After all, you can't judge someone according to a formal system that was created separately from their existence.
Edmund: I'm not saying that scholars like Newton and Galileo didn't contribute to science, rather that they can't be considered scientists themselves. You can argue that the development of the scientific process is part of the process itself, but I'm not convinced. The process of writing a song is distinct from the song itself.
Peter: I think it's problematic to define a scientist as you do. Your definition relies on the idea of an end, that is, that there is some end result to the development of the scientific process– that end, you claim, is found in Darwin. However, despite our egotistical claim of us reaching the "modern" age, there has been much progress even since Darwin in the development of the scientific method. Peer review, for example, is a long and strenuous process that scientists go through today in order for any of their work to be established as scientific fact. Yet, during the time of Darwin, peer review consisted only of being accepted into a journal. In other words, your definition of a scientist requires a constant re-evaluation of scientists who are subject to new aspects and innovations of the future scientific method.
Edmund: I believe you are pulling a straw-man and misrepresenting what I mean when I refer to the scientific method. Of course the process itself has changed and progressed over time, but the foundation of the scientific method is the idea of answers to these big questions being answered only in a natural way with no reference to supernatural forces. Even Issac Newton the father of the mechanical universe left his negative space to the elusive power of the almighty, a significant barrier in the way of true scientific innovation.
Peter: Hmmm. I think we are really getting somewhere now. Let me clarify. Your argument essentially revolves around the relationship between science and religion– you say that science done in tangent with the framework of a theistic world view will necessarily yield itself to using the supernatural instead of seeking the natural. A "God of the Gaps" if you will.
Edmund: Yes! You cannot be a scientist if you are relying on axioms that cannot be proven to fill the gaps where you have yet to find an answer.
Peter: But many argue that indeed all atheistic scientists do this very thing! Thinkers like David Hume and Friedrich Nietzsche point out that the entire institution of science is founded upon theistic axioms, presupposing the consistency and order of the universe as well as the reliability of human observation in order to come to any accurate conclusions about how the world works.
Edmund: This is the difference between a foundational assumption and a interpolating assumption*. Believing in observation and the order of the universe can be questioned but scientists take it as fact not because it is, but because it is practical in scientific work. Putting "God" in as an explanation does not help scientific work; it prevents a real genuine search for natural answers.
Peter: I think we are at the point where we would need to look more closely at specific theistic scientists and their work in order to precede.
Edmund: But wait, before we wrap up, what can we conclude? Who was the first scientist?
Peter: I'm not convinced we can nail down a particular figure as the first- after all, humans have been trying to understand the world systematically for most of their existence.
Edmund: I have yet to be convinced that pre-Darwinian thinkers can be considered to be using the scientific method. But I think there's a lot more to be said on this issue.
...to be continued
I really enjoyed exploring this new form of thinking. Let me know if you liked this mode of posting. I also fixed the comments section so try it out!
*these are not actual terms used in the professional academic world, but rather of my own creation.
As I was finishing up a biology lab, I came to a sudden halt in my conclusion and evaluation. My Group 4 IB Biology Lab had involved changing the wave-lengths of light absorbed in a mesocosm to see how the spectrum of light available to plants affected their photosynthesis. Realistically this meant that we grew radishes in bottles and changed the colors of the bottles for different conditions- clear, blue, green, and red. However, finding no red bottles available in the marketplace (c'mon Coca Cola, you really gotta get a move on), I resorted to wrapping a clear bottle with a (pink) clear wrap.
Flash foward to my sudden halt. I had assumed that pink had to be essentially red- after all, we get pink by mixing red and white, and white is just all the colors together. But someone else suggested it was closer to violet and here is where I stopped and stared into space... red and violet are on completely different sides of the light spectrum and yet we consider pink to be pretty close to both? And also now that I thought about it, red and blue yields purple but the wave-length of purple surpasses both red and blue? What's going on with the world?
Turns out there are pretty rational explanations to these questions and I was mixing up what's called additive and subtractive coloring, but that's besides the point. In asking these questions, I went far beyond what I had to do for any sort of coursework. In fact, the additional research (googling) I did will have likely no effect on my grade in any way. But this process is a perfect depiction of the real beauty of learning and its implications for the education system.
Some people make the mistake of assuming that students should love everything they are learning in school. As a result, many progressive schools are opting into 'design your own curriculum' programs where students choose what they want to study on a certain day. This sounds great, but in reality has some seriously terrible implications. If given the option of choosing whatever I wanted to study, likely I would not choose color theory- I probably wouldn't even know what that was. I can't say that the intersection of light and agriculture interests me more than anything in the world; but doing this particular lab and taking my particular biology class led me to exploring nuances of other subjects like color theory that I wouldn't have considered otherwise.
This isn't the only time that this has happened. My math class led me to cryptanalysis; my history class to economics. It is only by being exposed to a variety of disciplines at this age that I am able to really explore niche ideas and concepts that really have and keep a hold of my mind. The best part? No one knows what will fascinate me next.
- Alexandra G. Kytka
Don't forget to check out my podcast! Also, I have added and am continuing to update a "Projects" tab on the site in order to inform you of the various projects I am working on.
Most of us know on some level that playing interactive games is an important part of child development, however, most of us choose to leave that part of our childhood behind except for the occasional game night or for techniques of alcoholic consumption. I think this is a catastrophic mistake.
We tend to spend our free time in the following ways- watching films/shows, using social media, reading, or listening to podcasts (a great one is called Ergo available on iTunes and Soundcloud :)). These activities certainly have their place, especially in the process of gaining knowledge that comes through such mediums (especially podcasts; definitely check us out). However, gaining knowledge without being able to process that knowledge is a serious mistake that can make all for naught.
Playing games can be instrumental in this process. They have the potential to make you use your brain in new ways and develop your critical thinking skills in a way that consuming media cannot. The following are some games I suggest if you are interested in trying this out-
For All Intents and Purposes: Can be played independently during a commute or with other people. Based on a type of creativity IQ test, this game requires you to pick a random object and create as many uses for it as you can. Stipulation: You cannot take apart the object in any way.
Acronym Guess Who: This game needs to be played with at least one other person. Pick an acronym (I have in the past used HSBC, LDS, etc.). Then, take turns choosing adjectives for each letter to create the profile of a person and have the other person guess who it is.
The Question Game: How do you play the question game? If you respond with "Can you teach me?", you've got the right idea. During the question game, which is optimal when played between two people, you need to attempt a conversation comprising only questions that actually connect to each other's words without repeating any questions. (It's much harder than you might think.) Credit to my 8th grade Bible teacher Ms. Smith for teaching me this one.
Domestic Feline: Taken from the award-winning novel "The Help," this game involves taking a simple word/phrase (like house cat) and having another person say the same word in a more complicated, posh way (thus domestic feline). My favorite? "Salutations greetings my dear fraternal relation" as a replacement for "yo bro".
Vigenere Cipher: This last one isn't actually a game, but it is a fun engaging way to communicate or pass notes to a friend. You can look it up for better instructions online, but essentially you agree on a word or phrase to decrypt a message from a person and use a complex arithmetic sequential cipher given to you as a simple chart in order to send or receive a message.
Well, that's all I have for today. Now it's up to you to go from your consumption of this media to processing all your knowledge by playing a game! (But if you still need more media to consume, of course my podcast is still up for grabs).
- Alexandra Kytka
I don't know if this says more about F. Scott Fitzgerald or myself, but one of my most beloved character introductions I have ever read is that of Tom Buchanan, who is described as peaking in life during his playing football in college. Now, there's a lot more to criticize about Tom's character (the casual adultery, misogyny, and racism might be a good place to start), but I was struck by this particular description because it made me think about the way we judge a person's life and the way they have chosen to fill up their time.
Occasionally I feel bad for those kids you read about who published a book at age 14 or graduated college at age 16 because their media coverage pretty much ends there- who knows what they have done since or whether their accomplishments will ever surpass their earlier ones?
Let me get straight to the point- the idea of someone reaching a "peak" in their professional life is a total lie- but it's a lie we let ourselves believe. This metaphor makes life into a struggle akin to climbing up a mountain in that there's only one place we hope to reach and once we get there, we simply wait around for others to catch up. The truth is more complicated.
In short, the way to prevent this so-called "peaking" is to make sure you don't climb a mountain- do anything else; go on a hike, a sail, or a walk to the corner store. The truth is that if you have only one goal in life, you're either going to spend your entire life trying to get somewhere you'll never be, or you'll get there and have nothing else to do. Kids who have spent their entire lives dreaming about Harvard will either be disappointed come college acceptance season or they'll get to Harvard and not know what to do next.
Life is incredibly diverse with opportunities to do different things. Your goal was to write a book and you just get one published? Great, now go travel the world and blog about it. You became the prosecution lawyer you've always dreamed of? Terrific, maybe now on Saturdays you can teach an ESL class for local immigrants. The truth is you can't peak if you live in a mountain range.
This has been my personal drive recently, and it's the reason why I have gotten to do so many cool things. Aside from my coursework and my preparation for future academic opportunities, I run this blog, produce a podcast, and have many other projects lined up that you will learn about soon. If you reflect on your own time and realize that you are spending too much time on achieving your 'peak,' I encourage you to explore more of what you love and perhaps you'll find something extraordinary.
- Alexandra Kytka
No sources for this post, but be sure to check out my podcast "Ergo" on iTunes and Soundcloud- the second episode on atheism and theism is being released this Friday!
Animal lovers may be interested to know that in Argentina a judge ruled that the chimpanzee "Cecilia" has rights under the law and could not be confined in a local zoo without a companion to spend her days with, while philosophy lovers may be interested in Judge Mauricio, who explained his position by quoting Emmanuel Kant- "We may judge the heart of a man by his treatment of animals."
I'm sure PETA is rejoicing but I'm not as quick to celebrate- for in our desire to protect the wellbeing of animals and in Kant's own words actually lies more understanding of the nature of humanity than that of animals. Kant claims we can see into the morality of a man by looking at how he treats animals, a statement that sounds clear and radical at first. But look at what happens if we switch out animals for other people groups.
"We may judge the heart of a man by his treatment of the poor"
"We may judge the heart of a man by his treatment of the elderly"
"We may judge the heart of a man by his treatment of minorities."
Do you see the problem here? This statement seems to recognize the innate worth of the direct object (animals), but when we change the sentence around it is clear it does no such thing. Rather, it creates a dichotomy between the (claimed) superior and inferior, the dominant and the submissive, the morally responsible and the helpless. When speaking about animals, this isn't necessarily bad. Because we are human (unless I have a secret animal audience, in that case, hello!), we naturally put the innate worth of our species over that of others. However, many news outlets reporting on this particular case called attention to the emergence of animal rights being recognizing legally. But granting a population allowances because we believe we are superior to them is not giving them rights. We cannot claim adherence to animal rights as an expression of moral righteousness or integrity.
As much as I love the cute little furry ones, I recognize that when we object to someone abusing a dog we do it not necessarily on a foundation of believing in the inner worth of that dog, but because we find pleasure in the cute little furry ones and we don't want someone to compromise that source of pleasure. Humanity's quest for animal rights is not actually about the animals at all- it's about us- seeking to take care of the vulnerable and simultaneously being placed on a pedestal of ethics.
I love that this chimpanzee is legally entitled to a companion. I love this because I am a human who seeks relationships and community and not because I know the scientific basis for the emotional stability of apes. I give my congratulations to Cecilia and the hopefully full life she will live in companionship. But I caution the supposed heroes of the animal rights movement to understand where they are coming from.
(Also, please check out my podcast "Ergo" available on iTunes and Soundcloud!)
- Alexandra Kytka
The following is a critique of the current state of the Young Adult Fiction genre of literature.
I consider myself a pretty avid reader, however, I think there are a few massive problems that usually dissuade me away from the "Young Adult" (henceforth YA) genre of novels. The overwhelming consensus of my mind dictates that Young Adult Fiction is faulty by nature, but more about that later.
1. The plots never take risks.
From what I have seen, most of the plots of YA novels (especially the more popular ones) follow a very similar path. Open on a somewhat ordinary girl/boy who doesn't have much going on in their lives. Put them through some interesting circumstances, now they are interesting and turns out they were born to be special.
Of course, some diverge from this path a bit but I have been very disappointed in the fact that most of these plots stay on similar lines. If they don't follow this path, a novel will still stick to events and climaxes that have been proven to sell and entertain.
2. The characters are caricatures.
Now I know how difficult it is to create a truly complex character. But it seems like these people choose to not try at all. I enjoyed reading the Divergent Series and The Hunger Games series but both of those trilogies failed in creating characters who were not just amplified versions of one virtue. Katniss was stubborn, that is all. We don't see her come at herself from different angles or question how stubborn she actually is. We don't see her stubbornness coming into conflict with other traits. Katniss Everdeen is stubborn, and there we stop.
I would love to see a YA novel that creates a REAL, complex, character. A teenager who is juggling dozens of spheres of life not only trying to balance family and school. Real teenagers are working with family, school, friends, college plans, hobbies, life dreams, money problems, insecurities, politics, among others, and simplifying life to only two or three priorities is unrealistic.
3. The writers substitute good writing for emotional writing.
If I was a high school writing teacher I would probably give As to most YA writers. But I'm not a teacher and they are not in high school. Choosing words, selecting metaphors, and constructing imagery should serve a higher purpose than making a text sound good or attracting a reader. Truly good writing makes small choices to fit into a big structure or idea to convey to their reader. However, the easy reads of YA tend to write fluff for fluff's sake. This means of course, that books are more attractive and perhaps readers are more willing to read them, but at the end of the day the writer has failed to make any lasting impact.
(4.) AND THE REASON WHY ALL OF THIS HAPPENS:
As I said just before, I think that these problems are inherent in the YA novel. I think that good writing should not work like marketed products with focus groups and demographic pie charts. Good writing should be good writing regardless of the generation a eligible reader is in, and writers should strive to write novels that are legitimately good not only for teenagers but for beyond.
Especially nowadays, YA writers are popping out novels every two months or so because they rely on it for their income or fame. That is, of course, understandable. But these writers, most of which are adults in their thirties or forties, have a larger mission to teach youth about what makes writing unique. Instead, they are trying to mimic the way television works- pure entertainment for entertainment sake, quick flashy scenes and dramatic characters. But if writers continue to do this, surely teens will stop reading altogether as TV shows are making the choice to go the other way, making complex characters and unique plots with artistic choices for a higher purpose.
Exceptions to the Rule:
As I always like to say, there are always exceptions to the rule but you can't make the rule based on the exception. However, there are some wonderful books usually categorized in the YA genre that I do love very much. These are not all of them.
The Book Thief: This book is one of my favorites from this era. This may be because it was marketed not ONLY as a YA novel. Regardless, I think this book is so fascinating in its choice of perspective and motif and somehow it managed to take a concept written about hundreds of times and turn into its own unique majesty.
Outlander: There are more books in the series I am told, however, I have only read the first thus far. I enjoyed Outlander because it brought me into another world that isn't explored very often (Clan rivalry in 18th Century Scotland) and yet it was able to inspire thinking about modern themes in pop-culture today.
The moral of the story is this- If you're a writer, give the youth what we need, not necessarily what you think we want. If you're a reader, perhaps it's time to venture outside the limits of the books given to you by marketing experts and find out what you like for yourself. Maybe it means we need to find someone willing to write these quality novels.
There are, I'm sure, more novels that do not fall into the errors I have pointed out. Some may be included in the list below. However, my dear friend Chloe (http://pseudonymouswrites.weebly.com) made some gloriously terrible puns after I told her what I was writing about. So enjoy-
BAD IDEAS FROM MY FRIEND CHLOE
Paper towns more like Paper clowns.
Looking for Alaska more like Looking for a Good YA Novel.
Eleanor and Park more like Eleanor and Park that idea back where it came from.
Divergent more like diverge your way from this genre.
The Hunger Games more like The Hunger Games for a good YA novel.
13 Reasons Why more like 13 Reasons Why you shouldn't read YA novels.
City of Bones more like City of Bad YA novels.
Go Ask Alice more like Go Ask Alex why these are so bad.
Fallen more like fallin' off this YA bandwagon.
It's kind of a funny story more like it's kind of a horrible story.
A Wrinkle in Time more like A Wrinkle in This Plot.
The Maze Runner more like Maze, Run Away from this Book.
The Perks of Being A Wallflower more like the Perks of Being a Wall because than you don't have to read any of these books.
- Alexandra Kytka
I have yet to meet someone free of this experience: scrolling through your Facebook feed, you see someone you barely know share a link to a Buzzfeed Quiz: "What type of flower are you based on your favorite condiment?" Or, "Which pair of Hunger Games tributes are actually your parents?". Of course you click on it because even though you have never thought about it before, you just HAVE to know if you belong to Rue and Thresh or if your parents named you after the right flower.
But why? Why have these strange personality quizzes taken ahold of our psyche? Why do we care so much about seemingly pointless labels?
I think in order to explain the presence of pointless labels we have to explain the presence of labels in the first place. I'm actually very fascinated by labels. They are at fault for a lot of negative stereotypes that are perpetuated across society and easily allow us to look down on others. Take the popular cliche of a "jock." If I call a boy a "jock" because they are sporty and hang out with the popular kids I may also think the same boy is also foolish because of that categorized group. However, at the same time, labels often help us to organize and get through life. Just think back to that moment of "Mean Girls" or any Disney Channel TV show where they film a shy teen navigating the cafeteria to a group of friends through the seas of cheerleaders and geeks galore.
I'd like to think about this historically as well. Think back to a 'simpler' time of identity in which all your expected behavior, career, and position is decided for you by what family you are born into. A young, middle class, white woman born the only child in the 1800s knows she is supposed to marry up to protect her mother. She knows she is supposed to act polite around those with more money or age. She should be able to play piano reasonably well and read a bit, although this should not come into the way of her courtships. Of course, she must not be better at those hobbies than any man around (Yes, most of this comes form Jane Austen oops).
But things have changed drastically in the past few hundred years. We have let go of many of these social norms. Of course we still have class distinctions but there is a lot less relying on the family you were born into. When my cousin Sammy was born no one was able to make any assumptions as to what job she might have in the future, who she will marry, or if we will make more money than her older brother. Instead, she is building her identity on a blank canvas, exploring what she likes to do and hopefully succeeding based on her own desires.
This of course is a win for humanity. But all victories come with some negative unforeseen yet inevitable repercussions. We have lost our identity, which is not to say that we don't have one, but rather that we work much harder to find it. It's not given to us out of the womb. There may be expectations placed on me as to what I should do with my future, but no one claims to know what will happen based on the position and social ranking of my parents (which is too bad, because my dad is a pretty awesome guy).
But looking for our identity is hard, especially in a consumer society of mass marketing where everywhere we look brands are looking to form our identity into a nice little demographic bubble to check off a list. More than ever, we are forced to ask the question "Who am I?" because we have to somehow forge a self out of a rubble of pebbles, passions, and everything in between.
And this is where we come to clickbait personality quizzes. Or maybe not. Maybe we start with Myerr's Briggs and narrow it down to one of 16 basic personality tests but there are way more than 16 kinds of people in this world. So when we see that there are so many little ways to define ourselves into sentimental or random categories we click because it's easier to take a two minute quiz than go out into the scary world to find our identity.
And so there's an explanation but not an answer to the question that is at the core of this all. Why do I keep clicking? When will I finally discover who I am? When will I be secure enough in my identity to avoid the Buzzfeed quizzes and turn myself into an original copy?
I can't give an answer for this but one thing I know for sure:
"One may understand the cosmos, but never the ego; the self is more distant than any star." (GK Chesterton)
- Alexandra Kytka
This is part of a series entitled “Examining Evolution” in which I will look at the evidence for and objections against evolution in an attempt to reconcile the Scientific and Religious spheres.
I have often been so inclined to designate Galileo, perhaps in jest, as my “would-be grandfather” in another life. Regardless of the unforeseeable implications of such a musing, there is nevertheless a foundation of reason upon which it is based. For I do believe that this Italian scientist has provided a strong case for the relationship between religion and science that I often cite to this day.
As many have learned in their years of high school history, Galileo acted as a turning point in both science and society at the start of the Scientific Revolution. Copernicus had before proposed a heliocentric universe; however, he had proposed the new model only as an ex suppositione, a mere mathematical ‘game’ with no claim grounded in reality, and as a result the Church did not take direct action against him. By publishing his Dialogues Concerning the Two Chief World Systems in Italian instead of Latin, however, Galileo made clear his disagreement with the conclusions of the Church’s scientists. He was adamant that his model of the universe was supported not only by abstract mathematical principles but by the concrete make-up of the universe as perceived by his senses.
In the present day we have arrived at a similar dilemma, not with regards to the sun but with regards to the earth- Evolution. Science and Religion have yet again come into conflict over this theory, battle lines are forming, and forts created on each side. Some claim that evolution is irreconcilable with the Genesis account of creation and a duel ensues over who to trust more- the bible or science.
Using Galileo’s “Letter to Madame Christina of Lorraine, Grand Duchess of Tuscany”, the basic principles of biblical exegesis, and the scientific process, I would like to altogether extinguish this faulty premise so that both the science and the religious world may progress despite this colloquial spat.
The fallacy lies in the question: Who to trust more: the bible or science? Galileo certainly did not see this as the question we should be asking. He states,
Pay close attention to what Galileo is proposing. He explains that the people offering up biblical objections to his scientific ideas are merely using the bible as a defense for their preconceived notions. Furthermore, that these people understand neither the science they are trying to disprove nor the passages from Scripture they are pretending to use.
I once heard a man lecture on this very same issue and I heard something that caused me to metaphorically hurl the desk at him; he said that the Bible must be trusted more than science because science involves interpretation whereas the Bible does not. How uneducated of a statement is this? If the Bible is straight forward and never interpreted, why do we have so many different denominations in the Church all claiming to base their mutually exclusive doctrines on the same Scriptures?
Clearly, the Bible may be interpreted in different ways depending on an individual’s context, experience, and their own presuppositions. This is why I have a problem with people claiming various scientific theories to be false merely on biblical authority- because in fact, they are not appealing to biblical authority, but rather to their own authority. Galileo explains that “The holy Bible can never speak untruth” but that “if one were always to confine oneself to the unadorned grammatical meaning, one might fall into error”. This is not to say that one should never interpret the Scriptures or use them to support their reasoning, but rather to make clear the logical distinction and to humble oneself because, (and I apologize profusely if this does shock you) you might be wrong.
But what about science? Is science the missing piece of the puzzle, the perfect method for gaining knowledge despite personal bias? Many today would say yes. This, I believe, is very ignorant even if you simply consider the scientific process. Science is not nature. Science is our best understanding of nature. There is much we still do not know and theories that we have now may be destroyed and reassembled when new evidence comes along. Data taken from experiments may easily be misrepresented or misunderstood. Conclusions drawn may have to be removed if they do not follow logically from results.
So what does all this mean? Again, Galileo summarizes well: “the holy Bible and the phenomena of nature proceed alike from the divine Word, the former as the dictate of the Holy Ghost and the latter as the observant executrix of God’s commands”. Not only did Galileo move sense perception to the same level as the Bible, he says he may do this because he believes in a God who reveals through the senses and that he may trust these senses because of the good nature of this God.
This is essential because it suggests that the conflict is not between science and scripture, but between interpretations of the bible and sense perception– both of which may falter and need to be carefully examined.
If (and this is a huge if) you believe the Bible to be true, and something seems to be in conflict between what it says and what science says, the conclusion ought not to automatically be that science has gotten it wrong. For both you and the scientist have a limited understanding of God’s world; if something seems askew, they must be able to be reconciled as different revelations from the same Divine Will.
And this is what it comes down to. God created nature and God revealed his words to us. If there is a contradiction between them it means we have to work harder in both fields to ensure we have an accurate interpretation of Scripture and an accurate interpretation of our sense perception as dictated by the Scientific Process.
Over the course of the next few weeks I shall be attempting to do this very thing with the very controversial theory of evolution. I will be looking at it from a biblical, scientific, and cultural perspectives. I obviously do not claim expert opinion in any of these fields, but rather wish to document my own exploration into them. If you would like to keep up to date, please subscribe to this blog and you will be notified with the publication of new posts.
- Alexandra Kytka