Sunday, January 26, 2020

The Pedagogy Verses Andragogy

The Pedagogy Verses Andragogy Jarvis expanded theory of transformation of the person through learning prepositions an argument for the andragogical model and the way in which adults learn. Given the quality and extent of experiences an adult gains throughout life-span these experiences are shaped and molded by societal roles and tasks, and according to Knowles (1984) will be used as a source of learning. A desire for knowledge or to engage the learning process, often identified as eagerness to learn, is stimulated by experiencing motivating shifts in one developmental task to another or by an important life episode, creating an adjustment in behavior that leads to improvement in some characteristic of our personal being. Laher (2007) indicates that such movement on the individuals development, leads to a alteration from a subject-centered academic orientation to a problem-centered academic orientation to learning, due mainly in response to changing life occurrences. This paper will evaluate 1) the role that social change plays from an a anadragogical approach with adult learners, 2) the responsibility of higher education institutions in facilitating adult education, and 3) briefly discuss a comparison between the role of pedagogy and andragogy approaches. Social Change And Adult Learners In terms of social change and the adult learner several factors are to be carefully considered. These would necessitate attending to social recognition of learners and addressing barriers to the learning process, reaching disadvantaged learners, fostering critical reflection associated with the process and ensuring experiential learning, and preparation for social action and community development. All of these efforts would serve to contribute to the value of college education among adult learners. Mason (2003) notes that one assumption that should be considered is the readiness of learners to be self-directed, self-motivated, and personally resourceful. Those learners experiencing disadvantage or who lack social recognition, or experience inequality as to access to educational opportunities may result in feelings of insecurity or uncertainty when approaching self-directed academics thus resulting in feelings of inadequacy or low self-esteem, borne out by the extent of their disadvantage and the main motivation for learning. Therefore, Merriam, et al. (2007) and her colleagues suggest that empowering learners to act involves a number of tasks. Facilitating an environment for adult learners so that they could create a relationships of equality is key in developing skills needed to contribute to meaningful participation in adult curriculum programming at the university level. Having a role in collegiate democracy such as problem-definition, identification of adult student needs, problem-solving, and decision-making structures and the development of critical reflection allows adult students to become more invested in the educational pro cess and increases personal investment (pp. 23-27) Engagement at this level and the opportunities it provides will increase a sense of academic cohesion, groups of adult learners working to carry out social change actions and individual learners moving into opportunities for engagement that address this populations academic needs will help facilitate learning environments that provide academic experiences upon which adult students can obtain valuable successes, build confidence and reach academic goals. Experiential learning or education also helps adult learners identify their skills and strengths in order to devise progression options, and to become agents of their own learning (Connolly, 2002, p. 7). Such learning then is not only the responsibility of the individual learner, but must have a conduit by which learning is facilitated. The following section of this paper will address the responsibility of the higher education institutions role in adult education and the individual learner. Responsibility of Higher Education Institutions This section of the paper proposes the importance of institutions of higher education role in both providing and facilitating adult education with appropriate curriculum and strategies to enhance the adult learning experience. Higher education institutions goals and objectives could be adjusted to fit the learner and provide maximum opportunity to synthesize existing knowledge with new information by designing curricula that experientially relate to the learners developmental stage. The number of adults entering learning situations later in life is growing due to rapid displacement, advancements in shifts in the job market, technological demands, and movement of employment overseas. Merriam, et al. (2007) and her colleagues indicate that two best predicators of adult participation in a states higher education system were availability of undergraduate education (number of seats available, public and private) and educational attainment of the states adult population (percentage of adults with high school or higher) (p.69). Such responses to a voluntary or involuntary transition in their lives, such as seeking education to maintain current employment or to change careers, has necessitated a return to college for many older adults. It becomes the obligation of higher education institutions to adjust teaching strategies, curriculum, goals, and objectives to promote learning success in adult learners. To promote external social change and to provide optimum learning environments for older adult learners requires adjusting strategies in curricula and delivery of the curriculum. Therefore, the next section of this paper will address the important nature of the adult learner and the origins of the andragogical principles and theory. Pedagogy Verses Andragogy This section of the paper will briefly review insights with regard to the relationship between the pedagogy and andragogy principles and the adult learner. The leading form of teaching in America is pedagogy, or didactic, conventional, or teacher-directed method. A different method in terms of instructing adult learners is andragogy. The purpose of this section is to provide the reader with background information regarding both instructional forms. Pedagogical Assumptions. The pedagogical model of instruction was originally developed from Greek, meaning the art and science of teaching children. In the pedagogical model, the teacher has full responsibility for making decisions about what will be learned, how it will be learned, when it will be learned, and if the material has been learned. Pedagogy, places the student in a submissive role requiring obedience to the teachers instructions. It is based on the assumption that learners need to know only what the teacher teaches them. The result is a teaching and learning methodology that promotes dependency on the instructor (Knowles, 1984). The pedagogical model has been most used method applied equally to the teaching of children and adults and is seen as a contradiction in terms. As Knowles (1984) would suggest The reason this contradiction exists is as adults mature, they become increasingly independent and responsible for their own actions. They are often motivated to learn by a sincere desire to solve immediate problems in their lives. Additionally, they have an increasing need to be self-directing. In many ways the pedagogical model does not account for such developmental changes on the part of adults, and thus produces tension, resentment, and resistance in individuals (Knowles, 1984). According to Ozuah (2005), pedagogical theory emphasized five major points: the lack of experience, dependency (in terms of self concept), external motivation, content oriented learning, and readiness to learn. Due to their relatively short lifetimes, children do not have the opportunity to gain much useful experience from many life events or developmental tasks. As a result, children rely on teacher and/or adult guidance to fill the void and provide the information with predetermined course content, to create a frame of reference upon which to build new learning (Knowles, 1984). Furthermore, what little experience children do have is perceived within their limited cognitive abilities. Other factors of pedagogy are also in opposition to the nature of andragogical principles. Children are dependent upon adults for direction and guidance, in terms of learning, the dependent child looks to teachers for guidance as to learning needs, children are basically externally motivated to reach the goals set, not by them, but by teachers and parents. Berk (2004) insinuates that youth are concrete cognitive operational thinkers and operate in the here and now concept of achievement and notes until they are capable to thinking more in the abstract, they are not able to apply current learning to future experiences. In pedagogical methodology, a childs readiness to learn is driven by measurable achievement goals rather than developmental tasks. As childrens goals are externally pre-determined by teachers and parents, their readiness to learn aligns with adult expectations of them rather than their own. In other words, childrens readiness to learn is highly correlated with content achievement, as is their dependency on teachers to know what it is they need to learn. Imel (1989) suggest that Knowles strongly believed that through a comparison of pedagogical, teacher oriented methodology with andragogical, differences between adults and pre-adults would be clearly evident. Andragogical Assumptions. Andragogy as a system of ideas, concepts, and approaches to adult learning was introduced to adult educators in the United States by Malcolm Knowles. Knowles a professor of adult education at Boston University, introduced the term andragogy which he defined as the art and science of helping adults learn in 1968. By 1980 he suggested the following: . . . andragogy is simply another model of assumptions about adult learners to be used alongside the pedagogical model, thereby providing two alternative models for testing out the assumptions as to their fit with particular situations. Furthermore, the models are probably most useful when seen not as dichotomous but rather as two ends of a spectrum, with a realistic assumption (about learners) in a given situation falling in between the two ends (Knowles, 1980, p. 43 ). The andragogical model as conceived by Knowles is predicated on four basic assumptions about learners, all of which have some relationship to our notions about a learners ability, need, and desire to take responsibility for learning: Their self-concept moves from dependency to independency or self-directedness. They accumulate a reservoir of experiences that can be used as a basis on which to build learning. Their readiness to learn becomes increasingly associated with the developmental tasks of social roles. Their time and curricular perspectives change from postponed to immediacy of application and from subject-centeredness to performance-centeredness (1980, pp. 44-45). The growth and development of andragogy as an alternative model of instruction has helped to improve the teaching of adults. Andragogy as a concept and set of assumptions is a system subdivided into pedagogy (dealing with youth education) and andragogy (concerned with adult education). There is some variety, too, in the application of related terms. Some countries use adult pedagogy, one (the Soviet Union) uses the term auto didactic among others to refer to adult education activities, and a few countries use andragology to refer to andragogical science (Knoll, 1981, p. 92). Outside of North America there actually are two dominant viewpoints: . . . one by which the theoretical framework of adult education is found in pedagogy or its branch, adult pedagogy . . . and the other by which the theoretical framework of adult education is found in andragogy . . . as a relatively independent science that includes a whole system of andragogic disciplines (Savicevic, 1981, p. 88). Knowles (1975) in contrast to child learners suggest that adult learners evolve in the area of self-directed learning. One immediate reason was the emerging evidence that people who take initiative in educational activities seem to learn more and learn things better then what resulted from more passive individuals. He noted a second reason that self-directed learning appears more in tune with our natural process of psychological development (1975, p. 14). Knowles observed that an essential aspect of the maturation process is the development of an ability to take increasing responsibility for life. A third reason was the observation that the many evolving educational innovations (nontraditional programs, Open University, weekend colleges, etc.) throughout the world require that learners assume a heavy responsibility and initiative in their own learning. Summary This paper has provided a review regarding the research on approaches to adult learning in theory and practice. Additionally, consideration was given to role social change has played in adult learning programming and community outreach opportunities for this population. Noted were both success in reaching disadvantaged learners and those under-represented. The review of literature also confirms community education works particularly well for those adult learners who have experienced educational successes in high school and who have access to college courses and affordable course work. However, it is clear that andragogy and Malcolm Knowles have brought considerable attention to the adult education field as a separate field during the past three decades. Applied correctly, the andragogical approach to teaching and learning in the hands of a skilled and dedicated facilitator can make a positive impact on the adult learner. Knowles introduction of andragogy was predicated on four basic assumptions drawn on the learning differences between adults and children. With maturity and age, an individuals self concept becomes less dependent and more self directed while accumulating a wealth of valuable experience that would serve the learner when readiness to learn is reflected. Additionally, Knowles notes that adults seek out learning when appropriate to fulfill societal roles, and orientation to learning represented the skills or knowledge sought to either apply to daily problems in fulfilling the societal roles (Lee, 1998). Finally, learning becomes less subject-oriented and more problem-centered (Lee, 1998). In 1984, Knowles added a fifth assumption that suggested that adults are internally motivated rather than externally motivated, and in 1990 a sixth: the need to know why something must be learned prior to learning it and its justification for being learned (Fall, 1998).

Saturday, January 18, 2020

Gender Roles and the Perception of Women Essay

There was a time that having a daughter born to a family evoked more pity than congratulations from the community. Sons were valued more for they were viewed to bring practical help towards augmenting the family income through physical labor, as well as ensuring that the family name lives on with his progeny. (â€Å"Feminism†) Daughters were valued only for the potential honor they could bring the family with a good marriage. In olden days, a good marriage was not necessarily defined by the couple’s happiness but rather was deemed as such if both families stand to benefit from the union. Usually benefits would be measured in wealth, alliance or business. Marriages then were basically â€Å"mergers. † Women were not expected to accomplish anything other than the mastery of domestic duties and union with a suitable husband. After marriage, the only duties that a woman is supposed to fulfill are to look after the needs of her husband and give birth to as many children as possible with preference to the birthing of sons. The 1920’s and 30’s saw a wave of feminism that sought to overturn the traditional gender role assigned to women. They viewed patriarchy as oppressive to women and advanced the thinking that women are complements of males and therefore should be treated as equals. The 1920’s also saw a major victory for women in the United States with the passage of a law that allowed for women’s suffrage. (â€Å"Feminism†) The Second World War in the 1940’s also provided women with the opportunity to prove their worth outside their duties as homemakers. They started signing up as army nurses, members of women’s corps and workers in factories that provided supplies and ammunition to the â€Å"boys overseas. † Even with this however, women still experienced discrimination at the hands of employers who believed that it was the men’s role to earn money for their families. Those that were hired still had to face inequality in wages as their work were deemed easier compared to the men’s. (Acker 46) It has continually been an uphill climb for women in the assertion of their rights and the fight for identity and equality. Despite the many progresses made by women since the olden days, some cultures still place more premium on males. Sandra Cisneros’ account (Kirszner, 96-99) of being and born and living in a traditional, patriarchal society in the 1950’s show that even with the many new freedoms and rights accorded to women, their roles were still defined by marriage and domestic duties. â€Å"What I didn’t realize was that my father thought college was good for girls –good for finding a husband. After four years of college and two more in graduate school, and still no husband, my father shakes his head even now and says I wasted all that education. † (Kirszner 97) The selection further goes on to relate the attempts made by Cisneros in getting her father to acknowledge her achievements and herself as more than â€Å"only a daughter. † She wanted to BE his daughter in every sense of the word and enjoy the same pride her father has in her brothers’ achievements. I often witness the â€Å"hunch posture,† from women after dark on the warrenlike streets of Brooklyn where I live. They seem to set their faces on neutral and, with their purse straps strung across their chests bandolier style, they forge ahead as though bracing themselves against being tackled. (Kirszner 242) In Brent Staples’ observations in the â€Å"Black Man effect† in altering a public space (Kirszner 240), he presents the image of a woman who is determined to move forward yet remains aware of the possible challenges to her progress. While in the story the context women is defined in is couched in terms of potential threat from street violence and crimes, one could almost picture the same description as applicable to the grim and set determination of the feminists who steadfastly battles for women’s rights and progress. It has been many years since women achieved a major victory in suffrage and set about to establishing their identity in society. Yet in some cases, there seem to be some women who remain oblivious or at least, not benefited by the new stature and rights women have been able to claim through years of struggle with a male-dominated society. In Anna Deavere Smith’s â€Å"Four American Characters† monologue (2005) she shares a conversation she had with an elderly philosopher friend she had, Maxine Green. In the conversation, Smith asked Green:† What are two things that you don’t know and still want to know? † Green replies: â€Å"Personally I still feel that I have to curtsy when I see the president of our University and I feel that I ought to get coffee for my male colleagues even though I’ve outlived most of them. † Smith follows this up with the characterization of Maryland convict Paulette Jenkins. Paulette Jenkins represents the women in abusive relationship who suffer in silence. She never spoke out because she didn’t want people to know that there was something wrong with her family. She took her husband’s abuse and allowed him to do the same to her children†¦children that she had in the belief that it would soften her husband. What would make a man do such a thing? At the same time, what would make a woman stand by helplessly as her husband beats up her children and herself? Conflict in relationships between men and women are believed to stem from four main reasons: men’s jealousy, men’s expectation of women and domestic work, men’s sense of â€Å"right† to â€Å"punish† their women, and the importance to men of asserting and keeping their authority. Women on the other hand, are kept silent due to feelings of shame and responsibility (Dobash, and Dobash 4). More often than not, the women feel that they deserved whatever the husband did to them. This acquiescence may be due to their cultural orientation of women as subservient wives. Upbringing and cultural orientation can do much to influence a person’s understanding and acceptance of gender roles. (Dobash, and Dobash 4) However, there is always the freedom of choice and personal introspection, which should allow individuals to reason out right and wrong and the applicability and rationale of traditions for themselves. The case of Sandra Cisneros is the perfect illustration of this. Despite being brought up in a highly patriarchal household and culture, she chose to follow her own desire and achieve in her own right. In the end, she managed to earn her father’s respect and acknowledgment that she, as a woman, can accomplish and gain honor and pride for the family. Regardless of background, doctrine or culture, everyone, man and woman, has that same choice in choosing how their manhood or womanhood will be defined in their lives. Works Cited Acker, Joan. â€Å"What Happened to the Women’s Movement? -An Exchange. † Monthly Review Oct. 2001: 46. Questia. 28 Sept. 2007 . â€Å"Feminism. † The Columbia Encyclopedia. 6th ed. 2004. Questia. 28 Sept. 2007 . Dobash, R. Emerson, and Russell P. Dobash. Women, Violence, and Social Change. New York: Routledge, 1992. Questia. 28 Sept. 2007 . Kirszner, Laurie. Patterns for College Writing 10th ed. New York: Bedford/St. Martin’s. 2006. Mcneill, William H. â€Å"Violence & Submission in the Human Past. † Daedalus 136. 1 (2007): 5+. Questia. 28 Sept. 2007 . Smith, Anna Deveare. Four American Characters. 2005 TED. com. 27 Sept 2007 < http://www. ted. com/index. php/talks/view/id/60>

Friday, January 10, 2020

Planting the Seeds of Jealously in PARADISE LOST

All great works of literature have at their center a strong conflict. After all, if there was no conflict between the protagonist and the antagonist then there would be little of interest to any work. Not all conflict, however, is external. That is, while the protagonist and the antagonist may be in conflict there are also a number of internalized conflicts that the characters possess as well. Often, it is this internal conflict that drives the external conflict. Such is the case with the fourth and fifth books of John Milton's PARADISE LOST as there are a number of internal conflicts that are born of senses of jealousy and inferiority. When we first look at Book 4 of PARADISE LOST we are introduced to the pitiful figure of Satan. Please note, the word pitiful is not used here flippantly. The character truly appears pathetic. Part of this would be the result of Satan essentially being a stranger in a strange land who no longer feels welcome in heaven as he is cast out. This builds into a feeling resentment, anger and jealously towards the Earth in its paradise form. Satan then becomes driven to travel to paradise and disrupt things. It would seem almost as if Satan is a spoiled child who would rather destroy a toy rather than let anyone else play with it. Satan then re-directs his internal conflicts towards the inhabitants of paradise with Eve being his intended prey. If Satan can destroy Eve he can destroy paradise. Of course, if Eve were strong she would be able to fend off his tempting but she is not. This is because she is also internally conflicted. In Book Five, the Angel Gabriel informs Adam that there is the possibility that they may evolve to a higher power. But, in order to do so they must remain loyal to God. On the surface, this would seem like a rather simple task. Simply follow orders and all will be well. Of course, human beings have a tendency to be their own worst enemies and this was not lost upon Adam or Eve. In this case, it was Eve who falters first. Part of the reason for this is that Eve also possesses a certain sense of conflict not unlike Satan. (This is foreshadowed in the dream sequence which we will soon discuss) Satan as a fallen angel feels a certain sense of inferiority towards the angels that are still in the good graces of God. In a similar vein, Eve feels equally conflicted since she feels inferior to Adam. After all, the angel discusses how she and Adam can obtain perfection but he only discusses this with Adam. As such, there is a certain â€Å"snub† Eve feels and this is probably what led to her being more susceptible to the taunts of Satan. In a way, Eve's eventually jealously is similar to Satan. Since the root of Satan's rebuke of God can be traced to his jealously towards son it would not be out of the question that Eve would also feel a similar jealously. Her jealousy, however, is directed towards Adam who is seemingly favored by God and the heavens. Much like with Satan, this jealously would prove to be her undoing. This does not occur in the fourth of fifth chapter but it is foreshadowed by a  sequence where God acknowledges that his creations – the humans – will eventually fall and falter. It would seem that God understands such feelings are human nature and that they can not be circumvented no matter what. Perhaps, God understands that all creatures in heaven and earth are flawed and these flaws lead to the weak emotions of jealously, pride and envy that are their undoing. It is also foreshadowed in the early part of Book Five, Eve has a dream that foreshadows her fall. Adam warns her to not pay mind to thoughts of feeding from the Tree of Knowledge. However, we can see how the seed of such malfeasance is planted. But, there is a clear understanding here that inferiority exists. In other words, since there is a Tree of Knowledge there must obviously be things that Adam and Eve do not know. In other words, they are obviously imperfect beings that are in the shadow of God and the angels. Again, this creates the seeds of jealousy that will later manifest into resentment and defiance. Remember, PARADISE LOST is not so much about physically being cast out of paradise as much as it is about the sad realization that all beings are flawed. Among these major flaws are notions that an individual will suffer from envy and jealously; emotions which can lead to great undoing if not properly placed in check. Sadly, the seeds of jealousy in these two chapters of PARADISE LOST grow until they ultimately deliver what the title of the work would suggest – an outright loss of salvation and the birth of the desire to regain it. This, of course, is another tale for another time.   

Thursday, January 2, 2020

A Look at the History of Computers

Before the age of electronics, the closest thing to a computer was the abacus, although, strictly speaking, the abacus is actually  a calculator since it requires a human operator. Computers, on the other hand, perform calculations automatically by following a series of built-in commands called software. In the 20th century,  breakthroughs in technology allowed for the ever-evolving computing machines that we now depend upon so totally, we practically never give them a second thought. But even prior to the advent of microprocessors and supercomputers, there were certain notable scientists and inventors who helped lay the groundwork for the technology thats since drastically reshaped every facet of modern life. The Language Before the Hardware The universal language in which computers carry out processor instructions originated in the 17th century in the form of the binary numerical system. Developed by German philosopher and mathematician Gottfried Wilhelm Leibniz, the system came about as a way to represent decimal numbers using only two digits: the number zero and the number one. Leibnizs system was partly inspired by philosophical explanations in the classical Chinese text the â€Å"I Ching,† which explained the universe in terms of dualities such as light and darkness and male and female. While there was no practical use for his newly codified system at the time, Leibniz believed that it was possible for a machine to someday make use of these long strings of binary numbers.​ In 1847, English mathematician George Boole introduced a newly devised algebraic language built on Leibnizs work. His â€Å"Boolean Algebra† was actually a system of logic, with mathematical equations used to represent statements in logic. Equally important was that it employed a binary approach in which the relationship between different mathematical quantities would be either true or false, 0 or 1.   As with Leibniz, there were no obvious applications for Boole’s algebra at the time, however, mathematician Charles Sanders Pierce spent decades expanding the system, and in 1886, determined that the calculations could be carried out with electrical switching circuits. As a result, Boolean logic would eventually become instrumental in the design of electronic computers. The Earliest Processors English mathematician Charles Babbage is credited with having assembled the first mechanical computers—at least technically speaking. His early 19th-century machines featured a way to input numbers, memory, and a processor, along with a way to output the results. Babbage called his initial attempt to build the world’s first computing machine the â€Å"difference engine.† The design called for a machine that calculated values and printed the results automatically onto a table. It was to be hand-cranked and would have weighed four tons. But Babbages baby was a costly endeavor. More than  £17,000 pounds sterling was spent on the difference engines early development. The project was eventually scrapped after the British government cut off Babbage’s funding in 1842. This forced Babbage to move on to another idea, an analytical engine, which was more ambitious in scope than its predecessor and was to be used for general-purpose computing rather than just arithmetic. While he was never able to follow through and build a working device, Babbage’s design featured essentially the same logical structure as electronic computers that would come into use in the 20th century. The analytical engine had integrated memory—a form of information storage found in all computers—that allows for branching, or the ability for a computer to execute a set of instructions that deviate from the default sequence order, as well as loops, which are sequences of instructions carried out repeatedly in succession.   Despite his failures to produce a fully functional computing machine, Babbage remained steadfastly undeterred in pursuing his ideas. Between 1847 and 1849, he drew up designs for a new and improved second version of his difference engine. This time, it calculated decimal numbers up to 30 digits long, performed calculations more quickly, and was simplified to require fewer parts. Still, the British government did not feel it was worth their investment. In the end, the most progress Babbage ever made on a prototype was completing one-seventh of his first design. During this early era of computing, there were a few notable achievements: The tide-predicting machine, invented by Scotch-Irish mathematician, physicist, and engineer Sir William Thomson in 1872, was considered the first modern analog computer.  Four years later, his older brother, James Thomson, came up with a concept for a computer that solved mathematical problems known as differential equations. He called his device an â€Å"integrating machine† and in later years, it would serve as the foundation for systems known as differential analyzers. In 1927, American scientist Vannevar Bush started development on the first machine to be named as such and published a description of his new invention in a scientific journal in 1931. Dawn of Modern Computers Up until the early 20th century, the evolution of computing was little more than scientists dabbling in the design of machines capable of efficiently performing various kinds of calculations for various purposes. It wasn’t until 1936 that a unified theory on what constitutes a general-purpose computer and how it should function was finally put forth. That year, English mathematician Alan Turing published a paper titled, On Computable Numbers, with an Application to the Entscheidungsproblem, which outlined how a theoretical device called a â€Å"Turing machine† could be used to carry out any conceivable mathematical computation by executing instructions. In theory, the machine would have limitless memory, read data, write results, and store a program of instructions. While Turing’s computer was an abstract concept, it was a German engineer named Konrad Zuse who would go on to build the world’s first programmable computer. His first attempt at developing an electronic computer, the Z1, was a binary-driven calculator that read instructions from punched 35-millimeter film. The technology was unreliable, however, so he followed it up with the Z2, a similar device that used electromechanical relay circuits. While an improvement, it was in assembling his third model that everything came together for Zuse. Unveiled in 1941, the Z3 was faster, more reliable, and better able to perform complicated calculations. The biggest difference in this third incarnation was that the instructions were stored on an external tape, thus allowing it to function as a fully operational program-controlled system.   What’s perhaps most remarkable is that Zuse did much of his work in isolation. Hed been unaware that the Z3 was Turing complete, or in other words, capable of solving any computable mathematical problem—at least in theory. Nor did he have any knowledge of similar projects underway around the same time in other parts of the world. Among the most notable of these was the IBM-funded Harvard Mark I, which debuted in 1944. Even more promising, though, was the development of electronic systems such as Great Britain’s 1943 computing prototype Colossus and the ENIAC, the first fully-operational electronic general-purpose computer that was put into service at the University of Pennsylvania in 1946. Out of the ENIAC project came the next big leap in computing technology. John Von Neumann, a Hungarian mathematician whod consulted on ENIAC project, would lay the groundwork for a stored program computer. Up to this point, computers operated on fixed programs and altering their function—for example, from performing calculations to word processing. This required the time-consuming process of having to manually rewire and restructure them. (It took several days to reprogram ENIAC.) Turing had proposed that ideally, having a program stored in the memory would allow the computer to modify itself at a much faster pace. Von Neumann was intrigued by the concept and in 1945 drafted a report that provided in detail a feasible architecture for stored program computing.  Ã‚  Ã‚   His published paper would be widely circulated among competing teams of researchers working on various computer designs. In 1948, a group in England introduced the Manchester Small-Scale Experimental Machine, the first computer to run a stored program based on the Von Neumann architecture. Nicknamed â€Å"Baby,† the Manchester Machine was an experimental computer that served as the predecessor to the Manchester Mark I. The EDVAC, the computer design for  which Von Neumann’s report was originally intended, wasn’t completed until 1949. Transitioning Toward Transistors The first modern computers were nothing like the commercial products used by consumers today. They were elaborate hulking contraptions that often took up the space of an entire room. They also sucked enormous amounts of energy and were notoriously buggy. And since these early computers ran on bulky vacuum tubes, scientists hoping to improve processing speeds would either have to find bigger rooms—or come up with an alternative. Fortunately, that much-needed breakthrough was already in the works. In 1947, a group of scientists at Bell Telephone Laboratories developed a new technology called point-contact transistors. Like vacuum tubes, transistors amplify electrical current and can be used as switches. More importantly, they were much smaller (about the size of an aspirin capsule), more reliable, and they used much less power overall. The co-inventors John Bardeen, Walter Brattain, and William Shockley would eventually be awarded the Nobel Prize in physics in 1956. While Bardeen and Brattain continued doing research work, Shockley moved to further develop and commercialize transistor technology. One of the first hires at his newly founded company was an electrical engineer named Robert Noyce, who eventually split off and formed his own firm, Fairchild Semiconductor, a division of Fairchild Camera and Instrument. At the time, Noyce was looking into ways to seamlessly combine the transistor and other components into one integrated circuit to eliminate the process in which they had to be pieced together by hand. Thinking along similar lines, Jack Kilby, an engineer at Texas Instruments, ended up filing a patent first. It was Noyce’s design, however, that would be widely adopted. Where integrated circuits had the most significant impact was in paving the way for the new era of personal computing. Over time, it opened up the possibility of running processes powered by millions of circuits—all on a microchip the size of a postage stamp. In essence, it’s what has enabled the  ubiquitous handheld gadgets we use every day, that are ironically, much more powerful than the earliest computers that took up entire rooms.