<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Andrew Komar, Author at The McGill Daily</title>
	<atom:link href="https://www.mcgilldaily.com/author/andrewkomar/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.mcgilldaily.com/author/andrewkomar/</link>
	<description>Montreal I Love since 1911</description>
	<lastBuildDate>Tue, 04 Sep 2012 20:40:13 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>One man&#8217;s small step, mankind&#8217;s giant leap</title>
		<link>https://www.mcgilldaily.com/2012/08/one-mans-small-step-mankinds-giant-leap/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Thu, 30 Aug 2012 10:00:27 +0000</pubDate>
				<category><![CDATA[Sci + Tech]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=23115</guid>

					<description><![CDATA[<p>Why the future of space travel is - and needs to be - robotic </p>
<p>The post <a href="https://www.mcgilldaily.com/2012/08/one-mans-small-step-mankinds-giant-leap/">One man&#8217;s small step, mankind&#8217;s giant leap</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The 1969 landing of <em>Apollo 11</em>’s <em>Eagle</em> on the surface of the moon was the defining moment of the 20th century and arguably of all of history, turning the swords forged in the hellhole of industrialized warfare into plows that planted the first seeds of human experience beyond Earth. Long after almost every other person living in the 20th century has passed from living memory, the legend of Neil Armstrong will be remembered. Armstrong’s “small step” is credited as inspiring generations of astronauts, scientists, and engineers to pursue their future careers. Since 1972, no one has again set foot on the moon, and the technology required to bring people there and back safely no longer exists.</p>
<p>After this heyday of moon exploration, the emphasis has changed from moonshots to low Earth orbit missions and unmanned probes. In August of this year<strong>,</strong> NASA landed <em>Curiosity</em> on Mars, the latest and most ambitious in a series of Martian rovers and landers. <em>Curiosity</em> is a six-wheeled, nuclear-powered, laser wielding, car-sized robot that will research geological indicators of water and possibly any life that could have lived on the planet. NASA engineers developed an entirely new “skycrane” technology for the rover, essentially an automatic jet pack combined with a crane that fires to slow down the rover. Most critically, the rover is carrying hundreds of pounds of scientific equipment representing a virtual laboratory, which will allow scientists to make all the important measurements on site.</p>
<p>While the landing of an atomic space robot science tank on Mars is unquestionably awesome, it is unlikely that <em>Curiosity</em> or any other robotic missions will have anywhere near the cultural impact of a manned mission to Mars. We have unprecedented access to the fruits of this exploration, but more people on Twitter were talking about system engineer<strong> </strong>Bobak Ferdowski (a.k.a. Mohawk Guy) and his star-laden haircut than any of the scientific and engineering achievement of <em>Curiosity</em>. Even President Obama said in his congratulatory message to the team that “you guys are a lot cooler than you used to be.”</p>
<p>It was the old school of “white socks, pocket protector, nerdy engineer[s],” as Neil Armstrong described himself, that brought us the <em>Apollo</em> triumph. Half a billion people – over a fifth of the world population – a tuned in to watch the grainy black-and-white photos of the first lunar steps. It was a moment where all of humanity was united in one humble man’s experience.</p>
<p>Coincident with this high water mark of public admiration was NASA’s virtually unlimited budget required to ‘win’ the space race, and therein lies the paradox of the path that NASA manned and unmanned programs have taken since then. With inflation, the cost of <em>Apollo</em> would have come in at a hefty $109 billion in 2010, and in fact, at the height of the sixties, the program took up 3.45 per cent of the American government’s spending. After <em>Apollo 11</em>, though, the program was drastically scaled back, causing two planned scientifically-minded moon landings to be cut.</p>
<p>Furthermore, after <em>Apollo</em>, NASA was tasked with developing the Space Shuttle as well as a long-term program of robotic exploration of our solar system. With an ever-diminishing budget, successive directors were faced with tough choices on the future of the undeniably inspiring manned space program. When President Obama axed the next generation <em>Aries-Constellation</em> spacecraft that would have brought tomorrow’s astronauts back to the moon and perhaps even Mars, Neil Armstrong was blunt in his criticism. At a congressional hearing he stated that “for The United States, … to be without carriage to low Earth orbit and with no human exploration capability to go beyond Earth orbit &#8230; destines our nation to become one of second or even third rate stature.”</p>
<p>We should keep in mind that the primary motivation for <em>Apollo</em> funding was political rather than scientific – indeed, only one of the twelve astronauts who walked on the moon was a scientist. When America won the space race, the public awareness of NASA continuing scientific missions dropped to relative obscurity.</p>
<p>Since the days when the only players in the space game were superpowers engaged in proxy warfare, the aims and accessibility of space flight have changed dramatically. In recent years, privatized companies have begun ferrying cargo and passengers into orbit. The company SpaceX recently docked an autonomous spacecraft,<strong><em> </em></strong><em>Dragon</em><strong><em>, </em></strong>to deliver supplies to International Space Station, an arrangement which will become increasingly normal as this newly-formed “space economy” begins to grow and thrive. Far from being the only player in the game, NASA now benefits from an industry devoted to making space travel more economical. Indeed, portions of NASA’s budget are earmarked exclusively for investment in these private companies to spur further innovation.</p>
<p>The nascent space tourism sector is also democratizing access to space – to a point – with Virgin Galactic beginning to offer trips to low orbit for tiny fractions of the price demanded by former superpowers. Private investors in Interplanetary Resources have signaled their interest in asteroid mining, which may eventually create the infrastructure necessary for human interplanetary space travel.</p>
<p>The de-emphasis on manned space travel rubbed Neil Armstrong the wrong way, and understandably so. In a post-space race context, NASA has shifted its primary focus away from manned missions and towards purer scientific research.  While this has minimized the ‘human interest’ aspect of their missions and thus their public support, the post space-race years have also been the most fruitful in terms of scientific output. <em>Curiosity</em> is set to explore a geologically promising section of Mars for the next two years, and is very likely to be exploring for much, much longer.</p>
<p>Unfortunately for our pioneering spirit, it is both difficult and expensive to get a fragile human to outer space and keep them safely alive long enough to do interesting science. Consider that some of the earliest space probes launched just after the heady days of the moonshot, <em>Voyager I</em> and <em>Voyager II</em>, have begun to leave our solar system altogether. It is simply more practical to extend a human consciousness to the cosmos through robotic proxies than it is to do so ourselves.</p>
<p>The thought of distant probes may not be as romantic as a man standing on the moon, but these robotic sojourners will remain intact long after the sun itself has died, a permanent testament of human ingenuity and audacity forever adrift among the stars.</p>
<p>The post <a href="https://www.mcgilldaily.com/2012/08/one-mans-small-step-mankinds-giant-leap/">One man&#8217;s small step, mankind&#8217;s giant leap</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Could the internet be shut off?</title>
		<link>https://www.mcgilldaily.com/2012/03/could-the-internet-be-shut-off/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Sat, 31 Mar 2012 05:27:52 +0000</pubDate>
				<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<category><![CDATA[SideFeatured]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=15727</guid>

					<description><![CDATA[<p>Control over communication, from James Admin to Egypt </p>
<p>The post <a href="https://www.mcgilldaily.com/2012/03/could-the-internet-be-shut-off/">Could the internet be shut off?</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>This past February, occupiers in the James administration used Twitter to brand themselves as #6party, and to communicate with their supporters. They advertised a livestream feed, from which they broadcast the first hours of their time in Deputy Provost (Student Life and Learning) Morton Mendelson’s office. They dispatched press releases on a blog. On the first day of the occupation, after denying them food from supporters, and before shutting off the water in the washrooms, the administration took away their wireless connection.</p>
<p>It is almost a banal point to say that the internet is an essential tool to social organization today – from organizing protests, to house parties, to vote mobs – it’s become irreplaceable. But the radical new powers that have been granted to the average netizen have proven a speed bump to institutions that are based on a more traditional model.: In the context of authoritarian forces seeking to continue their control, or grandiose activists, such as Anonymous wishing to overturn the status quo, the internet – or control thereof – can serve as an irresistible tool to meet their goals.</p>
<p>The role of social media is often cited as a key factor in the success of a variety of large-scale revolutions. But for all its power and utility, the internet is still vulnerable due to the nature of its infrastructure.</p>
<p>Last year, in response to the millions of people mobilized in Tahrir square, the Egyptian government decided to shut off the internet. Ordinarily, the internet functions by allowing individuals to send and receive data from storage locations known as servers from anywhere in the world. The pathway of individual data bits (controlled by a router) can vary from bit to bit, but each one will pass through routers along the way. What the Egyptian government probably did was shut off critical routers that tell the rest of the internet where the Egyptian IPs could be found, effectively sending every Egyptian netizen into an internet free limbo. In Egypt, only four key routers needed to be shut down to stop the internet – the infrastructure was particularly vulnerable to this power abuse. Canada has hundreds or thousands of these critical routers, in contrast. In the case of McGill campus internet, which is provided through routers and ethernet cables, the infrastructure is controlled entirely by the McGill administration.</p>
<p>Controlling the internet runs deeper than just shutting off routers: Bill C-30, introduced by Vic Toews of “Vikileaks” fame, would give the Canadian government new powers to track individuals’ internet activities in real time. According to Minister Toews, if you disagree, you obviously support child predators. Unfortunately for the proponents of the bill, the current architecture of the internet does not allow for this type of tracking, since the routers that pass data along the way do not actually track the content of the data. In order to enforce the bill, the telecom companies (who own the vast majority of the routers, cables and towers that bring the internet to you) would have to engage in enormous hardware and software retrofits. This would effectively bring the free flow of data – and the speed of your internet activity – to a crawl.</p>
<p>The response from the internet was swift and hilarious, with the twitter hashtag #TellVicEverything, where individuals told the minister’s twitter account the banal details of their lives in typically funny fashion. The online hacktivist community known as Anonymous also took note. Anonymous has made many powerful enemies in the past with their bottom-up disruption to the physical infrastructure made possible by the very same internet technologies.</p>
<p>Anonymous specializes in shutting down the online presence of their ideological opponents using a piece of software known as the “Low Orbit Ion Cannon”. This software takes control of sympathetic individual’s’ computers in a so-called ‘bot-net’ to send many thousands of requests per second towards the relevant servers. The sheer volume of these incoming data packet requests causes the servers to overload and shut down. Victims of this type of attack by Anonymous have included the FBI,  MPAA, Visa and Mastercard, with the latter even resulting in arrests after an international investigation into the Anonymous hacker group.</p>
<p>As our lives and the global economies become inevitably more intertwined due to the seemingly limitless power of the internet, we should also reflect that the traditional power struggles that have defined human history will inevitably follow us into our new global playground. The nature of the internet is robust – but not invulnerable. Those willing to bend it to their own ends will do so. Whether or not they do it for power or “for the lulz” depends on the nature of our society, which is now inextricably a global one.</p>
<p>The post <a href="https://www.mcgilldaily.com/2012/03/could-the-internet-be-shut-off/">Could the internet be shut off?</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Bullshit on your plate</title>
		<link>https://www.mcgilldaily.com/2012/03/bullshit-on-your-plate/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 12 Mar 2012 04:06:37 +0000</pubDate>
				<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=14565</guid>

					<description><![CDATA[<p>Bug burgers, manure foam, factory farming, and the future of meat </p>
<p>The post <a href="https://www.mcgilldaily.com/2012/03/bullshit-on-your-plate/">Bullshit on your plate</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The fact that we can choose to consume animal protein of all varieties at every meal is only made possible by an uncomfortably disgusting supply chain. Many consumers rarely give a single thought to the industrial horror that enables this voracious consumption, simply buying the neatly packaged results of this process. As a red-blooded Albertan carnivore, I participate in this process. In the context of a growing world of seven billion hungry people, we should at least be able to stomach the disturbing reality of the meat on our plate. Among those of us who eat meat – roughly 96 per cent of Canadians – it often seems an unspoken reality to not talk about the process behind its production.</p>
<p dir="ltr">Right now, we &#8220;produce&#8221; over 58 billion animals every year to satisfy our hunger. The issue boils down to one of supply and demand. How can you maximize the amount of meat produced in the shortest amount of time? The answer that the market has come up with is the factory farm. Forget idyllic scenes of frolicking livestock living out happy lives before a humane slaughter, well over 99 per cent of all animals produced for consumption will come from factory farms. To maximize profits, time and space for growing is kept to an absolute minimum. An average factory farmed chicken will spend its full 39 day life (yes, seven weeks) locked in a cage with space less than a quarter of the size of a laid out newspaper. Thousands of chickens will occupy a single building, stacked on top of each other without any light and no ventilation, literally shitting on top of each other for their entire miserable lives. Pig and cow operations aren’t much better, with these social animals being confined in similarly appalling shit-soaked conditions.</p>
<p dir="ltr">To allow a profitable percentage (owners expect 5-10 per cent of animals to die in on the farms) of the animals to reach an age where they can be sent off to slaughter, they are continually pumped full of growth hormone and antibiotics to stave off any infections caused by a constant contact with shit. In the US, between 70 and 80% of all antibiotics produced annually are used in the agriculture industry. <a href="http://blogs.discovermagazine.com/80beats/2012/02/28/yes-antibiotics-used-on-livestock-do-breed-drug-resistant-bacteria-that-infect-humans/">This absurd use of antibiotics has already led to antibiotic resistant bacteria evolving as well as becoming communicable with human workers. </a></p>
<p>Most people are not swayed by humanitarian arguments for the cogs in the meat machine, preferring not to give a shit. It is the shit, however, that is one of the largest issues.  A single farm of 2,500 dairy cows produces as much shit as an entire city of over 450,000 people. Unlike the comparable city, it is virtually certain that this shit will go untreated, with much of the excess shit from cow, pig and chicken farms simply being stockpiled on site. Some of this concentrated shit will infiltrate the local aquifers. Some factory farm operations simply spray it into the air, where the aerosolized shit can settle on unsuspecting people downwind. On some pig farms, ‘manure foam’ that develops as a byproduct of all the stockpiled shit has <a href="http://grist.org/factory-farms/sht-happens-mysterious-manure-foam-causes-pig-farms-to-explode/">even started to explode</a>, killing the thousands of pigs inside the barn with a literal explosion of their own shit. With all that decomposing shit comes an enormous production of methane gas – indeed, methane is associated with agriculture accounting for over 7 per cent of greenhouse gas emissions worldwide.</p>
<p>With the sheer volume of shit, many animals will be shipped to one of the handful of slaughterhouses will almost certainly be caked in it. The volumes of animals regularly processed at these facilities make a rigorous cleanup of the meat impossible. Slaughterhouses are notorious for their poor record on worker rights, in particular exploiting for vulnerable immigrant communities with little hope of legal protection. The speed on the ‘disassembly line’ necessary to pull a profit makes human error very likely, with the net result of shit from the poorly butchered animal digestive tracts often ends up on other pieces of meat. Thankfully, cooking the meat will usually kill the bacteria associated with all this shit, but it doesn’t negate the fact that the shit is there in the first place. Unfortunately, even the greenest organic farm produced meat will most likely be processed in one of these slaughterhouses, so that alone is not a solution to all that contamination.</p>
<p>This may be leaving a bad taste in your mouth, so you may be asking what choices there are to replace it. As a devoted meat eater (and frequent Schwartz’s customer), I’m not content to simply accept vegetarianism as the logical conclusion from all this shit (though I rarely eat chicken, based on the amount of shit per unit weight).  Some scientists agree that meat should stay on the menu, and have been working on alternatives – such as growing the meat in the lab.<a href="http://www.bbc.co.uk/news/science-environment-16972761"> Dutch Scientists recently used stem cell technology to produce the first lab-grown burger in history</a>. While the costs are currently prohibitive (it cost $200,000 to produce the burger), at least the technology was shown to exist. Mark Post, the senior researcher for the team as well as the gourmand lucky enough to eat the burger, is optimistic that many of the technological hurdles will be worked out if research is continued.</p>
<p><a href="http://news.discovery.com/human/insects-food-worms-environment-110123.html"> Another alternative, currently shunned because of the ‘ick’ factor, is to eat insects for the protein.</a> From a pure industrial perspective, insect cultivation is 600-800 per cent more efficient than current practices, doesn’t require any antibiotics due to the lack of human-transferable diseases, and barely produces any shit. Instead of cutting down rain forests to make way for ranchland, insects can be grown on sugarwater indoors. Bugs, like mealworms or grasshoppers are already eaten in many countries as a delicacy, and could be produced and processed to resemble our favorite processed food at a fraction of the cost of regular meat production, without the enormous land use and environmental costs we accept.<br />
The prospect of chowing down on a ‘bug mac’ may be unappetizing, but it doesn’t come with the baggage of the shitty industry that produces virtually all the meat we eat anyway. As consumers who are unlikely to shun meat altogether, it is clear that we will have to think outside the lunchbox if we want to keep all our favorite dishes on the menu.</p>
<p>The post <a href="https://www.mcgilldaily.com/2012/03/bullshit-on-your-plate/">Bullshit on your plate</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>A new era for planet Earth</title>
		<link>https://www.mcgilldaily.com/2012/02/a-new-era-for-planet-earth/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 13 Feb 2012 11:00:59 +0000</pubDate>
				<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=13803</guid>

					<description><![CDATA[<p>It's hot, it's here to stay—and it really is all our fault. </p>
<p>The post <a href="https://www.mcgilldaily.com/2012/02/a-new-era-for-planet-earth/">A new era for planet Earth</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Global climate change is a process that takes much more than a single lifetime to play out – or to see the worst effects. On a long-term scale, the scope of the change is so great that geologists have proposed that we are currently living through the dawn of a new geologic era: the Anthropocene, also known as the Age of Humanity.</p>
<p>It was only a few thousand years ago that much of North America was covered in kilometer-thick layers of ice. The ices ebbed and flowed across the continent over the course of millenia due to natural warming caused by the Earth’s wobbly orbit around the sun. Our current place in the complex ‘wobble’ would ordinarily put us at the beginning of another period of giant ice sheets scraping across the continents. But we’re not only stopping this ice age, we’re going much further: The forecast for the Anthropocene is that it’s to be an age of extreme warming.</p>
<p>In the best case scenario: we collectively dump only a half-trillion ton slug of carbon dioxide equivalent gasses, and the world becomes two degrees Celsius warmer. What does a world that is two degrees warmer actually look like? We need only look 130 thousand years back in time to see a world that was about that much warmer – the Eemian Interglacial period.</p>
<p>Based on what we know about the Eemian , we can very likely expect a shifting of natural ecological ranges in the next few thousand years. It is unlikely that today’s animals will be able to move with these ranges, because humans are in the way of any migration. We are currently living in the beginning of a massive, planet-wide extinction event that could rob our descendants of 50 to 90 per cent of the biodiversity that exists today. The Eemian Arctic was likely seasonally ice-free, which is an ignoble milestone we will see again in our lives. However, the Eemian interglacial period cooled off when the wobbly orbit of the planet caused enough of a temperature drop to allow glaciers to flow once again, which won’t happen this time, as the warming is happening due to humans and not a wobble.</p>
<p>To get an idea of the worst case scenario, we must look much farther back, to 55 million years ago, to the Eocene. This was the last time the planet experienced what paleoclimatologists call a “super-greenhouse”. A super-greenhouse is caused by an quick, enormous dump of greenhouse gases into the atmosphere, on the order of one to five trillion tons of CO2 equivalent gas. Much of this may have come from methane-ice known as clathrates, which are very sensitive to oceanic temperature. Whatever the initial cause, this gigantic mass of extra carbon caused a global warming event that involved extreme changes that would be reflected in the climate for millions of years. If our industrialized society continues business as usual – and we do indeed have sufficient reserves to do so – burning gas, oil, and coal reserves could potentially release trillions of tons of CO2 in the next few hundred years.</p>
<p>Today, around 60 per cent of the carbon we emit annually is absorbed into the oceans, which makes them much more acidic. The acidification during the Eocene global warming period was so severe that it actually burned a red mark in the global sea sedimentary deposits. About half of all coral species, but particularly the deep-water variety, disappeared from the fossil record. Modern day species that may be vulnerable to any future acidification of the oceans include the global coral reefs (themselves the base of a staggering amount of biodiversity), lobsters, crabs, oysters, and many deep-sea microscopic life forms that form the root of the oceanic food chain.</p>
<p>The excess CO2 in the Eocene was sufficient to raise the global average temperature by as much as ten to twelve degrees Celsius, warmer at the poles than the tropics. Fossil evidence from the time indicates that even the northernmost point in the Arctic Circle was warm enough to support tropical species of plants and animals, suggesting a polar heating change of over twelve degrees Celsius.<br />
From the point of view of a human lifespan, the extra carbon that we are putting in the atmosphere will be there essentially forever, as will the effects on the climate and biodiversity. The outcomes of pollution might seem apocalyptic and outlandish, but the reality of the situation is that extreme greenhouse gas emissions have caused large changes to the environment in the past. It is irresponsible of us to refuse to understand the very long-term implications of our actions today. So, on behalf of those without the benefit of having the choice, who will come long after we’re gone, we ought to decarbonize our economy now. Welcome to the Anthropocene: let’s make this next geological era a good one.</p>
<p>The post <a href="https://www.mcgilldaily.com/2012/02/a-new-era-for-planet-earth/">A new era for planet Earth</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Kyoto sunset</title>
		<link>https://www.mcgilldaily.com/2012/01/kyoto-sunset/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 30 Jan 2012 11:00:16 +0000</pubDate>
				<category><![CDATA[inside]]></category>
		<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=13163</guid>

					<description><![CDATA[<p>Canada becomes the first country to withdraw from the Kyoto protocol</p>
<p>The post <a href="https://www.mcgilldaily.com/2012/01/kyoto-sunset/">Kyoto sunset</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Prime Minister Stephen Harper, vying for Canada’s 12th “Fossil of the Day” award – given to the most environmentally retroactive country by the Climate Action Network – became the first Kyoto signatory to formally break its commitments to the protocol in mid-December 2011. First put in place in 1997, the Kyoto Protocol was an international agreement to cut down on global carbon emissions in an attempt to mitigate some of the more dire effects expected from anthropogenic climate change.</p>
<p>The Canadian government, which has been a long-time critic of Kyoto, the only legally binding greenhouse gas agreement in existence, cited the fact that both China and the United States were not members of the Kyoto accord, and thus not bound to pay the significant costs that were set to be imposed on Canada. Collectively, those countries alone produce around 40 per cent of the total global emissions, whereas Canada produces less than 2 per cent. However, since the Chrétien government signed the protocol in 1997, per-capita emissions of Canadians have risen over 25 per cent from the specified target of a 6 per cent decrease from 1990 levels.</p>
<p>Under the penalties formerly agreed to in the accord, this increase over the prescribed limit meant Canada would have been  legally obligated to buy carbon credits to account for this difference. During the announcement of the withdrawal, Environment Minister Peter Kent cited a cost of almost 14 billion dollars for these credits, and blamed the previous Liberal government as they were in power for a significant amount of time during these changes. In reality, the growth rate  across all sectors has been about the same regardless of who’s in charge, and it is set to increase, since oil sand production well surpasses one million barrels a day.</p>
<p>Currently, there is no further binding legal agreement in place to take over when Kyoto expires. The last three international conferences in Copenhagen, Cancun, and Durban resulted in a few successful policy agreements, but lacked Kyoto’s legal binding. The lack of a universally agreed upon standard in the face of this global problem is the primary issue with much of the policy discussion that has happened in the last decade. This is the troubling reality of climate change; while we dither and bicker about who ought to suffer the most cutbacks, we continue to collectively belch another 10 billion tons of gas into the atmosphere per year. As such, some of the more noble goals of the original Kyoto Protocol may already be beyond our grasp.</p>
<p>There are serious concerns that we are well on track to exceed the two degrees global warming that is generally agreed upon as the limit for dangerous climate change. A recently released study in the <em>Royal Society of Philosophical Transactions</em> entitled “Beyond Dangerous Climate Change” calls into question the very idea that the two degree warming is the threshold for acceptable. This paper now puts that two degree limit as the threshold between “dangerous” and “extremely dangerous”. More troublingly, they estimate the probability of exceeding this limit as extremely high.</p>
<p>If we continue business as usual – which seems to be the most likely outcome – then we must understand that mitigation is incompatible with our current method of economic growth. Investing in “green” technologies – such as nuclear, solar, and wind power for – could still grow our economy – without the negative consequences of fossil fuels. Furthermore, even if we were to stop global emissions today, the lag between our emissions and their effect on the climate means we can already expect at least a half degree of further warming, an increasingly acidified ocean that threatens nearly a quarter of the world’s food supply, a thermally expanding sea submerging low lying coasts and islands, and increased melting of permanent ice to further contribute to sea level rise.</p>
<p>Interestingly, the only time in the last few decades that our cumulative emissions have actually decreased was during the global recession in 2008. During that year, the rate of emissions declined by about 3 per cent. To seriously have a chance of offsetting the dangerous climate change to come, we’d have to reach peak oil by 2015, and globally reduce our consumption by twice the rate we saw during the great recession every year until we hit zero. Studies on coral blanching suggest that this process will continue at CO<sub>2</sub> levels of as low as 430 parts per million (ppm), whereas even the most conservative estimates of our total emissions usually center on a final value of 550 ppm.</p>
<p>We have enough oil, gas, and coal reserves to last for decades, potentially hundreds of years of growing business as usual. There will be more of us – potentially 9 billion of us by 2050 – all vying for the same quality of life we enjoy today, which is only made fully possible by the continued exploitation of the ecosystem.</p>
<p>The choices we need to make today will not be made by governments whose primary purpose to be re-elected. The scale of the problem is well beyond the lifetime of one government, which is miniscule compared to how long the effects of climate change will be felt. It is not in the government’s interest to deal with this problem, and it will never be unless we hold them accountable. The fact that governments are able to use even the most basic carbon tax as a successful scare tactic for election campaigns shows our own recalcitrance.</p>
<p>As Canadians, we are responsible for one of the world’s highest per-capita emissions. Our place in the world ought to be to demonstrate that a more sustainable lifestyle is possible. While cumulatively our emissions are not that important on a global scale, we must demonstrate that a “first world” lifestyle is possible, even without the exploitation of the environment.</p>
<p>However, part of this new carbon-free lifestyle is acknowledging that there are things we must give up. Air travel as we know it is would not be possible – modern technology has no sustainable alternative to airplanes. Eating meat at every meal is not sustainable. Having access to products that must travel hundreds of kilometres, using current shipping technologies, to reach us is not possible if we wish to live an emission free life. On behalf of the hundreds of generations that will bear the economic cost of our exceedingly selfish behaviour – and current generations – we must voluntarily commit ourselves to bearing these costs.</p>
<p>The post <a href="https://www.mcgilldaily.com/2012/01/kyoto-sunset/">Kyoto sunset</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Our better halves</title>
		<link>https://www.mcgilldaily.com/2011/11/our-better-halves/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 21 Nov 2011 11:00:02 +0000</pubDate>
				<category><![CDATA[Sci + Tech]]></category>
		<category><![CDATA[SideFeatured]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=12068</guid>

					<description><![CDATA[<p>Montreal’s Steven Pinker on violence</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/11/our-better-halves/">Our better halves</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>In our modern world, riddled with police brutality, assault, murder, genocide, and war that stretch for decades, the mere suggestion that violence is declining may be met with incredulity. Yet, that is the thesis of Steven Pinker’s latest book, The Better Angels of Our Nature: Why Violence has Declined. It is an extensive look into violence as it has been practiced throughout human history, from our existence in hunter-gatherer tribes to us presently in the 21st century.</p>
<p>This undertaking is no easy task, but Steven Pinker is uniquely suited to tackling such a broad subject. Pinker, a Montreal-born McGill graduate who is now a professor of Psychology at Harvard, has built a considerable career by understanding the nature of human cognition. His previous books include The Stuff of Thought, an illumination of human language use, and The Blank Slate, an incredible exploration of the scientific understanding of human nature.</p>
<p>Violence, as Pinker reveals, is wired into our brains – an intrinsic part of the human condition, shaping the fabric of society. With excruciating detail, The Better Angels of Our Nature delves into the cruel and unusual ways humans have devised to torture and kill each other. However, the chances that the average person will act on this universal feeling of violence have gradually declined over time.</p>
<p>Pinker outlines six main factors that can potentially explain this decline: first, the pacification process, whereby warring tribes are brought under a centralizing authority, giving rise to the first civilizations. The second is the civilizing process, where this centralized authority begins to extract resources through work (albeit slave labour) and taxes as opposed to plundering or looting. The third is the humanitarian revolution where the communication of ideas begins to radically raise the collective consciousness, The fourth is a long peace, the post WWII era which is remarkably free of nuclear carnage despite nations’ nuclear capabilities. Finally, there are the various human rights revolutions, where increasing numbers of people are protected by rightful access to certain basic rights.</p>
<p>It seems more feasible to highlight small aspects of Pinker’s observations, rather than attempt to summarize the behemoth that is The Better Angels of Our Nature. Honour appears as a small, but interesting factor, given that it is intangible. There is no quantifying measure of honour, nor is there some sort of “honour particle.” It is a social construct that only exists because we all agree that it does. In the past, people would be quick to act violently towards those who had threatened or hurt their honour. Today, people still die for honour, but to a much smaller extent. One factor contributing to the decline of honour killings may be the creation of a legal system. In a society with a legal system, the power to wield violence is given to one – ideally uninvolved – third party. This third party, or leviathan, as Pinker calls it, reduces violence caused by individuals, so long as the legitimacy of this third party is accepted. According to Pinker, this has allowed for the rise of such things as manners, holding your tongue, and other general niceties we even extend to our enemies.</p>
<p>Another important change Pinker outlines seems to be an increase in literacy, made possible by relatively recent technological advancements such as the printing press and the Internet, both of which allow the brain to temporarily inhabit the viewpoint of another brain. While this observation may appear rather banal to our modern sensibilities, the value of formalized education, and the general increase in intelligence that follows, cannot be measured. Anecdotal evidence from WWII left a much larger impact on a population that was able to access and understand it, than that from WWI. Literacy and education gave us the ability to learn, not only in school, but also from our own mistakes in life.</p>
<p>Though The Better Angels of our Nature was, at times, depressing to read as it gives insight into the machinations of serial killers, rapists, and perpetrators of other unspeakable atrocities, it nonetheless left me feeling cautiously optimistic about the future of our species. In an age defined by our seemingly inexhaustible cynicism, this is certainly a welcome development.</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/11/our-better-halves/">Our better halves</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>The bullshit detector</title>
		<link>https://www.mcgilldaily.com/2011/11/the-bullshit-detector/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 07 Nov 2011 11:00:18 +0000</pubDate>
				<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=11180</guid>

					<description><![CDATA[<p>Why scepticism is a necessary application of the scientific method</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/11/the-bullshit-detector/">The bullshit detector</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The basic idea behind scepticism was perhaps best summarized by the physicist Richard Feynman: “The first principle is that you must not fool yourself, and you are the easiest person to fool.” This principle is exponentially more difficult to uphold in the modern era, where information of all degrees of truthfulness can flow without limit. To further complicate things, there are whole industries based on pseudoscientific nonsense that are more than willing to take your hard-earned cash from your gullible hands.</p>
<p>So how do we protect ourselves from the unscrupulous people who brazenly lie to enrich themselves? We need a bullshit detector to separate truth from their profitable claims. Luckily for us, the scientific method provides us with exactly the tools for the job! A brief review: any claim must be backed up with clear, inarguable, and reproducible evidence. Furthermore, this evidence ought to be free of the various cognitive biases that humans bring into the picture unintentionally. This includes conflating correlation and causation or interpreting data in a way that favourably confirms your presuppositions. Although this sounds simple in principle, it is nearly impossible to consistently apply.</p>
<p>In fact, in many situations, we actively wish to suspend these principles of scientific methodology. Magicians, for example, perform very convincing illusions to delight and mystify their audiences, all the while obstructing the methods behind their act. This is entertainment and it is harmless and fun – the audience is in on the lie, paying to escape to a fantasy world. But, a serious problem arises when either party is not privy to the fact that illusions are lies.</p>
<p>In 1913, Harry Houdini, arguably the greatest magician from the 20th century, experienced the traumatic death of his mother. Although he wanted to believe in the ability of psychics to contact the dead – who wouldn’t after losing a loved one? – he realized that so-proclaimed psychics used the same tricks to bluff to their customers as he did in his magic shows. The difference was that the psychics’ customers truly believe that they are contacting their loved ones – and are forking over cash for a service that they are not getting.</p>
<p>Houdini spent much of his career using his knowledge to debunk psychics. The tradition of magicians using their experience in trickery to expose duplicitous frauds continues to this day, most notably with James Randi. He started the James Randi Educational Foundation (JREF) which says that it’s purpose is “to expose charlatans and help people defend themselves from paranormal and pseudoscientific claims”. The JREF puts its money where its mouth is, and offers a million U.S. dollars for any claim that holds up under double-blind testing. The foundation takes every claim at face value, and asks the question: “If this is true, then how can we prove it?”</p>
<p>The JREF has seen hundreds of people attempt to claim the prize, many of whom genuinely believe that they have paranormal powers. Before every test, the JREF and the claimant mutually agree upon exactly what would constitute a proof of their ability, such as successfully guessing cards at a rate above chance. It is this mutually agreed upon starting point that is the true power of the million dollar challenge – it removes the excuse that the sceptics are somehow interfering with the “true’’ powers of the claimants.</p>
<p>The million dollars remains unclaimed, even after being up for grabs for decades. The fact of the matter is: the claimants are con artists (possibly self-deluded ones) preying off the gullibility of people for their own personal gain. If psychics or any other paranormal phenomenon do indeed exist, then the scientific process should be friend and not foe, as it exists to properly identify them.</p>
<p>Those who falsely claim to have psychic abilities are frauds who could potentially to cause great harm. Take the case of Shawn Hornbeck, a child that went missing in 2003. A celebrity psychic named Sylvia Browne told his parents, on national television, that their son was dead. She was wrong, and Hornbeck turned up four years later to parents who had believed their son dead. Browne is also notable for agreeing to the JREF million dollar challenge on the radio, then running away from her agreement. It’s been ten years, and she still hasn’t taken it.</p>
<p>Of course, as an author of dozens of books, videos, lectures, and owner of an exclusive “Inner Circle” with memership fees of up to $50 per year, maybe Browne is not in need of a million dollars. Especially a million dollars that might well expose her money-making “abilities” as a fraud.<br />
Still, her services can offer people comfort. This is the true crux of the issue: is the fact that a person’s false delusions give them comfort sufficient justification to allow predators to take advantage of them? James Randi explains the answer to this in the frequently asked question section of his website: “The potential harm is very real, and dangerous. Belief in such obvious flummeries as astrology or fortune-telling can appear – quite incorrectly – to give confirmatory results, and that can lead to the victim pursuing more dangerous, expensive, and often health-related scams. Blind belief can be comforting, but it can easily cripple reason and productivity, and stop intellectual progress.”</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/11/the-bullshit-detector/">The bullshit detector</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>E ≠ m*c^2?</title>
		<link>https://www.mcgilldaily.com/2011/10/e-%e2%89%a0-mc2/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 03 Oct 2011 10:00:19 +0000</pubDate>
				<category><![CDATA[inside]]></category>
		<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=9787</guid>

					<description><![CDATA[<p>Thought that nothing could move faster than light in a vacuum? You might be wrong.</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/10/e-%e2%89%a0-mc2/">E ≠ m*c^2?</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>Physicists from the Oscillation Project with Emulsion-tRacking Apparatus (OPERA) program at the European Institute for Nuclear Physics recently released a <a href="http://arxiv.org/pdf/1109.4897v1" target="_blank">paper</a> claiming that they may have discovered particles that might travel faster than the speed of light. This revelation, if true, would mean that the universal speed limit, as laid out in Einstein’s theory of special relativity, would be incorrect. Relativistic physics is the keystone of our modern understanding of the universe, so this announcement has the potential to be one of the most important findings in over a century.</p>
<p>These astonishing conclusions come from an experiment on a very different type of subatomic particle, known as the neutrino. This particle, the existence of which was first hypothesized by the Italian physicist Enrico Fermi in the 1930s, is generated during nuclear decay. These miniscule neutral particles barely interact with regular matter, which makes them exceedingly hard to experiment upon. At any given second, there are about 65 billion neutrinos, coming from the sun alone, flying through an area the size of your thumbnail. The vast majority of these neutrinos will fly right through the earth and out the other side.</p>
<p>Making neutrinos even more peculiar, is that they come in three different types: tau, muon and electron. And in an even more bizarre twist, neutrinos are known to spontaneously change from one type into another. It was this bizarre behavior that the OPERA team was investigating when they made their unexpected discovery. They originally intended to generate a beam of muon neutrinos at their lab in Geneva and aim it at a detector in Italy, underneath the Alps. The neutrinos were supposed to fly 732 kilometers through solid rock, with some spontaneously morphing into tau neutrinos along the way. Though most neutrinos were to sail clear through the detector in Italy, a tiny fraction of them would have been picked up on the otvher end.</p>
<p>It was because of this journey that this potentially universe shattering discovery was made. In addition to the proportion of morphed neutrinos detected, the team also made an extremely precise measure of the distance these neutrinos travelled and how long it took them to do so. To determine the distance, they used GPS measurements that were so precise that they were actually able to detect the movement of tectonic plates under the detector. They measured the average travel time to an accuracy of ten billionths of a second (ten nanoseconds), and they found that the neutrinos travelled 60 nanoseconds faster than the speed of light.</p>
<p>While the OPERA team, like many other scientists, wished that they could dismiss this result as erroneous – and they tried rigorously to do so – they were unable to dismiss the significance of the results. As a result, OPERA published their findings with the hope that the scientific community would be able to see an error in their ways. This is why the scientific process is the one of the most powerful intellectual tools ever created: before any claim is accepted, there must be multiple independent researchers who reach the same conclusions. If faster-than-light neutrinos are a reality, it means that Einstein’s theory, and our current understanding of the universe is fundamentally wrong. Overturning inaccurate models of the universe is the purpose of scientific methodology because nothing, not even the genius of Einstein, is above the demonstrated facts of reality.</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/10/e-%e2%89%a0-mc2/">E ≠ m*c^2?</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>This one is just right (almost)</title>
		<link>https://www.mcgilldaily.com/2011/09/this-one-is-just-right-almost-2/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Tue, 27 Sep 2011 03:40:41 +0000</pubDate>
				<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=9580</guid>

					<description><![CDATA[<p>Recent findings bring the possibility of extraterrestrial life closer than ever before</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/09/this-one-is-just-right-almost-2/">This one is just right (almost)</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The family of confirmed planets that exist outside our solar system gained fifty new members last week when the High Accuracy Radial velocity Planet Searcher (HARPS) team announced new discoveries at the Extreme Solar System Conference in Wyoming. This announcement described a planet that may orbit around its star in a habitable zone, known as a “Goldilocks zone” because it is neither too hot nor too cold, allowing liquid water to form. Liquid water is a necessary precondition for life as we know it, so finding planets that may have it is an obvious first step to finding evidence of extraterrestrial life. This brings the total number of known exoplanets – planets that exist outside our solar system – up to 685, with thousands more potential worlds waiting to be confirmed by various scientific missions.</p>
<p>This plethora of exoplanetary discoveries has seriously challenged scientists’ ideas about planetary formation. The traditional model has planets forming around the same time as a newly born star, accreting from the same disc of dust that rotates around the protostar, an early phase in the formation of a star. But with new discoveries, it became clear that current models were unable to predict the diversity of planets in the galaxy. “These models are crap. They may be the best we can do, but they are still crap,” said Hal Levinson, an astronomer speaking at the annual American Astronomical Society meeting in January.</p>
<p>Scientists use a variety of techniques to actually detect these planets, ranging from directly imaging them with powerful telescopes to more indirect methods. One such method detects the subtle periodic wobble of a star and its planet as they rotate about each other in their cosmic dance. This tiny effect can be measured from hundreds of light years away. It also allows astronomers to figure out the mass and orbital period of the planets. This is possible even in solar systems with multiple planets. The HARPS team has found dozens of planets, including a considerable number of “super earths” (planets with masses of up to 10 times that of earth). However, the radial velocity technique doesn’t reveal anything about the density of the planet, which means the “super earths” could range from puffy gas balls, such as those found in our own outer solar system, to more compact, rocky planets like earth.</p>
<p>The radial technique for detecting planets favours finding very large celestial bodies closely orbiting their suns. Many of the early exoplanet discoveries using this method were of these “hot Jupiter” planets, which orbit their stars far closer than our own Mercury, with masses considerably larger than Jupiter, at about 318 earth masses. These enormous planets would have surface temperatures reaching upwards of a few thousand degrees, whose ‘year’ could be as short as a few hours because of their tiny orbits. One planet the HARPS team found gets so hot that it actually acts like a Jupiter-sized comet, leaving a superheated ‘tail’ of planetary debris that gets blown away by the stellar wind at more than 35,000 kilometers per hour.</p>
<p>The transiting technique is a more direct way of measuring planet size. This technique watches stars for periodic “eclipses” caused by the planets moving in front of the star. The tiny drop in the intensity of the star’s light can be detected, and from that the physical size of the planet can be deduced. Of course, this only works when our telescopes are on the same orbital plane as the extrasolar system. The laws of probability tell us that this lucky alignment would only occur about 0.5 to 10 per cent of the time, depending on the planets distance from the star. In order to appreciate the miniscule change that occurs, imagine being able to see a poker chip placed in front of the search light in downtown Montreal all the way from Toronto. The transiting technique is not only able to do this, it does it on the scale of trillions of kilometres.</p>
<p>The newly operational Kepler spacecraft uses this method to find planets, watching about 145,000 stars constantly to detect these small drops in brightness. Even though it has only been in operation since 2009, it has already detected over 1200 candidate planets, including dozens of “super earths” and a couple within the habitable zone of their stars. Kepler has even detected a planet that orbits around two stars at once. Like the fictional Star Wars planet Tatooine, no two sunsets on this planet would ever be the same, because the two stars would be orbiting around each other as the planet rotates around the pair!</p>
<p>The transiting technique also offers scientists an opportunity to directly measure the atmosphere and temperature of the exoplanet, because the spectral signature of the starlight changes when there is a planet obstructing it. With ever-increasing telescopic power comes an increasing chance that we will finally detect a sister planet with a similar size, temperature, and atmospheric composition as Earth. This is one of the explicit missions of the upcoming James Webb Space Telescope (JWST), which is the successor to the immensely successful Hubble Space Telescope. The JWST will have a resolution of over 20 times that of the Hubble, which would make it theoretically sensitive enough to directly image a planet with compoarable size to earth. Funding for the JWST was recently called into question, citing cost overruns and mismanagement, but the project was ultimately allowed to continue. However, the earliest possible launch date for the JWST isn’t until 2019.</p>
<p>With the continuation and progression of all these various detection methods we truly are on the cusp of detecting a planet orbiting another sun that would be capable of supporting life as we know it. With the next generation of telescopes, we will be able to probe the atmospheres of some of these potential Earths. In doing so, we may even find a telltale chemical signature that would be incongruous with our model of a lifeless planet. Considering that scientists now know that planets can be found on up to 40 per cent of sun-like stars, and that there are billions of these stars in our galactic neighbourhood, it is virtually certain that there is extraterrestrial life in our cosmic backyard. And it will almost certainly be weirder and maybe even more wondrous than we can possibly imagine.</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/09/this-one-is-just-right-almost-2/">This one is just right (almost)</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>This one is just right (almost)</title>
		<link>https://www.mcgilldaily.com/2011/09/this-one-is-just-right-almost/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 26 Sep 2011 10:00:19 +0000</pubDate>
				<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=9475</guid>

					<description><![CDATA[<p>Recent findings bring the possibility of extraterrestrial life closer than ever before</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/09/this-one-is-just-right-almost/">This one is just right (almost)</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The family of confirmed planets that exist outside our solar system gained fifty new members last week when the High Accuracy Radial velocity Planet Searcher (HARPS) team announced new discoveries at the Extreme Solar System Conference in Wyoming. This announcement described a planet that may orbit around its star in a habitable zone, known as a “Goldilocks zone” because it is neither too hot nor too cold, allowing liquid water to form. Liquid water is a necessary precondition for life as we know it, so finding planets that may have it is an obvious first step to finding evidence of extraterrestrial life. This brings the total number of known exoplanets – planets that exist outside our solar system – up to 685, with thousands more potential worlds waiting to be confirmed by various scientific missions.<br />
This plethora of exoplanetary discoveries has seriously challenged scientists’ ideas about planetary formation. The traditional model has planets forming around the same time as a newly born star, accreting from the same disc of dust that rotates around the protostar, an early phase in the formation of a star. But with new discoveries, it became clear that current models were unable to predict the diversity of planets in the galaxy. “These models are crap. They may be the best we can do, but they are still crap,” said Hal Levinson, an astronomer speaking at the annual American Astronomical Society meeting in January.<br />
Scientists use a variety of techniques to actually detect these planets, ranging from directly imaging them with powerful telescopes to more indirect methods. One such method detects the subtle periodic wobble of a star and its planet as they rotate about each other in their cosmic dance. This tiny effect can be measured from hundreds of light years away. It also allows astronomers to figure out the mass and orbital period of the planets. This is possible even in  solar systems with multiple planets. The HARPS team has found dozens of planets, including a considerable number of “super earths” (planets with masses of up to 10 times that of earth). However, the radial velocity technique doesn’t reveal anything about the density of the planet, which means the “super earths” could range from puffy gas balls, such as those found in our own outer solar system, to more compact, rocky planets like earth.<br />
The radial technique for detecting planets favours finding very large celestial bodies closely orbiting their suns.  Many of the early exoplanet discoveries using this method were of these “hot Jupiter” planets, which orbit their stars far closer than our own Mercury, with masses considerably larger than Jupiter, at about 318 earth masses.  These enormous planets would have surface temperatures reaching upwards of a few thousand degrees, whose ‘year’ could be as short as a few hours because of their tiny orbits. One planet the HARPS team found gets so hot that it actually acts like a Jupiter-sized comet, leaving a superheated ‘tail’ of planetary debris that gets blown away by the stellar wind at more than 35,000 kilometers per hour.<br />
The transiting technique is a more direct way of measuring planet size. This technique watches stars for periodic “eclipses” caused by the planets moving in front of the star. The tiny drop in the intensity of the star’s light can be detected, and from that the physical size of the planet can be deduced. Of course, this only works when our telescopes are on the same orbital plane as the extrasolar system. The laws of probability tell us that this lucky alignment would only occur about 0.5 to 10 per cent of the time, depending on the planets distance from the star. In order to appreciate the miniscule change that occurs, imagine being able to see a poker chip placed in front of the search light in downtown Montreal all the way from Toronto. The transiting technique is not only able to do this, it does it on the scale of trillions of kilometres.<br />
The newly operational Kepler spacecraft uses this method to find planets, watching about 145,000 stars constantly to detect these small drops in brightness. Even though it has only been in operation since 2009, it has already detected over 1200 candidate planets, including dozens of “super earths” and a couple within the habitable zone of their stars. Kepler has even detected a planet that orbits around two stars at once. Like the fictional Star Wars planet Tatooine, no two sunsets on this planet would ever be the same, because the two stars would be orbiting around each other as the planet rotates around the pair!<br />
The transiting technique also offers scientists an opportunity to directly measure the atmosphere and temperature of the exoplanet, because the spectral signature of the starlight changes when there is a planet obstructing it. With ever-increasing telescopic power comes an increasing chance that we will finally detect a sister planet with a similar size, temperature, and atmospheric composition as Earth. This is one of the explicit missions of the upcoming James Webb Space Telescope (JWST), which is the successor to the immensely successful Hubble Space Telescope. The JWST will have a resolution  of over 20 times that of the Hubble, which would make it theoretically sensitive enough to directly image a planet with compoarable size to earth. Funding for the JWST was recently called into question, citing cost overruns and mismanagement, but the project was ultimately allowed to continue. However, the earliest possible launch date for the JWST isn’t until 2019.<br />
With the continuation and progression of all these various detection methods we truly are on the cusp of detecting a planet orbiting another sun that would be capable of supporting life as we know it. With the next generation of telescopes, we will be able to probe the atmospheres of some of these potential Earths. In doing so, we may even find a telltale chemical signature that would be incongruous with our model of a lifeless planet. Considering that scientists now know that planets can be found on up to 40 per cent of sun-like stars, and that there are billions of these stars in our galactic neighbourhood, it is virtually certain that there is extraterrestrial life in our cosmic backyard. And it will almost certainly be weirder and maybe even more wondrous than we can possibly imagine.</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/09/this-one-is-just-right-almost/">This one is just right (almost)</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>There&#8217;s probably no God (particle)</title>
		<link>https://www.mcgilldaily.com/2011/09/theres-probably-no-god-particle-2/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 12 Sep 2011 11:00:11 +0000</pubDate>
				<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<category><![CDATA[Uncategorized]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=9032</guid>

					<description><![CDATA[<p>Don't panic, we're just talking about the Higgs boson</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/09/theres-probably-no-god-particle-2/">There&#8217;s probably no God (particle)</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>During the <a href="http://eps-hep2011.eu/" target="_blank">International Europhysics Conference on High Energy Physics</a> hosted in late July at Grenoble, the latest data from the world’s most powerful particle accelerator was presented. After years of waiting for the Large Hadron Collider (<a href="http://press.web.cern.ch/public/en/LHC/LHC-en.html" target="_blank">LHC</a>) to be built and brought up to operational levels, and after numerous frustrating technical setbacks, the European Organization for Nuclear Research (CERN) was ready to present its first tenuous conclusions about the Higgs boson. It was there that they dropped the bombshell: CERN stated that there was a 95 per cent chance that it did not exist, effecitlvey ruling out its existence.</p>
<p>The Higgs boson, popularly known as the “god particle” because of its supposed role in endowing everything in the universe with mass, has been furiously searched for since the postulation of its existence in 1964. <a href="http://en.wikipedia.org/wiki/Standard_Model" target="_blank">The Standard Model</a> predicts a menagerie of subatomic particles. Of these, the Higgs boson is the only one yet to be confirmed. As a scientific theory, the Standard Model is the most thoroughly tested in all of human history. It successfully unites electromagnetism, the strong nuclear force that keeps atomic nuclei together, and the weak nuclear force that controls radioactive decay under one theoretical framework.</p>
<p>The Standard Model essentially says that all matter in the universe is composed of varying combinations of fundamental units called fermions of which there are two types: quarks and leptons. There are six types of quarks and leptons respectively, with each also having antiparticles. The combination of these 24 different fermions is what gives rise to the matter in the universe. Particles such as protons are composite particles; those made from different quark combinations are collectively referred to as “hadrons”.</p>
<p>Another important part of the Standard Model is the idea that all of the forces we are familiar with, such as the electromagnetic radiation that makes up visible light and enables wireless internet, arise as the result of interactions between force-carrying bosons. This is where the Higgs boson fits into the picture: it is supposed to be the force carrying particle that brings mass into being.<br />
Much of the theoretical work that goes into the Standard Model entails predicting the characteristics of these bosons. However, for any scientific prediction to be considered valid, it must survive the process of experimental verification. For the Standard Model, this requires that the existence of these theoretical particles is demonstrated. For most of these particles, under normal conditions, evidence of their existence cannot be observed. However, by providing high amounts of energy, these exotic particles can be created in a laboratory.</p>
<p>Unfortunately for Standard Model experimentalists, this requires building ever-larger and more expensive particle accelerators, which smash together hadrons at the speed of light. The LHC is the latest and greatest particle accelerator to date, and uses $100,000 worth of electricity to get beams of atoms moving one way or another around a 27 kilometer track just about three meters per second shy of the speed of light. These two beams are then smashed together, and scientists sort through the resulting hadron debris to figure out which particles were generated in conditions that have not existed since microseconds after the big bang. For the LHC, there are about 40,000 individual collisions per second, which makes for an awful lot of data to sift through.</p>
<p>Over the course of the past three years, the LHC created billions of collisions, each of which generates thousands of particles. The sheer volume of the data generated is daunting, to say the least. Even with the most sophisticated detectors ever created, it is extremely difficult to distinguish a true hadron detection and to do so requires thousands of hours of computation. The scientists at CERN have even started the LHC@home project, which uses the processing power of personal computers around the world to assist with these calculations.</p>
<p>Even with all that data generation and processing, the successful identification of an as-yet-undiscovered particle such as the Higgs boson can require years of continuous collisions, to allow the signal to rise up above the extraneous data, such as those collected from the collision of other particles. For a new particle to be considered confirmed, it must pass the so called “5-sigma” rule, demonstrating a 99.99995 per cent likelihood that the observed results are not more easily explained by background noise.</p>
<p>Even with this commitment to statistical rigour, sometimes there can be anomalous signals within this noise. Earlier this year, a leak from CERN showed data that matched exactly what the Higgs boson would look like. However, as more data was collected, those promising conclusions faded into statistical insignificance.</p>
<p>It is in this conservative experimental context that the significance of the conference’s summaries can be fully appreciated. Discovering with 95 per cent confidence that the Higgs boson probably does not exist is practically the raison d’etre of the LHC, because after nearly fifty years of waiting, scientists finally had the ability to test a central tenet of the Standard Model. Even though the Standard Model is the most viable theory thus far, new discoveries such as this show us that our fundamental understanding of the universe is incomplete.<br />
In terms of the underlying physics involved, this non-discovery lends credence to other so-called “Higgsless” models. These theories include the idea of technicolour, which creates mass through a different (and more complicated) method than the Higgs boson. Other alternative theories include the possibility of inducing mass through an interaction with a fourth spatial dimension, or even a theory called “loop-quantum gravity”, which posits all of space-time as being made up of tiny quantized, interwoven, fuzzy loops, with the interaction of these loops giving rise to all the phenomenon in the universe, including mass.</p>
<p>Although this is an exciting discovery for physics, it is understandable if the enthusiasm is not contagious. To many, a project of this scale is simply a waste of resources. However, disproving the Higgs boson is equally as important as giving evidence for its existence. This new discovery opens the door to other untested and novel theories of the nature of the universe. Although these studies are admittedly esoteric, their findings have far-reaching consequences. The thing to keep in mind about pure research, such as this, is that it’s always unpredictable. The technology behind MRIs arose from research not intended for practical use; its existence was only made possible through pure, esoteric, scientific inquiry.</p>
<p>As we move into the future, the words of Socrates come to mind: “As for me, all I know is that I know nothing.” For all of its successes, the Standard Model is completely incompatible with general relativity, which accurately describes the structure of the universe at its largest scale. Even current “theories of everything” such as String Theory (which has the Standard Model built into it) are incapable of accounting for astronomically observed realities like dark matter and dark energy. The sea of our ignorance is vast; but with each discovery such as this, the shores of our limited knowledge expand ever so slightly.</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/09/theres-probably-no-god-particle-2/">There&#8217;s probably no God (particle)</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>The BP oil spill: one year later</title>
		<link>https://www.mcgilldaily.com/2011/04/the-bp-oil-spill-one-year-later/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 04 Apr 2011 00:00:44 +0000</pubDate>
				<category><![CDATA[Blogs]]></category>
		<category><![CDATA[FrontPage]]></category>
		<category><![CDATA[inside]]></category>
		<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<category><![CDATA[Sections]]></category>
		<category><![CDATA[BP]]></category>
		<category><![CDATA[Corexit]]></category>
		<category><![CDATA[Deepwater Horizon]]></category>
		<category><![CDATA[Gulf Coast Research Laboratory]]></category>
		<category><![CDATA[Gulf of Mexico]]></category>
		<category><![CDATA[IMMS]]></category>
		<category><![CDATA[NOAA]]></category>
		<category><![CDATA[Obama]]></category>
		<category><![CDATA[Oceanus]]></category>
		<category><![CDATA[Paul Montagna]]></category>
		<category><![CDATA[Texas A&M]]></category>
		<category><![CDATA[University of Southern Florida]]></category>
		<category><![CDATA[University of Southern Mississippi]]></category>
		<category><![CDATA[WHMIS]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=8065</guid>

					<description><![CDATA[<p>The continuing consequences of Deepwater Horizon’s mismanagement</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/04/the-bp-oil-spill-one-year-later/">The BP oil spill: one year later</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>The first anniversary of the massive explosion aboard the offshore drilling rig Deepwater Horizon is approaching. The April 20, 2010 explosion, caused by a failure of the blowout preventer, ruptured the oil well at a depth of 1.5 kilometres below the ocean’s surface. The resulting gusher at the well head eventually dumped nearly 5 million barrels (some 700,000 metric tonnes) of crude oil into the Gulf of Mexico. Efforts to stop the flow were hampered by the technical difficulties of working at such an extreme depth, and the well was only capped when a relief well was drilled five months later. Since then, the Gulf oil disaster has largely faded from the public eye, but we are only now seeing the beginning of long-term problems that will affect the region for years to come.</p>
<p>Despite technical increases in the ability required for oil companies to drill at ever-increasing depths, the technologies to clean up spills have not fundamentally changed in more than thirty years, even since the Gulf disaster. The offshore drilling in the Gulf approved by the Obama administration features the same blowout preventers that we now know are prone to blowout. The safety report from BP that is cited in these latest applications is dated 2009, before the Deepwater Horizon failure. This failure to enforce standards is unsurprising in an industry where the “impartial” regulators have significant connections to lobbyists.</p>
<p>As for their advertised commitment to safety, BP’s research budget to address oil spill issues in the past twenty years was exactly zero dollars. Nevertheless, the cleanup effort was undertaken by BP in conjunction with minimal oversight by the U.S. government. From the beginning, troubling patterns of behaviour emerged in BP’s efforts, all seemingly aimed at minimizing the visible impact of the spill. There were instances of journalists being barred from taking photos of public beaches by BP-employed private security, and reports of BP cleanup workers illegally destroying the remains of spill-affected endangered species such as sea turtles.</p>
<p>The greatest effort at masking the true impact of the spill came with the scientifically dubious use of the oil dispersant Corexit. Oil dispersal agents such as Corexit work by breaking up dense clumps of oil and allowing them to dissolve in water, forming an oil sheen. In theory, oil-eating microbes are better able to digest this dissolved oil, but this had never been tested at the temperatures and pressures seen at the Gulf well head. At least 4 million litres of dispersant were used around the gusher to disperse oil before it could reach the surface, with an additional 4 million litres of dispersant used to break up oil at the surface.</p>
<p>Far from being harmlessly eaten or taken away by currents, much of the oil and dispersant remained in the deep areas in a 50 kilometre zone around the spill site. At that depth, most microscopic oil particles remain neutrally buoyant, neither sinking nor floating to the surface. Paul Montagna, a Texas A&amp;M scientist specializing in contamination due to oil spills, remarked, “We can see it. There was a really large plume that stayed mostly in the deepest areas.” Marine scientists on the research vessel Oceanus found a layer of oily material up to 5 centimetres thick coating the entire seafloor and stretching for dozens of kilometres around the site of the well. The vessel took hundreds of samples and in many instances, dead sea creatures were found.</p>
<p>Corexit is notably toxic and carcinogenic, classified by the Workplace Hazardous Materials Information System (WHMIS) as a hazardous material that induces burning and skin irritation. There have been no toxicological or environmental impact studies performed on the dispersants used so heavily in the “cleanup,” but the WHMIS safety data notes that Corexit “is known to ‘bio-accumulate’.” Bio-accumulation is the process of toxins building up in the organic tissue of animals, which can result in increased concentrations of these toxins at higher levels of the food chain. Preliminary research  conducted at the University of Southern Florida on a mix of dispersant and oil similar to that in the composition of the plumes found that bacteria and phytoplankton are particularly susceptible to their combined toxicological effects. The dispersed oil is composed of droplets that are small enough to be consumed by these microscopic creatures. Bacteria and phytoplankton are the basis for the entire Gulf Coast food chain that includes over 15,000 species, some of which (such as four species of endangered sea turtles) can be found nowhere else in the world.</p>
<p>Tiny blobs of oil have already been found in “almost all” of the larvae of blue crabs collected on beaches from Texas to Florida since last July, according to the University of Southern Mississippi’s Gulf Coast Research Laboratory. The value of the blue crab catch alone is worth $39 million to the region, with the total financial loss to the fishery in the Gulf estimated at over $2.5 billion.</p>
<p>The spill has drastically affected the marine mammal population as well. The first birthing season of marine mammals since the spill began this year in February, and instances of dead dolphins appearing on the shores has increased as much as eight times compared to pre-spill levels. Over 100 dolphin, whale and porpoise bodies have appeared on Gulf shores; over half of them were stillborn or immature babies. This number of recovered carcasses is a gross underestimate of the total death toll, and traditional estimates of death rates based on recovered bodies suggest that the true body count may be fifty times higher than we can confirm using those recovered. If this estimate is true, this would mean that over 5,000 marine mammals have already been killed due to the effects of the spill, not even mentioning the effects that the spill would have on the thousands of other species present in the Gulf.</p>
<p>Perhaps most disturbingly, any forensic analysis being performed on the recovered animals is currently the subject of a National Oceanic and Atmospheric Administration (NOAA) gag order due to an impending court case against BP. Analysis could provide evidence to link the deaths of the animals to the spill using the oils chemical footprint. Unfortunately, there is a backlog of thousands of samples collected by scientists since the spill that have sat untested at the Institute of Marine Mammal Studies (IMMS). A letter from the NOAA to the institute says, “Because of the seriousness of the legal case, no data or findings may be released, presented, or discussed.” Even a year after the spill began, no laboratories have been selected to perform these forensic tests that are so important for understanding the impact of oil spills. What this means is that the results of the investigation may not be available to independent scientists for years, and the long term consequences of a deepwater drilling accident may never be known.</p>
<p>The offshore drilling business has continued to surge virtually unabated since the disaster, with BP recording profits of $1.7 billion in the third quarter of 2010 alone, despite the nearly $40 billion spent on the cleanup. The Deepwater Horizon  was just one of nearly 4,000 rigs operating in the Gulf of Mexico, and new exploratory deepwater wells have been proposed off the coast of Newfoundland and in the Arctic circle by BP. This exploratory drilling is continuing despite the known lack of remediation technologies in case of disaster, or the existence of proven safety technologies. The reason is clear: the 4 million barrels of oil that have caused so much environmental devastation in the gulf represents about 75 minutes of average daily global oil consumption. Unless society can be weaned off of oil, companies such as BP will haveclear financial motivation to drill ever deeper, endangering lives, livelihoods, and ecosystems in the process.</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/04/the-bp-oil-spill-one-year-later/">The BP oil spill: one year later</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Withstanding seismic risks</title>
		<link>https://www.mcgilldaily.com/2011/03/withstanding-seismic-risks/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 28 Mar 2011 00:00:02 +0000</pubDate>
				<category><![CDATA[Blogs]]></category>
		<category><![CDATA[FrontPage]]></category>
		<category><![CDATA[inside]]></category>
		<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<category><![CDATA[Sections]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=7876</guid>

					<description><![CDATA[<p>Structural codes keep our buildings safe, but it may not be enough</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/03/withstanding-seismic-risks/">Withstanding seismic risks</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><!-- p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; text-align: justify; font: 39.0px 'ITC Garamond Light'} p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; text-align: justify; font: 9.0px 'ITC Garamond Light'} p.p3 {margin: 0.0px 0.0px 0.0px 0.0px; text-align: justify; text-indent: 12.0px; font: 9.0px 'ITC Garamond Light'} span.s1 {letter-spacing: 0.1px} span.s2 {letter-spacing: 0.2px} -->We often take for granted the idea that the buildings we use every day will remain standing. This becomes most apparent during seismic events when the structural capacity of the built environment is put to the ultimate test. Yet the damage caused by extreme earthquakes is highly variable, which is in fact best illustrated by the differences in the destruction caused by recent earthquakes in Haiti and Japan.</p>
<p>Earthquakes will continue to be a natural hazard as long as we continue to live in seismically active areas, so it is worth understanding the reasons why some buildings stay standing and others collapse when exposed to the same risks. Ultimately, that distinction is in the building codes governing construction practices, as well as how the buildings are maintained. Saeed Mirza, professor emeritus in Civil Engineering at McGill,  has spent much of his career working on issues related to construction and society. “The devastation we saw in Haiti is what happens when there is no meaningful structural code,” he said. Even though that earthquake was 1,000 times less severe than the recent Japan event, over 200,000 people lost their lives and more than 90 per cent of buildings near the epicenter collapsed.</p>
<p>At their core, what structural codes do is allow designers to determine two things: expected loads, and buildings’ expected ability to resist those loads. Inherent in these estimates (which are based on the best available data and probability) is the risk that we will underestimate the applied loads and overestimate the resistance, leading to failure and possible loss of life. In practice, the way that design engineers minimize that risk is by making their resistance estimates as conservative as realistically possible. For earthquake resistance, the Canadian code ensures that structures will survive small to moderate earthquakes with little to no damage. In the case of larger earthquakes, the design philosophy emphasizes multiple layers of redundancy and types of ductile failures that allow maximum survivability (such as beams that stretch and sag instead of suddenly cracking in half, allowing occupants to escape). The code is strictest when it comes to ensuring that structures needed post-disaster – such as fire stations or hospitals – remain intact even when less critical structures collapse.</p>
<p>The corollary to this philosophy is the fact that less risk necessarily costs more money, and that this relationship is not linear. For post-disaster design, the additional material, design, and construction will increase building costs by 20 to 30 per cent, sometimes even more. Though the question of whether additional safety is worth the money is unpleasant to ask, structural codes exist to ensure that whatever the answer, loss of human life will hopefully be minimized.</p>
<p>How structural codes determine a feasible solution to this problem is highly dependent on location, since the forces that govern structural design are generally region-specific. Snow loads may be particularly troublesome in Eastern Canada – and often control design – but are extremely unlikely in places like Arizona. Likewise, the building code in Japan is notably conservative in respect to earthquake design since the country is particularly seismically active. All of Japan is potentially at risk, so the Japanese structural code requires measures not seen in other countries, such as base-isolated foundations, or multiple redundant structural systems. With these advanced construction practices comes a higher price tag for each building, but the Japanese government has decided that these higher prices are worth the lives saved.</p>
<p>Even the most conservative building codes are not 100 per cent effective, because there is always a chance that the design for “worst-case scenario” will be exceeded. In the wake of such events, the lessons learned help improve our codes, and make future buildings even safer. Though these changes do much to improve the safety of new buildings, they do not affect the state of existing construction. The absolute importance of these changes can be seen in the fact that earthquakes disproportionally effect older buildings, while their modern counterparts perform as intended.</p>
<p>Aging construction suffers from an additional compounding problem: accumulating deterioration because of deferred maintenance. Just as one must change the oil in one’s car to keep it running, infrastructure requires continual maintenance to perform at the minimum level for which it was designed. The fact that the structural code necessarily requires conservative overdesign has masked much of the accumulated damage and diminishing capacity that these structures have incurred over their 50 year or more lifespans. The Canadian government has not been spending an adequate amount of money on the trillions of dollars worth of critical assets in its care, which could have potentially catastrophic results.</p>
<p>Mirza has worked extensively on this problem, having issued the first comprehensive report on the state of Canadian infrastructure in 1996. Since then, he estimates the amount of money required to bring all Canadian infrastructure back up to minimum service levels to be upwards of $400 billion. This number has been constantly rising since much of Canada’s critical infrastructure was built as successive governments chose to ignore the ballooning problem. We are seeing the results of this negligence today, with an estimated $212-million cost associated with rehabilitating the Champlain bridge, built in 1962 simply to keep it in service for the next ten years. The bridge was originally built for $35 million ($255 million in 2011 dollars), and this currently ongoing rehabilitation does not even begin to address known deficiencies in seismic resistance on this post-disaster structure.</p>
<p>When asked about how Montreal would fare in an earthquake, Mirza was blunt. “Neither the Champlain bridge or Turcot exchange may survive a moderate earthquake,” he said. “Can we afford to lose either? No. Yet we have consistently ignored our minimum responsibility to citizens and society, without even appreciating the danger we face&#8230; If a big one hit, the damage we would see would be catastrophic, and we would expect to lose many lives.”</p>
<p>After Vancouver, Montreal has the second highest level of seismic risk in Canada, and it is only a matter of time before the next “big one” hits. It may be in 2,000 years, or it may happen tomorrow – but it will happen. Even if we never see it in our lifetimes, we still must face the problem of our crumbling bridges and roads. How we as a society choose to respond to these dangers will ultimately determine how we fare, but there will be costs no matter what we do. Whether we pay in a few dollars now or many dollars and lives later is up to us.</p>
<p>&nbsp;</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/03/withstanding-seismic-risks/">Withstanding seismic risks</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Life after a star&#8217;s death</title>
		<link>https://www.mcgilldaily.com/2011/03/life-after-a-stars-death/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 14 Mar 2011 00:00:42 +0000</pubDate>
				<category><![CDATA[Blogs]]></category>
		<category><![CDATA[FrontPage]]></category>
		<category><![CDATA[inside]]></category>
		<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<category><![CDATA[Sections]]></category>
		<category><![CDATA[black whole]]></category>
		<category><![CDATA[Carl Sagan]]></category>
		<category><![CDATA[Cassiopeia A]]></category>
		<category><![CDATA[Chandra telescope]]></category>
		<category><![CDATA[cooper pairs]]></category>
		<category><![CDATA[Craig Heinke]]></category>
		<category><![CDATA[Dany Page]]></category>
		<category><![CDATA[Monthly Notices of the Royal Astronomical Society]]></category>
		<category><![CDATA[National Autonomous University of Mexico]]></category>
		<category><![CDATA[neutrinos]]></category>
		<category><![CDATA[neutron star]]></category>
		<category><![CDATA[planetary nebulae]]></category>
		<category><![CDATA[red giant]]></category>
		<category><![CDATA[superconductor]]></category>
		<category><![CDATA[superfluid]]></category>
		<category><![CDATA[supernova]]></category>
		<category><![CDATA[University of Alberta]]></category>
		<category><![CDATA[University of Southampton]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=7290</guid>

					<description><![CDATA[<p>The extraordinary world of neutron stars</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/03/life-after-a-stars-death/">Life after a star&#8217;s death</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p>If you have trillions of tonnes of fuel burning at millions of degrees for billions of years and you suddenly run out, what happens?</p>
<p>Stars in the twilight of their lives are doomed to different fates depending on exactly how much stuff is in them. In the case of sun-like stars, they will die by puffing up into red giants (with a radius bigger than earth’s orbit around the sun) and eventually ejecting much of their stardust to form so-called “planetary nebulae” that can span dozens of light-years across.</p>
<p>If the star in question is bigger and hotter than the sun, it performs the much more dramatic exit from the cosmic stage known as a supernova. When the last of the available fuel runs out, the fusion reaction providing the counter-force to the ridiculously immense gravitational forces at the core of these stars is cut off. Without any counterbalance to gravity, the entire star falls in on itself, compressing the core and bouncing back out – tearing the entire star apart in a colossal shock wave explosion that can be seen clear across the universe.</p>
<p>The energy provided by supernovae is sufficient for the creation of heavy elements such as the ones that make up rocky planets and the little ape-things that live on them. In the words of Carl Sagan: “We are all stardust.” Not all of the stardust is ejected, though, the compressed cores of the dead stars – which can become either black holes or neutron stars – are among the strangest and most inexplicable objects in the entire universe.</p>
<p>Black holes are what happen when there is simply too much mass in that compressed core – at least ten times the mass of the sun. The remnant of the core collapses into a single point of infinite density, warping space-time so severely that not even light can escape from the area. If the core is below that critical mass, the object that forms may be a neutron star. A neutron star contains about 1.5 times the mass of the sun (about 500,000 Earth masses) compressed into a ball with a radius of just 12 kilometres. That is about the same density as the entire human population squished into the volume of a sugar cube. The gravitational pull is so intense that the electromagnetic repulsion between protons and electrons is not sufficient to keep them apart, which forces them together and creates neutrons.</p>
<p>Neutron stars conserve the rotational energy of their progenitor stars in the same way that the speed of a spinning figure skater increases as they bring their arms to their body. With the extreme difference in radii, neutron stars have been spotted rotating more than 1,000 times a second! In these conditions of extreme gravity, heat, and spin, astronomers have been stymied in their attempts to adequately model their interior workings.</p>
<p>In 2009, theoretical astrophysicist Dany Page at the National Autonomous University of Mexico championed a new, if bizarre model to explain the inner workings of neutron stars. He predicted that the conditions within the interior would cause the neutrons present to collapse together into the lowest possible quantum energy state, called a Cooper pair. This process would result in the formation of neutrinos (particles that are virtually without mass and can freely pass through just about anything) that would carry energy away from the star, thereby lowering its temperature. Matter bound up in Cooper pairs behaves essentially as a macroscopic quantum particle, conventionally known as a “superfluid.” Flowing superfluids perform as superconductors, meaning electricity flows through them without energy loss, and superfluids flow without friction. If superfluid was put into a glass back here on Earth, it would climb up over the walls of the glass and escape.</p>
<p>Page’s model was awaiting evidence to support its strange physical predictions. Luckily for him, astrophysicist Craig Heinke at the University of Alberta has found compelling new evidence in data collected from the Chandra X-Ray telescope. They observed a supernovae remnant named Cassiopeia A (Cas A), located a mere 11,000 light years away, over a period of years. The first light from the supernova reached Earth 330 years ago, which is exceedingly young in terms of stellar evolution. According to Heinke, we’ve got “ringside seats to studying the life cycle of a neutron star from its collapse to its present, cooling off state.”</p>
<p>Heinke’s team has found that Cas A has cooled off by 800,000 degrees in just ten years. This cooling rate was impossible to account for without including the mechanism first proposed by Page, which was the conclusion that both Heinke and independent researchers at the University of Southampton in the U.K. came to. The two teams jointly announced their findings in papers that will appear in <em>Monthly Notices of the Royal Astronomical Society</em>.</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/03/life-after-a-stars-death/">Life after a star&#8217;s death</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Get smart</title>
		<link>https://www.mcgilldaily.com/2011/03/get-smart/</link>
		
		<dc:creator><![CDATA[Andrew Komar]]></dc:creator>
		<pubDate>Mon, 07 Mar 2011 00:00:16 +0000</pubDate>
				<category><![CDATA[Blogs]]></category>
		<category><![CDATA[FrontPage]]></category>
		<category><![CDATA[inside]]></category>
		<category><![CDATA[Prose Encounters Of The Nerd Kind]]></category>
		<category><![CDATA[Sci + Tech]]></category>
		<category><![CDATA[Sections]]></category>
		<guid isPermaLink="false">http://www.mcgilldaily.com/?p=7026</guid>

					<description><![CDATA[<p>Delivering electricity using a smart grid could save energy and the Earth</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/03/get-smart/">Get smart</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></description>
										<content:encoded><![CDATA[<p><!-- p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; text-align: justify; font: 39.0px 'ITC Garamond Light'} p.p2 {margin: 0.0px 0.0px 0.0px 0.0px; text-align: justify; font: 9.0px 'ITC Garamond Light'} p.p3 {margin: 0.0px 0.0px 0.0px 0.0px; text-align: justify; text-indent: 12.0px; font: 9.0px 'ITC Garamond Light'} span.s1 {letter-spacing: -0.1px} span.s2 {letter-spacing: 0.2px} -->For life in the 21st century, it is impossible to put a dollar value on the necessity of reliable electrical power. Electricity underlies every aspect of our modern lives, and yet the grid used to deliver this crucial commodity has not substantially changed since the 1890s, after its invention by Nikola “Electric Jesus” Tesla.</p>
<p>The current electricity delivery model is a source-sink one, meaning that a few giant power plants passively monitor the demand for electricity caused by net usage, and turn generators on or off accordingly. From a consumer perspective, your personal “sink” is monitored with a metre that is checked a couple of times a month and you are charged accordingly with a flat rate.</p>
<p>But the real cost of producing electricity is far from a flat rate. Over the day, as people go about their business, the net amount of electricity used rises relative to times when everybody is asleep, resulting in “peak power” consumption, such as when consumers begin making dinner as they collectively return home from work. This peak power is particularly noticeable during hot summers, when millions of people turn on their energy-intensive air conditioners.</p>
<p>Base load power plants (like coal and nuclear) can economically only run 24/7, so additional demand is met using peaker plants, gas-fired variants that are only run for a few hours a year. With peaker plants, the actual cost of electricity will fluctuate throughout the day, but the source-sink paradigm does not shift these additional costs or savings onto consumers.</p>
<p>The vulnerability of the system was perhaps most apparent during the Northeast blackout of 2003. Hot August days were causing millions of people to ramp up the air conditioning, bringing more peaker plants online. As more power flowed through the already taxed transmission lines, lines in Ohio sagged low enough to be shorted out by tree branches. To pick up the slack, electricity was rerouted through other lines, shorting them out and causing a cascading failure that ultimately plunged 55 million people across the northeatern U.S. and Ontario into darkness, and cost the affected region an estimated $6 billion in losses.</p>
<p>Enter the smart grid: an emerging set of technologies that can make electricity consumption more efficient and economically viable. The smart grid would overturn the source-sink model by making every electricity user into a potential producer through the use of a two-way metering system. Monitors deployed through the grid would allow supply and demand of electricity to be measured in real time, resulting in a market-driven pricing of electricity.</p>
<p>As a result of this real-time pricing, consumers would be financially rewarded for using electricity in off-peak hours, resulting in lower electrical bills. A smart grid would also be able to re-route power intelligently in the case of emergencies like ice storms, falling trees, or even terrorist attacks, greatly decreasing the likelihood of a failure like the 2003 blackout.</p>
<p>The smart grid would also be a boon for electrical cars. Real-time pricing would allow car owners to recharge their batteries during off-peak hours at low cost. During high demand, car owners could sell excess battery power back to the grid at a profit, simultaneously reducing the need for the peaker plants – and consumption in general – through a distributed supply. The batteries in just one million electric cars have a potential electrical capacity of about a gigawatt hour, which amounts to approximately 15 per cent of the average daily energy use by Canadians. Potentially enough to take the “peak” off of peak power.</p>
<p>Small-scale energy, such as that made from personal solar panels, wind turbines, or the not yet commercialized micronukes, would also fare well under a smart grid system. If combined with sufficient battery capacity, these small producers would be able to sell their electricity at market pricing for profit at any scale, lowering investment barriers for individuals and small businesses alike. Greater small-scale power distribution would minimize the need to transmit power over the vast distances it currently travels today, such as the 11,000 kilometers of lines that provide power to Quebec alone. Losses because of line heating and physical distance in these transmission lines amount to about 6 per cent of all electricity produced. The physical length of transmission wires are particularly vulnerable to space weather phenomenon, like the Quebec blackout of 1989, which was caused by a huge solar flare.</p>
<p>A smart grid would cheaper, more robust, more reliable, and greener than the one we have today. The smart grid Telegestore project in Italy, which consists of 27 million connected metres, provides annual net savings of 500 million euros (around $680 million) every year. Based on an estimate published by Pacific Northwest National Laboratory, the present value of a smart grid can been conservatively estimated to be $7.5 billion to Canadians – not including the cultural, environmental or societal benefits included with radically rethinking how we consume power. Electrical demand will only increase in a growing, developing and heating world, and the very least we can do is provide it in a way that isn’t prone to catastrophic failure. It’s what Electric Jesus would have wanted.</p>
<p>The post <a href="https://www.mcgilldaily.com/2011/03/get-smart/">Get smart</a> appeared first on <a href="https://www.mcgilldaily.com">The McGill Daily</a>.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
