In an interview today on RTÉ Raidió na Gaeltachta, musician Mairéad Ní Mhaonaigh said that the community stood together against the planned oyster farms for the northwest, in An Céideadh and Cruit, and that they must keep up the pressure against the plans.In total, 3200 objections to the oyster farms in An Céideadh and An Cruit were received.In the interview today with Aodh Máirtín Ó Fearraigh on RTÉ Raidió na Gaeltachta, Mairéad Ní Mhaonaigh said: “It took us by surprise when we heard about it (the planning application), because it was almost through already … and it was people from outside the area who would gain. There are a lot of people around the Rosses and Gaoth Dobhair who use that area, because it’s a beautiful place to go walking or to bring your children, a safe place to swim…”“When you think that we’re trying to entice tourists here with the Wild Atlantic Way, that would ruin the landscape.”3200 objections were received for the 2 planned applications in question, and Mairéad said that was a great source of pride for her.“I’m really proud, it’s great that the community came together and protested strongly against the plans.” Speaking of her own objection, she said that she was totally against the farms as they would ruin the landscape, and that it would be more difficult to entice people to the area, and also the fact that the local community would not profit from it.“The community is united and that’s a great thing. I don’t think it’s going to benefit the areas of Céideadh or Cruit to have a farm like that. The people’s voice is the most important thing here, and I hope we’ll be listened to.”Mairéad Ní Mhaonaigh lives in An Charraig Fhinn in the Rosses area.She was interviewed today on RTÉ Raidió na Gaeltachta.Irish musician voices her objections against proposed Oyster Farm at Cruit Island was last modified: October 19th, 2016 by Mark ForkerShare this:Click to share on Facebook (Opens in new window)Click to share on Twitter (Opens in new window)Click to share on LinkedIn (Opens in new window)Click to share on Reddit (Opens in new window)Click to share on Pocket (Opens in new window)Click to share on Telegram (Opens in new window)Click to share on WhatsApp (Opens in new window)Click to share on Skype (Opens in new window)Click to print (Opens in new window)
How do our bodies make sense of the external world? Through our senses, of course; at least they are the entry points of data into the mind. Behind those senses are remarkable mechanisms that we use but do not actively operate. The design in their automatic operations is slowly being revealed with better observing techniques.Sensing sound with motors: “From grinding heavy metal to soothing ocean waves, the sounds we hear are all perceptible thanks to the vibrations felt by tiny molecular motors in the hair cells of the inner ear,” began an article on PhysOrg. A single mutation – one amino acid change – in a molecular motor protein called myo1c is enough to disrupt the function of the myosin motor in the hair cell and cause hearing loss. The mutation causes a reduced sensitivity, perhaps due to making it spend less time attached to actin filaments. The amino acid is “highly conserved” (unevolved) throughout the superfamily of myosin motors, the article said.Sensing light with circuits: A novel microscope technique has allowed scientists at Max Planck Institute to decode the eye’s complex circuitry, Science Daily reported. “The properties of optical stimuli need to be conveyed from the eye to the brain,” the 03/31/2008). One example of pre-processing accomplished by ganglion cells is responding to light moving in a particular direction. “This direction selectivity is generated by inhibitory interneurons that influence the activity of the ganglion cells through their synapses.” Just as with man-made network protocols, the scientists “discovered that the distribution of the synapses between ganglion cells and interneurons follows highly specific rules.” These ganglion cells intercept and process the visual information before it is received by the brain. The article described various rules the network of cells follow in activating or inhibiting visual information.Sensing time with clocks: All living things follow “circadian rhythms,” biological responses to changes in time of day, month, and year. As in other mammals, the human master clock is located in the brain – specifically, in the suprachiasmatic nucleus (SCN), a group of nerve cells in the hypothalamus near the visual cortex. In response to its data inputs, the SCN can direct the brain to produce more or less melatonin, a hormone that induces sleep. Live Science described how the SCN works. There are internal inputs, like genes and proteins produced in the body, and external inputs from the senses. “Biological clocks aren’t made of cogs and wheels, but rather groups of interacting molecules in cells throughout the body,” the article said. One of the proteins is aptly named CLOCK – “an essential component in directing circadian rhythms in humans, fruit flies, mice, fungi and other organisms.” Another is SIRT1, which senses energy use in cells. The balance of these factors affects how the SCN directs the body to respond to light and darkness and other factors. Disruption of the biological clock can lead to a host of problems. Jet lag is a common example. Fortunately, clock repair is available for that: “Eventually your body is able to adjust its circadian rhythms to the new environment” by a kind of clock reset. Other dysfunctions, though, can lead to more serious problems, like “obesity, depression and seasonal affective disorder.” That’s because “hormone production, hunger, cell regeneration and body temperature” are some of the processes that rely on accurate circadian rhythms.All sensory inputs must be processed by the brain. Fortunately, the brain, like good computer systems, has redundancy mechanisms that give it “plasticity” – the ability to change as we learn, or as parts become damaged. Science Daily described how researchers at the University of Michigan Medical School are testing mice to see how “the plasticity of the brain allowed mice to restore critical functions related to learning and memory after the scientists suppressed the animals’ ability to make certain new brain cells.” Fault-tolerant artificial networks, like the power grid and the internet, provide for alternate routes when hubs become unavailable. Similarly, we have “mechanisms by which the brain compensates for disruptions and reroutes neural functioning,” the article said. Part of this is recovering from loss of the ability to make new brain cells by giving existing cells more activity and longer life spans. “It’s amazing how the brain is capable of reorganizing itself in this manner,” Geoffrey Murphy, an associate professor of molecular and integrative physiology at the medical school said. “Right now, we’re still figuring out exactly how the brain accomplishes all this at the molecular level, but it’s sort of comforting to know that our brains are keeping track of all of this for us.”It makes sense that readers will sense the wonder of the senses a little more after reading these sensible articles, free as they were of evolutionary nonsense.(Visited 8 times, 1 visits today)FacebookTwitterPinterestSave分享0
Been living under a rock these days? There’s this hip new tablet device from Apple called the iPad. Most are in agreement that the new toy is pretty slick, but they also agree on where the iPad fails – there’s no camera. iPod Touch fans were disappointed last year when Apple announced that the iPod Nano would be getting the much coveted camera, and now fans of a different sort are feeling the same dejected feelings.Augmented reality is a technology that allows 2D and 3D objects to be placed onto a live video feed, creating unique user experiences. AR applications entered the mainstream with a few advertisements and installations for automobiles in 2008. Since then the technology has found its way onto our home computers with things like the GE Smart Grid campaign, and onto our cell phones with mobile AR browsers like Layar and Wikitude. Have you seen those videos of people holding their iPhones up in London or New York to find the nearest subway station? That’s augmented reality, and developers and followers of the technology (myself included) were hoping a camera on the iPad would open the door to a larger and more immersive AR experience. No such luck. 5 Outdoor Activities for Beating Office Burnout Tags:#Augmented Reality#web In the grand scheme of things, augmented reality represents a drop in the ocean of iPhone app development, and Apple would need more than some petitions and disappointed developers to add a camera or change their API. However, hope may be on the horizon, as MacRumors.com reported this morning on the discovery of the option to take photos in the iPad simulator.While disappointed, AR fans are still optimistic about the iPad’s future. Claire Boonstra and Maarten Lens-FitzGerald, co-founders of Layar, one of the most popular mobile AR applications to date, expressed their sentiments on Twitter Wednesday when they heard the news about the iPad. Boonstra noted that we may have to wait for version 2.0 to see Layar on the iPad, while Lens-FitzGerald added that they have plenty of mobile phones to work on for the time being. Related Posts chris cameron 9 Books That Make Perfect Gifts for Industry Ex… Thomas Carpenter at Games Alfresco, the leading augmented reality news blog, may have said it best when he noted Wednesday that Steve Jobs didn’t make the iPad for AR fans – he made it to give Amazon’s Jeff Bezos nightmares. Either way, for those of us eager to have our realities augmented, perhaps we will be pleasantly surprised next year when AR developers like Boonstra and Lens-FitzGerald are the ones on stage with Jobs showing off the next iteration of the iPad. The best thing AR fans can do for now is create and promote amazing AR applications that will captivate the masses and launch AR further into the public eye. We can only hope that Steve Jobs is watching.Be sure to keep your eye out in the next few weeks, as ReadWriteWeb will be presenting our next premium report focusing this time on the use of augmented reality in marketing. Photo by Flickr user vlauria.See also: ReadWriteWeb’s complete coverage and analysis of the iPad on our iPad topic page. 4 Keys to a Kid-Safe App 12 Unique Gifts for the Hard-to-Shop-for People… Augmented reality has already gained traction on the iPhone and Android platforms with dozens of AR apps available for download today. Mark Billinghurst, one of AR’s “founding fathers” and a leading AR researcher since 1994, reached out to me yesterday to express his feelings about the iPad – a device with which he says Apple has missed an opportunity.“The form factor, CPU and graphics capability make it an ideal platform for a handheld AR experience,” said Billinghurst. “A camera on the back of the iPad would have enabled the development of vision based AR applications and created a whole new class of ARapplications on the App store.”Billinghurst also points out that his company ARToolworks has already provided over 100 iPhone app developers with their ARToolKit SDK, a clear sign of the growing interest in mobile AR. However, one hurdle in the way of these developers is Apple’s reluctance to open the video API on the iPhone to real-time image processing – an impedance which AR proponents have gone as far to petition Apple to overturn.Right now, applications can grab a few frames every couple of seconds to process, but the kind of accuracy needed for AR applications requires real-time frame-by-frame processing of the video feed. This would allow applications to track objects and motion seen through the camera’s lens, greatly increasing the chances for accurate placement of 2D and 3D objects as well as the interpretation of real-world items.